Impact 2024: The Industrial Data and AI Conference for and by Users | Nominate Speakers Now for a Ch...
We (HUB Ocean) have refactored the metadata schema in data files stored at CDF and considered to reingest the datasets based on the new capability. We plan to purge everything in production environment to have a clean new start. I am wondering if there is any admin role in Cognite who can perform fully cleanup for a tenant´s environment which is Hub Ocean´s oceandata environment? Or Cognite has any script to do that? It would speed up us quite a lot with reingestion of data.It would be great to have a chat before the purge action is taken. /Pinghua
I’m trying to create assets as mentioned in the ## 2. Create The Asset Hierarchy section of the Hands on tasks but I’m getting an error, although I have checked the solution and it’s exactly the same solution you have provided.The task says:For each geographical region, create a corresponding CDF asset that is under the "global" root asset and is associated with the "world_info" data set.My solution:list = []for region in df['region'].unique(): asset = Asset(name=region, parent_id=14499569942375, data_set_id=621288550636820) list.append(asset)client.assets.create(list)The error message i’m getting when running the above code:ValueError Traceback (most recent call last) Cell In[100], line 6 3 asset = Asset(name=region, parent_id=14499569942375, data_set_id=621288550636820) 4 list.append(asset) ----> 6 client.assets.create(list) File ~/workspace/using-cognite-python-sdk/.venv/lib/python3.10/site-packages/cognite/client/_api/assets.py:432, in AssetsAPI.create(self, asset) 402
I am learning GraphQL queries in Congite by following the documentation About data modeling | Cognite DocumentationHow can I add an additional filter condition based on functionalLocationParent object name value.query{listEquipment (filter: {and: [{catalogProfileDescription: {eq: "Compressors"}}]}) { items { externalId name description catalogProfileDescription functionalLocationParent { name description } cfihosEquipmentClass { name description } }}}
Hi Team,I am trying to get details of broken files uploaded in CDF . Is there any way to query actual file size from CDF using python sdk without downloading the file? Please add a feature to query file size using python sdk to get the corrupted/broken files. Thanks in advance.
The transforamtion is pointing for missing isstring colomns
I am developing a outlier detection model. The model will be accessed by users through a Streamlit app. At the moment I am assessing whether deploying the model through the Streamlit page in CDF would be a good option. My issue is that I am not sure what the most efficient way to access and import the dataset into the app is. Would it be a good approach to create a data model and import the data from there, or extract the data straight from raw? Another idea I had was setting up a scheduled Cognite Function to download the data as a csv and letting the app access that. I need the data to be updated a few times a week and have the “import” time in the app to be fairly quick for the user experience in the app to be good.My dataset is around 1 million rows with three numeric and two non-numeric columns. For now, I have the data as a raw table in CDF.Thanks!
How can I use Machine Learning Models inside Cognite data fusion?
Hi all. I’ve been trying your tutorial on How to Connect CDF to Power BI, to no avail.I am not using SUbscriptions since I do not have the permissions to write one right now. So, I was just trying to execute via Time Series API or even submit a post via curl.But the problem is with the Power BI API’s URL. Eventhough I try to make a POST, by using curl, it’ll return response 401. Can anyone give me a hand here?Do I need an admin to give me write permissions?
Hi team,I have the following questions,How does CDF handle unit of measurement. Is there a concept of storage unit system and display units system Eg; If a certain time series dataset have Celsius scale and other have Fahrenheit and kelvin scale. How does CDF store these in dataset and does it offer functionality for conversion. If I want to build a chart with multiple time series have data with different units, does CDF offer unit conversion on the fly so that I can have everything displayed in a different units? I would like to have a comprehensive documentation for it.
Hi,I was wondering where I can find information about how to set up the permissions for a file extractor reading only files from a few sites in SharePoint, i.e., not giving the extractor access to all sites. I have solved it using the GraphAPI, but this is quite cumbersome, so I was wondering if we have documentation on how this can be set up. I’ll post my solution below for clarity on how I solved the issue:_______________________________________________________________________________You will need to create two applications. One that is used in the file extractor and one to give access to the file extractor.Register the File Extractor App: Go to the Azure Active Directory portal. Click on "Azure Active Directory" then click on "App registrations". Click on "New registration" to register a new app. Grant API Permissions: Once your app is registered, click on the app's name to go to its dashboard. Click on "API permissions" in the left panel. Click "Add a permission". Choose "Mic
I have one question related to 3d models. I would like to explore does reveal/cognite supports programmatic change of animation on 3d model. What i am looking for is, i have 3d model of pump and would like to change the animation speed depending on the speed i set from javascript code (basically real time speed which i get from my api). Does it possible ? If so could you guide me to example code how to do that. Thanks
I am reporting the following issue: Steps to reproduce:Login to cognite (I used non prod / QA environment) Go to transform data Create a transformation by providing all required fields Transformation is created but is very slowSometimes at the end of point 2 the cognite web page becomes un-responsiveAlso in general it is observed degraded performance for testing the groups and capabilities in manage access.The page is becoming unresponsive. Please find attached a screenshot of this.
Greetings,I would like to get a view of which files uploaded to CDF through extractors have not been completely uploaded and only partially uploaded.In order to do this, I wanted to write a python script that compares the files sizes in the RAW metadata tables with the actual file sizes in CDF. The question is, is there a way to retrieve file size directly of a file in CDF through python SDK without having to download the files locally?Thanks in advance
Hi, I got error while submitting my 3rd attempt. It gives error when I submitted my answers. Kindly help to resolve this issue. Thank you,Amol
When I upload a file into Classic data model, how do I associate that file object to Flexible Data Model (FDM) object and retrieve the file object data using Graph API from FDM.
I am looking for information related to 3d models. I would like to explore does reveal/cognite supports programmatic change of animation on 3d models which are uploaded to cdf. What i am looking for is, i have 3d model of pump and would like to change the animation speed depending on the speed i get from my api (basically real time speed which i get from my api). Does it possible ? If so could you guide me to example code to how to do that. Also the uploaded 3d model in cdf is not displaying like how the original model that was uploaded. After uploading into cdf its not displaying with colors and animations, could you tell me why it is like that ? Thanks
Hello! Was looking around and cleaning some data, and came upon the realization that I can’t find out of you can access Cognite resources that are not contained in a dataset. That is, is there an ability to find/fetch/list Cognite resources that are “floating free”.I have tried setting data_set_ids And data_set_external_ids To None, but this is the same as not having specified it.
I have provided the essential credentials within the config.yml file in the OpcUaExtractor config folder, however when run it get this error. How can I fix this?
Hi team, I’m working on OPC UA Extractor. I have configured metadata-targets and configured metadata-mapping.I’m not able to see the metadata along with timeseries data.Please find the configuration in attachment and screen shot in CDF how timeseries data looks like. please let me know if any additional configurations need to be done to see the metadata along with timeseries data.
Hello, is it possible to test the CogniteAuthError using monkeypatch context manager ?Regards ,Hakim Arezki
Ensuring the security of PI credentials is critical in the Cognite PI extractor.In the context of Azure Vault, the requirement involves implementing robust measures to safeguard PI access credentials. This entails storing sensitive information, such as usernames and passwords, securely in Azure Vault. The system should be configured to read these credentials from the vault when needed, providing a secure and centralized method for managing and accessing PI authentication details. This approach enhances overall data security by minimizing the exposure of sensitive information and adhering to best practices for credential management in cloud environments. we have implemented in other custom extractors. can you share the documentation to implement in Cognite PI extractor
N/A
As per below pic, we need to click “Create dataset” after navigating to the “Use the Data Catalog”. But I am unable to find this option for creating dataset, can someone help?
When we are selecting different date range, the granularity is changing for different number of days selection. For 2 days - 3min; For 3 days - 5min.Can we get access to any document how this pattern is designed.
Is there a project available that we can use to review the function examples from Cognite Functions tranning?
Already have an account? Login
Enter your username or e-mail address. We'll send you an e-mail with instructions to reset your password.
Sorry, we're still checking this file's contents to make sure it's safe to download. Please try again in a few minutes.
Sorry, our virus scanner detected that this file isn't safe to download.