This discussion is dedicated to help learners of the Cognite Data Fusion Fundamentals learning path succeed. If you’re struggling with the exercises in this learning path, try the tips & tricks below or post a comment with the challenge you’re facing. You can also post your own tips and respond to fellow learners’ questions. Cognite Academy’s instructors are also here to help.
Enter cognite-learn as a domain name. after enter domain name the error -An error occurred when looking up the details for this organization
Hi @Sanyogita Wable! I just checked and it works for me. Maybe you @Dilini Fernando can take a look as well?
Hi @Sanyogita Wable ,
I've also tested this, and it appears to be functioning correctly.
Could you please retry in an incognito window to ensure that any potential browser cache-related issues are ruled out?
Regards,
Kanchana
I’ve uploaded assets.csv into a dataset and trying to run the SQL script to create the assets but am getting the error Request with id 87b97f14-7c9b-94b9-9e1f-129e2ed18250 to https://api.cognitedata.com/api/v1/projects/cdf-fundamentals/assets failed with status 400: Reference to unknown parent with externalId Salma:23-PT-92535.
The query runs fine in Preview mode but fails to run. Funny thing is it processes a different amount of rows each time it runs.
Here’s the query I’m using:
SELECT concat('Salma:',loc) as externalId, IF(parent_loc='' OR parent_loc IS NULL, '', concat('Salma:',parent_loc)) AS parentExternalId, CAST(lastUpdatedTime AS STRING) AS name, to_metadata(*) AS metadata, description AS description, 6295592107437051 AS dataSetId FROM `Salma`.assets
Any ideas what’s going wrong?
Hi @Salma Ghafoor
It seems to be due to the configuration of the transformation. The Target resource type needs to be “Asset Hierarchy” instead of “Assets”. The Assets target resource type does not recognize the parent external id attribute. You can change it from the transformations UI by clicking on the dropdown next to “Create or Update Assets”
See the pictures below.
Thanks Sofie! That did the trick
Hi,
I am going through the Working With CDF: Integrate module, and have transformed the data and it looks like the 36 assets have been created based on this output:
But I cannot see the assets in the Data Catalogue:
Do you know what’s wrong?
Actually, I found the mistake from a previous post.
The transformation was wrong in the first place - dataSetId was supposed to be the actual number ID that was recorded down earlier :D
I’m glad you found out @Hannah Håland! Let us know if there is anything else we can help you with!
Hi,
I am going through the module Working With CDF: Contextualize, and I have loaded the P&ID. Now I am trying to link it to my Aveva assets. But I cannot find my dataset in the Data set drop down menu.
I have checked in the Data Catalog and the data set does exist in CDF.
How do I fix this?
Hei,
Yet again, I fix my own problem!
I close the browser and re-sign in and my dataset magically appears :)
Thank you for sharing your solutions with the community, @Hannah Håland. This is incredibly valuable for other learners of Cognite Academy. Good luck with gaining your CDF Fundamentals certificate!
Hi!
I am in Fundamental course and I am working on the exercise “Working With CDF: Contextualize”. The challenge I face is in the part “Contextualize P&IDs”.
My dataset with my assets does not show up in the drop-down - what could be the mistake I have made? (I have done my best to follow the training material)
Best regards,
Bjarte
Hi @Bjarte Håtveit
I experienced the same myself this morning and I will have to talk to the engineering team about it.
I’m having issues with Transformation of the Raw data provided. Running SQL Script:
SELECT concat('Brad1971:',loc) as externalId, IF(parent_loc='' OR parent_loc IS NULL, '', concat('Brad1971:',parent_loc)) AS parentExternalId,
CAST(lastUpdatedTime AS STRING) AS name,
to_metadata(*) AS metadata,
description AS description,
2147974692941503 AS dataSetId
FROM `Brad1971`.assets`
Is successful preview, but when I execute the query with
It gives an error:
I’ve tried it multiple times and deleted table (starting over) and cleared Raw table to start over as well as Target Table. No Luck. In reviewing the Raw Data Table, I don’t see is missing or bad ID. i could trace the parentID’s all the way up the tree. Please advise.
Sorry...I had selected Asset and not Asset Hierarchy….all good.
Hi all, I am just running through the Working With CDF: Integrate course and am facing a problem with inserting the data points into the time series.
I am following all the steps and at the end I should be able to see data points in the IFSDB time series data set. However, there are no data points in any of the time series.
The run states that 4,296 rows read - 15,428 timeseries read. So there should be data in IFSDB.values (the reference in the SQL query)
I also deleted the transformation once and did it all again, as well as changed the time interval to 5 years in case the data is older. Still I can see no data.
Has someone an idea I can try?
FYI here the SQL query from the training
SELECT
concat('FirstnameBirthyear:',dp.sensor) AS externalId,
cast(time_stamp/1000 as timestamp) AS timestamp,
cast(value AS double)
FROM IFSDB.values AS dp,
--selecting from _cdf.timeseries means we select from the timeseries we ingested to CDF earlier. We do this to make sure all the time series we try to add data points to actually exist
_cdf.timeseries AS ts WHERE CONCAT('FirstnameBirthyear:',dp.sensor) = ts.externalId
I found the problem. I accidentally deleted the “ : “ when creating the externalID for the time series. Thus the data points could not be matched. When I put the “ : “ back in to the SQL it worked!
@Charlotte Waese good you figured out!
Hi.
I have managed to create a set of timeseries that have correct external ID Leo1987* but are not linked to the dataset but linked to assets. I can browse them in Data Explorer, but cannot delete. Same time I have another set of timeseries linked to dataset Leo1987-IFSDB but not linked to actual assets. Can someone please help me to fix that.
Hi @Leonid Kuritsin
I have managed to create a set of timeseries that have correct external ID Leo1987* but are not linked to the dataset but linked to assets. I can browse them in Data Explorer, but cannot delete. Same time I have another set of timeseries linked to dataset Leo1987-IFSDB but not linked to actual assets. Can someone please help me to fix that.
Can you please let me know on how you are creating the timeseries? Are you using the API or the SDK?
You can delete the timeseries using the API/SDK. Please find the relevant documentation below.
It looks like they were created using one of transformations when i was trying to solve optional exercise.
I will try to delete them using API. Thank you very much
@Leonid Kuritsin Thank you for the update. The documentation here will help you to set the dataset_id to the timeseries when you create it using transformations.
Please let me know if you need further assistance.
I am going through the transformation part, I have run it a few times but still no luck. Initially, the error said that ‘parent_loc’ didn’t exist. when I looked at the uploaded asset.csv file in the RAW area it showed ‘parent_loc,,’ so I deleted the table and re-uploaded the data. But it will shows ‘parent_loc,,’ so I just updated the SQL code to match the column name but now I am having another error not sure, can something help, much appreciated.
Hi @ashyassi! What you can do to get past this blocker is to read from another data set.
If you do “FROM IFSDB.assets” instead of your own data set, you will be reading from a table that doesn’t have the extra commas at the end. The issue with the “fix” you tried is that it will not find any matches for the parent external id, since the external id doesn’t have the extra commas, but the parten external id column does, and it will not match.
Hello guys, I am not able to start the hands-on part of the CDF fundamentals course, because I am not able to upload the CSV file to create the first table. I put the CSV file there and it simply does not appear and do nothing. I don’t know how to continue the course without being able to upload the data.