Hello,I am running a csv_extractor that loads data to Raw explorer. I run the extractor from my machine locally through cmd. How do I set up an extraction pipeline to monitor this extractor? I created an extraction pipeline from “monitor extractor”, but not sure how to link it to my extractor.Regards,Diana
What is “YOUR_SECRET_API_KEY”? How to get this secret key? Please tell
I am new to learning how to utilize Cognite Data Fusion. I apologize if this question has already been answered; if it has, please inform me!Currently, I plan to use Cognite to establish a connection with a digital twin and utilize the Python SDK to import and extract data using custom-built models. However, I have a question regarding connecting the SDK to a private project. In the case of Open Industrial Data, we were provided with a Tenant ID, Client ID, and Base_URL.Hence, I would like to inquire whether there are any specific steps or requirements to connect to a private access project using the Python SDK. I would greatly appreciate any assistance you can provide. Thank you!
Hi,I’m going through the Python SDK- CDF Transformation module in the DATA ENGINEER BASICS - TRANSFORM AND CONTEXTUALIZE course.While I’m able to create the Transformation object, I’m unable to run due to some Authentication issues. When I’m trying to run using the below code:client.transformations.run(asset_transformation.id, wait=False)I’m getting the following error :Transformation job could not be created.Error code: 403API error: Invalid source/destination credentials: Could not authenticate with the OIDC credentials. Please check your credentials.Request ID: 1a470c41-9001-989b-8f97-a2aaefdfd098Is it due to some recent change in the Oidc credentials defined in the notebook currently?
Incremental load version should be LongType or TimestampType. | code: 400 | X-Request-ID: XXXXXX
Hi: I’m doing DATA ENGINEER BASICS - INTEGRATE course.Trying to execute Notebook 1, I cannot pass through the cell getting the authentication with Azure, namely:creds = authenticate_azure()Any help?P.S. I’m using Google Colab
Hi Team, Please help with the current setup or process of azure devops to deploy Cognite (CDF) Function(API) to CDF project.
I’m trying to transform some time-series data (manufacturing pump pressures) into Asset Hierarchy/Numeric Data.The SQL is not writing to the timestamp column and not seeing errors in Preview.I’ve tried MANY different SQL commands to convert the DateTimeStamp string column to timestamp column/datatype and none have worked. CSV files are attached if anyone wants to try it. Added query in attachments. Help is greatly appreciated! Transform setup: Here is some of my data being edited in Excel and saved as CSV then imported into CDF Raw. TagID Tagname Value DateTimeStamp FCE1-HYD-PMP-3 FCE1-HYD-PMP-3 NULL 3/2/2023 3:41 FCE1-HYD-PMP-3 FCE1-HYD-PMP-3 28.54443 3/2/2023 5:51 FCE1-HYD-PMP-3 FCE1-HYD-PMP-3 35.68053 3/2/2023 5:51 FCE1-HYD-PMP-3 FCE1-HYD-PMP-3 42.81664 3/2/2023 5:51 FCE1-HYD-PMP-3 FCE1-HYD-PMP-3 49.95274 3/2/2023 5:51 RAW
I have a transformation that is to write to a new dataset as seen here:SELECTconcat('CWS:',loc) as externalId,IF(parent_loc='' OR parent_loc IS NULL, '', concat('CWS:',parent_loc)) AS parentExternalId,CAST(lastUpdatedTime AS STRING) AS name,to_metadata(*) AS metadata,description AS description,7089382776719091 AS dataSetId FROM `CWS-Assets-DB`.`CWS-Assets-Tbl` The Preview seems to work fine.When RUN: I get following error:Request with id 96f4a082-9dde-9b41-8065-0b3cf0923197 to https://az-eastus-1.cognitedata.com/api/v1/projects/ra-istc-sandbox/assets/byids failed with status 403: Resource not found. This may also be due to insufficient access rights..The permissions on the dataset for my user group is raw:readraw:writeraw:listdatamodels:readdatamodels:writedatamodelinstances:readdatamodelinstances:writedatasets:readdatasets:writedatasets:ownertimeseries:readtimeseries:writefiles:readfiles:writeevents:readevents:writesequences:readsequences:write3d:read3d:create3d:update3d:deletetransfo
I am listing time series for a given asset, and get a lot of results. I need to filter based on the time series external ID, but the only option is by using the “external_id_prefix” argument to the list function. So I build up the prefix left to right. Somewhere in the external id is a parameter that I do not care about, and after comes a new parameter that I want a particular value of. Concrete example of external IDs:IAA_Miros_Weather_Data_WIA_008IAA_Miros_Weather_Data_WIB_008IAA_Miros_Weather_Data_WIC_008IAA_Miros_Weather_Data_WID_008IAA_Miros_Weather_Data_WIE_008I am interested in only getting time series with external IDs that contain “_WI” and that end with “008”. Is there a way to list time series with a wildcard? Something like this:client.time_series.list( asset_ids=[my_asset_id], limit=None, external_id_prefix="IAA_Miros_Weather_Data_WI*_008", partitions=4)I could of course obtain the relevant time series by filtering after the fact. Something like thisresult = cl
Hi Team I am trying to deploy the code into new env facing issue because of black --check.
In the CDF docs I see there is a PI and OPCUA connector. Many industry 4.0 community members see MQTT rather than OPCUA as the future of Industrial IoT cloud based communication, as it is lightweight, report by exception, client driven, etc…If I want to onboard a facility that has an IoT MQTT gateway, how does Cognite recommend we integrate? I have used Azure IoTHub and Azure Functions to connect and transform the data in the past. Do you have a reference architecture for this? Can we connect to the gateway directly with a CDF connector, or do you recommend using Azure IoTHub and Functions (or similar GCP/AWS)?Also Azure has some neat tools like IoT Device Provisioning Services, be nice to know a POV on if/how to utilize those. Thanks!
Already have an account? Login
Enter your username or e-mail address. We'll send you an e-mail with instructions to reset your password.
Sorry, we're still checking this file's contents to make sure it's safe to download. Please try again in a few minutes.
Sorry, our virus scanner detected that this file isn't safe to download.