Impact 2024: The Industrial Data and AI Conference for and by Users | Nominate Speakers Now for a Ch...
Great article Pierre. Pierre also opened a product idea around that topic, if you are interested, please upvote it
Thank you for the feedback Sunil
In general, we would also like to be able to create our own custom SQL function as well
Similar to the one for dataset i guesshttps://docs.cognite.com/cdf/integration/guides/transformation/write_sql_queries/#asset_ids
+1 or at least to extend the functionhttps://docs.cognite.com/cdf/integration/guides/transformation/write_sql_queries/#asset_idsto be able to use externalId as wellUsing names can lead to unexpected results since they are not unique
Have a setup the secret in your local environment?“Make Sure, You've created an environment variable CLIENT_SECRET with the value of the client secret obtained from admin.” There are multiple ways you can do that, either by using an .env file, you can get more information here You can also set the client secret using getpass, see more info here. The lines of code for that are in the notebook, but commented The line to check if the client is connected should beclient.iam.token.inspect()We will update the code on the notebook.
HiDid you run through the 1st notebook? (1_Authentication.ipynb)This is where the function to create the client is created and defined
Great to hear. Thank you for identifying this bug
Could you upgrade the package to the latest versionpip install cognite-replicator --upgradeVersion should be 1.2.3
Ok, apologies, i think you found a bug, we will look into it and fix it today
What is the value of “batch_size”? It looks like it is an dictionary. its value is set in the config file for examplelen(ts_src) is an int and printed above (408)batch_size should also be an int, can be set to 10000 for example
That is strange, it looks like the client secret you are not using does not have enough rights to list the events on publicdata. I would check with Cognite support who provided you the key to make sure you have the rights to view events.Otherwise, if you are confident using the Cognite SDK, you can perform a token inspect or try to list events manually to pinpoint the issue.
Interesting article @Frank Danielsen and @Jan Inge Bergseth
Yes, if you src_boolean_client_secret: True then you need to specify a client_secret as environment variables like you did. There could be a lot of events, it might take some time to replicate. the script is not hanging, but replicating in the background. Depending on the number of events on the source, it could take a considerate amount of time. For your case, they are over 40 million events in the publicdata project, which is why it takes some time
As usual, great article Pierre!
Excellent post Pierre!
I can also recommend the guide the integration team has created regarding troubleshootinghttps://docs.cognite.com/cdf/integration/guides/extraction/opc_ua/opc_ua_troubleshootingAnd the OPC UA extractor can also run as a Windows service or as Linux executable if that is your choice of infrastructure. More to come on this soon.
Regarding the ingestion and the sampling frequency there, this is not something we control as part of Open Industrial Data, the data is replicated from a data source we do not control, so it is very possible that something has changed over there
Hi Anders, There is not any fuzzy search on the externalID field, but we do support fuzzy search on the name and description. https://docs.cognite.com/api/v1/#tag/Time-series/operation/searchTimeSeriesDepending on the data model, do you have possibility to filter on that information somewhere else like the metadata field for example? If not, I guess filtering after the fact is a way to go as well.
You should install the python package black via pip or poetrypip install blackorpoetry install black
You can clone the repository and runblack .locally on your environment and push it to the remote repository
Hi,Thank you for your question.We have done MQTT integration for multiple customers. We have used our extractor-utils library that can be found onhttps://github.com/cognitedata/python-extractor-utils and the extractor can be deployed as a docker container.A simple configuration file containing connection information is read by the extractor to connect to MQTT and push data points to Cognite Data Fusion.We have also done GCP functions in the past triggering on messages from GCP Core (same as IoT Hub in Azure) as you are suggesting in your question.Let me know if that answers your question and if you would like additional information
Already have an account? Login
Enter your username or e-mail address. We'll send you an e-mail with instructions to reset your password.
Sorry, we're still checking this file's contents to make sure it's safe to download. Please try again in a few minutes.
Sorry, our virus scanner detected that this file isn't safe to download.