Impact 2024: The Industrial Data and AI Conference for and by Users | Nominate Speakers Now for a Ch...
Hi @Kristian Nymoen Here’s another reference that can assit you in using the Cognite API to upload your 3D models.https://developer.cognite.com/dev/guides/upload-3d-models/upload-3d/
Hi @Raluca Bala, I'm currently using the repositories listed below. However, both of them are similar, and the last update was four years ago.https://github.com/cognitedata/interacting-with-open-industrial-data https://github.com/cognitedata/open-industrial-dataIf this is for training purposes, I would assume that the only publicly available data is within the 'publicdata' project in Cognite Data Fusion.If you happen to find something else, please share.Regards, Andre
@Ben Petree, my mistake. I realized that the client secret is created in Azure.Thank you very much for your support.
Thanks @Ben Petree I will definitely take a look at the how-to. However, I am unsure about how to create the client secret on the Cognite side.
Hi @Rajendra Pasupuleti You can find an example of using the Cognite File Extractor for reading files in CSV format and uploading the content to CDF RAW at https://github.com/cognitedata/python-extractor-example.Hope this helps!André
Hi @Elka Sierra,It's not a use case at all; I was just wondering because we have our own PI extractors built on top of the PI SDK. We could contribute to Cognite PI extractors if they were open source, similar to OPC UA.Thank you very much,André
Great, @Everton Colling ! That was a viable solution, albeit a tough one. I'll take a look at the tutorial you mentioned. It would be fantastic to have your tutorial with a Cognite example. If I can contribute from my side, I will gladly share it with you. Thank you very much.
@andrelcalves - If the question is whether it would be possible to use a Push/Streaming/PubNub dataset right now, I believe the answer to be “no” as we do not expose the equivalent of a “streaming data flow” from CDF (and something would need to provide OData with the changing data information). If you’re asking whether a Push/Streaming/PubNub dataset could be a possible feature enhancement for our OData interface(s), that’s an interesting idea and something we may choose to do at some point. But it is unfortunately not something we can get started on until at the earliest the summer of 2024. Thanks @Thomas Sjølshagen
Hi @ipolomanyi ,Is there a projected release date for when this feature will become available in the Power BI CDF connector?Out of curiosity, since the final solution is not yet available, could this be addressed by using a Push dataset, Streaming dataset, or PubNub streaming dataset?Reference here: https://learn.microsoft.com/en-us/power-bi/connect-data/service-real-time-streaming
Cognite Team @Anita Hæhre , out of curiosity, since the final solution is not yet available, could this be addressed by using a Push dataset, Streaming dataset, or PubNub streaming dataset?Reference here: https://learn.microsoft.com/en-us/power-bi/connect-data/service-real-time-streaming
As mentioned by @mathialo , you can simply refer to the pytest documentation for guidance. However, if you're looking for some examples, you can explore the Cognite GitHub repositories. Below is a simple example of a validation unit test in the Cognite Python SDK:https://github.com/cognitedata/cognite-sdk-python/blob/master/tests/tests_unit/test_utils/test_validation.py
It may sound like a simple question, but does it always extract a full data load, or is it prepared for Change Data Capture (CDC)?
Great news @Christian Mueller
@VamsiGrandhi Please let us know if it works for you, and we would appreciate it if you could share how you resolved it.Thanks in advance
Hi @VamsiGrandhi,Please check with you are using client authentication as “Send as Basic Auth header”See my token configuration below: baseUrl: https://api.cognitedata.comAuth Url: https://login.microsoftonline.com/{{tenant-id}}/oauth2/v2.0/authorizeproject: publicdata
I found it on Cognite Extractor OPC-UA GitHub. I'll give it a try.metrics: # Start a metrics server in the extractor for Prometheus scrape server: host: port: 0 # Multiple Prometheus PushGateway destinations: push-gateways: - host: job: username: password: # Configuration to treat OPC-UA nodes as metrics. # Values will be mapped to opcua_nodes_NODE-DISPLAY-NAME in prometheus.
Thanks for the tutorial.How OPC-UA extract handle reconnection after disconnection in a subscription?
@Ishita Mathur It worked for me when I removed the body content and replaced it with {} since I didn't want to pass any parameters.Example: {{baseUrl}}/api/v1/projects/{{project}}/assets/listRequest body: {} Status 200OKTime:289 msSize:13.51 KB
Thank you @mathialo
Already have an account? Login
Enter your E-mail address. We'll send you an e-mail with instructions to reset your password.
Sorry, we're still checking this file's contents to make sure it's safe to download. Please try again in a few minutes.
Sorry, our virus scanner detected that this file isn't safe to download.