Impact 2024: The Industrial Data and AI Conference for and by Users | Nominate Speakers Now for a Ch...
When I upload a file into Classic data model, how do I associate that file object to Flexible Data Model (FDM) object and retrieve the file object data using Graph API from FDM.
Hi Team, From the cdf-fundamentals training, I have mapped required details but data is displaying as a linear format not trends.
I am reporting the following issue: Steps to reproduce:Login to cognite (I used non prod / QA environment) Go to transform data Create a transformation by providing all required fields Transformation is created but is very slowSometimes at the end of point 2 the cognite web page becomes un-responsiveAlso in general it is observed degraded performance for testing the groups and capabilities in manage access.The page is becoming unresponsive. Please find attached a screenshot of this.
Hyperlink in CDF is not navigate to right screen for the provided hyperlink. User will navigate to CDF by default screen not the intended hyper linke.g: https://oq.fusion.cognite.com/oq-test/explore/search/timeSeries?cluster=westeurope-1.cognitedata.com&env=westeurope-1&journey=timeSeries-114626196540160&q=separator
type SimulationModel { modelName: String! nodes: [SimNode]}type SimNode { modelName: String input: [Key]}type Key{ modelName: String unit: String}With the above sample schema, I am able to fetch all the SimNodes together with their Keys for a specified SimulationModel instance using the GraphQL:query MyQuery{ listSimulationModel(filter:{externalId: {eq : "00000000-0000-0000-0000-000000000000"}}){ items { externalId modelName nodes(first:100){ items{ externalId modelName input{ items{ externalId modelName unit } } } } } }}How to do do the same with the APIs, could you give an example? Not sure which API should be used(instances/list, instances/byids, instances/search, instances/query).
hi,I am trying to access the data science training, and it requires two credits and when hit purchase it is freezing, any help for this matter ?
Exploring Cognite CDF
I was trying to upload a machine learning model file in pickle format of size 5GB. I observed that whenever the file is getting uploaded, the size of the uploaded file is 969 MB. The entire file is not getting uploaded. Only a part of it getting uploaded.Are there any restrictions on size for file upload ? Can we increase the that restriction while uploading?
I have also added limit parameter in sql query...some how limit parameter was not getting recognized
I was trying to create a flow to refresh all views in one of my solution model in a project. All my transformations are running individually without failing. But as soon as I run the Flow, only first transformation runs, the second one fails with an unclear message “Something went wrong”. I understand this is an EA features as of now, but I see a lot of application of this feature in one of our projects hence I am trying to setup some flows in it. Could you please help in checking if I am doing something wrong here?Task ID - 1dfb440d-c678-43a4-8c67-0d98467694e8 Flow Definition:[ { "externalId": "tr-tr-SDM-WellStatus-Entity", "type": "transformation", "name": "1706783603193", "parameters": { "transformation": { "externalId": "tr-tr-SDM-WellStatus-Entity", "concurrencyPolicy": "fail" } }, "onFailure": "abortWorkflow", "dependsOn": [] }, { "externalId": "tr-tr-SDM-Well-Status-Hierarchy", "type": "transformation", "name": "1706783606
N/A
There is similar question posted in other community. Please refer that.
Hi Team,I am getting an error message while running the cogex init in command promp.Is there anything I am missing here, can you guide me on that? Thanks and Regards,Navyasri Indupalli
Hi my blog collegus,can some one let me know the following info,please let me know how to login through my company, do subscription from market place to cognite is mandatory, I need to do a demo projectRegardsDev
Hi Team, I am facing some issues while using the above pipeline.yaml file setup to run the pipeline, Could you please help me doing the devops setup so that i can understand the process.I will set up a call , please let me know your available timings. Regards,Nidhi N G
It has been observed that while transforming data from CDF datapoints table, if the datapoint has a timestamp value with year below or equal 1970, the timestamp value is shown as null. How do we deal with this issue?
Hi I have an application to read data from and I have two options either to use ODBC or OPC UA directlyWhat are the benchmarks or limitations when it comes to aboveI have 30,000 data points to read with sample time of 1s and 5s and 10s
When I bring my chart into the canvas, the calculated values are not showing. I’m likely just missing a setting, but cannot figure it out. Hope someone can help me. Thanks.
Hi Team ,I am getting data set id validation error while trying to deploying functions.i have earlier deployed functions before using the same dataset id on that particular instance.but now suddenly getting the below error though i have not done any changes to the existing setup.Please help me here. Regards,Nidhi N G
How to get permission to extract data from learning portal to consolidate role based training completions at an organization level - for e.g. Accenture
I am getting below error while executing Documentum Extractor. FATAL: Invalid YAML file: Both database and table must be set when metadata destination is RAWcom.cognite.connector.dctm.config.InvalidConfigException: Invalid YAML file: Both database and table must be set when metadata destination is RAW. I have gone through the cognite documentation for possible configuration, it is not mentioned in the documentation. There is no parameter for metadata_destination, database, table. Please help me with details, from where we can get this parameter info? or how can I configure it in YAML? Please find my config.yaml file contents as below: version: 1logger: console: level: INFOcognite: # Read these from environment variables host: #${COGNITE_BASE_URL} project: #${COGNITE_PROJECT} idp-authentication: # token-url: ${COGNITE_TOKEN_URL} tenant: #${tenant} client-id: #${COGNITE_CLIENT_ID} secret: #${COGNITE_CLIENT_SECRET} scopes:
while using python sdk 5.10.5 and downgrade versions getting issue like 'CogniteClient' object has no attribute 'datapoints'.With latest version getting error while hitting micro service. Please find below file for your reference.
Hello team,We have ingested 15671 instances on an indexed model.While trying to view the data on the data managment tab on the fusion UI, we are getting 408 error.When we try to query the data through GraphQL, even if we apply limit of 1 we are getting the same response.Error message: { "title": "", "message": "Client request(POST https://westeurope-1.cognitedata.com/api/v1/projects/slb-pdf/models/instances/query) invalid: 408 Request Timeout. Text: \"{\n \"error\": {\n \"code\": 408,\n \"message\": \"Graph query timed out. Reduce load or contention, or optimise your query.\"\n }\n}\""}Details of the model:space: gb_MovieDMdata model: Avocet data model
Already have an account? Login
Enter your E-mail address. We'll send you an e-mail with instructions to reset your password.
Sorry, we're still checking this file's contents to make sure it's safe to download. Please try again in a few minutes.
Sorry, our virus scanner detected that this file isn't safe to download.