Impact 2024: The Industrial Data and AI Conference for and by Users | Nominate Speakers Now for a Ch...
I am developing a custom app for performing semantic search using Cognite FDM GraphQL endpoint and LLM and done some research using ChatGPT 3.5 to generate GraphQL query from natural language.To my understanding Cognite copilot feature find under data model UX is using Azure OpenAI that runs in the Cognite subscription.Want to know how to make use of Azure OpenAI that runs in the Cognite subscription for Custom App deveopment?
I am looking for function/query that can search in JSON type data against array of values. Json can have nested keys. I tried using JSON_SEARCH or JSON_CONTAINS . getting Undefined function:JSON_SEARCH(data->'$[*]', JSON_ARRAY("abc")) Is there a way to search in JSON object keys for list of values ? for e.g :JSON col data {"name":"Back","address":{"location":{"country":[{"city":{"state":[‘jkl’,’fgh’]},"type":"home"},"empId":"e100043867","FullName":"Back"}JSON_ARRAY = [‘name’,’state’,’country’]check if JSON_ARRAY values exists in JSON data ?Any help would be appreciated.
We have data model withtype abc{id: Stringkind: Stringname: Stringdescription: Stringkey: String}In other table , we have these columns stored as column valueskind attributes 1.0.0 name 1.0.0 description 1.0.0 key 1.0.0 name need query to join on column value(stored as rows) from one table with solution model table data can group on by id or kind? As there can be n number of attributes, if we can do it dynamically to match column value with column name.Any help would be appreciated
Hi,Could you advise me on how to fix the following error popped up from Microsoft when I get a new access token through the “Authentication” tab in Cognite API collection?AADSTS70011: The provided request must include a 'scope' input parameter. The provided value for the input parameter 'scope' is not valid. The scope {{baseURL}}/.default is not valid. The scope format is invalid. Scope must be in a valid URI form <https://example/scope> or a valid Guid <guid/scope>. I’m taking the course “Set up postman using OpenID connect” (https://lab.cognite.com/understanding-open-industrial-data/1328002)Here is my Postman setting.AuthorizationVariables
Currently, we choose the Instance Space same as Data Model ? What is the purpose of Space ? When we need to create one different with data model ? thanks :)
I’m trying to Create a new Numeric DataPoint transformation and when I created a new transformation the UI shows a question on how it should handle NULL values (keep existing value or deleting).I’ve setup my query with a CAST function to convert values to double. I understand that this will return NULL values when CAST is not possible, so I would expect the transformation to follow the settings defined for the transformation, but instead the transformation fails with the following message:Column 'value' was expected to have type Double, but NULL was found Can you please clarify what is the expectation regarding the Numeric DataPoint Transformation. Can it have NULL values as input or not and what is the settings for NULL values about?Thanks
Not able to access CDF portalsAccenture -Cognite Data Fusion Learn -Cognite Data Fusion
I was not able to find the time series data points for associated with the Seawater lift pump and Data quality hands on labs.
SQL Transformations error. Please report this error to Cognite. Job id: 91084409-b4e9-4e7c-b872-d1d26dcc2459 (cogniteId: 43334402) SQL Transformations error. Please report this error to Cognite. Job id: 7b2d398b-4234-46a9-a377-95400ae3af48 (cogniteId: 43334400) SQL Transformations error. Please report this error to Cognite. Job id: ff2d0960-b2df-4031-a48a-d2bc4b03e3c5 (cogniteId: 43334372)
How can I use Machine Learning Models inside Cognite data fusion?
Request with id 1912a13b-f4d1-9e0d-a7df-056276ca269e to https://westeurope-1.cognitedata.com/api/v1/projects/****/timeseries/data failed with status 400: Expected numeric value for data point.how to get to know that for which external id this error is shown?
I am looking for information related to 3d models. I would like to explore does reveal/cognite supports programmatic change of animation on 3d models which are uploaded to cdf. What i am looking for is, i have 3d model of pump and would like to change the animation speed depending on the speed i get from my api (basically real time speed which i get from my api). Does it possible ? If so could you guide me to example code to how to do that. Also the uploaded 3d model in cdf is not displaying like how the original model that was uploaded. After uploading into cdf its not displaying with colors and animations, could you tell me why it is like that ? Thanks
We multiple are seeing 408 response code from graphql query api. There are multiple combinations of filters which are not working. For eg the below one returns 408, i think the problem is with the array length of filter entity.entityName. If the array length is 2 it works else it is not working.query MyQuery { listTimeSeriesRef(filter:{and:[{entity:{entityType:{in:["NAVIGATOR","SCREEN_CFG_REF"]}}},{property:{in:["TUBING_TEMP"]}},{entity:{entityName:{in:["Process.ReAllocation","Analysis Point by Desks","Downtime.GroupDowntime.DateTime.Subsystem"]}}}]}) { edges { node { property entity{ lastUpdated entityType entityName } } } } }
When I upload a file into Classic data model, how do I associate that file object to Flexible Data Model (FDM) object and retrieve the file object data using Graph API from FDM.
Hi Team, From the cdf-fundamentals training, I have mapped required details but data is displaying as a linear format not trends.
I am reporting the following issue: Steps to reproduce:Login to cognite (I used non prod / QA environment) Go to transform data Create a transformation by providing all required fields Transformation is created but is very slowSometimes at the end of point 2 the cognite web page becomes un-responsiveAlso in general it is observed degraded performance for testing the groups and capabilities in manage access.The page is becoming unresponsive. Please find attached a screenshot of this.
Below is the screenshot of error , during running transformation. Below is the Query used to transform, Preview is working fine. But During the Transformation Run, it is giving above error, saying start time values cannot be greater than endTime values. But According to Query it shouldn’t be the case. plz suggest solution. Attached Sample Data(data sheet) For your reference.select `Notification_no` as externalId, `startTime` as startTime, `endTime` as endTime, `description` as description, `type` as type, 1118188859138991 AS dataSetIdfrom(select concat('NO_',QMNUM) as `Notification_no`, --to_timestamp(QMDAT,'M/d/yyyy') as `startTime`, CASE WHEN RIGHT(MZEIT, 2)='AM' THEN TO_TIMESTAMP(CONCAT(QMDAT,' ', SUBSTRING(MZEIT, 1, LENGTH(MZEIT) - 3)), 'M/d/yyyy h:mm:ss') ELSE TO_TIMESTAMP(CONCAT(QMDAT,' ', SUBSTRING(MZEIT, 1, LENGTH(MZEIT) - 3)), 'M/d/yyyy h:mm:ss') + INTERVAL 12 HOURS END AS `startTime`, CASE WHEN isnull(QMDAB) THEN
Hyperlink in CDF is not navigate to right screen for the provided hyperlink. User will navigate to CDF by default screen not the intended hyper linke.g: https://oq.fusion.cognite.com/oq-test/explore/search/timeSeries?cluster=westeurope-1.cognitedata.com&env=westeurope-1&journey=timeSeries-114626196540160&q=separator
type SimulationModel { modelName: String! nodes: [SimNode]}type SimNode { modelName: String input: [Key]}type Key{ modelName: String unit: String}With the above sample schema, I am able to fetch all the SimNodes together with their Keys for a specified SimulationModel instance using the GraphQL:query MyQuery{ listSimulationModel(filter:{externalId: {eq : "00000000-0000-0000-0000-000000000000"}}){ items { externalId modelName nodes(first:100){ items{ externalId modelName input{ items{ externalId modelName unit } } } } } }}How to do do the same with the APIs, could you give an example? Not sure which API should be used(instances/list, instances/byids, instances/search, instances/query).
hi,I am trying to access the data science training, and it requires two credits and when hit purchase it is freezing, any help for this matter ?
Exploring Cognite CDF
I was trying to upload a machine learning model file in pickle format of size 5GB. I observed that whenever the file is getting uploaded, the size of the uploaded file is 969 MB. The entire file is not getting uploaded. Only a part of it getting uploaded.Are there any restrictions on size for file upload ? Can we increase the that restriction while uploading?
I have also added limit parameter in sql query...some how limit parameter was not getting recognized
I was trying to create a flow to refresh all views in one of my solution model in a project. All my transformations are running individually without failing. But as soon as I run the Flow, only first transformation runs, the second one fails with an unclear message “Something went wrong”. I understand this is an EA features as of now, but I see a lot of application of this feature in one of our projects hence I am trying to setup some flows in it. Could you please help in checking if I am doing something wrong here?Task ID - 1dfb440d-c678-43a4-8c67-0d98467694e8 Flow Definition:[ { "externalId": "tr-tr-SDM-WellStatus-Entity", "type": "transformation", "name": "1706783603193", "parameters": { "transformation": { "externalId": "tr-tr-SDM-WellStatus-Entity", "concurrencyPolicy": "fail" } }, "onFailure": "abortWorkflow", "dependsOn": [] }, { "externalId": "tr-tr-SDM-Well-Status-Hierarchy", "type": "transformation", "name": "1706783606
N/A
Already have an account? Login
Enter your E-mail address. We'll send you an e-mail with instructions to reset your password.
Sorry, we're still checking this file's contents to make sure it's safe to download. Please try again in a few minutes.
Sorry, our virus scanner detected that this file isn't safe to download.