Impact 2024: The Industrial Data and AI Conference for and by Users | Nominate Speakers Now for a Ch...
Hello Community,we have a option scheduling in cognite functions to run those functions periodically. But is there any way to trigger Cognite function when a particular event occur.thank you
Hiwhile attempting to run SELECTworkpackage_desc AS description,to_timestamp(completion_time/1000) AS endTime,to_timestamp(start_time/1000) AS startTime,type,subtype,concat('FirstnameBirthyear:',key) AS externalId, 1234567890 AS dataSetId FROM IFSDB.maintenanceWHERE to_timestamp(completion_time/1000)>to_timestamp(start_time/1000) I get this error Verify that the 'IFSDB.maintenance' relation exists | code: 400 | X-Request-ID: 019f535a-a30f-9184-b3d3-966184e9d36b Note that:I substituted my Name and birthyear changed the dataset idMy assumption: sure I have a IFSDB.maintenance RAW (not sure if I skipped something in the lesson...)
I have to build two 2 types of extractors.1. PI extractor to connect to the PI server and fetch the data and ingest into CDF2. SharePoint online - Extract data from files present in SharePoint online and ingest into CDF. I wanted to know the construct in building the extractors. Mainly wanted to account for scenarios where the PI server is not available and PI extractor is unable to fetch the data from the pipeline. How to handle these kinds of situations and incorporate them in the code while building the extractor. Also, how to handle monitoring while performing extractor. Are there some sample code repos that can be referenced for getting complete idea for building extractors.
We multiple are seeing 408 response code from graphql query api. There are multiple combinations of filters which are not working. For eg the below one returns 408, i think the problem is with the array length of filter entity.entityName. If the array length is 2 it works else it is not working.query MyQuery { listTimeSeriesRef(filter:{and:[{entity:{entityType:{in:["NAVIGATOR","SCREEN_CFG_REF"]}}},{property:{in:["TUBING_TEMP"]}},{entity:{entityName:{in:["Process.ReAllocation","Analysis Point by Desks","Downtime.GroupDowntime.DateTime.Subsystem"]}}}]}) { edges { node { property entity{ lastUpdated entityType entityName } } } } }
While going through the academy course, many pages are not displaying properly. Starting with the “Check your knowledge” page under section “Introduction to entity matching” through the remainder of the course.
Hi Team, Please enable and configure Air setup in the below CDF project. (Accenture-demo-dev) under cluster EUROPE1-GOOGLE. Please let me know for any details.
We are having performance issues for event retrieval whenever there are more events even below 500.Can you help me to understand how partitioning can be done or pagination, as we need to perform operation for each event after retrieval and send it as a api response.I tried below: for unstructre_insight in client.events(type="Insight", limit=None,partitions=10): (“do something with unstructre_insight in each partition in parallel to reduce response time”)I observed that if there are 64 events, then all 64 events are retrieved in one execution, how we can get one partition at a time and perform something for first partition in parallel while retrieving second partition to reduce time.
Request with id 1912a13b-f4d1-9e0d-a7df-056276ca269e to https://westeurope-1.cognitedata.com/api/v1/projects/****/timeseries/data failed with status 400: Expected numeric value for data point.how to get to know that for which external id this error is shown?
I am able to browse ds-basics , cdf-fundamentals ….etc
Hello team,We have ingested 15671 instances on an indexed model.While trying to view the data on the data managment tab on the fusion UI, we are getting 408 error.When we try to query the data through GraphQL, even if we apply limit of 1 we are getting the same response.Error message: { "title": "", "message": "Client request(POST https://westeurope-1.cognitedata.com/api/v1/projects/slb-pdf/models/instances/query) invalid: 408 Request Timeout. Text: \"{\n \"error\": {\n \"code\": 408,\n \"message\": \"Graph query timed out. Reduce load or contention, or optimise your query.\"\n }\n}\""}Details of the model:space: gb_MovieDMdata model: Avocet data model
If I have view B which implements view A, in the definition of view B do I need to list the properties (e.g. container mappings) that are “inherited” from A? I noticed that if I don’t list them, this works fine the first time when view B is created (I can see the inherited properties automatically included), but if I try to update view B (even if nothing changes) the call fails saying that there is a change in the property list and the version needs to be changed. I’m assuming this is because the inherited properties are missing from the definition. Is this behavior by design and I have to explicitly list the properties, or would it be considered a defect?
HI, May i know how to schedule a refresh in Power BI for Cognite as i am facing below error?When i try to schedule, i see it is disabled for me.I tried to install the gateway locally as well but it says installation failed.While establishing the connection with CDF on PBI desktop i get pop-up as third party connector. Does that mean PBI service doesn't support scheduling for Cognite?Please help.TIA.Regards,Arati
Hello, it’s kind of easy to get all edges via rest api, but how i can filter them by start node? It’s not clear from documentation how to create such filter. Or any other approach using rest api will help me.{ "instanceType": "edge", "limit": 1 "filter: { "property": [], "value": "value" }}My filter is not correct here, but what I want is to filter all edges where “startNode”: {“space”: “name-space”, “externalId”: “some-id”}
Hello Team,i have deleted some of the events from cognite but still i can see its present and when i click on that then it’s giving resource not found. can anyone please help what can be the issue.and also I am trying to delete all that also not able to delete due to previous issue.PFA.
Could you please suggest how to limit source?
When following the CDF Fundamentals - Working with CDF: Integrate, I accidently performed the events and timeseries transformations using the data set ID of the AvevaNet data set, and not the IFSDB data set. I don’t know how to undo this. Any help would be much appreciated.
I was trying to upload a machine learning model file in pickle format of size 5GB. I observed that whenever the file is getting uploaded, the size of the uploaded file is 969 MB. The entire file is not getting uploaded. Only a part of it getting uploaded.Are there any restrictions on size for file upload ? Can we increase the that restriction while uploading?
https://github.com/cognitedata/cognite-sdk-js/tree/master/samples/react/msal-browser-react, I am trying to run npm install and $ REACT_APP_CDF_PROJECT=... REACT_APP_AZURE_TENANT_ID=... REACT_APP_AZURE_APP_ID=... npm startgot the following error:node:internal/modules/cjs/loader:1024 throw err; ^Error: Cannot find module '/Users/kevin.peng/code/cognite/cognite-sdk-js/samples/react/msal-browser-react/start' at Function.Module._resolveFilename (node:internal/modules/cjs/loader:1021:15) at Function.Module._load (node:internal/modules/cjs/loader:866:27) at Function.executeUserEntryPoint [as runMain] (node:internal/modules/run_main:81:12) at node:internal/main/run_main_module:22:47 { code: 'MODULE_NOT_FOUND', requireStack: []}
type SimulationModel { modelName: String! nodes: [SimNode]}type SimNode { modelName: String input: [Key]}type Key{ modelName: String unit: String}With the above sample schema, I am able to fetch all the SimNodes together with their Keys for a specified SimulationModel instance using the GraphQL:query MyQuery{ listSimulationModel(filter:{externalId: {eq : "00000000-0000-0000-0000-000000000000"}}){ items { externalId modelName nodes(first:100){ items{ externalId modelName input{ items{ externalId modelName unit } } } } } }}How to do do the same with the APIs, could you give an example? Not sure which API should be used(instances/list, instances/byids, instances/search, instances/query).
Currently, we choose the Instance Space same as Data Model ? What is the purpose of Space ? When we need to create one different with data model ? thanks :)
while using python sdk 5.10.5 and downgrade versions getting issue like 'CogniteClient' object has no attribute 'datapoints'.With latest version getting error while hitting micro service. Please find below file for your reference.
I am following the steps in the course. But still I am getting error. Error
Hi Team ,I am getting data set id validation error while trying to deploying functions.i have earlier deployed functions before using the same dataset id on that particular instance.but now suddenly getting the below error though i have not done any changes to the existing setup.Please help me here. Regards,Nidhi N G
The change in the requiredness of one of the field is not consistently validated across all the inherited types.Below works and the data model is published.Data Model Version - 1interface Person @view(version: "1"){name: String}type Actor implements Person @view(version: "1") { name: String didWinOscar: Boolean}Next stepBelow doesn't work and I get an error "Can not change the nullability of the property\n We do not support modifying a field's 'required'ness at the moment."interface Person @view(version: "2"){name: String!}type Actor implements Person @view(version: "2") { name: String! didWinOscar: Boolean}Next stepBelow worksinterface Person @view(version: "2"){name: String}type Actor implements Person @view(version: "2") { name: String! didWinOscar: Boolean}
Hi,I am trying to run transformations-cli locally with below command:transformations-cli deploy . (The current directory has manifest.yaml and transformation.sql files) I am getting below error:Deploying transformations...Failed to parse transformation config, please check that you conform required fields and format: Invalid config: can not match type "dict" to any type of "destination" union: typing.Union[cognite.transformations_cli.commands.deploy.transformation_types.DestinationType, cognite.transformations_cli.commands.deploy.transformation_types.DestinationConfig, cognite.transformations_cli.commands.deploy.transformation_types.RawDestinationConfig, cognite.transformations_cli.commands.deploy.transformation_types.SequenceRowsDestinationConfig, cognite.transformations_cli.commands.deploy.transformation_types.AlphaDMIDestinationConfig] Here is destination section of manifest file:destination: viewSpaceExternalId: my-model-space-id viewExternalId: CDFTimeSeries viewVersion: 0_2 i
Already have an account? Login
Enter your username or e-mail address. We'll send you an e-mail with instructions to reset your password.
Sorry, we're still checking this file's contents to make sure it's safe to download. Please try again in a few minutes.
Sorry, our virus scanner detected that this file isn't safe to download.