Impact 2024: The Industrial Data and AI Conference for and by Users | Nominate Speakers Now for a Ch...
Hi,In the documentation it is mentioned that 10,000 timeseries is the limit in each subscription. Is this the hard limit or it can be changed based on use cases? We are using CDF workflow and in that our first function to read data points subscriptions to get the timeseries and datapoints. Based on our use cases, 10,000 number for timeseries in subscription is too small. We are considering 10,000 wellbores and 200 properties. (10,000 * 200 = 2 Million timeseries). Any recommendations?
Suppose I want to perform below activities in the workflowFunction 1 → Dynamic Task 1 (Set of Tasks) → Function 2Function 2 will only get started after all Dynamic tasks are completed , Is there any way through which Function 2 can run in parallel based on different output coming from each dynamic task? Even if it is possible to achieve this without dynamic tasks in workflow, please let me know those ways.The output coming from all dynamic tasks are not dependent so that’s why it should be fine to run the Function 2 independently for each output. Also, please provide the example of workflow definition if I want to use the output of dynamic task in Function 2. For example, if I want to use output of dynamic task “step-2-dynamic-task-get-records” in “step-3-print-records”. This is the workflow definition I am using. { "items":[ { "workflowExternalId": "nk-cdf-workflow2", "version": "5", "workflowDefinition": { "description": "", "tasks": [ {
Hi, I created one workflow using Flows option with only 2 transformations in it. I see the inconsistent behavior when it runs. Most of time, It is failing with below error. Sometimes it gets succeeded. Task 0db08faf-5701-4de7-ac53-1725b7948869 failed with status: FAILED and reason: 'Unknown reason' https://slb.fusion.cognite.com/slb-pdf/flows/nk-cdf-osdu-flow?cluster=westeurope-1.cognitedata.com&env=westeurope-1 Also observed below error as well: Task faeabd7c-681f-468c-bfa3-a6d17380a497 failed with status: FAILED and reason: 'Transformation job is already running and failure mode is set to fail on running'
Hi, When the data is in CDF RAW then we can use is_new function with lastUpdatedTime in transformation and this works fine but how can we use that function for data coming from FDM model. Let’s say, from CDF RAW we are populating the data in FDM domain model and we are considering only incremental data with is_new function in transformations.This is the domain model (space: dm):type Entity @view(space: "dm", version: "1"){name:String} This is transformation query to populate above Entity type:select distinct concat('wellbore:',WellUWI) as externalId, WellName as name, FROM `db1`.`wellbores`where is_new( "db1:wellbores", lastUpdatedTime )This is good and we can see only incremental data from CDF RAW in the domain model. Now this is the solution model (sol_model):type WELLBORE{name:String }This is transformation query to populate above WELLBORE type ( I can directly map the name of WELLBORE of solution model to ENTITY of domain model but for this example I am ign
Hi, I am observing below error(in bold) in extractor log for timeseries query: 2023-01-24 03:31:03.345 UTC [INFO ] QueryExecutor_0 - Starting extraction of avocet-timeseries2023-01-24 03:31:03.345 UTC [DEBUG ] QueryExecutor_0 - Query: 'SELECT ITEM_ID + PROPERTY_TYPE+'Avocet' as "externalId", START_DATETIME as timestamp, PROPERTY_VALUE as value FROM ITEM_PROPERTY1 WHERE START_DATETIME is not null and PROPERTY_VALUE is not null and LAST_UPDT_DATE is not null ORDER BY LAST_UPDT_DATE ASC'2023-01-24 03:31:03.612 UTC [INFO ] QueryExecutor_0 - No more rows for avocet-timeseries. 2 rows extracted in 0.267 seconds2023-01-24 03:31:03.873 UTC [DEBUG ] ThreadPoolExecutor-9_0 - https://westeurope-1.cognitedata.com:443 "POST /api/v1/projects/slb-pdf/timeseries/data HTTP/1.1" 200 222023-01-24 03:31:03.875 UTC [DEBUG ] ThreadPoolExecutor-9_0 - HTTP/1.1 POST https://westeurope-1.cognitedata.com/api/v1/projects/slb-pdf/timeseries/data 2002023-01-24 03:31:03.877 UTC [INFO ] QueryExecutor_0
Hi,I am trying to run transformations-cli locally with below command:transformations-cli deploy . (The current directory has manifest.yaml and transformation.sql files) I am getting below error:Deploying transformations...Failed to parse transformation config, please check that you conform required fields and format: Invalid config: can not match type "dict" to any type of "destination" union: typing.Union[cognite.transformations_cli.commands.deploy.transformation_types.DestinationType, cognite.transformations_cli.commands.deploy.transformation_types.DestinationConfig, cognite.transformations_cli.commands.deploy.transformation_types.RawDestinationConfig, cognite.transformations_cli.commands.deploy.transformation_types.SequenceRowsDestinationConfig, cognite.transformations_cli.commands.deploy.transformation_types.AlphaDMIDestinationConfig] Here is destination section of manifest file:destination: viewSpaceExternalId: my-model-space-id viewExternalId: CDFTimeSeries viewVersion: 0_2 i
Hi, I have created one FDM model having JsonObject field. In the transformation query I referred that field , example query:SELECT Series_Title as externalId, Series_Title as name, Overview as description, int(Gross) as gross, float(IMDB_Rating) as imdbRating, int(Runtime) as runTime, int(Released_Year) as releasedYear, '{"key1":"value1","key2":{"key3":12,"key4":"hello","key5":{"key6":10}}}' as dataFROM movies.moviesFDM model is:type Movie { name: String! description: String watchedIt: Boolean imdbRating: Float releasedYear: Int runTime: Int gross: Int actors: [Actor] data: JSONObject} The transformation is failing with this error:Unknown property type Json. May be I am missing something, Can you please help?
Hi, I see this error in extractor log, I hope even if this happens , it will be retried and there will not be the data loss. Please correct if I am wrong or if there are ways to change any configuration in extractor to avoid this. 2023-03-07 05:42:04.233 UTC [WARNING ] QueryExecutor_3 - Too many concurrent requests in pod | code: 503
Hi,Let’s I have below data model created in FDM and there are 2 types Device and Sensor. Device is referring to other type Sensor. Device - name:String - sensors:[Sensor] Sensor - name:String - timestamp: Timestamp - value: Float I have populated the data through transformation in this model. I wanted to check on how to get the latest timestamp and value for particular device and sensor through FDM GraphQL query.Example:Device : d1Sensor:name: s1timestamp: 2023-01-01 01:00value: 10.0 name: s1timestamp: 2023-01-02 01:00value: 20.0 name: s1timestamp: 2023-01-03 01:00value: 30.0 I am expecting to latest timestamp as 2023-01-03 01:00 and value as 30.0
Hi,I have data in raw table in this format:Table1:id p1 p2 p3 1 2.0 3.0 4.0 2 10.0 15.0 20.0 I wanted to transform this details into following format through transformation id property value 1 p1 2.0 1 p2 3.0 1 p3 4.0 2 p1 10.0 2 p2 15.0 2 p3 20.0 so I have written following transformation query:SELECT id, property, valueFROM `db1`.`Table1`UNPIVOT (value FOR property IN (p1, p2,p3)) UP This query works in SQL studio, but in COGNITE transformation UI , I see below error:Mismatched input 'FOR' expecting {')', ',', '-'}(line 3, pos 15) Can you please help to understand what is missing in this?
Already have an account? Login
Enter your E-mail address. We'll send you an e-mail with instructions to reset your password.
Sorry, we're still checking this file's contents to make sure it's safe to download. Please try again in a few minutes.
Sorry, our virus scanner detected that this file isn't safe to download.