🚀 Impact 2024: The industrial data and AI conference for and by users | Nominate speakers now for a...
Hi @Jørgen Lund , Do you have any update of this feature? Can we try in any CDF project?
Just an additional pointers, our projects are backed by SAuth identity and not Azure AD
@Jørgen Lund Thanks for your response. We have scheduled our current transformations to run hourly from domain model to solution model. As we are ingesting more data in domain model , the transformations from domain to solution has been taking time as it always considers all the data. The scheduled transformation from RAW to domain model is working fine as it only considers new data.
Thanks @Jason Dressel for the update.
Looks like, The extractor has minimum accepted date so that’s why there is an issue in storing incremental state Checked with below dates:1971-01-01 05:29:59.000, Error in logs: Discarding 1 datapoints due to bad timestamp or value1971-01-01 05:30:00.000, No Errorsso looks like it works when timestamp is >= 1971-01-01 05:30:00.000 (UTC+5.30 timezone). If extractor considers the source timestamp in UTC timezone , then it could have worked beyond 1971-01-01 00:00:01.000. Does it mean that extractor considers source timezone?
Hi,In our case LAST_UPDT_DATE column(incremental column) and START_DATETIME(actual physical timestamp of property) both are different. Looks like , the above query assumes both are same. so below query works assuming both are same: SELECT * FROM ( SELECT ITEM_ID + PROPERTY_TYPE+'Avocet5050' as "externalId", CAST(LAST_UPDT_DATE AS DATETIME) as timestamp, PROPERTY_VALUE as value FROM ITEM_PROPERTY WHERE START_DATETIME is not null and PROPERTY_VALUE is not null ) as table1 WHERE table1.{incremental_field} is not null and table1.{incremental_field} >= {start_at} ORDER BY table1.{incremental_field} ASC incremental-field: timestampWhat changes are required considering START_DATETIME as actual physical timestamp of measurement and LAST_UPDT_DATE as a incremental field(when that measurement got updated in database table).
Hi,Here is the request id : 099cefef-6bc9-98c0-8122-68076440353c
I tried that as well, but I still the same issue.
Hi, I have verified the credentials again and I use the same credentials to call APIs Here is the complete error: > transformations-cli deploy .Deploying transformations.....\Python311\Lib\site-packages\cognite\transformations_cli\commands\deploy\transformations_api.py:95: UserWarning: Feature DataModelStorage is in beta and still in development. Breaking changes can happen in between patch versions. return Instances(Credentials for cli-test-transformation write failed to validate: Unauthorized | code: 401 | X-Request-ID: xxxxx
Hi, I tried that, now I see different error (unauthorized error), looks like some permission issue: return Instances(Credentials for cli-test-transformation write failed to validate: Unauthorized | code: 401
Hi, Thanks for your reply. Looks like I am still missing something: transformations-cli deploy .Deploying transformations...Failed to parse transformation config, please check that you conform required fields and format: Invalid config: Wrong type for field "destination" - got "{'view_space_external_id': 'test-data-model-space', 'view_external_id': 'Entity', 'view_version': 2, 'instance_space_external_id': 'test-data-model-space', 'type': 'instances'}" of type dict instead of DestinationType, DestinationConfig, RawDestinationConfig, SequenceRowsDestinationConfig, DMIDestinationConfig, InstancesDestinationConfig, RawDestinationAlternativeConfig Here is the entire contents of manifest file: externalId: tr-cli-test-transformationname: cli-test-transformationquery: >- file: transformation.sqldestination: type: instances view_space_external_id: test-data-model-space view_external_id: Test view_version: 0_2 instance_space_external_id: test-data-model-spaceignoreNullFields: truesha
Hi,Thanks for your reply. I tried with latest transformations-cli (2.3.0) but I see the same error. I generated manifest file from CDF UI. Can you please provide me the example of manifest file with destination as FDM model? Thanks.
Hi,If I create following model then List of timeseries work. Looks like, we can’t probably refer List of CDF native TimeSeries directly and has to be wrapped in another type. Hope my understanding is correct. type Device{name: String,sensors: [NewTimeSeries]}type NewTimeSeries{timeseriesrecord: TimeSeries}
Hi Anders, For this query: query MyQuery { getEntityById( instance: {spaceExternalId: "device_data_model_new", externalId: "4b7a198df8b14a52b7f0386d595a18f9:8881e9bc3bb24b62895526e4d0c7a9b1"} ) { edges { node { name sensors{ dataPoints { timestamp value } externalId } } } }} I see below error: { "errors": [ { "message": "Cannot read properties of undefined (reading 'split')", "stack": "TypeError: Cannot read properties of undefined (reading 'split')\n at splitLinesAuto (https://slb.fusion.cognite.com/apps/cdf-solutions-ui/v.eedfcff5a199bd41b940646351c42488cae36052/vendors-node_modules_graphiql_react_dist_codemirror_es_js.js:773:21)\n at Object.splitLines (https://slb.fusion.cognite.com/apps/cdf-solutions-ui/v.eedfcff5a199bd41b940646351c42488cae36052/vendors-node_modules_graphiql_react_dist_codemirror_es_js.js:7071:16)\n at Object.<anonymous> (https://slb.fusi
Hi Anders,Please ignore above query of device type and see below query for Device type as for above I was getting duplicate external id error: sql query for Device type transformation as follows:select array(cdftimeseries.externalId) as sensor, device.NAME as name, concat(device.ID, ":",substring_index(cdftimeseries.externalId, ':', -1)) as externalId from `devicedb`.`devices` device left join _cdf.timeseries as cdftimeseries on device .ID=substring_index(cdftimeseries.externalId, ':', 1) where cdftimeseries.dataSetId=248572714504585considering externalId for Device is for example : device1:row1
Hi Anders,Thanks for your reply. when we do the sql transformation for Device, in the transformation window I see sensor type as Array<String>: nullable sql query for CDF timeseries destination type as follows for creating timeseries records:select concat(ID,":",SensorName,":",key) as name, concat(ID,":",SensorName,":",key) as externalId,false as isString, 248572714504585 as dataSetId from `devicedb`.`devices`This has created timeseries records having externalId like:device1:s1:row1device1:s2:row2 sql query for Device type transformation as follows:select array(cdftimeseries.externalId) as sensor, device.NAME as name, device.ID as externalId from `devicedb`.`devices` device left join _cdf.timeseries as cdftimeseries on device .ID=substring_index(cdftimeseries.externalId, ':', 1) where cdftimeseries.dataSetId=248572714504585 Do you think this looks ok? I wanted to confirm if I am referring timeseries externalId correctly in device type transformation.
Hi Anders,Thanks for your reply. In the Graphql query, I am using getDeviceById, giving externalId for specific device and referring particular sensor through filtering present in that device and selecting timestamp and value. That gives me all timestamps and values. The expectation is to get only latest timestamp and value.sample query:query MyQuery { getDeviceById( instance: {spaceExternalId: "data_model_new", externalId: "aa21ba4e2a804aeaa3b954fa057b8990"} ) { edges { node { name externalId sensors(filter: {name: {eq: "s1"}}) { edges { node { externalId name timestamp value } } } } } }} On this comment- “Note that you should probably not FDM as a time series database, but rather use the time series database we do provide” . Can you please elaborate more. Can we use CDF native TimeSeries object in FDM data model type? As of now, I am extractin
Thank you Dilesha, this works.
The actual timestamp is present in the column START_DATETIME but as you mentioned above I modified it to LAST_UPDT_DATE in select query. Here is the query section in config: - name: timeseries database: sqldb # same as above database query: SELECT ID + TYPE as "externalId", LAST_UPDT_DATE as timestamp, PROPERTY_VALUE as value FROM ITEM_TABLE WHERE START_DATETIME is not null and PROPERTY_VALUE is not null and LAST_UPDT_DATE is not null and LAST_UPDT_DATE >= CAST('{start_at}' AS DATETIME2) ORDER BY LAST_UPDT_DATE ASC incremental-field: LAST_UPDT_DATE schedule: "*/1 * * * *" initial-start: '2020-01-01' destination-type: TIME_SERIES
Thanks Pierre for your reply. Instead of START_DATETIME I used LAST_UPDT_DATE for timestamp in select but I still see the same error.
Thanks @Vu Hai Nguyen for the quick response. I have follow up question on using TimeSeries Type in FDM data model. Is it the native CDF timeseries resource type. Can we link FDM field to existing timeseries resource type to refer the existing timeseries data. If it is true , How can I use that in transformation, to link the timeseries data.
Already have an account? Login
Enter your E-mail address. We'll send you an e-mail with instructions to reset your password.
Sorry, we're still checking this file's contents to make sure it's safe to download. Please try again in a few minutes.
Sorry, our virus scanner detected that this file isn't safe to download.