Skip to main content
Question

Save data extracted with OPC UA extractor to raw staging area in CDF or link timeseries received to current CDF assets current

  • 16 July 2024
  • 4 replies
  • 58 views

Hello, is it possible to send the timeseries data in raw staging area in CDF instead of writing directly in CDF timeseries or link the timeseries directly to the current assets in CDF? Is not clear for me how can I do this with the current config.yml parameters. I've seen there is an option to use node-map parameter to link timeseries to assets if I'm not wrong but can anyone provide an example for this?

Thank you,

Raluca

4 replies

Userlevel 4

I have looked through the sources and full config file in the repository, and am confident in stating the following:

You can use RAW as the target for the data describing the Time Series (metadata), but you cannot do the same for the data points (they have to be sent to the Time Series Data Points service).

Typically, we do not recommend extracting time series data to RAW from any source. This is due to the data point volumes we’ve typically experienced (i.e. we basically view this as an anti-pattern in the Cognite product division).

The TimeSeries service is designed to receive and return significant amounts of datapoints - we’ve ran workloads moving over 40 million data points written (ingested) per second, in our shared production environments without any negative SLA/SLO impact - while the RAW (staging) service is designed to handle larger rows of columnar data.

What is the use case causing you to prefer to move the data points to a service like RAW, and not directly into the TimeSeries service?

Userlevel 1
Badge +6

Hi @Thomas Sjølshagen ,

Thank you for your detailed answer. I don't need necessarily to save data in RAW but ideally I need to link directly the timeseries to the current assets I have in CDF. How can I do this, it is possible before pushing data to CDF ts? Or do I need to use transformation afterwards to link the timeseries from CDF to their respective assets?

Thank you,

Raluca

Userlevel 1
Badge +6

Hello, 

I've used transformation afterwards to link the timeseries from CDF to their respective assets. I thought I would need to run this transformation over and over again but it seems that incoming ts with datapoints from the OPC UA extractor are finding the ts from CDF and update them with new datapoints.

Thank you,

Raluca

Userlevel 4
Badge +7

I have looked through the sources and full config file in the repository, and am confident in stating the following:

You can use RAW as the target for the data describing the Time Series (metadata), but you cannot do the same for the data points (they have to be sent to the Time Series Data Points service).

Typically, we do not recommend extracting time series data to RAW from any source. This is due to the data point volumes we’ve typically experienced (i.e. we basically view this as an anti-pattern in the Cognite product division).

The TimeSeries service is designed to receive and return significant amounts of datapoints - we’ve ran workloads moving over 40 million data points written (ingested) per second, in our shared production environments without any negative SLA/SLO impact - while the RAW (staging) service is designed to handle larger rows of columnar data.

What is the use case causing you to prefer to move the data points to a service like RAW, and not directly into the TimeSeries service?


At Radix, we've also characterized sending time series data directly to the RAW layer as a Cognite anti-pattern. Your message reinforces our thoughts on this matter. Thanks for sharing the information! @Thomas Sjølshagen 
 

Reply