Is the following use case a good fit for Data Workflows?
- Cognite Function that is reading data from Time Series and writing "event-like" data into Data Modeling
- continuous processing for metric type A
- ideally an execution every minute, but more importantly no overlapping executions (i.e., if execution takes longer than a minute)
- passing state from one to another function execution
- hourly processing for metrics type B
- continuous processing for metric type A
Idea:
2 Data workflows:
- Hourly execution of trigger Cognite Function to calculate metric B
- Daily execution of a workflow that triggers Cognite Function that recursively outputs a dynamic task, which calculates metric A. The Function outputs information for the next execution run (i.e., timestamp for next execution, state, and other info). At the end of the day the dynamic task would not output a timestamp for the next execution and workflow would complete. The execution trigger could also be daily.
Check the
documentation
Ask the
Community
Take a look
at
Academy
Cognite
Status
Page
Contact
Cognite Support