Skip to main content
Answer

Feasibility Assessment: Create "Continuous" processing with Cognite Functions and Data Workflows

  • April 30, 2024
  • 4 replies
  • 79 views

Is the following use case a good fit for Data Workflows?

  • Cognite Function that is reading data from Time Series and writing "event-like" data into Data Modeling
    • continuous processing for metric type A
      • ideally an execution every minute, but more importantly no overlapping executions (i.e., if execution takes longer than a minute)
      • passing state from one to another function execution
    • hourly processing for metrics type B

Idea:
2 Data workflows:

  1. Hourly execution of trigger Cognite Function to calculate metric B
  2. Daily execution of a workflow that triggers Cognite Function that recursively outputs a dynamic task, which calculates metric A. The Function outputs information for the next execution run (i.e., timestamp for next execution, state, and other info). At the end of the day the dynamic task would not output a timestamp for the next execution and workflow would complete. The execution trigger could also be daily.

Best answer by Dilini Fernando

Hi @Christian Flasshoff,

As of now, I’m closing this topic. Please feel free to start a new post if you have any questions. 

4 replies

addition:

  • another option could be in the workflow definition have cdf task as the last task of the workflow. After the function has completed its execution the cdf task triggers a new run/execution of the same workflow.
  • it’s important to keep restrictions and limits around workflows in mind

Jørgen Lund
Seasoned Practitioner
Forum|alt.badge.img
  • Product Manager
  • May 8, 2024

Probably a bunch of different ways this could be solved. I think it also depends on how you define the Function that does the metric calculation.

  • Option A: one Function execution calculates the metric multiple times. The Function is then responsible for ensuring that it iteratively calculates the metric multiple times within the duration limits of a Function call AND returns some state information before it shuts down. Benefit: fewer Function executions needed, and probably more performant as you have fewer start-ups of Functions.
  • Option B: one Function execution calculates the metric once, returns state, and shuts down. Benefit: simpler Function.

For both you could potentially have a workflow with 99 sequential calls to the Function, and ending with a task that triggers the next execution of the same workflow. 

This is somewhat of an edge case for data workflows, and implementation would require experimentation. We’re interested in getting feedback from you as you proceed, so feel free to leave it here!


Dilini Fernando
Seasoned Practitioner
Forum|alt.badge.img+2

Hi @Christian Flasshoff ,

Did the above help you? 


Dilini Fernando
Seasoned Practitioner
Forum|alt.badge.img+2
  • Seasoned Practitioner
  • Answer
  • June 12, 2024

Hi @Christian Flasshoff,

As of now, I’m closing this topic. Please feel free to start a new post if you have any questions.