Impact 2024: The Industrial Data and AI Conference for and by Users | Nominate Speakers Now for a Ch...
Is the following use case a good fit for Data Workflows?Cognite Function that is reading data from Time Series and writing "event-like" data into Data Modeling continuous processing for metric type A ideally an execution every minute, but more importantly no overlapping executions (i.e., if execution takes longer than a minute) passing state from one to another function execution hourly processing for metrics type B Idea:2 Data workflows:Hourly execution of trigger Cognite Function to calculate metric B Daily execution of a workflow that triggers Cognite Function that recursively outputs a dynamic task, which calculates metric A. The Function outputs information for the next execution run (i.e., timestamp for next execution, state, and other info). At the end of the day the dynamic task would not output a timestamp for the next execution and workflow would complete. The execution trigger could also be daily.
In order to request aggregates from large views in Data Modeling into PowerBI, it would be beneficial to have the /aggregate endpoint exposed in OData (and PowerBI as a result). Otherwise all the data needs to downloaded to PowerBI first and perform the aggregates locally, this lowers the performance and is not possible for views with millions of instances.
Already have an account? Login
Enter your E-mail address. We'll send you an e-mail with instructions to reset your password.
Sorry, we're still checking this file's contents to make sure it's safe to download. Please try again in a few minutes.
Sorry, our virus scanner detected that this file isn't safe to download.