Impact 2024: The Industrial Data and AI Conference for and by Users | Nominate Speakers Now for a Ch...
Hi! We have made the following updates available for Workflows.Support for Workflows in the Cognite Python SDKThe Cognite Python SDK now has native support for Workflows. Documentation can be found here. We have also updated the example notebook, which you can find attached as a file below (Note: Download the file and remove the .txt postfix to open it as a Jupyter Notebook.)New Access Management CapabilitiesYou can now configure access to Workflows directly in Access Management in CDF. Navigate to “Manage” > “Manage access” and for the relevant Groups add capabilities to read/write Workflows as needed (see screenshot below). This replaces the experimental ACLs which were required initially to interact with the service. The capabilities can also be added to a group using the SDK directly, e.g.:from cognite.client.data_classes import Groupcapabilities = [{"workflowOrchestrationAcl": {"actions": ["READ", "WRITE"], "scope": {"all": { }}}}]group = Group(name="Workflow Orchestration", ca
👋 Hello everyone!We're thrilled to announce that support for incremental loading from Data Models is now available in Transformations. This enables you to craft transformations that efficiently process only new or modified data since the last successful run, rather than the entire dataset. The result is enhanced performance for your transformation jobs, particularly when dealing with large datasets where incremental updates are relatively small.We highly recommend leveraging this capability for all relevant transformation tasks.More details on how to use this feature can be found in our documentation.Let us know if you have questions or feedback!
Currently, CDF Workflows have no built-in scheduling mechanisms (it is on our roadmap). A simple way to implement schedule-based execution of workflow is to leverage Cognite Functions. Below is a two-step example to explain how it works.1. Create the Cognite Function that will act as the workflow triggerNote that you need to specify the client_credentials parameter inside the call to client.workflows.executions.trigger for the authentication to work at runtime.# Enter credentials and instantiate clientfrom cognite.client import CogniteClientcdf_cluster = "" # "api", "greenfield", etc.cdf_project = "" # CDF project name tenant_id = "" # IdP tenant IDclient_id = "" # IdP client ID client_secret = "" # IdP client secretclient = CogniteClient.default_oauth_client_credentials( cdf_project, cdf_cluster, tenant_id, client_id, client_secret)# Define Function handledef handle(client, data, secrets): from cognite.client.data_classes import ClientCredentials execution = client.workflows.
Update! See this post for an updated example notebook based on the Cognite Python SDK.We want to make it easy for you to interact with CDF Workflows through Python. Therefore we’ve created a Jupyter Notebook that demonstrates the service using Python and a simple example. Native support in the Cognite Python SDK will be released shortly. Try it out, and feel free to leave feedback or questions in the comments! Note: Download the file below and remove the .txt postfix to open the file as a Jupyter Notebook.
Hi all!Data Workflows (re-named) is now in Public Beta, as part of the latest release of Cognite Data Fusion. See the Product Release Spotlight post for more information.We have also added a new feature for subworkflows which you can read more about here.You can find updated documentation at Cognite Docs and Cognite API spec, replacing all earlier versions. Note that the user interface in Fusion remains in Alpha pending the development of a new version. If you lack access to the user interface, get in touch with a Cognite representative or leave a comment below.Let us know if you have any questions or comments!
Hey there! We are thrilled to announce the Early Adopter Beta release of CDF Workflows.CDF Workflows is a managed process orchestration service within CDF. Using CDF Workflows, you can effectively coordinate the order and timely execution of interdependent processes such as Transformations, Functions, requests to CDF, and dynamic tasks effectively. Key Highlights Declarative Workflow Definition We have designed Workflow Orchestration with a declarative approach. Users need to specify the tasks to be orchestrated, input parameters, and the desired execution order. This intuitive approach we believe reduces complexity and lowers the bar for adaption. Orchestrating four different processes The Workflow Orchestration service supports the orchestration of four crucial types of processes: Transformations, Functions, CDF, and Dynamic Tasks. This versatility ensures that our workflows can leverage various functionalities, optimizing productivity. Versioning Capabilities Managing changes an
For many operators, power trading remains more of an art than a science that still relies heavily on heuristics and trader intuition. But as markets evolve and deviate from historical patterns, the tools and methods to make smarter trades must adapt in order to maintain or grow profitability and become a sustainable source of competitive advantage.In this webinar, we talk about our experience of integrating data science into forecasting and trading analysis, and how we used Cognite Data Fusion to enable analysts and traders to work iteratively in a data science process across a range of analytical topics within the trading domain. Check it out and leave your comments below! What do you see as the biggest opportunities and challenges within power trading analytics?
Hi! We have made the following updates available for Workflows.Optional TasksDefining non-critical tasks in a workflow as optional, meaning that in the event of a task failure, the workflow itself should continue, has been a much requested feature. You can now set the policy for how to handle failures and timeouts for each task in a workflow. By default, the workflow will fail if a task fails (after retries) or if the task times out. Alternatively, you can set the parameter to skip the task in the event of a failure or timeout (after retries). This means that the workflow execution will continue even when the task does not complete successfully. The feature is documented in the API specififcation here (navigate items > workflowDefinition > tasks > onFailure). Concurrency Policies for TransformationsIf a job is already running for a given Transformation in CDF, a new job cannot be started concurrently (fails with error message “A job already runs for this transform”). To better
Already have an account? Login
Enter your E-mail address. We'll send you an e-mail with instructions to reset your password.
Sorry, we're still checking this file's contents to make sure it's safe to download. Please try again in a few minutes.
Sorry, our virus scanner detected that this file isn't safe to download.