Master the fundamental skills required for industrial data operations, integration, and contextualization in Cognite Data Fusion.
Essential knowledge required for all subsequent courses.
Master the core concepts of data ingestion, contextualization, and the foundational architecture. This comprehensive path sets the stage for advanced engineering tasks.
Master the fundamental skills required for industrial data operations.
Master the ingestion of data from various industrial sources using standard Extractors and API integrations.
Utilize CDF Transformations to convert data from the RAW layer into the Core Data Model.
Structure and manage data using containers, views, and semantic layer best practices.
Understand the concept of asset hierarchy and perform basic linking of CogniteTimeSeries and files to CogniteAssets.
Demonstrate basic proficiency with the Cognite Python SDK and Toolkit for simple read/write operations and authentication.
Properly define and apply Spaces to manage data ownership and access control.
Deep dive into specific capabilities once you have the core skills.
Master complex pipelines, data reconciliation, and scalable stream processing techniques.
Building and maintaining automated pipelines for linking assets, timeseries, and documents.
Deep dive into advanced schema design, versioning, and working with complex query patterns.
Implementing fine-grained access control, security best practices, and data governance policies.
Enter your E-mail address. We'll send you an e-mail with instructions to reset your password.
Sorry, we're still checking this file's contents to make sure it's safe to download. Please try again in a few minutes.
OKSorry, our virus scanner detected that this file isn't safe to download.
OK