Overview:
This quick start guide provides a comprehensive introduction to implementing and working with the Quickstart Common Data Model (CDM) and contextualization workflows in industrial environments. It combines data modelling fundamentals with practical contextualization techniques to help you build robust industrial data management solutions.
This guide walks you through the concept of Quick Start Enterprise data model and the steps to download, configure, and deploy QuickStart using the Cognite Toolkit.
What is Quick Start Enterprise Data Model(EDM)?:
-
A ready-to-use data model for Cognite Data Fusion that standardizes core industrial data like Assets, Time Series, Work Orders, etc.
-
Speeds up projects by replacing manual setup with a modular, deploy-ready structure that cuts onboarding time from weeks to hours.
-
Provides scalable, reusable architecture so developers can focus on building applications instead of infrastructure.
Why Quick Start Enterprise Data Model is Needed?
-
Accelerates Implementation: Eliminates the "cold start" phase by providing a ready-made schema, moving projects from architecture to data ingestion in hours.
-
Ensures Scalability: Uses a modular, "mix-in" architecture that allows the foundation to grow with the business without requiring expensive structural overhauls.
-
Standardizes Data Language: Creates a consistent definition for Assets, Time Series, Files,etc across the enterprise, ensuring that apps built for one site work for all others.
-
Optimizes Developer Performance: Leverages the Data Modeling Service (DMS) for high-speed GraphQL queries, providing developers with a structured and predictable API experience.
-
Reduces Engineering Overhead: Automates the deployment and maintenance of complex data relationships via the
cdf-toolkit, significantly lowering the long-term total cost of ownership.
Module Architecture:
```
qs_enterprise_dm/
├── data_modeling/
│ |
│ │
│ ├── containers/ # 46 container definitions
│ │ ├── Asset.Container.yaml
│ │ ├── Equipment.Container.yaml
│ │ ├── WorkOrder.Container.yaml
│ │ ├── WorkOrderOperation.Container.yaml
│ │ ├── WorkOrderOperationConfirmation.Container.yaml
│ │ ├── MaintenanceOrder.Container.yaml
│ │ ├── Notification.Container.yaml
│ │ ├── Operation.Container.yaml
│ │ ├── FileRevision.Container.yaml
│ │ ├── Reportable.Container.yaml
│ │ ├── ... (36 more)
│ │ └── FunctionalLocation.Container.yaml
│ ├── views/ # 39 view definitions
│ │ ├── Asset.view.yaml
│ │ ├── Equipment.view.yaml
│ │ ├── WorkOrder.view.yaml
│ │ ├── WorkOrderOperation.view.yaml
│ │ ├── WorkOrderOperationConfirmation.view.yaml
│ │ ├── MaintenanceOrder.view.yaml
│ │ ├── Notification.view.yaml
│ │ ├── Operation.view.yaml
│ │ ├── FileRevision.view.yaml
│ │ ├── Reportable.view.yaml
│ │ ├── ... (29 more)
│ │ └── FunctionalLocation.view.yaml
│ ├── qs-enterprise.datamodel.yaml # Full enterprise data model
│ └── qs-enterprise-search.datamodel.yaml # Search-optimized data model
| ├── schema.space.yaml # Schema space (enterpriseSchemaSpace)
│ ├── enterprise.instance.space.yaml # Enterprise instance space
│ |__ site.instance.space.yaml # Site instance space
├── default.config.yaml # Default configuration variables
└── module.toml # Module metadata
```CDF Spaces:
The module provisions three spaces to separate schema definitions from instance data:
| Space Variable | Default Value | Purpose |
|---|---|---|
|
|
| Holds all view and container definitions (the data model schema) |
|
|
| Stores enterprise-wide data instances shared across sites |
|
|
| Stores site-specific data instances |
Data Models:
1. Enterprise Data Model (DataModel)
The primary data model that combines all 39 enterprise views together with 31 CDM base views into a single queryable model. It includes:
-
All CDM interface and base types from
cdf_cdm(CogniteAsset, CogniteEquipment, CogniteTimeSeries, etc.) -
All custom enterprise views from the schema space (Asset, Equipment, WorkOrder, etc.)
2. Enterprise Search Data Model (DataModelSearch)
A lightweight subset of the enterprise model designed for search use cases. It includes the most commonly queried CDM types and their corresponding enterprise views:
-
CDM types: CogniteAsset, CogniteEquipment, CogniteTimeSeries, CogniteFile, CogniteAssetClass, CogniteAssetType, CogniteEquipmentType, CogniteFileCategory
-
Enterprise views: Asset, Equipment, TimeSeries, FileRevision, Notification, WorkOrder, AssetClass, AssetType, EquipmentType, FileCategory
Enterprise Views:
Core Asset & Equipment Views
| View | Implements | Description |
|---|---|---|
| Asset |
| Physical assets with hierarchical parent/root/path structure, linked to equipment, files, activities, and time series |
| FunctionalLocation |
| Hierarchical structures representing specific positions where assets are installed or functions are performed |
| Equipment |
| Physical devices or supplies linked to an asset, with equipment type, files, activities, and time series |
| AssetClass |
| Classification categories for assets |
| AssetType |
| Type definitions for assets |
| EquipmentType |
| Type definitions for equipment |
Maintenance & Work Management Views
| View | Implements | Description |
|---|---|---|
| MaintenanceOrder |
| Formal requests to perform maintenance tasks such as repair, inspection, or servicing. Links to operations, assets, equipment, and time series |
| Operation |
| A specific part of the work included in a maintenance order (a work order item). Linked to a maintenance order and assets |
| Notification |
| Formal records to report maintenance issues, defects, or requests. Links to a maintenance order and asset |
| WorkOrder |
| Work orders with operations, linked to assets and equipment. Includes |
| WorkOrderOperation |
| Individual operations on a work order with confirmations. Includes |
| WorkOrderOperationConfirmation |
| Tracks actual vs. planned work on operations with fields for actual work, forecast work, costs, and timing |
| Activity |
| Activities happening over a time period, linked to assets, equipment, and time series |
Data & File Views
| View | Implements | Description |
|---|---|---|
| TimeSeries |
| Series of data points in time order, linked to assets, equipment, and a unit |
| FileRevision |
| Documents and files with custom properties for facility, unit, line, and file revision |
| FileCategory |
| Category classification for files |
| Unit |
| Units of measurement for time series |
3D & Visualization Views:
| View | Implements | Description |
|---|---|---|
| 3DModel |
| Top-level 3D model resources |
| 3DObject |
| Individual 3D objects |
| 3DRevision |
| Revisions of 3D models |
| 3DTransformation |
| 3D transformation matrices |
| CADModel |
| CAD-specific 3D models |
| CADRevision |
| CAD model revisions |
| CADNode |
| Individual nodes in a CAD model |
| PointCloudModel |
| Point cloud 3D models |
| PointCloudRevision |
| Point cloud model revisions |
| PointCloudVolume |
| Volumes within point clouds |
| CubeMap |
| Cube map textures for 3D environments |
360-Degree Image Views:
| View | Implements | Description |
|---|---|---|
| 360Image |
| Individual 360-degree images |
| 360ImageAnnotation |
| Annotations on 360 images |
| 360ImageCollection |
| Collections of 360 images |
| 360ImageModel |
| 360 image model resources |
| 360ImageStation |
| Stations from which 360 images are captured |
Cross-Cutting Views:
| View | Implements | Description |
|---|---|---|
| SourceSystem |
| Standardized representation of source systems (e.g., SAP, PI) |
| Annotation |
| General annotations on resources |
| DiagramAnnotation |
| Annotations specific to diagrams |
| Reportable | (standalone)
| Cross-cutting view providing |
Key Relationships
+-----------+
| Asset |
+-----+-----+
/ | | \ \
parent/ files| equipment| activities\ timeSeries
children | | | \
+---+ +--+--+ +--+------+ +--------+
|File| |Equip| |Activity | |TimeSeries|
|Rev.| |ment | | | | |
+----+ +-----+ +---------+ +----------+
+---------------+ +-------------+ +-------------------+
| Notification |------>| Maintenance |<------| Operation |
| | | Order | | |
+---------------+ +------+------+ +-------------------+
|
assets, equipment,
timeSeries
+-------------+ +-------------------+ +-----------------------------+
| WorkOrder |<------| WorkOrderOperation|<------| WorkOrderOperationConfirmation|
| | | | | (actual/forecast work, costs) |
+-------------+ +-------------------+ +-----------------------------+
-
Asset is the central entity, connected to children (hierarchy), equipment, files, activities, time series, and 3D objects
-
FunctionalLocation mirrors Asset hierarchy but represents functional positions in a facility
-
Notification triggers MaintenanceOrder, which contains Operations
-
WorkOrder contains WorkOrderOperations, each tracked by WorkOrderOperationConfirmations
-
Reportable provides cross-cutting
sys*properties (site, unit, tags found/linked) shared by Activity, TimeSeries, MaintenanceOrder, and Operation -
All major entities carry a
sourcerelation to SourceSystem and aUUIDcustom property
Configuration
Default Variables (default.config.yaml)
# Enterprise spaces
enterpriseSchemaSpace: sp_enterprise_process_industry
enterpriseInstanceSpace: sp_enterprise_instance
siteInstanceSpace: sp_site_instance
# Data model configuration
organizationName: Enterprise
enterpriseDataModelId: DataModel
enterpriseDataModelVersion: v1
enterpriseSearchDataModelVersion: v1
# CDM version
cdmDataModelVersion: v1
# Reserved word prefix (for views starting with numbers or reserved words)
reservedWordPrefix: Enterprise_
Key Configuration Points
| Variable | Purpose | Example Override |
|---|---|---|
|
| Space for all schema definitions |
|
|
| Space for enterprise instance data |
|
|
| Space for site-specific instance data |
|
|
| Prefix used in data model names |
|
|
| Version tag for enterprise views |
|
|
| Prefix for views whose names start with numbers (e.g., |
|
|
| Name of the primary source system |
|
Environment Override
Override defaults in your environment config file (config.<env>.yaml):
variables:
modules:
qs_enterprise_dm:
enterpriseSchemaSpace: sp_myorg_process_industry
enterpriseInstanceSpace: sp_myorg_instance
siteInstanceSpace: sp_mysite_instance
organizationName: MyOrg
reservedWordPrefix: MyOrg_
sourceName: Houston AVEVA PI
Getting Started
Prerequisites
-
CDF project with data modeling enabled
-
CDF Toolkit (
cdf) CLI installed and configured -
Your project contains the standard
cdf.tomlfile -
Valid authentication to your target CDF environment
-
cognite-toolkit >= 0.7.33.
-
The data plugin enabled in your
cdf.tomlfile.
Step 1: Enable External Libraries
Edit your project's cdf.toml and add:
[alpha_flags]
external-libraries = true
[library.cognite]
url = "https://github.com/cognitedata/library/releases/download/latest/packages.zip"
checksum = "sha256:4dd36677bdf0dc6464d283134c943360fe9bf6948ff645049ccb063d1c095379"
This allows the Toolkit to retrieve official library packages, including the Quickstart DP package.
📝 Note: Replacing the Default Library
By default, a Cognite Toolkit project contains a [library.toolkit-data] section pointing to https://github.com/cognitedata/toolkit-data/....
These two library sections cannot coexist.
To use this Deployment Pack, you must replace the toolkit-data section with library.cognite:
|
|
|
|
|
|
The library.cognite package includes all Deployment Packs developed by the Value Delivery Accelerator team (RMDM, RCA agents, Context Quality Dashboard, etc.).
⚠️ Checksum Warning:
When running cdf modules add, you may see a warning like:
WARNING [HIGH]: The provided checksum sha256:... does not match downloaded file hash sha256:...
Please verify the checksum with the source and update cdf.toml if needed.
This may indicate that the package content has changed.This is expected behavior. The checksum in this documentation may be outdated because it gets updated with every release. The package will still download successfully despite the warning.
To resolve the warning: Copy the new checksum value shown in the warning message and update your cdf.toml with it. For example, if the warning shows sha256:da2b33d60c66700f..., update your config to:
[library.cognite]
url = "https://github.com/cognitedata/library/releases/download/latest/packages.zip"
checksum = "sha256:da2b33d60c66700f..."
Step 2: Add the Module
Run:
cdf modules add
This opens the interactive module selection interface.
Step 3: Select the Quickstart Enterprise Data Model Deployment Pack
From the menu, select:
First Select : Data models: Data models that extend the core data model (3)
Second Select : Quick Start Enterprise Data Model
This will download the “qs_enterprise_dm” respective containers, views, spaces and data model files.
Step 4: Verify Folder Structure
After installation, your project should contain:
If you see this structure, qs_enterprise_dm then it has been successfully added to your project.
modules
└── models
└── qs_enterprise_dmIf you want to add more modules, continue with yes (Y); otherwise choose no (N).
Proceed with creation (Y) to generate the folder structure and files for your selected modules.
Step 5: Deploy to CDF
Build and deploy as usual:
cdf build
cdf deploy --dry-run
cdf deploy
After deployment, the qs_enterprise_dm models, containers, and views will be available in your CDF environment.
Testing the Quickstart Package:
Add the Module SourceSystem:
cdf modules add
Select : Source Systems: Source systems modules
This will download all the test data to the local repo.
Overview of Folders:
modules
├── sourcesystem
│ ├── cdf_pi
│ ├── cdf_sap_assets
│ ├── cdf_sap_events
│ └── cdf_sharepoint
└── models
└── qs_enterprise_dm1. Update the Configuration File:
Update the configuration files config.<env>.yaml for any variables that are not set. Environment variables must also be updated with Client ID and Client Secret values for different data sources such as Aveva PI, SAP, etc., which can be found under the sourcesystem module’s variable declarations. As of for now the Client ID and Client Secret values for these sources are using IDP_CLIENT_ID and IDP_CLIENT_SECRET.
Update the following variables in the configuration file:
-
<my-project-env>Replace with your CDF project name for that environment. -
<GROUP_SOURCE_ID>Replace with your designated Group Object ID. For testing, you can use your TK service principal group’s SourceId, but this is not recommended for production. -
<RUN_WORKFLOW_USER_ID>Replace with your email address or a user who can trigger workflows.
2. Deploy to CDF:
-
Populate variables in the configuration file by running . Run
cdf build. If warnings appear, resolve them and rebuild using the same command.⚠️ Note: Ignore warnings such as
WARNING [LOW]: Module 'cdf_pi' has non-resource directories: ['upload_data']... -
Optionally dry-run the deployment using
cdf deploy --dry-runto validate everything before deploying to your CDF instance. -
Deploy to your CDF instance using
cdf deploy.
3. Upload Test data to CDF:
In your cdf.toml file, verify that the data plugin is enabled:
[plugins]
data = trueRun the following commands to upload the synthetic data:
cdf data upload dir modules/sourcesystem/cdf_pi/upload_data
cdf data upload dir modules/sourcesystem/cdf_sap_assets/upload_data
cdf data upload dir modules/sourcesystem/cdf_sap_events/upload_data
cdf data upload dir modules/sourcesystem/cdf_sharepoint/upload_data
Verify the data upload in Integrate > Staging in CDF.
⚠️ Note:
1. These upload_data directories contain Manifest.yaml files with hardcoded table and database names. If you change table or database names in the configuration file, update the corresponding Manifest.yaml files as well.
2. If you are maintaining your modules under the organization directory, then add the organization directory name to the start of the upload_data directory path.
4: Trigger WorkFlow:
Once data is uploaded, trigger the ingestion workflow from the Data Workflows UI in CDF:
This will populate the data model and build connections between assets, equipment, work orders, etc.
Support:
For troubleshooting or deployment issues:
-
Refer to the Cognite Documentation.
-
Contact your Cognite support team
-
For any queries related to the deployment pack, write to us on our slack channel #topic-deployment-packs.
Check the
documentation
Ask the
Community
Take a look
at
Academy
Cognite
Status
Page
Contact
Cognite Support