Data Modeling provides you with the flexibility to define your own Industrial Knowledge Graphs based on relevant industry standards, your organization’s own data structures and use cases, or a combination of all of these. Large, and often complex, Industrial Knowledge Graphs might be needed to represent the full extent of your industrial data across disciplines. An important aspect of these knowledge graphs is being able to explore and iterate on both its data and its structure.With this release, we are enhancing the Data Modeling user interface in Cognite Data Fusion to better display the underlying concepts that power your industrial knowledge graph. With this increased visibility into your models, you can have higher understanding and confidence in iterating the structure of your data model.Space centric exploration of containers and views Explore containers and view information for the data model. Data management summary + sidebarSpace centric exploration of containers and viewsExp
ContextThis write up describes a basic set up of writing data points to Cognite Data Fusion using Apache NiFi. The source data is power consumption readings from the HAN interface of a power meter in a residential fuse box:The starting point here is an existing MQTT broker that receives data from an existing MQTT client device. We will use Apache NiFi to consume the MQTT messages, extract the readings and write continuously to CDF.The end goal is a live updated time series within CDF with power readings every 2.5 seconds that can be used for analysis or automation within CDF or simply visualization: NiFi flow overviewThe picture above shows the complete NiFi flow. From left to right, consuming MQTT messages, transforming and writing to Cognite Data Fusion. Approximately 120 data points per five minutes, which corresponds to the power meter outputting one reading every 2.5 second. The NiFi flow makes use of the following Processors: Processor Purpose ConsumeMQTT
This guide describes how to run the Cognite dB Extractor in a separate Docker container to fetch data from Microsoft SQL server to Cognite Data Fusion. Prerequisites Running instance of MS SQL server and valid credentials to SELECT rows from the table in question Docker host that can access the MS SQL server Running CDF project with an Azure AD service principal with capabilities to write either to CDF. 1 - Prepare Docker ImageCognite provides the database extractor as a Docker image published to Docker Hub, requiring just the addition of an ODBC driver. Since we are connecting to MS SQL we will install drivers provided by Microsoft for Debian 11, using this Dockerfile: Note! Go to https://hub.docker.com/r/cognite/db-extractor-base/tags to see the latest version of the Docker imageFROM cognite/db-extractor-base:2.5.0-beta5RUN curl https://packages.microsoft.com/keys/microsoft.asc | apt-key add - && \ curl https://packages.microsoft.com/config/debian/11/prod.list > /
If a user includes https:// or http:// in the Override Azure Tenant field the url to Azure AD will not include a correct tenant and login is prevented. This has caused login issues for at least a few users. The request is simply that the form is modified to strip the protocol scheme, ie removing http:// or https:// automatically, if present.
Already have an account? Login
Enter your username or e-mail address. We'll send you an e-mail with instructions to reset your password.
Sorry, we're still checking this file's contents to make sure it's safe to download. Please try again in a few minutes.
Sorry, our virus scanner detected that this file isn't safe to download.