Skip to main content

We are currently exploring potential use cases with our team, and one of them involves implementing data quality techniques. I was wondering if anyone could share their experience in order to identify the best approach within the Cognite Platform for conducting data quality checks to ensure that all fields in critical tables are populated correctly.

In the past, we have implemented a similar solution with a different vendor, utilizing a system that can prevent the flow of erroneous data through our data pipelines by employing validation and integrity checks based on declarative quality expectations.

Thank you in advance.

Be the first to reply!

Reply