Solved

Cognite data deployment architecture

  • 19 July 2023
  • 4 replies
  • 132 views

Please share Cognite deployment architecture diagram. How storage is maintained on public cloud in various regions?

icon

Best answer by Jason Dressel 27 July 2023, 19:04

View original

4 replies

Hi Mohammad,

Thanks for the question. Can you elaborate a little more so that I can give you a concrete answer. Are you interested in how we deploy our software, the technologies we use to store CDF data in the cloud, or how we maintain those data sources?

Regards, 

Scott

Hi Scott,

Sorry for this late response, somehow I missed this reply

I would like to get answer on both topics you mentioned

 

Regards,

Mohammad Imran

Userlevel 4
Badge

@Mohammad Imran,

Cognite Data Fusion (CDF) is a micro-services architecture SaaS application that is deployed to 3 major cloud service providers today (Azure, Google and most recently AWS).  CDF consists of 300+ micro-services which are continuously deployed as many as 1000 times per week across development, staging and production environments.  The infrastructure is predominantly Kubernetes based.  Storage consists of a polyglot system of technologies based on data type, availability, scalability and performance requirements.  Storing and serving time series data, for instance, involves a combination of storage technologies (Foundation db, Kafka, PostgreSQL, Elastic Search).  You can read more about availability, business continuity, backups here at the following link:
  https://docs.cognite.com/cdf/trust/security/availability_continuity

Hope this helps,
-Jason

Userlevel 3

Hi @Mohammad Imran, did the above help you?

Reply