A use case we encounter at Cognite is writing data back to SAP. This can happen in several contexts where the main goal is to create or update data in SAP (ex: work orders, notifications etc.) based on the analysis of data stored in Cognite Data Fusion. The ability to write back to SAP allows to take decisions without going back and forth between applications, which can save a lot of time. It is also less error prone than manually filling fields in SAP based on what you read in Cognite Data Fusion.
This use case, which is all about automating processes, definitely fits Industry 4.0. In addition to that, SAP is one of the most used ERP systems in the industry. As a side note: we are talking today about SAP, but the same would be possible with other ERPs (as long as they have an API we can send requests to).
For example, in a maintenance context: when analyzing data of your industrial machinery, you might notice that one of your machines needs maintenance. Instead of going to SAP, looking for that specific machine, and adding a work order for it, you could simply do it from your analytics solution. You could even have machine learning models that detect machines needing maintenance operations.
From a technical perspective: first you need your “write-back application” to be authenticated both on Cognite Data Fusion and SAP sides. Depending on the use case, the application sends requests either directly to a SAP API, or to a facade API. Also depending on the use case and its needs, both streaming and batch processing can be supported. The source data, in CDF, can be Events and Raw for streaming, and almost every possible object for batch processing. The logic is then implemented in the application, with for example a mapping between CDF and SAP objets, alongside with other processing before sending requests to SAP, to have in the end some data updates in SAP.
Does that example sound familiar to you ? Have you ever worked on something similar ? Let me know your thoughts.