Operational Digital Twin: How data contextualization provides a complete, actionable understanding of industrial operations

  • 3 March 2021
  • 0 replies
  • 151 views
Operational Digital Twin: How data contextualization provides a complete, actionable understanding of industrial operations
Userlevel 6
Badge

A digital twin can be one of the most useful, insightful tools to drive industrial innovation. While the digital twin concept is no longer new, the capacity of the term continues to expand based on technological advancement, particularly in the realm of industrial IoT. Over time, digital twins have morphed to meet the practical needs of users. In Oil & Gas, for example, the possibilities of condition-based monitoring and predictive maintenance have amplified the need for a digital representation of both the past and present condition of an object or system. 

 

Gartner predicts that “by 2023, 33% of owner operators of homogeneous composite assets will create their own digital twins, up from less than 5% in 2018” while “at least 50% of OEMs’ mass-produced industrial and commercial assets will directly integrate supplier product sensor data into their own composite digital twins, up from less than 10% today.”1 In the same report,  Gartner indicates that digitalization will motivate  industrial companies to integrate and even  embed their digital twins with one another to increase their own competitiveness. To do that, industrial companies need to make some sound strategic decisions now to lay a firm-but-flexible foundation for digital success. 

It is possible for an organization to enhance the overall understanding of its operations by putting all OT and IT data through a contextualization pipeline to create an operational digital twin. This next frontier in the digital twin space utilizes scalable cloud architecture to enable a crucial decoupling of individual models (e.g., applications, simulation models, analytics) from separate source systems, reversing the unnecessary complexity established in point to-point integrations. Oil & Gas companies that deploy an operational digital twin will finally have true control over their data - the ability to understand where it comes from, how reliable it is, and how to enrich it over time. They will also be the first ones to scale successful solutions on top of that data, which must be the priority of any digitalization initiative. 

 


 

Introducing the Operational Digital Twin

An operational digital twin is the aggregation  of all possible data types and data sets, both  historical and real-time, directly or indirectly related to a given physical asset or set of assets in a single data platform; the collected data must be clean and contextualized, linked in a way that mirrors how things are or would be linked in the real world, and made consumable depending on the use case. 

 


 

Below, we will outline the prerequisites and benefits of constructing an operational digital twin using examples already at work in the field.

There should be no limit to the amount of data, or the types of data included in this operational digital twin. Every data type, from time series to P&IDs to 3D models to maintenance logs to weather data, adds context that brings the operational digital twin closer to representing the true industrial reality of the asset(s). This vital foundation gives the owners of the data a complete, useful overview and allows authorized users, whether internal or external, to streamline the creation of models for individual components, equipment, and processes because all the relevant data already exists in a single, accessible virtual space. 

The operational digital twin allows for data consumption based on the use case. Any model the user creates can live off the streaming live data that exists there, enriching the space by feeding its own insights or derived information (e.g., synthetic temperature or flow information created by a simulator for equipment where no real sensor exists) back into the twin. Combined with live and historical data, these insights on equipment behavior shore up the operational digital twin, making it even more complete and useful for the future.

Building an Operational Digital Twin

Industrial companies have so far invested in aggregating their data and making it available to their personnel, usually by utilizing a cloud data warehousing set-up. To build an operational digital twin, this collected data must be put through a contextualization step, a process that is both automatic and manual. 

A strong operational digital twin requires:

• Multiple data sets & data types (unlimited)

• Multiple relationships between data (unlimited) 

• Underlying principles of data vitality, data openness, and data accessibility 

These requirements are inspired directly by the needs of the human users of the technology, who care about different kinds of data and need different ways to navigate and view it. 

Multiple data sets & data types 
With openness as a default, both in terms of data sharing and information exchange across their organization, Aker BP was able to expand its digitalization initiatives quickly and efficiently. First, they deployed Cognite Data Fusion across all five of their operational assets and prepared to ingest data, one block at a time, in a strategic order. To choose that order, they collected hundreds of use cases across a wide variety of domains. With the help of domain experts, they categorized the cases and prioritized them based on a number of criteria, including how quickly a case could likely be solved and how proportionately high the return on investment might be. Top priorities for Aker BP fell into the following categories: Smart Maintenance, Production Optimization, Digital Worker, HSE, and Drilling & Wells. 

Beginning with Smart Maintenance cases, Aker BP identified sensor data (time series) and maintenance logs as the data types to liberate first. The developers who deployed CDF used targeted ingestion APIs to extract the sensor and maintenance data from their separate source systems and duplicate it in the cloud. To meet the needs of their Digital Workers operating offshore, Aker BP added 3D models and maintenance logs. Each set of use cases helped to identify new layers of data to liberate and add to CDF. By the end of Aker BP’s initial year-long push to liberate four decades-worth of operational data in 2018, their tenant in CDF contained 600,000 time series, 1.3 trillion data points, 400,000+ documents, and 4 million events, and all these numbers were on the rise. Liberating the data and contextualizing it is the first step. Usefulness then requires that the data is findable when a user needs it. 

Multiple relationships between data 
Different users will require different functionality from the operational digital twin. This is about navigating and viewing the data.  

Components and equipment often share more than one link or relationship in the real world. A pump has a geographical link with the pipe attached to it, as well as with any other equipment in the immediate area. The same pump has an upstream and downstream flow link with different valves and pipes. And finally, the pump has a logical link with all other pumps on the oil platform, regardless of their respective physical or flow locations.

An operational digital twin can handle a variety of possible structures to meet the needs of its users 

Support Engineer  
Needs to service a particular equipment package on an oil platform. They only care about this package. When they approach the operational digital twin, a support engineer wants to be able to navigate by system, major equipment, and the individual components they deal with every day. An asset hierarchy is a logical way to frame the data in this case. 

Production Engineer 
Cares primarily about production optimization. They want to know about the flow of oil. Understanding equipment types is part of that equation, but only in terms of oil flow. The production engineer does not need to see all the valves on the oil platform, nor do they need a 3D model to help them find the individual valve whose sensor data is running in their model. Rather, they want to be able to combine layers of sensor data and event data to run anomaly detection models and/or set alerts related to temperature and pressure, for example. In this case, a graph database structure would be more useful. 

Electrical Engineer 
Wants to know whether critical equipment is receiving enough stable power. The layout of the system is important to them, but only as it pertains to the flow of electricity, not oil. 

 


 

Alive, Open and Accessible 

The operational digital twin must be dynamic, flexible enough to meet the needs of a growing variety of users and models. This goes beyond populating the twin with a truly comprehensive set of contextualized operational data. A successful operational digital twin will also be constructed on principles of data vitality, data openness, and data accessibility. In other words, no matter how deep and wide it becomes, the twin must remain alive, open, and accessible. 

Alive: As live data becomes an expectation; the question of latency will be a key differentiator for the best operational digital twins. The lower the latency, the more closely the digital twin reflects the industrial reality. Milliseconds can be the difference between theory and practical use of an application in the field. 

Open: There are many ways of creating value on top of the operational digital twin. Models and applications for internal use are the main drivers, of course, but there are numerous opportunities for value creation in opening the operational digital twin up for connection with other external, proprietary digital twins. 

While concerns regarding intellectual property are reasonable in this increasingly open digital age, the potential rewards for finding a flexible solution to issues of digital twin integration should outweigh the instinct to remain shut off from the rest of the value chain. 

Having true control over one’s operational data includes the capability to share data when and with whom it makes the most business sense. To reap the benefit of the openness described above, certain best practices are required to make the data accessible to the people authorized it to use it, both internally and externally. 

Accessible: The quality of any data model is contingent upon the quality of the data it runs on. The more eyes you have on your data, the better the quality control. It’s important for internal people to be able to use the API easily, without unnecessary friction (i.e., new account registrations or extra passwords). Offshore workers and other domain experts will spot errors that data scientists won’t see. Accessibility is also essential for potentially powerful external partnerships. To integrate with proprietary data models from an OEM (as in the Aker BP-Framo case described above), the owner of the operational digital twin should use open APIs for consumption on top of the industrial data platform. This means sending private, secure API keys to authorized users, but also providing full, open documentation on the APIs themselves. Taking a proactive approach to data accessibility can also have other reverberating benefits. One of the biggest obstacles to data-driven innovation is lack of access to high-quality data. An operational digital twin is, by definition, a comprehensive set of cultivated data offering as close a digital representation of industrial reality as possible. Much of the data in the operational digital twin will be proprietary and necessarily private. But some of that data will be benign, and by opening that data for access to curious third parties, it’s possible to stimulate unplanned innovation that may well impact your industry and the ecosystem around it.

 


 

In the summer of 2018, Aker BP launched the Open Industrial Data Project, providing the largest, live industrial data set in the world to any interested party.  

 


 

The data originates from a single compressor on the Valhall oil platform in the North Sea. The stream is organized and made available via Aker BP’s operational digital twin through the Open Industrial Data website free of charge. Registered users receive an API key that provides access to the live data set, as well as an application to navigate and visualize the contextualized compressor data, and user friendly toolkits to help users at every level of technological savvy get started with the project. The hope is that sharing this live data will accelerate innovation in predictive maintenance, condition monitoring, and other applications, directly benefiting Aker BP’s operations, but also improving the health and outlook of the industrial ecosystem on the Norwegian Continental Shelf. 

 


 

Example Use Case

A mobile worker filling a tank on an oil platform may have previously been required to radio the control room to gauge the level of oil in the tank during the process. Now they can use a handheld device to watch a digital twin of the tank fill in the real-time, provided they can depend on the streaming data coming through with very low latency. Aker BP opted to open its operational digital  twin to integration with a digital twin from Framo, a global provider of pumping systems. The goal was to enable a new business model: the first performance-based contract between an operator and an OEM on the Norwegian Continental Shelf. The historic model was the  typical transaction, wherein Aker BP purchased  Framo’s pumps outright and handled operation  and standard maintenance, while Framo provided calendar-based maintenance and was on-call for advice and special service if  
something went wrong. In the new “SMART”  contract (signed in August 2018), Framo’s relationship to Aker BP changed for the first time in decades.

Framo maintains responsibility for the maintenance of all its pumps across Aker BP assets, including a system of seawater lift pumps that cool the production systems (already in place on the Ivar Aasen platform). With the performance-based contract, Framo obtained continuous access to live data from those pumps for the first time. This shift allowed Aker BP and Framo to align their priorities wholly for the first time. Aker BP wants their pumps up and running as long and as efficiently as possible, avoiding costly unforeseen breakdowns and assuring that maintenance is always necessary, driven by data, rather than by the calendar. Framo also wants the pumping system to run optimally because they are effectively selling uptime in the new contract and they need to hit a number of enumerated KPIs based on advanced analytics. All of this is only possible now that Aker BP can quickly and easily share live operational data for the pumping system directly with Framo. Under this new contract, both parties recognize data as a prime resource and actively trade it between one another to extract value on either side. With excellent security protocols in place, intellectual property remains safe.

 


 

Conclusion

Historically, a digital twin has had a single dimension of contextualization to solve the use case it was specifically created to answer. When one creates an operational digital twin, the richness of the data describing the industrial reality allows the user to create many more correlations between data points. Data needs to be combined--contextualized-- to solve problems. An operational digital twin is one tool that Oil & Gas companies can use to grasp a greater level of understanding and control regarding their data and their operations, discovering insights to optimize operations, increase uptime, and revolutionize business models. This is why contextualization is crucial to the creation of the operational digital twin, making it flexible enough and scalable enough to handle the complexities of current operations and to anticipate the demands of data-driven operations in the digitalized industrial future.


0 replies

Be the first to reply!

Reply