Industry 4.0 and digital twins promise many benefits, including an efficient lot size one and various optimizations. However, successfully implementing these concepts requires a sophisticated IT infrastructure. Based on these requirements, this blog describes how such a solution can be realized by leveraging NetApp® and Fraunhofer best practices.
Industry 4.0 offers a plethora of use cases and solutions for all different kinds of challenges. Unlike previous revolutions, it does not introduce a singular new technology, but covers a more groundbreaking change, the end-to-end digitalization of manufacturing processes. It is mainly driven by innovations in plant software systems and by the transfer and transformation of knowledge from the IT sector to manufacturing.
In our previous blog “Industry 4.0 and digital twins” we explained digital twins (DT) as a core component of Industry 4.0, which describes all relevant assets in an I4.0 production.
As described in detail in the FHG blog “Building the Industry 4.0 IT Infrastructure for Digital Twins” the following four requirements can be formulated for an I4.0 IT infrastructure:
The Industry 4.0 solution described in this blog post is a commercial offering based on the open-source Eclipse BaSyx middleware, which adds data management from NetApp and implementation as well as support from objective partner AG.
This blog post focuses on the distributed data management architecture that enables the deployment and management of a DT in a hybrid cloud environment. A couple of years ago NetApp introduced the concept of the data fabric as an architecture and set of data services that provide consistent capabilities across a choice of endpoints spanning on-premises and multiple cloud environments.
Every user of a data fabric can decide whether and where data should be stored. Most users choose a hybrid solution. Putting all data into the cloud could quickly overload the available network bandwidth, while entirely on-premises storage is not as flexible as hybrid solutions. On-premises solutions have a significant timing advantage: Transmission of raw data to cloud solutions is often not possible in case of a large number of devices and high sampling frequencies. Controlling machines also requires reliable network connections, which are not possible if the control algorithms are fully deployed to the cloud. Cloud resources are preferable in the case of different load profiles; for example, when large amounts of data need to be analyzed.
Eclipse BaSyx encapsulates Industry 4.0 components into containers and thus permits users to deploy digital twins on both on-premises devices and into cloud resources, depending on use cases and requirements. Furthermore, containerization enables quick redeployment of components. Thus users can react quickly to changing requirements by reconfiguring their infrastructure. The components of the Industry 4.0 data fabric delivered by NetApp ensure that all data of digital twins is safe, manageable, movable, and accessible, whether it resides in the data center, at the edge, or in a public cloud. For example, laws or regulations might demand a reliable archiving of data. NetApp Astra™ Trident ensures persistence of all BaSyx containers. (Trident documentation)
For our reference architecture for BaSyx,we tested NetApp StorageGRID® object storage as the on-premises data repository. StorageGRID supports seamlessly moving data between on-premises and public cloud storage to optimize data availability, protection, performance, and cost. In Industry 4.0 scenarios, the data management infrastructure must store an almost infinite amount of data types in distributed locations, and it must support centralized access. With its S3 interface, StorageGRID offers the possibility of integrating Industrial Internet of Things (IIoT) data into ERP and other business applications as well as into public clouds. The possibility of saving each object with metadata is another advantage of using an object store. This centralized access, together with metadata information, dramatically reduces data collection and preparation efforts for artificial intelligence (AI) and machine learning (ML) analytics.
Our Industry 4.0 solution provides a set of prepackaged components that simplifies the creation of Industry 4.0 scenarios. The creation of digital shadows and digital twins is supported by asset administration shells (AAS) and submodels. The AAS is the digital representative of a physical or nonphysical real-world asset. It provides access to all relevant asset data and services, resulting in scalable and maintainable architectures.
A scalable architecture is able to react quickly to quantitative requirement changes. For example, it supports changes in the numbers of users or data points that must be processed. The solution architecture was built with scalability in mind; each Industry 4.0 component is available as a container image with explicit background storage that can be shared between containers. Thus scaling the number of devices and products is as easy as deploying a new container.
Maintainability describes the ability to modify an existing Industry 4.0 architecture to adapt it to new and changing needs. Thus dynamic deploying or redeploying the core components of our Industry 4.0 solution is supported to enable rapid adaptation to changing environments. The provided off-the-shelf images can easily be configured through text-based property files, as documented in the Eclipse BaSyx wiki. Additionally, changing or deleting an existing AAS is supported by a comprehensive API. In the same manner, it is possible to upload and register a new AAS without stopping the system.
By leveraging an AAS registry, it is possible for applications to discover AAS and to access data and services from their submodels. Through an event mechanism, applications are always up to date and informed about changes of the registered AAS.
In contrast to the Internet of Things (IoT), Industry 4.0 has specific network requirements. For example, distributed control loops, realized by a Supervisory Control and Data Acquisition (SCADA) system, typically have hard real-time constraints. Data must be available within milliseconds and data access must be possible within a predictable delay. Devices create massive amounts of data, estimated to be in the range of several terabytes (TBs) per day, and per plant. This number is expected to multiply by a factor of 5 to 10 within the next few years.
Implementing a public cloud-based solution would require first uploading all data to the public cloud before being able to use it (for example, for controlling the manufacturing process), and before being able to analyze it. This solution will most likely introduce unpredictable latency, which is not acceptable for real-time control of manufacturing. In other extremes, deploying all applications at the edge is not feasible due to the resource constraints of edge nodes, which stand in harsh contrast to the requirements of big data analytics and AI solutions.
As a result, a hybrid approach was chosen as the reference architecture. This approach allows data preprocessing at the edge and storing data with high latency requirements in an on-premises cloud while still leveraging the benefits of a public cloud for less frequently changing data or to archive data.
The combination of the BaSyx middleware architecture with a data fabric delivered by NetApp offers all the required flexibility for diverse IIoT use cases, without losing control, security, and manageability.
We will share more insights in future blog posts.
Since March 2012, Jürgen Hamm has been holding the position Solutions Architect SAP at NetApp Germany. In this role Hamm focusses on consulting customers and partners on IT-infrastructures, network technologies, SAP technologies and virtualization under VMware. Hamm builds cross-functional teams to secure the successful execution of SAP-related customer projects in DACH (Germany, Austria and Switzerland). Jürgen Hamm is also pushing ahead the development of NetApp’s value offering in the Internet of Things (IoT), the expansion of new groups of customers and a changed go-to-market for NetApp. The IoT coffee machine showcase is just one example of multiple demos that Hamm set up to showcase and proof NetApp’s role in the IoT. Before working at NetApp, Jürgen Hamm worked as a technical consultant at the IT-consultancy company GOPA as well as Novasoft since 1998. He is a state certified technician in the field of automation and production engineering.
Brush up on the latest trends and developments in cloud, on premises, and everywhere in between. This is where it all gets real, with a cherry on top.
Explore a wide range of open forums where you can post questions, share answers and just generally get smart on all the NetApp technologies that matter most to you.