Unlocking Value in the Digital Era
Industry 4.0, also known as the Fourth Industrial Revolution, refers to the integration of advanced technologies such as the Internet of Things (IoT), artificial intelligence (AI), and big data into industrial processes. A key enabler of Industry 4.0 is the Industrial Internet of Things (IIoT), which describes the collection and analysis of vast amounts of data from sensors and devices on the factory floor. This data can then, in the correct format, be utilized to optimize processes, reduce downtime and improve product quality.
In our experience, supporting users and integrators, device connectivity protocols, IT/OT orchestration, transactional system data integration, data cleansing, normalization, and contextualization are key supporting factors that must be considered.
The scale and wide-reaching scope of Industry 4.0 industrial operations require efficient, performant, and inclusive communication technologies. The open protocols MQTT Sparkplug B and Open Platform Communication Unified Architecture (OPC UA) are leaders in addressing standardization and interoperability in IIoT. Our intention is not to determine the superior communication solution but rather to discuss their characteristics and aspects worth considering.
MQTT is a lightweight communications transport protocol that is suitable for limited bandwidth networks and applications with multiple clients and devices sharing data in a many-to-many arrangement. It enables clients to publish and subscribe to data in cloud- or premisehosted brokers that manage the data. MQTT alone does not define the organization of data in packets, known as the payload, leading to risks of interoperability issues and vendor lock-in due to vendor-specific payload formats.
MQTT Sparkplug B extends basic MQTT with a standardized payload format for users, integrators, and suppliers to use to define models for interchanging data, albeit not as well defined as OPC UA information models.
Users of MQTT need to consider Smart MQTT clients and brokers that go beyond just moving data around. A smart client or broker will handle multiple payload formats over the same connection, support emerging standards such as the IETF draft JSON schema in addition to Sparkplug B, be able to automatically extract MQTT topic data into tags for consumption via standards like OPC, and handle propagation of data connection quality status.
If control decisions rely on MQTT data, it is imperative for the smart client or broker to address the management of missed and outof-sequence messages, guarantee message order preservation, and address the handling of failed writes. If network downtime and lost data is a concern, the smart broker or client must support store and forward. For robust security, the smart client and broker must enable designation of read-only data.
The OPC UA standards are another means of providing a standardized framework for data exchange and communication between diverse industrial systems, devices, and applications. An evolution of the OPC Classic standards, the OPC UA standards define a secure integrated means of exchanging a wide range of industrial data, along with standardized information models with well-defined namespaces for interchange of data, often in specific vertical industries.
OPC UA information models are published and available from the OPC Foundation with XML definition files to rapidly empower client and server applications to share the industry-specific data in the model. Sophisticated users and integrators can define and publish their own information models to exchange plant data within their businesses, or with supply chain or other partners. This data encompasses not only the raw data but also historical and event data and metadata, including details about data sources, data quality, and interrelationships between data points.
If an OPC UA information model is available for your industry, you may consider using it. If your HMI, SCADA, historian, or MES does not yet support OPC UA information models, visual integration middleware solutions are available to transform the information model data into OPC UA data access or OPC Classic interfaces for integration into those systems.
For applications requiring many-to-many communications, OPC UA Publish/Subscribe (Pub/Sub) offers an efficient transport for data of all types, raw or organized, in OPC UA information models. OPC UA Pub/ Sub is an extension of the OPC UA protocol. Unlike basic OPC UA, which uses a client/server model, it uses a secure, multicast-based model for simultaneous distribution of data to multiple subscribers. It can also be used over an MQTT transport, enabling businesses to benefit from the best of both worlds, leveraging the strong standardization of OPC UA information models with OPC Pub/Sub for data distribution, and the flexibility and simplicity of MQTT.
In any implementation, there will be systems and devices to connect that do not natively implement MQTT/OPC UA in any form. Strategies to connect and integrate standard PLC and control protocols, legacy devices, and nonstandard protocol devices must be addressed early, either through device replacement or integration software. The MQTT and OPC standards have empowered a market supply of offthe-shelf software with visual configuration interfaces to connect just about anything with a serial or ethernet connection and a documented, published protocol to the OPC and MQTT standards.
There is an active industry trend to create and enable a Unified Name Space (UNS) to empower efficient information interchange in support of real-time decision-making. The use of a UNS helps to break down information silos, allowing businesses to collect and analyze data from a wide range of sources, providing a more complete picture of operations. The UNS does not necessarily live in one place but rather in a distributed environment involving all applications.
The tools used to gather information that goes into the model should empower the standardization efforts. OPC UA, UA information models, UA Pub/Sub, MQTT, and MQTT Sparkplug B are technologies that meet those requirements, enabling and empowering technologies to get the data into the UNS. Middleware supported by suppliers with deep expertise in the field who are willing to have open conversations is critical to supporting UNS implementations as the bridge between proprietary device protocols and open standard protocols.
Ultimately, the decisions on what standards and tools are best for an application should stem from considerations about the nature of devices, the protocols they support, the protocols other software and business systems in the application support, and the availability of well-supported, off-the-shelf protocol conversion middleware. Other considerations include robust security, scale, performance, flexibility to accommodate future growth, and the existence of ecosystems of devices, software, and suppliers that are committed to supporting them.
Orchestrating OT and IT
The convergence of operational technology (OT) and information technology (IT) is also vital for dismantling data silos, facilitating uninterrupted data exchange, empowering real-time analysis, supporting predictive maintenance and quality control, and more. The modern capacity to automate processes reduces the risk of human error and lets businesses concentrate on more value-added activities. A flexible, event-driven middleware and visual workflow tool can play a pivotal role in achieving these goals.
Modern implementers expect integration without writing custom code and a visual working environment. They want to construct basic and complex workflows to exchange data and automate processes that are unique to their operations, test them, and then deploy them at scale. Since Industry 4.0 applications often involve massive amounts of data sources, the solution should scale using templates and mass imports. Users should expect a tool to support a wide array of protocols for OT, cloud, and IT, including OPC UA, OPC Classic, MQTT, REST, ERP interfaces, and databases to ensure seamless data flow throughout the entire industrial ecosystem. By leveraging open standards-based tools such as OPC servers, they should also be able to reach IT systems using protocols such as SNMP.
When considering the choice and implementation of a visual workflow tool, businesses should reflect on the following aspects:
- Current integration challenges: Evaluate the IT and OT landscape to identify pain points that can deliver rapid time-to-value if eliminated, such as data silos, inefficiencies, or lack of real-time data access.
- Find rapid time-to-value: By starting with quick successes that bring meaningful business results, support and funding for the more complex longer-term gains should be easily obtained. A visual workflow tool will empower rapid prototyping and success, and templates and imports should be available for scaling up to the larger implementations funded by the early wins.
- Scalability and adaptability: A suitable tool should be scalable and adaptable in performance, deployment options, and configuration, allowing businesses to accommodate future growth, emerging technologies, and evolving industry standards.
- Security and reliability: The tool should ensure secure data transmission and storage and provide reliable performance in diverse industrial environments.
- Ease of use and maintainability: A user-friendly interface, troubleshooting tools, and robust support for various protocols and systems can reduce the learning curve and long-term maintenance efforts.
Once the devices are connected, and IT and OT technologies are orchestrated, there’s still a key factor of bridging transactional operational and business data, cleansing, and contextualizing.
Bridging the data divide—transforming to value
Transforming data into information, also called the “data divide,” adds value by providing actionable insights and enabling informed decision-making, driving operational efficiencies, and supporting continuous improvement initiatives. We discussed OT data access and OT to IT orchestration, but the integration of operational and business or transactional databases and systems also needs to be taken into account.
Bridging this so-called data divide requires a comprehensive solution that goes beyond the typical connectivity above. Essential functionality for such a solution includes cleansing, normalizing, and contextualizing, and providing data to various consumers. This work is more efficiently done closer to the data source, rather than sending potentially invalid or unaggregated data to the cloud for analytics, machine learning, and other value additions.
When evaluating such data normalization, data cleansing, information delivery, and bridging solutions, the following items must be considered:
- Breaking down data silos: Data silos occur when information is confined within separate systems, making it difficult to access and analyze. An ideal solution will integrate disparate systems, centralizing data from multiple OT, IT, and business sources into a unified platform. By providing a holistic view of the plant floor, the solution should enable organizations to optimize operations, identify inefficiencies, and make more informed decisions.
- Streamlining decision-making and KPI delivery: Advanced analytics and reporting tools are crucial for organizations to gain valuable insights from their data. Empowered operators and managers should not have to wait on developers to create the reports they need to answer their questions. Real-time self-service dashboards that provide key performance indicators (KPIs) and other critical metrics should be a part of the solution, empowering relevant team members to monitor performance, identify trends and act.
- Reducing manual data collection: Manual data collection is timeconsuming and prone to errors. The solution should automate data collection, reducing the need for manual intervention and ensuring accurate, consistent information. This empowers plant floor personnel to focus on more strategic tasks, increasing overall productivity. In the event where manual data collection is still required, the solution should provide an intuitive, browser-based, spreadsheet-like user interface to capture such data, complete with an audit trail and necessary security to protect data integrity.
- Scalability and flexibility: The solution should be designed to adapt to the unique needs of each organization. Ideally, the solution’s modular architecture should allow for seamless integration with existing OT, IT, and business systems, and an ability to scale as business requirements evolve. The solution should support various communication protocols and interfaces, making it compatible with a wide range of devices and applications, including OT- and ITrelated protocols, and transactional enterprise databases.
- The value of data analytics: Data analytics plays a crucial role in transforming raw data into actionable insights. An ideal solution will collect and centralize data near the source but also cleanse and contextualize it, ensuring that the information is accurate, relevant, and ready for serve as a single source of truth for various data consumers, such as managers, engineers, executives, and other applications, that need to make data-driven decisions that improve the business. analysis locally or in the cloud. Then, the solution can serve as a single source of truth for various data consumers, such as managers, engineers, executives, and other applications, that need to make data-driven decisions that improve the business.
Success in Industry 4.0 initiatives requires consideration of many factors. Device connectivity, OT to IT orchestration, bridging data divides to business/transactional systems along with cleansing and contextualization before performing advanced analytics are key enablers that must not be overlooked.
Software Toolbox stands as an experienced partner for clients and integrators seeking to address these key enablers. Established in 1996, the company offers an extensive array of open, standards-based tools that function collectively as solutions or as value-enhancing supplements to enterprise vendor applications. Our versatile offerings, including communication drivers, OPC server and client components and toolkits, and data visualization tools, allow for seamless integration of disparate systems, enabling clients to unlock the true value of their data. Software Toolbox’s commitment to providing outstanding customer support and training ensures that clients have the necessary resources to implement, maintain, and expand their Industry 4.0 capabilities. Ultimately, Software Toolbox’s expertise and dedication to client success serve as the foundation for the digital transformation that fuels the future of industrial automation.
Images courtesy of Software Toolbox Inc.
This feature comes from the ebook AUTOMATION 2023 Volume 3: IIoT & Industry 4.0.
Xem Thêm: Hệ thống MES