HiveMQ CTO – why MQTT is gaining traction in industrial automation and IT interop

George Lawton Profile picture for user George Lawton January 29, 2024
Summary:
HiveMQ CTO weighs in on why MQTT is picking up steam for enterprise IoT deployments in factory automation, logistics, and pharmaceutical workflows. Key drivers include a unified namespace and tested industrial automation interoperability.

connected

Over the last thirty years, researchers, vendors, and enterprises have developed over a dozen different IoT messaging specifications and standards. Increasingly, enterprises are turning to MQTT, a protocol initially developed in 1999 to monitor oil pipelines. 

The protocol has seen a surge of interest in factory automation, logistics, pharmaceutical workflows, connected cars, building automation, and facilitating connected appliances at scale. That’s according to HiveMQ CTO Dominik Obermaier, who began exploring ways to use MQTT to connect IoT apps over a decade ago. He says:

I think until 2018, we didn't do any industrial, it was typically IoT use cases. And then, since 2018, and especially since 2022, it really added more steam. A lot of industry consortia picked up MQTT as an alternative to OPC UA, which is one of the dominant open protocols there. The thing about industrial spaces is that there is so much proprietary technology because no vendor wants to be interoperable with somebody else. And then our customers said, ‘Let's explore MQTT because it's an open protocol. It's simple. So, it should be easy to get started, we don't need a lot of overhead.’

According to Transforma Insights research, the number of IoT connections is expected to grow from 15.1 billion in 2023 to 29.9 billion in 2030 for enterprise applications such as track and trace, fleet monitoring, and smart meters. Obermaier says MQTT is becoming a de facto standard owing to its scalability for up to 200 million clients, low-client footprint, security, reliability over sporadic networks, and support for bi-directional connectivity. Additionally, it is gaining ground as a popular complement or replacement to the OPC-UA protocol widely used in factory automation. 

Over the last couple of years, HiveMQ has seen a surge of interest in MQTT outside of traditional use cases for building a nervous system bridging IT and operational technology (OT) silos. For example, it can help streamline workflows spanning ERP, CRM, manufacturing execution systems (MES), supply chain management, and the supervisory control and data acquisition (SCADA) systems for controlling and monitoring industrial devices and processes. Thanks to a more efficient architecture than HTTP, it can also save millions of dollars per year in communication costs. 

HiveMQ has started hiring many experts in industrial automation to help support this trend since it represents one of the largest opportunities for the old protocol. At the same time, it is widely used behind the scenes in various mobile apps, connected cars, and home appliances. 

Unified Name Space

In 2005, Walker Reynolds, now president of Intellic Integration, was looking for a better way to interconnect data across multiple IoT infrastructures in a salt mining field in Texas. At the time, workers had to drive to the control rooms in different facilities. He conceived a Unified Name Space (UNS) to automatically bring the required data to Excel spreadsheets customized for various workflows. He started out using Dynamic Data Exchange (DDE) to automate the flow from different systems into spreadsheets. 

In 2014, Reynolds started experimenting with MQTT, and the idea caught the attention of the wider MQTT community. Cirrus Link Solutions, which builds industrial automation solutions, released Sparkplug v1 as an open source UNS specification, and handed off stewardship to the Eclipse Foundation in 2019. Since then, the new specification has caught the wider attention of the industrial automation community. 

UNS also represents a massive paradigm shift for the industrial automation industry, which has been comfortable with directly integrating different devices and apps together. The traditional model imagined a discrete hierarchy across field functions, automation, manufacturing operations management, business planning and logistics. The layers could only talk to the one above or below. The UNS upended this model by allowing any app, control system, or device to talk to any other. 

As it turns out, connecting the silos is as much a cultural battle as an architectural one, Obermaier explains: 

I'm always surprised when I am with a customer of how big the clash actually was. It was a cultural shift because the people working in factories are very different from those working in IT. So, it's very hard to get data from a manufacturing line into the data center and vice versa because, on one side, you have vendors who are building silos. On the other side, you also have a cultural shift. If you look at organizational structures, you have people who are running the factories and people who are running the company. And usually, even until the VP level, they don't talk to each other.

Of course, some security and controls are warranted. OT teams don’t want business executives pushing their finely tuned equipment into overdrive to meet a pending deadline. And business teams don’t want low-level employees accessing sensitive company information. UNS shines in providing a way of modeling, storing, and securing data across systems with a high level of granularity. The MQTT broker in the middle can help teams craft the appropriate data-sharing rules in a unified software layer rather than one or more customized integrations across layers. The upshot is that teams can craft a new data connection in a few weeks that previously required eighteen months. 

OPC UA

The industrial automation community has a long history of trying to make it easier for machines to talk to various apps. The Open Platform Communications community began work on a Unified Architecture (OPC UA) standard in 2003 that now defines a standardized data model for over sixty types of industrial equipment. The OPC historical timeline notes that OPC was conceived at a time when Microsoft operating systems dominated the industrial automation landscape, and automation vendors began to use Microsoft’s COM and DCOM. The original standards used Microsoft’s Object Linking & Embedding (OLE) for Process control, which was initially the meaning of the OPC abbreviation. 

However, Obermaier contends that while OPC standards are open, implementations are not necessarily interoperable. This can occur when equipment or software vendors add proprietary pieces on top to lock in customers. This is not something unique to the OPC community. HiveMQ is a member of the OPC, but Obermaier raises concerns about some of the ways OPC standards get implemented: 

Taking open standards and then plugging something proprietary on top of them is wrong. And this is also something I'm very passionate about. It's just plain wrong. I haven't seen a single customer say, ‘Oh, you know what, it would be awesome if this open protocol would have something proprietary, so we can never switch vendors.’ I've never seen a customer say that. But vendors still tried to do that. And this is happening with OPC UA PubSub. This is what many of these hardware vendors are trying to do. I think we need to understand what worked for the open internet since the 80s and the lessons learned and apply them to a business because the lessons are the same. And I think some companies get it. And I think this will be a competitive advantage, and some people will get burned, and this will cost a lot of money.

He points to the example of one large automotive company that reached out after struggling to get live production data from factories around the globe. Although the automotive company was a staunch supporter of OPC UA, they could not easily pull similar data from factory equipment in various countries that used different programmable logic controller (PLC) stacks that varied by country. They turned to MQTT and a UNS to help consolidate information pulled from the different technology stacks. 

The future of MQTT

The future of MQTT will be driven by enterprises looking for better ways of bridging their data silos to take advantage of the innovations in AI. Obermaier says their decision to focus on infrastructure that happens to support the needs of industrial automation ended up being a happy accident. He explains: 

When you talk about our company vision and our mission, MQTT isn't even part of that. MQTT is something we focus on, but what we're really building is a central nervous system. The overall markets are not good for many companies, but in our case, companies are still investing in this kind of new architecture. So, this is something that was, in hindsight, pretty good for us as an infrastructure company.

A new version of MQTT is coming out for sensor networks that is especially suited for low power wide area networks, 5G, and autonomous driving use cases. Obermaier also is seeing a lot of tailwinds driven by interest in AI. He believes AI is overhyped in the short run but underappreciated in the long run. He says:

I personally don't care about ChatGPT and other things so much at the moment, as other people do. What I do care about is that we need more and more data in order to make autonomous decisions based on data. And for now, humans are playing around with generative AI, but I believe the main AI interfaces will continue to be machines, not humans. Of course, the human wants the sexy stuff, but one very important element is the machine data. 

It's getting more and more important to have the data where it needs to be as soon as possible. A lot of groundbreaking technology is still missing, especially when it comes to streaming AI and other things. I'm hoping we're not the company that needs to solve that. But if nobody else is solving that, we will be solving that in the market. What I know is HiveMQ is a central nervous system for feeding all of the data for the AI world.

My take

Two things strike me about the rise of MQTT buoyed by UNS. First, it is already helping enterprises to unify data naming conventions across disparate things and enterprise apps. Second, the success of this combination speaks to the value of interoperability rather than just openness. 

In the early 1990s, I worked on the nerve system for the Biosphere II project in Arizona, where we locked eight people in a self-contained world for two years, materially isolated from Earth (Biosphere I). At one point, I was tasked with bringing in data from about 2,500 sensors connected alternatively to gateways from Motorola or Hewlett-Packard and using different naming conventions to boot. It took the better part of two months to harmonize the naming conventions alone. And I still had my doubts about many of the sensors, such as moisture sensors that could sometimes vary wildly from ones a few feet away or at different depths. 

Everything was wired up over a coax ethernet network, connected by dedicated hubs, which was another nightmare to manage and configure. A few months after I left, I went to my first Interop conference, where I was blown away by the size of Cisco’s coming out party, thanks to the massive interest in and growth of the emerging TCP/IP protocol and required routers. The thing driving this growth was the focus on large-scale interoperability plugfests for the somewhat untested TCP/IP protocol, where everyone had to demonstrate their kit working together. 

Around the same time, telcos, government agencies, and others were pushing for the Open Systems Interconnection (OSI) model with seven layers of openness. Unfortunately, the OSI promoters did not test out the interoperability of their stack in the same public way as the TCP/IP community. Over the intervening three decades, TCP/IP has evolved into the de facto Internet standard, while the OSI stack has faded into memory, other than its seven-layer diagram. 

To be clear, the OPC Foundation website says:

OPC is the interoperability standard for the secure and reliable exchange of data in the industrial automation space and in other industries. It is platform independent and ensures the seamless flow of information among devices from multiple vendors.

It is important to note that the OPC released OPC PubSub in 2018 to “encourage further interaction between OPC UA and communications protocols such as MQTT.” Reynolds, who widely uses OPC UA as a base layer in industrial automation products, also expresses frustration about the implementation process in actual equipment and software. This is why he sees MQTT taking off as an alternative, particularly when connecting to cloud, ERP, and CRM apps. In this 2023 interview, Reynolds says: 

The problem is, in the implementation of OPC UA, they made so much of the standard optional. Many of those features are never implemented, especially in the hardware layer.

The fact that enterprises are looking outside the OPC UA community for industrial automation interoperability suggests it may not have lived up to some aspects of this promise. If this is indeed the case and the OPC community wants to keep pace with MQTT and UNS, it might not hurt to hold a few more open industrial automation plugfests like Interop. 

Loading
A grey colored placeholder image