New lamps and old lamps together - when legacy can make bleeding edge work much better
- New technology needn’t be assumed to supersede legacy systems, especially if the start position is that both new and old are used collaboratively to enhance the capabilities and value of the other
With cloud services now mainstream, TIBCO used its recent NOW conference to fill out the character and content of its own growing offering and extend its back-story in ways that present fresh backwards compatibility. This allows legacy on-premise tools to move forward into a new cloud native status, and giving long-standing legacy tools users a richer palette of cloud tools with which to work.
The fractured nature of time-zone online conferences turned into an option to beard a couple of the company’s senior lions – Chief Operating Officer Matt Quinn and Chief Technology Officer Nelson Petracek – in their respective dens for a discussion on what TIBCO is doing, and where it is going.
On the subject of what it is doing, particularly in its growing cloud services, Quinn took the ball and ran with it. The company has been building out its range of tools for some four years now, and is continuing to come up with cloud-native versions of offerings that have long been key parts of its integration pitch. This year has seen one of the most long-standing tools, the Enterprise Message Service (EMS) get a cloud native sibling, EMS-X, he said:
“We sell a truckload of EMS for many, many years. That's been historically only on-premise, but now EMS-X allows us to deploy to the cloud. Not only that, it is going to be available as a service, which will be the other big aspect of it. TIBCO’s early success in the SLA era was largely due to two products, Business Works and EMS, with Business Works as the engine and EMS the pipes. Business Works has been up in the cloud now for a couple of years, called TIBCO Cloud Integration. And EMS-X kind of completes that story for customers looking to start that journey to SaaS.”
The company has also been busy getting a cloud native version of WebFocus ready to roll. The company acquired WebFocus creator Information Builders towards the back end of last year, and has now managed to transform and integrate that firm’s tech.
According to Quinn, this has involved a lot of effort around its roadmap, specifically around cloud delivery. He sees this as a particularly important development for WebFocus customers, as it will unlock new opportunities to work more directly with the wider TIBCO tools.
I asked him what he thought of the position put forward earlier this by VC firm Andressen Horowitz, suggesting that companies which start in the cloud rarely stay there and move back on-premise because it is difficult to manage cloud costs. Does this, I asked, move them more towards the breadth of tools and services available from TIBCO? His response:
I am always suspicious. When people zig when everyone else is zagging, I always assume that they've got something to sell me. Go back 15 or 16 years, there were a lot of people who were early movers into the cloud, when cloud really started to become a thing, and a lot claimed that they were going to move all in on cloud. The reality is always somewhere in the middle.
So in his view, going into the cloud to save money is a wrong position, unless it can quickly be blown up to a very big operation, at which point a business can start to see indirect savings, such as not having to directly manage infrastructure or hire the experts needed to do it. Quinn also thinks people forget that the tooling and underlying services with SaaS do generally lower the barrier to entry for productivity:
I look at the cloud as the Swiss Army knife. If I need something that's really, really specific in one way, I might find it. But if I just needed the Swiss Army knife, which a lot of people do, then the performance characteristics and the cost characteristics, and everything else like that, can be superior, but you've got to look at it.
One area of growing interest to the company is the developing interest in Poly Cloud operations, where not just complete user operations are run on cloud services best suited to their needs, but individual processes are distributed across a range of appropriate cloud services. Quinn is seeing larger users starting to think along such lines explicitly, though he pointed out that many more are closer to it than they think just by choosing one service provider over another for a particular task and implicitly learning to manage multiple environments, contracts, and vendor relationships:
We have companies out there now that are at the highest level, telling their board exactly how much is going to be in the cloud, and how much is going to be in colocation, and how much is going to be in the data centers. And I think that's a really good way of explaining to the board: have the costs shift between those various buckets.
Labs for play time and real developments
One of the important areas for future development of the TIBCO product line is the emergence of edge computing, not least because one of its key requirements is set to play to one of the company’s long-standing strengths – integration. A significant tool in this process for CTO Nelson Petracek is its focus point for new developments, TIBCO Labs, which has just been enhanced with the introduction of Labs Gallery, which now provides interactive demos and a chance for users to play with new developments in IoT, blockchain, edge analytics, Augmented Reality, and composable apps. The interactive element allows users to test current LABS projects on tailored use cases, join in on a development program, or recommend new areas for innovation:
Labs has a few different goals. One, of course, is to work with customers and partners, to help them address different business problems. The second aspect is based on those conversations (with users): we develop projects, usually open source, and then we see where they go from there.
One such project that has now emerged into the light of day is API Management, which is based on Mashery, which was acquired from Intel some five years ago and about which little has been heard since. Edge computing is an area that is heavily geared to APIs and their management and according to Petracek, the way APIs are being used has broadened beyond delivering functions or micro-services. APIs are starting to be used on top of data. Today, users want to expose data as APIs, which fits with edge computing rather well.
Part of the Connect portfolio, he sees Mashery as being a bit ahead of its time because it started as a SaaS product. It now also runs on premise, which fits well with certain types of APIs, especially those associated with data. This has particular relevance to edge computing, where there is as much affinity to on-premise operations as cloud operations and the movement of very high volumes of small datasets – often metadata – around highly distributed hybrid on-prem/cloud environments will be a key component:
It's bringing it into our cloud platform, it's tying everything nicely together, it's recognising the fact that API's are not just about services and functions, it's a much broader area now. API Management helps with how do I model APIs, how do I build these APIs, and how do I deploy and manage them? And how do I do that anywhere and everywhere across all these different form factors?
API Management provides an abstraction layer which gives the ability to transform or represent data in a more consumable form before it is exposed as a set of APIs. It also allows users to apply certain types of transforms, analytics, data quality checks and data masking. This provides control over areas such as access and who sees what data.
This can also play an important role in managing how data, and more importantly metadata about data held out at the edge, flows between the many process stages between individual sensors, intermediate management and analysis hubs and the regional/central data centers that will make up the information and business management heart of any business.
The Now conference also saw the launch of another Labs project as a product – Cloud Discover. This adds process mining to the portfolio in response to user requests at a time when process mining is becoming a hot topic, even though it has been around as a technology for some time. Petracek said:
Now organizations, as they go through their digital transformation strategies, as they're doing things like building digital twins, one of the things they need to understand is `what my process is doing currently?’. Not how they're documented, but how are they really working? And how many variations of those processes do I have? I didn't realise there's that many ways to create a purchase order, for example.
This is becoming increasingly important, not only in the immediate area of ensuring processes are both optimal AND effective, but also in the area of whether they are correct. In many areas of business, governance is a vital element and, as Petracek observed, if governance says there is one way of running a process, that is the way it must be carried out, even if there are 19 much better ways of doing it, and process mining helps find them, saving businesses butting up against significant aggravations if they are unearthed some other way.
There seems to be a fundamental law emerging as cloud becomes mainstream, only to be extended in scope by edge computing, while legacy systems continue to chunter away as always - nu-tec never replaces old-tec. They simply enhance and extend the scope of each other. Here is an example as TIBCO, doyen of IT integration and related tools that facilitate glueing systems together effectively for the last 25 years or more, emerges as possibly one of the key players in facilitating the core issues of how the ever-extending distance between the 'soup' of the farthest outer reaches of the network edge and the 'nuts' of the mega-cloud and on-premise data centers get screwed together effectively.