Tibco sets out how to unify the way to AI/Cloud/Edge flexibility
- Summary:
- If the buzzwords de jure are 'fail fast’ and 'fail often’ and if failure is now (as NASA has for years insisted) just a qualified success, users are going to need to work in an environment where the 'suck it and see’ approach is easily engineered. And that is what Tibco is setting out to provide.
I am not personally aware of any market survey having been undertaken to identify the split between data analytics applications that have a long term and constant use lifecycle, and those that are just one-off, or at best sporadic-use, tools to serve a specific need at a specific time. Perhaps more to the point, what would be the split if users had the option to readily create such applications, run them and then put them to the sword with a clear conscience?
My best guess would be that the percentage looking for that type of analytics offering would be high, if there was indeed an easy way of creating such applications. And that has been the trouble, for pulling together all the data files that might be required for any analysis process is often an extremely difficult job, especially if the person doing the work is not the specific owner of the domain where the data is stored.
Wading through often long lists of datafiles only identified by obscure, domain-specific naming conventions will normally be enough to stop a project dead in its tracks, and will certainly be enough to delay its development and implementation. Yet there is an increasing need in practical data analytics for what some refer to as a `three day app’ – thought up and scoped out on a Wednesday, produced on a Thursday, run on a Friday with results in assessment, and quietly erased sometime over the following weekend.
For most businesses this is something that remains on the `nice idea’ list but for now unachievable. The time taken to identify the data required to make any analysis of real value is still normally measured in weeks not days or even hours.
Short-circuiting this issue is one of the main new developments announced at the this week’s Tibco Now conference, held in London, in the form of the new Cloud Metadata toolset. Built on EBX, this has just been launched into a beta lightweight SaaS offering, to be available on the TIBCO Connected Intelligence Cloud. It is due to go in beta testing on the service during this coming November.
Unify, then unify some more
It is part of a new thrust the company is focusing on, and bought out front and centre at Tibco Now - the need for users to unify their operations around their data, and therefore have the right tools needed to both exploit it better `now’, but also develop and engineer the new innovations in exploiting that data out into the future.
Users are getting used to the idea of cloud native applications that exploit the capabilities and where both functionality of cloud services, and often see them described as something very different from the more established, legacy applications they already use. This has the potential to create yet another silo model that CIOs need to address. It also runs up against other terminology such as hybrid cloud, which is assumed to provide an environment where legacy and cloud native applications can co-exist.
Tibco looks to take this a step further, however. It wants to build a cloud native infrastructure, with tools specifically designed for building cloud business and process management from the ground up as well as the top down as the key objective. And accepting that cloud native applications are also predominantly open source, the company is adopting the open source model for much of its newer development work.
For example, the new Messaging Manager 1.0.0, which includes an Apache Kafka Management Toolkit, provides a command line interface with predictive intelligence capable of simplifying the setup and management of Kafka to deliver unified, real-time data feeds providing high-throughput and low-latency. The messaging components use a common management plug-in for easier continuous integration and deployment.
The Internet of Things (IoT) is also a feature of the unifying play Tibco is making, which has introduced machine-to-machine communication via OPC Foundation Unified Architecture in the company’s Streaming software. This is particularly seen in the latest version of Project Flogo. Integrating with Tibco's existing solutions, the Project Flogo Streaming User Interface is claimed to allow developers to build resource-efficient, smarter real-time streaming processing apps at the edge or in the cloud, improving the productivity of expert IT resources.
Lastly, to further strengthen Tibco's contribution to the open-source community, the firm introduced an open-source specification, CatalystML, to capture data transformations and consume machine-learning artefacts in real-time, high-throughput applications.
AI as the foundation
There is, of course, something of the circular about the combination of unification strategies and AI, with the former providing the platform on which the latter can thrive and be far more exploitable, while the mere existence of the latter makes the former an imperative for any business. The company therefore now plans to ingrain AI into everything it does from now on.
For example, the AutoML extension for Tibco Labs can speed the development and selection of AI workflows, while new Process Mining capabilities make it possible for customers to discover, improve, and predict process behaviour from data event logs produced by operational systems.
My take
There is something about the word 'unify' which smacks to me of restriction, tight control and the loss of agility and flexibility. But the other side of that coin is that if flexibility is actually to work – if application A is to work with Service X just because it might be the answer to all a company’s prayers – then the ability to have a unifying underpinning to a company’s IT infrastructure is essential. It is the only way such experiments have a chance of working. And with the growth of the cloud, the coming of computing out to the edge using 5G comms, and the real arrival of proper AI, such experiments will surely become the order of the day, not an occasional outburst of lunacy.