LzLabs defines software to eat the world's legacy mainframes

Profile picture for user pwainewright By Phil Wainewright July 6, 2016
Swiss startup LzLabs releases software that it says will let enterprises move legacy mainframe applications and datasets unchanged to the cloud

Mainframe in a futuristic representation of a matrix code © vladimircaribb - Fotolia.com
Just as medieval alchemists sought in vain to discover the secret of transmuting base metal into gold, today's information technologists have struggled for years to transform legacy mainframe applications to new platforms.

Many of the world's largest enterprises still run vital aspects of their business on software that was written many decades ago, by progammers who have long since drawn their pensions. Locked into costly mainframe hardware platforms, it's becoming more and more challenging to connect these archaic systems of record into the modern digital systems that engage today's consumers with ever-faster, more responsive interactions.

At last, though, there's a glimmer of light at the end of the tunnel. The mainstreaming of application containerization technology is starting to provide the tools to move applications off the original mainframe hardware platforms into virtual containers that run on modern cloud computing platforms, and therefore on far cheaper, commodity hardware. In an interview published this week, Accenture's head of technology services confirmed to me that the IT services giant is already helping its clients to do this:

Even it is possible today, I can take an IBM mainframe [application] and containerize that, move it into the cloud and run it in x86. It is possible, we are running it for three clients.

Trouble is, these are complex projects which involve recompiling the original application and repairing or refactoring elements of the original code so that it will work in the new environment. There's no magic philosopher's stone that allows for a straight port from the old mainframe to the shiny new cloud container. Until now, that is, according to LzLabs, a five-year-old startup based in Switzerland.

Software defined mainframe

This week sees the first release of what LzLabs calls the "first ever software defined mainframe." The company has written its own equivalents of all the main subsystems of the mainframe environment — CICS transaction processing, VSAM data access, DB2 and VSAM databases — to run on Linux, so that an entire mainframe application can run its original COBOL or PL/1 code and datasets unchanged on the platform. Partnerships with Red Hat and Microsoft mean the virtual mainframe can be deployed in datacenters on Enterprise Linux or on the Azure public cloud.

LzLabs' chairman and founder Thilo Rockmann outlined the concept to me when we met earlier this year:

We are able to take the executable form and move it over to a Linux or x86 environment and run it unchanged in a native format ...

These [subsystem] components only have to be compatible on the interface. In a sense the containerization approach that is out there is a key element. There is no IBM or any other third party code in the software that we are delivering. It is our code.

We create a virtual machine environment around the software so that it feels like it hasn't been moved.

Having written its own versions of the subsystems, the LzLabs approach offers significant savings on licensing costs that would still be payable under the more cumbersome recompilation route — hitherto the only option when moving mainframe applications to the cloud. As CEO Mark Cresswell confirms:

We are not moving any licensed software onto our device. We have faithfully reproduced all the subsystems people use to host this software.

At no stage during any of the development work that we've done have we sought to emulate or copy what went on before. These are entirely new creations of our company.

The platform also includes mainframe-compatible job control and security, migration tools, storage management facilities and system utilities. A partnership with tool provider COBOL-IT allows modification of the COBOL code if desired.

Cost isn't the only benefit of moving these applications into a modern environment. Much more important is gaining the flexibility to integrate the systems of record with more innovative applications. The platform allows for original, proprietary APIs to be replaced with more contemporary alternatives, which in turn makes it easier to gradually modernize the original application one function at a time. Cresswell explains:

The first step for our customers is that they can migrate those existing applications, with minimal risk, onto a much lower cost platform, thereby freeing up money to enable them to transform them into a more modern form, if they wish to.

Once the customer applications are running inside the software-defined mainframe, we have capabilities that can enable them to be accessed as a service. At which point they then become more interoperable with systems that are very microservice based.

You can also then begin to gracefully migrate individual parts of an application suite into non-legacy languages. So you can migrate them from COBOL and PL/1 into things like Java, what have you — and preserve the interoperability that existed prior to that transformation.

LzLabs quotes statistics showing that more than 5,000 of the world's largest companies continue to rely on mainframes, with over 70% of global commercial transactions still occurring on mainframe-based systems, and a quarter-trillion lines of COBOL programming continuing in use. Now, says the company, they can use its platform "to transfer their applications to modern systems with no code rewrite, recompilation, or data reformatting."

My take

The demise of the mainframe has been prematurely announced so many times over the past few decades that the continued survival of the species has come to seem almost inevitable. But enterprise leaders are under pressure to find new ways to enable modernization and innovation at a speed commensurate with the demands of a connected digital world. I sense the emergence of a shift in sentiment — a renewed willingness to consider the use of containerization as a means of finally ending reliance on these venerable workhorses.

But to be able to take decades-old applications and run them unchanged in the cloud or on Linux x86 servers sounds so much like alchemy that skepticism is justified. Before they are willing to entrust those systems to the LzLabs technology, they'll want to thoroughly test whether it really can deliver all that is promised.

While there may be light at the end of the tunnel, there's perhaps a warning in that LzLabs has chosen to name this first release Gotthard, after the world's longest, deepest rail tunnel, which the Swiss declared open last month. Perhaps the message is that while this is a swift, new route, it's still quite a distance before you'll reach your destination.

But in an era when, in the words of tech industry luminary Marc Andreessen, software is eating the world, it should not really be a surprise to discover that the original mainstay of enterprise computing is itself under threat of being sucked up into the cloud.