German bank NORD/LB embraces rapid change with Kafka and Confluent

Derek du Preez Profile picture for user ddpreez April 29, 2022 Audio mode
Summary:
NORD/LB is undergoing a transformation programme worth approximately 500 million euros - part of that is unleashing data across the organization to embrace rapid change.

Image of a NORD/LB building
(Image sourced via Confluent website)

NORD/LB is one of Germany’s largest commercial banks, with total assets of more than EUR 126 billion, and is also part owned by the German government. The bank is undergoing an extensive transformation programme - which was triggered by the 2008 financial crisis - and is investing half a billion euros up until 2024 to adopt new systems and ways of working. 

Key to this transformation is implementing new IT systems and tools, where NORD/LB has decided to adopt Apache Kafka, running on Confluent Cloud. The aim of using Confluent is to unleash the bank’s data, allowing the organization to build products and services more quickly, more effectively embracing rapid change. 

Whilst NORD/LB wasn’t directly impacted by the 2008 financial crisis, given that it didn’t have huge investments in real estate, the long term fall out has been significant. The bank has a heavy stake in shipping investments, which suffered due to a fall in shipping demands in subsequent years. 

Things hit a pivotal point in 2017 when NORD/LB needed to be refunded, because of its cash requirements. However, because the bank is part owned by the government in Germany, it couldn’t just put out new shares like other private banks, nor could it receive taxpayers’ money due to EU state aid rules. 

As a result, the government had to act like it was a private investor and implement some drastic measures. Speaking at Kafka Summit in London this week, Sven Wilbert, Data Manager at NORD/LB, said: 

The effect was that 50% of people were laid off, there was a 50% cost reduction, and a total renewal of the IT infrastructure. That was actually good for us, it was a good thing. The core banking system is totally changing. We are getting rid of the old, centralized Db2 data warehouse, replacing that with a financial service data model. 

We are changing about 60% of all our infrastructure and IT systems, either changing them substantially or replacing them with new ones. And we are also thinking about new ways to integrate data. 

Part of this is moving to an event-driven architecture based on Confluent, allowing the bank to access siloed data. Wilbert said: 

Of course we do have a big data warehouse, based on Cloudera. Of course we have different relational databases. But what we are now doing is using Kafka as the transporting system to get data from all of our systems into our new central SAP core banking application. 

While doing that, we can also transport the same data into our big data platform. Because we are not just streaming the data out of the system in the raw, bare form, we are also changing the way data integration works. 

Normally you get the data from the system, at some point you just transform it, then you have some normalized stuff, and then you just reship everything. What we want to do is distribute the data integration work throughout the system.

Change is no longer scary

The Kafka architecture allows the producer of a data object to ship through the Kafka system and then the consumer of that data can tap into the stream and transform it into whatever they need. NORD/LB began using Kafka on premise, but has made a decision to adopt Confluent Cloud by the end of the year. Wilbert added: 

We know we can’t do it ourselves on premise. We don’t have the manpower, the skills, and it’s just more expensive to do it ourselves. It doesn’t mean cloud is cheaper, but in our case going for Kafka-as-a-service [Confluent Cloud] is cheaper for us in different ways. 

Confluent comes into play because we can’t do it ourselves. We need someone that is trustworthy as a partner, not only for the implementation phase, but also when we come to run the thing. Because if this thing fails, we won’t be able to do anything, because it will transport all of our data between systems. 

We want to get rid of point to point between IT systems - not by force, but by having the data in Kafka and having it more easily available. Other systems can hook up to the Kafka stream that they need and just take the data that they need, at the pace they need. And we don’t actually have to do it ourselves. 

In the past there has been a centralized team of approximately 50 to 70 people at the bank maintaining this entire environment. Going forward, once Confluent Cloud is adopted, this will shrink down to a small center of excellence that is made up of four or five people. They will be involved supporting projects and other teams using Kafka. 

The end result, or the final ambition, as NORD/LB leads up to the end of its current investment programme in 2024, is that the bank is able to build quickly and that change is no longer seen as ‘taboo’ in the organization. Wilbert said: 

The biggest challenge we are facing today regarding data, is getting access to the data. So once you have access to the data, and then build up your application, it’s so much faster when you can easily access the data. 

You can easily hook up to the Kafka stream, into whatever you are using, and then the data scientists can immediately start building the model. Then you have the infrastructure-as-a-code element, where you build up your environment in a containerized form, and then it’s easy to upload. So all of this ‘having to prepare the event of bringing something into production’ all of that is just not there anymore. 

In the past, change was something dangerous. Now change is something that we want to do, we want to be faster. My boss said a cool statement in the management meeting last week - change is becoming the new standard. We are embracing change more and technology is supporting us to get there. 

A grey colored placeholder image