10x Banking makes personalized financial services a reality with Confluent and Kafka

Derek du Preez Profile picture for user ddpreez April 26, 2022
Summary:
10x Banking is aiming to reinvent the industry with its SaaS banking platform - where real time event streaming data, enabled by Confluent, is at its core.

And image of the 10x banking logo
(Image sourced via 10x Banking website)

10x Banking was founded by Antony Jenkins, the former CEO of financial services giant Barclays, after his experience of seeing how difficult it is to make technology changes in an industry that is still by and large running on mainframe technology. 10x Banking was established as a SaaS core banking provider, a modern cloud-based platform, to replace legacy systems in the financial sector. 

Central to 10x’s platform is the use of Confluent, a company built around the Apache Kafka ecosystem, and one that enables the use of real-time event streaming data in organizations. Confluent aims to provide buyers with the tooling to create a ‘central nervous system’ of real-time data, or what CEO Jay Kreps describes as ‘data in motion’

Confluent believes that as architectures have changed, and as more and more cloud-based applications are deployed, users need real-time insight into data and event changes in order to better service customers. 10x Banking’s use of Confluent aims to provide banks with a platform that it hopes will enable its customers (banks) to build better, more personalized experiences for their customers. 

We got the chance to speak to Stuart Coleman, Head of Product and Engineering for Data & Analytics at 10x Banking, at Kafka Summit in London this week about the company’s use of Confluent and its future ambitions for using real-time event streaming data. Commenting on 10x’s founder’s philosophy, Coleman said: 

[Jenkins’] frustration was with how difficult it was to make technology change. The repercussions from that is how difficult it was to make product change and actually build relevant financial services products for customers, because of this inertia that there is in the banking sector. 

10x was founded in 2016 and its clients currently include Chase UK and Wespac in Australia. When the company started building its core platform, it quickly realized that it would still only be one component of what is a huge ecosystem of banking technology in the industry. 

Whilst 10x aims to replace core banking platforms, there are still technologies required for regulatory reporting, data warehousing, risk, compliance, and a whole host of others needed to run a bank. Knowing this led to the decision to use Confluent. Coleman said: 

It was very clear to us early on that we needed to make sure that the data was liberated from the core. So pretty much from day zero of the architecture, we made a decision that we were going to make sure that all of that data was published as events. And of course, there's been iterations of the architecture of how that worked and so on, but that's always been a core principle.

10x evaluated several event-driven technologies to do this, which happened to be around the same time that Confluent Cloud was released. The team was conscious of the fact that it is a SaaS business and would need to run a lot of Kafka clusters in order to do a sufficient job. The resources required to run these clusters in-house would have been too burdensome, distracting 10x from its core product, which led to its decision to make use of Confluent Cloud. 

The use of Kafka and Confluent has two dimensions to it, in terms of the benefit for 10x customers. Coleman explained: 

[The first is that customers will need to be] satisfying their existing processes and reporting needs inside their existing mothership organization. So if I'm a bank, I need a lot of data to be able to satisfy my data contracts to the larger organization. 

So, giving them that real time event stream so that they didn't have data locked in a vendor, means that they don’t have to rely on some batch process to extract that on a nightly basis. They can basically take a real-time feed of all of the data that is happening. They could build their own general ledgers, their own risk systems, their own reporting, whatever they needed to do. It was a very easy way for them to integrate a new banking platform into their core systems.

And secondly, Coleman added: 

As a bank, as a 10x client, I want to be able to build some distinguishing features, some unique features to me. Having all of that data available in real time means that suddenly I can build my own customer experiences on top of the core banking system. 

I can start thinking about: how can I do personalized product offers? Personalized communications? Can I do special cashback offers? I suddenly have all that data available to me, to do these kinds of things. Although it may not be part of the 10x offering, I have the power to do that.

Future ambitions

Coleman said that the use of event streaming data doesn’t come without its challenges, particularly in the banking industry where consistency and correctness are paramount. As a result, 10x is still making use of traditional databases to provide a system of record. He said: 

I think [a traditional database] is still the right technology choice, to be honest, in the banking system. The biggest challenge for us was, how do we ensure that we make the event like a first class citizen? Not something that's published on a best efforts basis, because you can't tolerate the best efforts basis, it has to be absolute. 

So we had to make sure that we had a reliable system, where we could ensure that the domain databases and the event layer were consistent. We actually use the database as a, what we call, a message outbox. 

We ensure the consistency of the data writes by writing both the physical microservice data, but also an event into the same database, and then those events are asynchronously published. That means that you never have a situation where there's something in the database which isn't on an event, or vice versa. 

However, 10x also has a lot of future plans for its use of Confluent and event streaming data. One is looking at how its platform can help customers make better use of that event data, through the use of aggregation and analytics. Coleman said: 

We have all of this event driven technology, but it's very much like a point in time. I've made a transaction. I've just opened a new account. It’s a snapshot in time. What we're looking to do is be able to actually build aggregates in history of how you've behaved using banking. Banks have such a rich scene of data about all of the transactions that the customer is doing, and they're not really making use of it to offer products that would be useful to the customer. When you start aggregating that data in interesting ways, it's really powerful. 

Where it kind of falls over is, you have dimension explosion. So if I want to see the aggregate spend of a customer by merchant, that is probably shopping at hundreds of merchants, where there's 1000s of merchants in the world, and I need to aggregate all of that data by each different dimension, it just explodes. 

So we're using an analytical database called Pinot, which has a direct plugin into this event stream, which the domain is made available on the topic, and you can build those views yourself in a specialized database. 

Equally, 10x is looking at how Confluent Cloud can be used to improve its resilience posture, enabling multi-region failover. This could help its market position, given that it could ease any customer concerns about moving to the cloud, as well as satisfy regulators. Coleman explained: 

There’s a lot of pressure from regulators at the moment. Yes, it’s positive to move to the cloud, but there is this systemic risk of having lots and lots of banks with a single provider. So we're looking at how we can use the event system to actually perform a multi-regional failover. 

We are looking at the cluster linking capability of Confluent Cloud so that we can have a warm Kafka cluster ready to cutover to and how we could feed some events or systems from that cluster. So suddenly I don’t have to do database copies and migrations, I can actually just repopulate my data stores from the event stream. 

Loading
A grey colored placeholder image