How to make the most of Apache Kafka and Confluent Cloud - three customer perspectives
Three digital leaders talk about how their businesses are using Confluent Cloud – and they explain how other organizations can benefit, too.
Digital leaders who want to explore the benefits of Confluent Cloud should find a strong business case and engage with experts inside and outside the organization.
That’s the conclusion of three IT professionals, who told diginomica at the recent Kafka Summit London in London how their businesses implemented cloud-based technology from Confluent, which is a commercial provider of the open-source Apache Kafka movement.
For those interested in exploring the Confluent Cloud, there are three key take-away lessons: work with specialist support, get stakeholder buy-in, and focus on education.
Work with specialist support
Michael Holste, System Architect at Deutsche Post DHL, says his team started investigating how it could improve an internal system that distributes shipping-event data for the distribution of parcels across Germany two years ago. The system transported about 170 million messages per day and 5,000 messages per second at peak. However, the system was based on legacy technology and couldn’t be scaled upwards, he explains:
We knew we needed to have another solution for message transport. So, we focused on Apache Kafka, and especially Confluent. We had support from Confluent. We had an architect who helped us design the solution. He was very helpful because he had deep insight into all specialities of Apache Kafka that we couldn't see.
At first, Deutsche Post DHL ran the new system on-premises in a data center in Frankfurt. About one and a half years ago, they established two clusters in the Azure cloud. Now, Confluent is hosted in the cloud and the company is transferring 200 million messages per day online. He says the business has about 50 systems that deliver data and another 50 that consume it. Rather than queuing data, Confluent keeps the business up to date on a range of concerns – from distribution to sorting – and allows people to self-serve, says Holste:
It works fine. It was a huge effort to switch from queues to topics, because now the users have to decide themselves what they need and what they want, so it was big mind shift. It was really helpful to have the support of Confluent at that point because we wouldn't have been able to support that mind shift.
Deutsche Post DHL is now thinking about how to use the cluster for other purposes, such as master data management. The aim will be to create a data mesh that helps distribute insight to all parts of the company. When it comes to best-practice techniques, Holste says other digital leaders should create a tight, structured approach:
Have the support of an architect at Confluent. He helped people to change their mindsets and see things they didn't see before. It’s also important to be sure of what you want to do with the technology.
Get stakeholder buy-in
Paul Makkar, Global Director of the Datahub at Saxo Bank, says his firm operates a single, hybrid on-premises stack based in Copenhagen. The bank pushes about $20 billion in transactions a day and offers 70,000 trading instruments to institutional, white-label and retail clients. However, Saxo had challenges around scaling and wanted to take advantage of the cloud, so turned to Kafka and Confluent. Makkar recalls:
We wanted to create a self-service, easy-to-consume platform for domain teams. We don't want to be the guys in the middle, telling them how to do stuff in intricate detail. We wanted to give them what they need in terms of connectors and topics and to organize that by domain team. We have a series of back-end processes that mean they can self-serve what they actually need through a cluster.
With stakeholder buy-in, Saxo is developing a data mesh-like approach to data. The adoption of Confluent pre-dated Makkar’s arrival at the firm two and a half years ago, but he was in situ when the contract came up for renewal. He says there weren’t any other competitors that provided a similar all-in-one-solution:
The reality of Kafka is that it sits in a world where people aren't going to adopt it by just suddenly switching over to be completely event driven. You will always need to integrate some databases – and Confluent is really the only show in town.
Makkar advises other digital and business leaders who are thinking of using Confluent to get high-level stakeholder buy-in and to find a path that makes sense for the organization:
You might move slowly, take your learnings, and get engagement and interest going. When technical and domain teams see success with new technology, they’re keen to understand more. But I think it's really hard, especially with critical systems, to do what we've done unless you've got high-level buy-in.
Focus on education
Gustavo Ferreira, Tech Lead Software Engineer at financial services specialist Curve, says his organization was previously using the open-source message-broker RabbitMQ, but was spending too much time maintaining the self-managed system. Two years ago, they started exploring solutions to their architectural challenges before selecting Kafka. Ferreira says:
It’s log-based, so it opens up new patterns and opportunities that you wouldn't have with other technologies. Also, there’s a huge community, so you have more people thinking about the same problems and how to solve them with technology. You also have connectors with Kafka, which allows you to move data from one store to another. And that for us was very important because we wanted to do streaming.
After deciding to go with Kafka, Curve opted for Confluent’s fully managed solution as the team wanted to deliver the most value with the least resources. One of the principal engineers had worked with Confluent before and was impressed, adds Ferreira:
And after two years of being with Confluent, we've never had an incident and the support is excellent. You can go very high level and very deep. Support is like insurance. You don't usually need it. But when you need it, it's there and they can help you – and it doesn't matter how deep the problem, you’re going to have someone that can help.
Today, Ferreira says the combination of Kafka and Confluent Cloud is helping the business reach its desired destination, suggesting it’s “the backbone” of all asynchronous communications. For other digital leaders who want to take a similar route, he says teams need to know how to make the most of the benefits that Kafka and Confluent provide:
If you are serious about having a new technology in your company, you need to have a group of two or three people – even if it's a temporary team – where they are focused on learning the technology and integrating it with other systems.