Data streaming is key to managing real-time risk
- Summary:
- Getting insights from data at speed is a challenge for businesses with lagging legacy systems. Confluent's Peter Pugh-Jones makes the case for transformation to manage risk.
It’s 15 years since the 2008 banking crisis rocked the pillars of the world’s biggest and most influential financial institutions. Today, the specter of the market turmoil that claimed some notable scalps still haunts the industry.
One of the many causes identified in the aftermath was a lack of visibility regarding banking institutions’ exposure to trades, loans and other financial instruments. In other words, too many banks simply didn’t know where they stood financially until it was too late.
Today — just as then — banking, share trading, loans and foreign exchange is a continuous 24/7 operation that follows the sun around the globe. But instead of carrying out continuous risk analysis, all too often the computer architecture in place to crunch the numbers is only done in batches — often overnight.
In this scenario, banks only get a sense of their overall exposure the following day. And even then, the information may be incomplete or out of date.
In 2008, this approach meant that banks — especially with offices around the world – were sometimes over-exposed to particular stocks, currencies or regions which may be politically unstable. It also meant they were on the back foot in the event of a financial or geo-political shock.
Speeding up batch processing helps to improve visibility and mitigate risk
To mitigate the risk, banks have been searching for ways to speed up batch processing. What used to happen once a day can now — thanks to advances in technology — be achieved multiple times each day. But despite the giant strides made, it’s still batch processing. It’s still ‘data delayed’ rather than ‘data in motion’.
Which is why more and more financial institutions are turning to data streaming. Instead of waiting to process financial data in batches, each new piece of information is processed and analyzed continuously as each new trade is made.
For instance, EuroNext — the first pan-European exchange spanning Belgium, France, Ireland, the Netherlands, Portugal and the UK — built an event-driven trading platform for markets across multiple European countries to support high-volume, high-speed trading and provides clients with access to real-time data.
The trading platform — Optiq® — provides a tenfold increase in capacity and an average performance latency as low as 15 microseconds for order round trips as well as for market data. Phillipe Pujalte, Infrastructure and Operations Director at Euronext, said:
It was a massive project, and a highly reliable message brokering infrastructure that could scale just by adding more nodes as our business needs changed was key to the entire effort.
The performance requirements for this messaging infrastructure were equally stringent, Pujalte added:
We needed the streaming platform to be able to ingest up to a million orders per second, with real-time latency in the millisecond range.
Legacy systems still holding back digital transformation
Despite the step change in risk analysis and speed of response, there are still some institutions today that rely on spreadsheets to carry out some of their analysis.
Part of that is down to legacy systems — especially among some of the more traditional institutions that have yet to adopt systems architecture capable of streaming. And it’s fair to say that cost is also a barrier. But it’s also redolent of a business mindset reluctant to embrace change.
And yet, for those that do embrace data streaming to manage real-time risk, the benefits are plain to see.
With 75 million customers and a market cap of more than $38B, Bank Rakyat Indonesia (BRI) is the largest bank in Indonesia and the largest microfinance institution in the world.
As part of its digital transformation program it opted for event-driven architecture to power real-time analytics for credit scoring, fraud detection, and merchant assessment services.
Making the jump from batch processing to data streaming
Among the innovations being driven by event streaming are a system that detects anomalous customer transactions in real-time, an ISO 27001 certified open API that connects BRI with a digital ecosystem of partners, an early warning system that identifies customers who are at risk of payment default, and a micro-lending app helps BRI reach untapped new market segments.
Kaspar Situmorang, Executive Vice President, Digital Center of Excellence at BRI, said:
With event streaming and the ability to capitalize on real-time analytics in merchant services, we tripled agent banking sales.
The first step for any financial institution to improve its real-time risk and trading visibility is to accept that data streaming — rather than batch processing — is the way forward. That may seem simple. But for those banks with their computing architecture firmly rooted in the last century, this may not be as easy as it sounds.
This is particularly true for those that have based IT spending decisions on tactical decisions — implementing tech to address a certain problem — rather than a more strategic approach which fits into the bigger picture.
Sometimes this can be done from within the management team with no outside help other than a realization that real-time data streaming is the way forward.
More often than not, though, such radical changes only accompany a change of personnel at the very top of a bank.
Either way, the motivation should be same — to embed real-time risk analysis within the banking and financial services sector to prevent a repeat of the 2008 crisis.
Find further information on the new paradigm in financial services data infrastructure.