EDF UK is using the Snowflake Data Cloud to build integrated foundations for machine-learning innovations that will help its customers save energy and money.
Alex Read, Senior Manager of Data Platforms at EDF UK, was keen to help the utility firm move away from a plethora of legacy systems and towards a more integrated view of enterprise data. At the start of 2022, his team started building a one-stop shop for customer analytics on top of the Snowflake Data Cloud.
Information from across the business is now collected in an AWS S3 data lake and consolidated in the Data Cloud. EDF uses the Snowpark development framework to help data scientists bring machine-learning models into production on AWS SageMaker. This integrated approach makes it easier to understand the requirements of customers and to develop new products, including via machine-learning techniques. Read explains:
We're now producing a higher volume of products – it’s about four times the data science products we were historically. That's enabling us to target customers better. It's enabled us to help customers that are most financially vulnerable, to set up appropriate strategies to manage them if they get into difficulties, and to predict that before it happens.
This ability to understand customer requirements and then develop new data-led products is in sharp contrast to the situation before the introduction of Snowflake. Read says senior stakeholders at the business had previously backed investment in a range of machine-learning initiatives with little tangible reward:
Historically, data science was a massive challenge for us pre-Snowflake. Data was not in a good shape – it was scattered across multiple solutions. We’d have to bring data from a myriad of sources across the organisation, push it to multiple other data solutions, and stand up the infrastructure and environments to support that work.
The organization needed to support a range of platforms, with multiple potential points of failure. Rather than rely on this mix of bespoke and legacy systems, Read and his senior business colleagues opted to use the Snowflake platform to bring data assets together after an RFP exercise:
We selected Snowflake because of its ability to scale. Also, we liked its ability to support data science use cases coming down the track. We analysed other vendors, but Snowflake stood out. It was cloud-agnostic and covered all our use cases.
Snowflake is now the heartbeat of a consolidated enterprise data platform that relies on fewer systems and services than before. As well as Snowflake and AWS technology, EDF UK uses tactical solutions, such as Matillion for data extraction and transformation, Colibra for data catalogue and data governance, and AirFlow for orchestration. Read says this collection of integrated data technologies will help EDF UK to stay competitive in a challenging sector:
“Our organization is currently going through a massive transformation programme. We analysed some of our start-up competitors who will naturally be a lot leaner and meaner than we can be as a massive organisation that has been around for decades. And as part of that process, we realised that we needed to change the way we work because our cost to serve customers was just competitively high.
Read’s team spent 2022 implementing Snowflake and the other technologies in the EDF UK data stack. Read estimates about 90% of legacy platforms are now migrated to Snowflake. By the end of Q1 2023, he envisages Snowflake will be EDF UK’s sole enterprise data platform. Implementing the data platform has not been without its challenges – especially due to the range of sources to consolidate:
We've rebuilt our entire data platform that supports EDF. The fact that we've conquered that in 12 months is a good achievement. The big part of our challenge has just been understanding different ways of working in legacy and bringing that to life in the Snowflake world.
The data team minimized costs and ensured integration by focusing on two key phases: lift and shift, and optimisation. After moving data and jobs to the new platform, the team worked with specialists at Snowflake to develop processes that ensure everything works as it should. Read recalls:
All of that effort gives us the ability to better optimise our costs. But cost optimization isn't a one-off. It's an ongoing process and we're always reviewing what we do with Snowflake, how we structure jobs, and how we analyse them on an ongoing basis.
While the team continues to hone its approach, Read recognises most work relating to the underlying data infrastructure is now complete. After spending 12 months building the platform, his team now has the challenge of thinking about what comes next:
The mission statement we came up with was ‘putting data power into the hands of the business.’ We've got this great platform and now people need to use it. It's about how can we integrate Snowflake and our other technologies to make sure that our users can self-serve.
According to Read, the data team is already using the platform to help the business benefit from access to new machine-learning products:
It’s allowing us to offer better service to our customers and make sure we have product propositions that support our customer base and business. The whole approach is about using data to make us a leaner operation, with a bunch of data science products that enable us to work with our customers.
Examples of these products include Energy Hub, which is an analytics tool that’s available on the web and in an app that allows customers to monitor and control their energy use. The data team has also worked with the National Grid to build a ‘winter turndown’ tool, which shows how customers can cut energy use and potentially reduce bills. Read explains:
When it looks at usage, the tool makes some predictions on what it thinks are your biggest energy items. Customers can start making decisions on how to reduce their energy demand. That's a big benefit in the current economic environment.
Read’s team will be working on other developments through 2023. For digital leaders who are thinking about using Snowflake and other cloud-based technologies as part of an enterprise data strategy, he has straightforward advice:
Understand at the earliest part of the process where your different components sit and what you want them to do. Look at the minimal number of technologies you need to arrive at the outcomes you need to support and take it from there. We can now serve data better and we have less vendors to interact with.