John Lewis Partnership creates a one-stop-shop for data using Snowflake technology
- Summary:
-
The retailer now has a cloud-based platform for standardised data analytics tooling.
John Lewis Partnership is using Snowflake’s technology platform to bring disparate data sources together and deliver analytical insight to people who work across the business.
Barry Panayi, Chief Data and Insight Officer at John Lewis, says how this tight grip on data is allowing the retailer to make faster and more effective decisions on behalf of its customers.
Panayi joined John Lewis in March 2021. While the decision to implement Snowflake pre-dated his arrival, he’s played a crucial role in ensuring the technology is embedded across the organisation. Panayi says Snowflake provides a one-stop shop for a single version of the truth at John Lewis, which is in sharp contrast to how things were in the not-so-distant past:
It’s the place to go to get data. That sounds quite simple, but data was just everywhere before. While it was controlled and we were using a lot of resources to keep track of multiple versions of things, understanding who owned the data and what it meant was difficult. Trying to get an inventory of the data we held was hard, and people were using copies for each individual use case.
Snowflake now sits at the heart of a tightly controlled data ecosystem at John Lewis. The technology provides a cloud-based platform upon which Panayi’s team can standardize data analytics tooling for users across – and even outside – the business:
People can access data, if we say they can, and we can control it. Then they can do all the cool stuff they need to do. We already share data with some suppliers because we've got that kind of relationship with the consumer-packaged goods companies.
More than a data warehouse
According to Panayi, the key to the successful implementation of Snowflake is a recognition that the platform must be seen as more than a data warehouse. While the technology brings data together, he says its real strength comes in helping the business do more with information – and that’s something that can be tough for people in all kinds of positions to recognize:
Originally, I don't think we were getting everything we could out of the relationship with Snowflake. But now, we are much more on the right end of the spectrum – understanding that you can't think about Snowflake as just another data warehouse. Instead, you need to think of it completely differently, which is a real mindset shift for people, especially data engineers.
Panayi cautions that using Snowflake can be a big step change for people who are used to working with traditional data warehouse systems. Rather than cutting and pasting data from one system to another, Snowflake works across IT services to create a cloud-based system of record that can be used to drive decision-making processes:
The data engineers have learnt about Snowflake and are excited about it. Snowflake enables us to use our cloud platform like a cloud platform as opposed to just a cheaper method of storage. People must recognise that the cloud isn't just a server with a really long cable. You’ve got to think about it as something a bit different.
John Lewis’s data stack includes a range of other important technologies. As well as using Google Cloud Platform, the retailer runs Tableau business intelligence on top of Snowflake. His team also uses specialist data tools dbt and Collibra. Most coding takes place using Python and the team is beginning to explore machine-learning tools. Panayi says the team is investigating other Snowflake features but is focused on building firm foundations:
Where we are in our journey now is we need to get all our data in Snowflake and get people consuming it. I'd like a firmer integration with things like data quality and our Collibra tool, which deals with metadata and other high-level elements. I'm focused on making sure that we're doing things properly and at the right pace rather than biting off more than we can chew.
Leadership
Prior to working for John Lewis, Panayi was data chief at Lloyds Banking Group. He has also worked in senior digital leadership roles at Bupa and Virgin. Having worked for a range of companies in multiple sectors, he says it’s important to enact leadership lessons from the past and build a strong, cloud-based platform for data-led change:
I've been involved in loads of these things that have crashed and burned because people try and do everything at once. So, once the data is in good shape, then we can do the next thing because just decommissioning old technology produces costs savings that are worth going after.
Panayi estimates the first stage of his strategy – getting data into good shape – will be finished during the next 12 months. Once that work is complete, his team will start thinking about how it can use some of Snowflake’s sector-specific offerings, such as the Retail Data Cloud and Snowflake Marketplace, to make more use of its data. Right now, he says the benefits of the Snowflake implantation are clear:
Everything's a lot quicker. Data scientists can work quicker and we can build data pipelines quicker. I am the risk owner for data for the whole business. I need to confidently and very quickly say where the data is and what people are doing with it. And ideally, I don't want to spend a lot of time on that. So, having it all in one place makes sense.
Panayi advises other business and IT leaders who are thinking about implementing Snowflake to focus on focus on the difference between the technology platform and traditional data warehouse technologies. Following advice from a trusted colleague to make sure he knew more about the system, he completed Snowflake certification. That’s quite a step for a senior digital leader to take but Panayi says the investment in time paid off:
It’s about getting a little bit of dirt under your fingernails until it becomes the norm because I don't think you can assume that everyone in the tech organisation will know Snowflake. Now when I go into the executive board and they ask me, ‘what is all this?’, I can explain why it's different from old-fashioned data migrations, where they probably ended up pouring millions and millions of pounds in to get a different flavour of what they already had.