Denny’s is a US institution, providing diner-style food to hungry customers since 1953. What started as a coffee shop in Lakewood, California, the organization now franchises its diners and has grown to over 1,500 restaurants across the United States. However, despite its history, the Denny’s Corporation is adapting to new technologies and is thinking about how it can harness real-time data sourced from new cloud-based point-of-sales (POS) systems across its estate.
The decision to move to a new POS system was taken in 2021, and is still in progress, but the change prompted a number of questions internally about the possibilities of streaming data directly from restaurants to gain better insights into products, customers, staff and the restaurants themselves. However, Denny’s was grappling with an aging data warehouse and intelligence stack, which seemed unfit for the future forecasting needs of the business.
We got the chance to speak with Joey Fowler, Senior Director of Technical Services at Denny’s, about the company’s ambitions for a more sophisticated, cloud-based data ingestion stack that would enable better intelligence for executives. Commenting on the ongoing move to the cloud-based POS, Fowler said:
It opens up so many possibilities, so many great possibilities, as far as the data we can get and what we can achieve. But then there are challenges on the other side of that real time data, in terms of making sure that we've got a portfolio that can handle receiving all of that information, processing it efficiently.
Denny’s data warehouse team had been using Tableau for reporting, but had also “cobbled together” a bunch of different technology to make sure it could get information ingested into the warehouse. In 2022, the team started reporting to Fowler and during that time were asking questions along the lines of: How can we scale and what do we need that will meet our future ambitions? This questioning came at the same time the company’s existing database and intelligence portfolio started to creak. Fowler said:
On top of the challenges that we were having in not being able to be scalable from a database perspective, we were also experiencing a lot of slowness, breakdowns and congestion with ingesting the data into the warehouse.
Hungry for change
With all this in mind, Fowler made the decision to move to Snowflake, bring in Power BI, and adopt Confluent for its Apache Kafka tooling. The need was clear, as Fowler explained:
We were looking at over 8 million transactions per day, and our cobbled together system just was not making it. It’s like you start talking badly about that car, and then it breaks down. Don’t say it out loud, because if you do, it is going to break down.
And so that's kind of what happened, we were talking out loud and saying we were making all these changes and then it just felt like it's starting to limp even worse.
It’s worth noting that the 8 million transactions per day was just based off of the 55 restaurants that currently have migrated to the cloud-based POS system. With over 1,500 restaurants in total, the demands on the technology were only going to get more intense. Fowler said:
What we were looking for was a solution that was going to be able to handle the volume in a timely manner, that was going to be scalable, that would be easy for the developers to learn, and to be able to use quickly, pretty much out of the box. And of course, with the Confluent connectors, we get that.
But along with that, also the ability to be able to do custom development.
So that's kind of how we got here, to find a solution that's going to be able to do that ingest, because that's where our focus is right now. And do it fast and not take hours. To process and not break down in the middle, meaning we would have to reach out to the business and say ‘oh, by the way, we're going to be late’.
Foundations for the future
Fowler said that the move to the new Snowflake, Power BI and Confluent stack has been fairly straightforward - albeit with a couple of minor hiccups that she puts down to minor configuration problems. But broadly, her team is happy and the transition was smooth. The biggest challenge has been maintaining business-as-usual whilst taking on an multiple ambitious projects. She added:
I would say the biggest challenge is, of course, priorities and needs don't stop as you're trying to make these sweeping changes or bringing in new things. The business doesn't stop, so you still have the other work to do too.
But then I would say also, some of those challenges are just making sure to find the time for the team to also learn what they don't know about the different tools that we're implementing.
In terms of the data being ingested, and the insights gleaned, Denny’s is currently looking at product mix, food costs, labor information, and training data within the restaurants. However, the plan for the future, as the system scales, is to put better intelligence in the hands of executives and to adopt more predictive intelligence. Fowler said:
In 2024, the focus is going to be for us to start some forecasting and all of that good stuff, with a huge view on dashboards.
Where we're going is predictive analysis and thinking through, ‘okay, if this happens, what if?’. And getting to better forecasting. I don't think we do a great job of forecasting the way we do it right now, and so that's part of the conversation that I recently had with our business users.
But for the time being, Denny’s is focused on scaling up the operations and getting the teams comfortable with the different tools, with an eye on becoming more efficient and doing things differently.
We spoke to Fowler at Confluent’s annual use event in San Jose this week, where she said that she’s also considering the role of AI in the new platform and is using her time to understand the role the different providers in the mix can play. She added:
I would say for me, I don't know what I don't know. So I do want to dig in a bit deeper to understand, when we start thinking about AI, and I’m thinking about AI and all those different things from a Snowflake perspective.
Now, we’ve got to think about it from a Kafka/Confluent perspective. And just how does all of that work together? What does Kafka bring to the table from an AI perspective versus what I know that I can do in Snowflake? How does Kafka play into that? Where's their seat at the table, so that we are taking full advantage?
We’re still in a place where I feel like I'm still learning and needing to understand those things that I don't know.