Georgia-Pacific cuts complex data modelling times in half with SAS
- Summary:
-
You may not think that making toilet paper demands an intense use of data, but Georgia-Pacific sees competitive advantage in running models more quickly within SAS.
Even if you haven't heard of Georgia-Pacific, you'll very likely have relied upon the products it makes - particularly during the COVID-19 pandemic. As one of the world's leading makers of tissue, the company has seen a 120% increase in demand during the health crisis as households continue to stockpile toilet paper.
And making tissue, pulp and packaging is big business. The company has 30,000 employees, operates out of 180 locations worldwide and has invested $9.4 billion into its operations since 2013.
However, whilst you might not guess it at first, the use of data is key to Georgia-Pacific's operations and market competitiveness. Speaking this week at SAS's Global Forum virtual event, Roshan Shah, VP for collaboration and support center operations at Georgia-Pacific, explained how the company is predicting outages and improving safety via the use of complex data models.
In addition to this, by making use of the SAS platform, Georgia-Pacific has been able to further democratise the use of data and reduce complex modelling times by more than half.
Shah explained that the use of data in decision making becomes particularly important when you consider the difficulty in retaining a high knowledge-base workforce. He said:
At the highest level of our organisation, the message was pretty simple. We need to rely on data, AI, and more fact based decisions in general, right? What we've been very aggressively pushing is how do we integrate more data and analytics into our everyday decisions? I'll give you one example, where over the last decade we've lost a lot of our subject matter experts and folks who knew how to run our equipment really well, via attrition, retirement, or otherwise. Replacing that is pretty tough. When you have 30% of your workforce that know how to make those equipment work and you continue to lose that - particularly as you look at forward-looking, it becomes even tougher.
That's why, being able to bridge that knowledge gap, without necessarily spending decades in the mill, but really start looking at IoT data, and then of course integrating AI on it, to figure out what matters and what should be done differently, to stay ahead of trouble? From a process upset, safety, or from a competitive landscape, digital transformation is not really an ‘option', it's really a necessity.
Three core verticals
Georgia-Pacific has been making use of the SAS platform for over the past year, focused on three particular areas. When looking at the company's entire manufacturing operation, Georgia-Pacific divides it into three verticals - process health, asset health, and finally, safety.
Shah said that Georgia-Pacific, unsurprisingly, has a lot of equipment, which it needs to try and operate as smoothly as possible in order to reduce risk. Across all of that equipment, terabytes and petabytes of data is produced. He said:
That's where we're looking at and actively using [SAS[. In fact, today, I think we have about 1,900 models that run multiple times a second. Each one of them is deployed in the SAS platform to help us in each of those three buckets [process, asset and safety] we talked about. We constantly need to do more with those models or make better models.
Therein lies the challenge - data science as a skill set is really hard to hire. There aren't that many in the world and most folks who do know how to write in Python and R, they're really good model builders, not necessarily a good programmer.
Georgia-Pacific has been using SAS to go from being very Python and R specific to a more user-friendly, GUI-based platform, where you don't necessarily need a highly trained data scientist. Essentially, according to Shah, it's about democratising that process. He said:
You can have multiple folks going into the platform and being able to build and deploy those models, really, really quickly. Something that we used to struggle with in the past is we would build a model, but then we would hand that off to IT. Not that there's anything wrong with that, it's just we didn't know how to add all the additional things - such as error handling - that need to go with it
What we've been able to do with SAS is we can enable a citizen data scientist, or an engineer, to build those models and what really matters. They can go from taking a complex time series data and building a neural net within minutes and hours. They can go and deploy it.
Benefits
Shah said that Georgia-Pacific wouldn't be as far along in its data journey without the use of SAS, which has allowed the company to deliver quicker time to value. He added that the use of the platform has changed expectations within the organisation, where if people aren't getting access to data quickly and delivering models in production, questions are asked about ‘why not?'.
Shah said that it has changed Georgia-Pacific's paradigm in terms of taking a complex piece of work and integrating it into daily practice. He said:
We don't care so much about it being perfectly accurate. We look for time to value of money. These folks are able to do that, which in turn, frees up the data scientists to solve the much more complex problems. If you look at it from IT's perspective, it really frees them up to focus more on, how do you make data available? How do you clean that data? How do you figure out what are the key-value pairs and historical perspective that you need to piece together so someone can build those models much easier. How do you make consumption easy?
Shah said that Georgia-Pacific has seen "seven and eight figure value" that wouldn't have been delivered otherwise. This comes from such insights as being able to predict when something is going to break a month in advance of it doing so, rather than waiting for it to happen. He explained:
In the manufacturing world, time series data is perhaps the most critical. And we're able to consume that data in real-time, whereby you could consider a small motor that sits in a remote facility. We are able to get that data within seconds of it being generated and process it through a model, to figure out whether that was an anomaly, whether we are about to have some sort of unplanned outage - we are able to do that in real time.
The fact that we can do that in a GUI manner and be able to quickly deploy that, it's given us a huge comparative advantage. It used to take us on average about twelve weeks to take a complex model, get it built, and deploy that, and then put it in production. I think we could easily say that that's gone, on average, about three to six weeks, sometimes even shorter. That's pretty unprecedented.