Operationalizing AI — making the tech accessible, applicable and repeatable

Profile picture for user Rick Rider By Rick Rider October 22, 2019
Summary:
Enterprises have started operationalizing AI. Infor's Rick Rider explains how to make this powerful technology accessible, applicable and repeatable

Conceptual illustration of artificial intelligence with abstract futuristic background © vector images - shutterstock

This is unlike the hype associated with other technologies. Artificial Intelligence (AI) and Machine Learning (ML) have unprecedented capabilities to transform the way companies run their businesses, interact with customers, and create competitive advantage. To date, however, the number of successful enterprise use cases have been limited, and such projects primarily have been driven by data scientists and business analysts.

The complexity of deployment has made the concepts hard to design and roll out with consistent results. For some, gaps between the vision and reality have been wide. Fortunately, a new generation of more user friendly, results-driven AI platforms and tools are emerging. The technology is now being operationalized, packaged and made accessible to a wider range of organizations and skill sets. This new breed of AI implementation accelerators will help organizations achieve the potential that they have been envisioning.

The meandering road of AI evolution

In the early phases of digital projects, artificial intelligence and machine learning were often coupled with other applications, such as the Internet of Things (IoT), advanced Business Intelligence (BI), and predictive analytics. AI was the means to reach the end. But, too often, proof of concept projects had to be designed from the ground up for each application, elongating the Return on Investment (ROI) beyond expectations.

Initiatives had several places where they could take a wrong turn. One of the most common was the detour through overwhelming mountains of data. Lack of easy-to-use tools for consuming and analyzing the data could send projects into a tailspin. Even more basic than that, the quality of data could also be called into question, making stakeholders doubt the reliability of any findings. MIT Sloan writes about the many challenges:

Artificial intelligence (AI) and cognitive technologies are burgeoning, but few companies are yet getting value from their investments. The reason, in our view, is that many of the projects companies undertake aren’t targeted at important business problems or opportunities. Some projects are simply too ambitious — the technology isn’t ready, or the organizational change required is too great.

Product readiness, as the article suggests, can be an issue. Most innovative enterprise software solutions cycle through a maturity model. They typically start with theoretical concepts and test cases before they evolve into mature solutions. While many solutions eventually find their market, the risk of becoming shelf-ware always looms over the solutions that are hard to adopt or bring minimal value. Enterprise AI technology has been drifting in these uncertain waters as implementation strategists struggle to find the right repeatable recipe.

A recent Forbes article discusses the challenges and why AI and ML adoption has been slow in the enterprise. In addition to requiring the expertise of data scientists and business analysts, organizations must have reliable data and an infrastructure for data analysis in place, the article suggests:

Data is often the slowest and most expensive component of the ML modeling process. In order to avoid ’garbage in, garbage out,’ it's helpful to ensure that you have access to robust data that is labeled properly. With robust data and proper labeling, a model can be accurately trained to identify patterns, such as characteristics of a fraudulent credit card transaction, or to make an effective marketing offer or product recommendation. This requires businesses to not only make sense of their own data but also to have the infrastructure to easily integrate first-party data with third-party data.

Although some companies have been able to overcome such challenges, the harsh reality is that a very small percentage of models developed by data scientists have been deployed in production environments. The International Institute for Analytics in 2019 Analytics Predictions and Priorities estimates less than 10% of AI test projects are deployed into full-scale production. This startling statistic could be disappointing to the many enterprises who had placed blind faith in the transformative potential of the science. Or, for others, it could be motivation to try harder.

What’s changed?

Enterprises have demanded improvements to implementation processes. This last-mile seems to be where many of the complications arise. Heavily modified solutions, patched add-on modules that don’t integrate well, and clunky reporting tools can all get in the way.

Upgrading to a modern, end-to-end ERP solution is often the prerequisite to starting an effective AI project. This step improves the data infrastructure and makes it easier to harmonize data across several departments, divisions and locations. It removes silos of data — and silos of thinking — that are so detrimental to a full-scale enterprise AI project.

Once the enterprise solution is in place, the AI tools can be applied with greater ease. Some forward-thinking solution providers have responded to the market need by developing tools to operationalize AI. Now, critical components of AI applications are available as a Platform as a Service (PaaS). This tool kit gives users, even those without data modeling experience, the ability to combine pre-defined algorithms and advanced BI templates to build AI use-cases that fit their particular needs. The software can be deployed in days, not months, and without requiring legions of consultants.

Use-case examples

Flint Hills Resources — this company is a leader in refining, chemicals, and biofuels and ingredients. Based in Wichita, Kansas, it has more than 4,000 employees and operates primarily in the Midwest. The company’s capital projects and acquisitions have totaled more than $15 billion since 2002. Flint Hills Resources produces a diverse range of fuels and ingredient products for many household and commercial goods, including gasoline, diesel, jet fuel, asphalt, ethanol, biodiesel, olefins, polymers, aromatics and base oils.

It is turning to AI solutions to better manage its inventory. By accurately predicting when maintenance on its refining and manufacturing equipment will be needed, the company can take advantage of cost-effective stocking patterns — all the way down to the part and component level. This means it’s not over-stocking ‘just in case’ and won’t be caught with stock-outs when a critical piece of machinery needs to be serviced. Controlling inventory saves capital, while supporting preventive maintenance strategies. Spare part inventory represents a multi-billion-dollar problem/opportunity across the industry. Chris Dahl, CIO of Flint Hills Resources, says:

The combination of AI and EAM (Enterprise Asset Management) can give us better insights into our asset health and help transform the way we do maintenance at our facilities.

Acushnet Company — a global leader in the design, development, manufacture and distribution of performance-driven golf products. The company is best known for Titleist, the game's leading performance equipment brand, having earned the trust of tour professionals and club professionals, as well as competitive amateurs and dedicated golfers worldwide.

The company had challenges obtaining visibility into optimal demand forecasts for existing products and new product launches. Forecasting issues led to inconsistencies in supply chain and manufacturing. Turning to AI, though, offered a solution, says Pete Marshall, CIO:

It provided Acushnet the infrastructure to democratize AI at the enterprise level. This platform opens the doors for us to leverage data to its fullest potential.

CERN — the European Organization for Nuclear Research, based in Geneva, Switzerland. CERN’s research facility is home to the Large Article Accelerator used in international research on the origin of the universe. The Laboratory, established in 1954, has become a prime example of international collaboration for the sake of science.

The complex equipment used in the research requires diligent preventive maintenance. The facility was having issues filtering out the ‘noise’ of consequential device readings (air flow) and shutting down devices more often than needed. Turning to AI technology helped by predicting air flow trends for individual devices. The facility maintenance team could move from time-based preventive maintenance to predictive maintenance. With more precise predictive maintenance, CERN could reduce the number of false alarms. With a more detailed analysis of individual devices, trends could also be correlated with additional aspects, such as locations or age of devices. David Widegren, Head of Asset and Maintenance Management at CERN, sums up:

The AI Platform is allowing us to quickly investigate real ML outcomes connected to our EAM (Enterprise Asset Management) solution and operationalize results.

Key takeaways

AI technology is one of the defining elements of digital transformation, holding great potential. The challenges in enterprise deployment can be addressed through modern implementation tools that give users the starter packs and ease-of-use they need. Enterprises should not give up on AI, but should look for platforms that will help them deploy solutions. Summarizing the issues well, McKinsey and Company offers this advice:

Meeting this challenge requires organizations to prepare their leaders, business staff, analytics teams, and end users to work and think in new ways — not only by helping these cohorts understand how to tap into AI effectively, but also by teaching them to embrace data exploration, agile development, and interdisciplinary teamwork.

The article goes on to explain that hiring a few data scientists or turning to consultants cannot solve the AI challenges either:

Quick-fix tactics aren’t enough to transform an organization into one that’s fully AI-driven and capable of keeping up with the blazing pace of change in both technology and the nature of business competition that we’re experiencing today.

Instead, enterprises need to embrace a curious mind-set and provide tools that enable users to explore use cases for analyzing data, predicting outcomes, and harnessing insightful patterns Thanks to new implementation accelerators, this AI vision is not only possible, but also practical.