Appian - where AI meets RPA and Process Mining

Brian Sommer Profile picture for user brianssommer May 12, 2023
The recent Appian World show was a masterclass on everything process technology but the insights regarding AI were big news, too. Attendees also got a lot of insight into Appian’s process mining, data fabric and other capabilities. Here’s the recap.

An image showing how process design and process execution are supporting AI
(Image sourced via Appian )

Last week, Appian held its annual user conference in San Diego. The event reflected the growing size of Appian and a number of changes (notably AI) impacting buyers AND sellers of software. 

Attendance was pegged at over 1500 people on-site and many more virtually. The company, by the way, is approximately twice as big as it was in 2019. The last Appian World I attended in person was in May 2013 and was a fraction of the size (and number of sessions) as this event.  See this event writeup of that show for contrast and technology themes. 

The name badges this year were interesting to read. I saw lots of developers and partners. But there were also people with Knowledge Management, Process Improvement and other job titles that are often scarce at ERP shows. I also met with a number of partner firm personnel. Appian, by dint of its large customers and their complex processes, is apparently a vendor that large integrators, consultants, process improvement specialists, etc. find attractive. 

Complexity is something Appian customers possess. There were numerous demonstrations and breakout sessions whose speakers came from federal government agencies, insurance firms, life sciences and other entities with long, complex processes. These Appian customers and prospects are not satisfied with inordinately slow, potentially error-prone and partially manual processes. These businesses want better processes to improve the customer (or user) experience, reduce costs, increase productivity, and/or gain competitive advantage. 

This was also a show where Appian developers and super-users at customers are highly revered. There was a $10,000 prize to the super-developer who won Appian’s version of a hackathon (i.e., the Live Build Challenge). Even the PR team felt I needed to meet with the lead cheerleader of the developer nation at Appian (It was a pleasure to do so, by the way). 

The headlines

Appian is now branding itself as the “End to End Process Platform”.  This branding reflects their use of new AI/ML/Large Language Model technologies with RPA (robotic process automation) and process mining to solve complex business process challenges quicker than ever. Watching demonstrations of these combo capabilities make me think that Appian’s toolset is really a “Business Productivity Generator”. I’m not trying to start a branding war with that remark but I think it’s important for prospective Appian customers to see the company as more than a tools provisioner and more of a firm that helps move organizations to a new level of performance. In my experience, people buy outcomes not tools.  

“AI as my co-pilot” came up in numerous executive one-on-one’s, keynotes, etc. Appian executives wanted all of their developer community to realize that newer AI, low code, process mapping, etc. tools will not lead to mass layoffs of programmers/developers. We saw several demonstrations where:

  • AI tools could generate SAIL statements and other coding content. However, because no AI tool can know what a new process step, form, report, etc. will need in its entirety, a human being is needed to supplement, adjust or correct these machine-generated products.
  • AI tools can do a great job of debugging & desk checking code. They can find potential syntax problems in seconds where mere mortals might spend hours doing so and still not catch everything. 
  • The better a developer is at asking an AI tool a question, the better the results will be. 
  • A developer is needed to finesse the automated content. These tools are like “chop and copy” reuse tools on steroids. 
  • The tools provided several suggestions and reference material excerpts that a human must peruse and make a value judgement as to which sources or suggestions are the most appropriate.

I believe that humans will want to and need to work with these tools. I’m good with that concept. What I wonder though is what next year’s Appian World event is going to look like. I suspect in the next 12 months Appian and its customers will create a form factor larger number of process insights, automations, etc. and there won’t be near enough time at the show to highlight even a small percentage of what will likely be some outstanding new creations. 

The Data Fabric is what Appian calls its ability to not just stitch together disparate data but to also see how this information is (or could be) used in a process. Process performance and usage statistics are identified via process mining technology. Data is analyzed to see what information is being used in different processes and process steps.  The Data Fabric helps companies:

  • (re-) discover they have and create a more complete data model that processes can exploit more completely
  • Create new applications much faster than before with a no-code integrations and codeless data modeling
  • Develop highly secured applications 
  • Reduce application maintenance costs

Process HQ is Appian’s long-range vision of process mining. The technology provides a nice graphical view of a process, statistics from process mining technology, workflow logic, process mining insights, etc. The software then shows where new effort/programming is needed to improve the business rules and outcomes. 

One of the more interesting aspects of this capability is that users can see before and after process results/performance statistics to see if bottlenecks, throughput, etc. actually improved and are now at acceptable levels.

Earlier in my career, the process documentation tools I used (not the advanced process automation tools of today) were so limited as to what they could do. In fact, most of these were static documentation tools. Process HQ’s power comes from harnessing the data within and generated by several Appian technologies to rapidly focus process experts on potential improvements and complete these improvements in short order. 

Process Automation/Robotic Process Automation generally includes a number of tools to identify process workflows, exception logic/rules, approvals, etc. An RPA outcome can be a highly automated process where a number of routing, processing, decisions and other actions are occurring automatically. Done well, these tools can dramatically reduce human effort and errors. 

There were a number of breakouts where customers, partner firms and/or Appian team members stepped attendees through the effort required to light up their government procurement, insurance underwriting or other complex process. But complexity was only one factor common to many of these presentations. Some processes also have a significant amount of regulation, lots of changes over time, rapidly evolving products, etc. 

Where one customer might put dealing with frequently changing regulations as the key driver for using these tools, another customer might list improving customer experience as the top goal. The variability in the kinds of processes being automated was quite noticeable but you could see how each was a critical issue for the company to improve. 

In the end though, I did observe that Appian has obviously had a lot of success within the government sector just because of the sessions offered and executive comments. This makes sense as US federal agencies are large, highly regulated entities that would benefit from these tools. Insurance is also another sector with similar challenges. These market & process realities shape Appian’s go-to-market efforts and reflect the kinds of organizations they target for new deals. 

Private AI is not a military person but was a term that was used frequently by Appian executives. Appian has delineated all of the new AI/ML/LLM capabilities into two camps (i.e., public and private) based on where the training data and processing logic for these tools lives. For example, if you want to translate all of your English-language support documentation to Castilian Spanish via a large language model, you might use one of the externally available AI tools to do so. Doing so would mean using a public AI tool and exposing your data to the third-party’s tool. That third-party tool will get smarter because of its intake of your firm’s proprietary data. 

If the proprietary data is something of a competitive advantage for your firm, is something that should be held to a high degree of confidentiality, etc., a firm would be better off using its own Private AI tool. There may be other reasons to use a Private AI solution. For example, a planning tool might better understand some financial results if it only uses your own firm’s sales data. Your firm’s sales may be countercyclical to those of some of your competitors. Since these tools look for patterns within the chosen datasets, getting great data will help ensure better results. Alternatively, poor, confusing data will simply generate low-value results (i.e., garbage in, garbage out).   

Appian executives stated their intent to back Private AI solutions for the foreseeable future. That response, while conservative, will help protect their customer’s data, confidences, etc. While some public AI use cases were mentioned, it was always with the caveat that this would have to be something where data security could be ensured, the risks to people/companies was minimal, and, Appian had time to thoroughly vet the solution. 

New AI Use Cases were popping up throughout the show. Here’s a taste of what was being discussed:

  • One Appian speaker showed how an AI tool could take some Appian code and have the AI tool create a natural language version of it. This tool could help developers document a process or program’s functionality into usable, readable documentation that non-technical people could understand.
  • We also saw the reverse of the above. English-language copy was sent to the AI tool and programming code was generated.
  • We saw how an AI tool could read a scanned document and create an online form from this in seconds. An added feature in that demo involved the user requesting the AI tool to change the color of headings without almost no effort. This new online form was actually ready to use in a matter of seconds and looked materially better than the image it had to work with. 
  • AI can be used to identify fields you probably should put into reports/dashboards based on what the tool has learned from your employees’ prior usage of other reports, data downloads, etc. and what kinds of filters people use in their searches. 
  • And, we were even teased that an AI Powered RFI Generation tool was being developed to help government procurement personnel more quickly and accurately develop these documents. (see image below)

An Image of an Appian graph and its offer of private AI
(Image sourced via Appian)

Integrator/Partner interest was keen at this show. Most major global service firms had some presence. Accenture was noted for their buildout of innovation factories using Appian technology. RSM was acknowledged for doubling their Appian staff complement over the last year. Many customer presenters were either introduced by or shared speaking duties with their implement/development partner. 

Unique Solutions are the stars of this show. 

My take

Shows like this, especially when tech is undergoing a shockwave of introspection and change, are fascinating to attend. All kinds of new ideas and concepts are flying about with varying degrees of stickiness. Some attendees will be real short-term thinkers and regardless of the new AI buzz are simply looking for an incremental tool to take home with them. Others are looking for the long-range structural changes on the horizon and how these will affect their industry (and not just their firm or a single process). Both kinds of users/buyers were in attendance.

This show also highlighted for me how far both Appian and the process technology spaces have changed in the last few years. A few years ago, the focus was largely around workflow technology and low code solutions. This show was about process mining, generative AI and more. Evolution is an interesting animal to study. 

Finally, the mood at the show was notable. Energy/enthusiasm within the customers, Appian team members, partners, etc. was high. It’ll be interesting to see if Appian can maintain this at next year’s event in Washington D.C.

See also: Appian Platform for Process Automation -Low-Code - Process Mining from May 2021


A grey colored placeholder image