Looking ahead - the future of application software
- Summary:
-
It’s time to re-imagine the application software space again. This once a decade occurrence has the potential to upend the leader boards in this space but what specifically will this new generation of tech look like? Here’s a strategy starter for those developers and buyers of application software to ponder.
Three times in almost as many weeks, a software CEO I know has had an epiphany. These executives have seen the power behind large language models, ChatGPT and other AI/ML tools. They’ve left briefings on these new technologies more than a bit shaken. Why? The new technology will significantly change the application software space – and that’s their livelihood.
Every ten years or so, there’s a sea change in application software. We’ve seen bespoke solutions give way to packages. Mainframe and mid-range solutions gave way to client server applications. Client server apps gave way to web-based apps which in turn gave way to cloud applications. Hosted cloud applications were (mostly) displaced by multi-tenant or smartphone apps. These shifts create and destroy entire application software firms. Anyone remember: Walker, McCormack & Dodge, MSA, Software International, et.al.? (Check out the ERP Graveyard site!)
We’re at another 10-year point and application software companies have to reimagine their offerings once again. The question du jour is: What will define the newest generation of application software? Here are some the top strategy questions software buyers and sellers will need to answer:
Is code relevant and how?
When I think about the application systems I used to build and repair decades ago, I spent massive amounts of time documenting the logic of a system and incorporating a lot of that into program specifications, pseudo-code or software code. At that time, programming was a time consuming exercise in documenting all manner of “IF-THEN” statements that covered every possible kind of transaction, every possible combination of data, every everything. Developing an app took lots of time because programs/applications had to consider every possible way a user might want to interact with the system and the almost endless variety of transactions that the application would need to process.
Over time, we, thankfully, could use more powerful coding languages and could say goodbye to tools like Assembler. And while COBOL, PL1, RPG and others were a marked improvement over Assembler, still more innovation in programming tools was to emerge. New tools for developing web-based and mobile applications were especially notable.
Today, it’s worth noting the progress of low/no-code development tools and process mining capabilities. The usefulness of the low/no-code tools comes in rapidly creating smart workflows and taking care of large quantities of routine transactions at the same time. These products are a productivity godsend. These represent a quantum improvement opportunity in allowing developers to create a few main transaction workflows with options to quickly add more approvals, special handling, routing, etc. as new transaction types or business needs emerge. The net-effect of this is to create all-new applications in record time. Process mining tools provide insights in processes and can identify where and how these processes could be better optimized.
Bottom Line: New apps will have less code, take less time to develop and be tuned by end-users. While hardcore coders (e.g., the ones who kill it at hackathons) will always remain valuable and essential, the more work-a-day coders will be in far less demand.
Where will apps be built?
The future for application development may look more and more like Zoho. Zoho’s an application software company that builds its applications in India and in other countries where there’s an abundance of highly skilled (or trainable) workers but low-cost structure. Zoho is huge. It has over 13,000 employees and offers its applications to customers for about $7/user/month for one or all 42 of its applications. It has used its low cost development environment with a low friction sales process to make its apps very affordable to all firms globally. Old-school ERP solutions are a costly artifact of a different time.
The new generation of applications may be created beyond Silicon Valley and the most successful vendors may be those that focus on other attributes that Zoho brings to the marketplace. The newer applications must be developed, sold and implemented at a major cost reduction relative to current solutions in the market.
This cost reduction is overdue as the incremental value a customer gets from a rehashed/warmed over ERP product today is slight (and I’m being generous with that description) while the incremental cost is outrageous. These incremental products are still way too expensive to buy and implement (or upgrade) and too disruptive to the business.
Bottom Line: Old applications and the business methods/models that supported them are obsolete. Application vendors need to reimagine their own business model (not just the applications they sell). Vendors should expect to dramatically reconsider where they build their applications, how they sell them, and how they can do so at a 10-100X reduction in sales price.
Will AI come to the rescue?
Programming could be in for a real change as AI tools get better and better at developing major chunks of code or functionality. It’s already apparent that AI tools can find and repair broken code. AI tools can also generate much of the code that a specific program or utility might need. These tools can automatically convert software documentation, relatively error-free, into other languages at virtually no cost (via an LLM). They can also find patterns and correlations in seconds thus enhancing the accuracy, frequency and power of forecasts and plans – all of which can improve a company’s financial results.
AI tools can also serve up a better quality of help. Many online help systems or chatbots simply follow a script and present a canned response to a limited number of requests. An AI chat capability can answer most any request and even structure the response in a logical workflow for the recipient, if so desired.
The work of a programmer will change. Programmers (if that name will even remain) will:
- Craft great work statements for the AI to use as the basis for new code
- Review, tune and/or rework aspects of the AI-generated code
- Submit some human created code products to an AI tool for debugging
- Etc.
This opportunity also presents a means for software companies to reduce the cost of their development efforts and possibly reduce their overall pricing to customers.
What is also getting attention is that AI, when paired with big data stores, can actually eliminate the need for some code development. For example, recruiting software programs have needed a taxonomy of job titles to effectively search a jobseeker’s application or resume. This is not an insignificant matter. One person’s “Executive Vice-President of Manufacturing” could also be noted as:
- VP – Manufacturing
- EVP – Mfging
- Sr. VP of Manufacturing
- Operations Lead
- Chief Operating Officer
- Etc.
Recruiting software needed a taxonomy to parse resumes and applications to identify candidates with similar backgrounds. Today, an AI-tool can peruse a big data store and build its own taxonomy. The time and cost saved in designing, coding and populating a taxonomy is huge.
AI can also help in identifying potentially fraudulent transactions, which employees might be leaving the firm, whether key suppliers are likely to experience financial risk soon, and many more items. This doesn’t require old-school logic, tables, programs, etc. The AI-tools can do in minutes what a team of developers and users needed months to complete. And, the AI-tool can be rerun with updated data anytime while a program may only get updated once every couple of years.
Bottom Line: Application software firms need to take an AI-first approach to new apps development to see how much of the application functionality can be satisfied via these smart tools. That may lead to a surprising reduction in the amount of code that a human being must develop.
What’s beyond transaction processing?
The world doesn’t need another old-school Fixed Assets application. We have plenty enough of these. In fact, we have plenty of all kinds of transaction processing applications. What businesses need, instead, are radically, rethought and net-new applications (not old apps on a new ‘platform’). More specifically, businesses need applications:
- That capture and make better sense of ESG data. This requires a new approach to cost accounting, managerial reporting, ERP and more. This is an IoT, big data, analytics, etc. smorgasbord that needs great leaders to sort out.
- That are focused more on providing business insights than accounting journal entries
- That make the back office a mostly lights-out environment
- Come with dozens of pre-provided workflows and exception handling protocols
- Easily and cheaply integrate with 100X more external applications than just those applications that come from the application vendor itself
- That use EXTERNAL data to help inform decision makers. There is a practical limit to what internal transaction data can tell you (e.g., it really is tough to comprehensively understand your competitors from your own accounting data)
- That ethically and responsibly use AI tools. No AI-based tool should ever be released into production if it hasn’t been thoroughly checked for bias, discriminatory effects, etc. And, these tools are periodically checked to ensure that new bias isn’t creeping into the system.
- Etc.
Bottom Line: It’s time, actually it’s overdue, for a radical reimagining of what application software should be. One thing is certain: it’s not what it used to be.
The shifting power landscape
Access to transaction data and the availability of highly useful AI/ML tools will be the new vendor lock-in. We should expect old-school vendors (e.g., ERP vendors) to make the exporting of a customer’s own transaction data very expensive. These toll charges, akin to what we saw with programs like Indirect Access several years ago, will be designed to make it more likely that an existing software customer will use the old-school vendor’s less capable tools and not some AI-first vendor’s solution(s). To retain account control, vendors will use economic leverage to make it costly to use great AI tools from third parties.
Software buyers need to fight this extortion/greedy wallet grab. It is your data after all and shame on any vendor that holds a customers data hostage or threatens a customer with a massive financial penalty for external usage. You can’t get competitive advantage from a sub-optimal, old or late-to-market tool.
Software buyers and vendors need to realize that value will be derived from powerful AI-like tools that will use all kinds of internal and external data. The value will be in the insights from these tools while the transaction processing functionality gets relegated to a commodity-like cost item only. Buyers will find more value in brilliant analytics and insights and see the transaction processing capabilities as a ho-hum cost of doing business solution.
Bottom Line: Greed is a virtual certainty in the apps world and the greediest vendors never miss an opportunity to stick it to their own customers. History tells us that the most punitive and greedy vendors will be the ones with the poorest new generation solutions. All vendors have to decide what kind of firm they’ll be in a post-AI application software world. Those vendors better be careful as history has been especially unkind to vendors who botch these 10-year market shifts.
Will applications run differently?
Yes.
For decades, we saw people initiate a transaction on a desktop device and that triggered a program to run on a server or other computer and use one’s own internal transaction data. The data, programs, etc. were all co-located in the user’s data center. And while that will still occur going forward, it’s the sheer power and data storage needed that will make newer AI/Big Data hungry apps mostly a thing of the hyperscaler world.
In fact, the idea that a business is going to acquire, load and maintain the vast amounts of data into its own data center that applications will use is just not practical for all but some of the largest firms. The new applications will want to access vast amounts of big data. These data stores will be form factors greater than what companies currently possess. Some of these data stores will contain externally sourced data (e.g., geolocational, weather, global worker data, etc.) that cannot nor should be stored on a local computing device (e.g., smartphone or your firm’s own servers). Moreover, the computing power needed to parse this data in a timely and cost-effective manner may be best found in a hyperscaler’s technology offering and not in one’s own data center.
To illustrate, one CIO recently shared with me his desire to keep all applications and application data on local machines that his IT group can maintain and monitor. This mid-size company has a modest technology stack and even more modest technology budget. They won’t be able to use many advanced technologies as they simply lack the computing power and disk storage to get the most out of new recruiting, scheduling, planning, forecasting and other applications. The applications of today are already outstripping what this CIO has and, in this case, he’s unaware of how this position will make his firm uncompetitive long-term.
Will application software customer/user needs continue to evolve?
Absolutely.
Nothing, especially in technology, is permanent.
What the application software buyer/customer wants today are applications that:
- Are inexpensive to acquire, implement, maintain, refresh and renew
- Deliver outsized value vis-à-vis the cost of the product
- Provide some kind of competitive parity or differentiation
- Deliver an outstanding user experience
- Are low friction products to acquire, renew and extend
- Can be implemented inexpensively and quickly with a minimum of problems
- Offer painless/fast upgrades
- Remain current
- Possess best-in-class security
- Etc.
We all recognize those items and it’s unfortunate that so many vendor offerings are devoid of many of these characteristics today.
Tomorrow’s application offerings will need to deliver all of the above plus:
- An ability to quickly replace one application with something newer/better. Vendors have gone the opposite direction (i.e., with lock-in strategies, punitive contract terms, unfathomable internal integration issues, etc.) for decades and users are tired of this. They want a vendor to offer a great application and if it fails to meet the customer’s ever changing/growing needs, then replacing it should be as simple as acquiring a new app on one’s smartphone. Reread this bullet point and think of an app as an AI utility and you’ll really see why this matters.
- Applications that are designed for ZERO constraints. Applications should be designed to handle unlimited kinds of data/data types and in unlimited volumes. How can a product use Big Data, graphical images, sensor readings, etc. if its architecture was designed to work in a constrained manner? New applications should assume that they can access mind-boggling amounts of data in sub-second timeframes and perform all kinds of intense computations on this data.
- A more expansive and super simple collection of pre-built integrations to other firm’s solutions. If you’re a vendor and you find out you have customers still using spreadsheets, sneaker-net or manual entry methods because of your missing, broken or cost-prohibitive integrations, you’ve failed.
- Really simple, short contracts. The new AI apps should be as easy to acquire and switch out as a smartphone application.
Bottom Line: Relevant products are the ones that succeed in the market long-term. Just because a vendor wants things their way (i.e., customer needs or sentiment be damned), doesn’t necessarily make this a good business strategy. Vendors would be smart to really understand today’s and tomorrow’s evolving customer needs and wants if they want to be relevant long-term.
What should vendors do with laggard technology users?
Historically, vendors have chosen from a couple of strategies to deal with some of their more laggardly customers. These are the customers that choose not to upgrade in a timely manner. Sometimes vendors will offer monetary incentives (or announce future punitive measures) for these customers.
Vendors, generally, want to keep all of the customers they have. As a result, they often agree to do things like support old applications for a specified timeframe (although some have even promised lifetime technical support) for these obsolete apps. Sometimes vendors will do unnatural acts to find ways to upgrade customers via many small upgrades. That strategy might work, technically, but the users will feel like they’ve experienced a death by a thousand cuts waiting to eventually, someday, get to this new promised land. Ripping off the bandage and getting with the future may be a better approach.
Laggard technology customers and buyers are often the last ones to board the bus. These late-to-the-party types can also be stingy/tight-fisted and will stay with old tech until it’s about to fall apart. Frankly, given the psychographics of Laggard apps buyers, I would never recommend a vendor pursue them for new sales nor for upgrades.
A vendor knows if it has Laggard customers if the customer:
- has not, after many, many years, completed the integrations needed for the application
- still relies on spreadsheets in lieu of automated integrations
- has amassed a material amount of technical debt
- is waiting for the vendor to offer them material financial incentives to upgrade
- is way behind their competitors technically
- is still fixated on on-premises applications
- etc.
Bottom Line: Smart vendors build for the customers to come NOT for the customers they already have. Software vendors must decide if they will play defense or offense. The defense vendors dangle all manner of stopgap, interim or other stay in place technologies and incentives to keep customers loyal without necessarily getting too advanced in their acquisition and use of new applications. Offense vendors look forward, not backward. They are challenging their customers (and prospects) to move into the future before competitors do. No application vendor can deliver competitive advantage to its customers if it is still backward looking and enabling Laggards.
What should a software vendor do now?
A great strategic planning exercise might be to create two teams and compare the outputs from each. One is to map out the future application strategy for the firm assuming the company must provide a pathway to the future for its customers. The other team is to proceed as if they are to create a greenfield vendor app company whose mission is to not be tethered to the past and existing customers. The greenfield strategy would examine topics like:
- What kind of apps would we build and how would these vary from existing applications the company offers today?
- How/where would the new applications be built?
- How can we deliver new applications a form factor cheaper and faster than is the case today?
- Similarly, how can we price new applications to encourage massive market uptake and enhance company profitability?
- Can we lock-up certain external databases for the exclusive use of our customers?
- What kind of ethical policies will we embrace (e.g., Just because we can do something technically, doesn’t necessarily mean we should do it especially if it creates potential litigation risk or brand damage)?
- Should we conduct a lot of primary research (e.g., focus groups) to really understand what radically reengineered or reimagined processes could look like?
Bottom Line: Get planning NOW!
My take
Re-invention is common in the software world. I’ve known several serial entrepreneurs who launch new companies every time they see a new generation of application software looming on the horizon. Great vendors reinvent themselves while others linger too long in the past and erode their value. Those slow vendors often get acquired at some fraction of their heyday valuation.
Re-invention is not without risks but there is definitely a major downside if a vendor doesn’t do so in a timely manner.
This new generation of application software may well be less incremental in look and feel than prior generations. For example, the move to client-server was a move away from monolithic systems although the application functionality was eerily similar to prior solutions. This next wave of change will likely contain some very different ideas about applications and how applications are constructed. Vendors and buyers alike will definitely want to ponder these moves.
Let’s see what’s possible now….