Main content

The art of super-opting - before you monetize data, ‘contractize’ it

Martin Banks Profile picture for user mbanks May 16, 2024
'Super-opt’ will come in a million varieties, but don’t forget to `contractize’ your data first...


One of the first new application groups to appear that can be laid at the feet of generative AI is a set of tools that help users get more – and hopefully much more – out of the applications they have been using for a while. If they are to be given an over-arching characterisation it would probably be as ‘super-opt’ tools, those that take applications optimization to a new level of business value.

There is, of course, a potential downside to this development, especially if they are just one step forward from the ‘added an AI chatbot’. This is where AI add-ons do start dealing heavily with both current production applications and, perhaps more important, live data. So managing this interface will need to be handled with care and circumspection.

The potential for growing new operational problems when combining new AI-based processes with real, confidential data is now raising its head, especially when it comes to a key grey area of working with business confidential data. It is one that first emerged when multi-tenanted cloud services became available, where sharing virtual service space with actual competitors became a genuine issue. It is still a concern for some users, who still prefer to handle applications running sensitive data in other, more obviously isolated ways.

Poorly managed gen AI does offer a similar potential, this time adding the risk that one customer’s confidential data could end up helping to train a direct competitor’s AI implementation. And in the world of gen AI, cloud services, and specialist service providers, it is not always clear who owns what, and what rights are attached to what they might do with any data while it is in their domain.

Super-opt’ – a working example

One vendor in this emerging ‘super-opt’  sector, is Quantum Metric, so I took the opportunity to have a chat  with CEO Mario Ciabarra to try and pin down some of these issues, not least because an opportunity to optimize the operation of a trusted application – really squeeze the very best value out of it – might save a good deal of  investment cash by extending the life of a trusted application suite rather than taking the risk of making what could prove to be unnecessary changes.

Quantum Metric positions itself as delivering a customer-centered digital analytics platform that provides a simplified approach to monitoring, diagnosing and optimizing the digital journeys its customers take, providing the insights that build a clearer understanding of what drives customers and what they need in order to meet their own business goals and objectives.

It has recently introduced some significant updates based on using Google’s Gemini Pro LLM services aimed and at building a suite designed to enhance digital understanding across the customer lifecycle. To help digital organizations hear, understand and respond to the needs of their customers, it has now introduced Felix AI to provide AI-powered session summarisation.

A key target for Felix AI is replacing the traditional ways of handling session replays. These are often widely used by businesses looking to grab a deeper and richer understanding of what is motivating their customers, so they can better align their product and/or service offerings to meet those needs. The current downside to this work is that the replays are inherently complex and time-consuming. Indeed, the company states that its customers, during 2023, spent over 320,000 hours reviewing session data.

By contrast, Felix AI summarizes sessions in seconds. In practice, the exact visitor experience is captured and summarized in a way that quantifies the scale of each issue and its impact on key business metrics. It also provides the tools to deep dive into the detail of each summary as necessary, in order to obtain clarification. This can be of particular value with businesses geared to heavy use of marketing and sales campaigns, so that, for example, the right campaign can be identified.

It can also then utilize flexible APIs to work with a range of common channels and services, such as text, email, and Slack. It can be set up to provide roll-based summaries, including call centre staff, so that such staff can be given a concise and accurate summary of a customer’s issue while talking to them.

Spinning off from Felix AI, Quantum Metric has also added three new tools to support the full customer lifecycle, and democratise the use of the same data across different teams within a business so that all parts of the business are working to the same customer `story’. These are Interactions, which now includes heatmaps and zoning to improve the optimisation of critical web pages; User Analytics, which provides new data visualizations to help teams understand issues such as customer retention and churn; and Lightning Analytics, a solution to building optimisations for critical workflows and operational apps built on Salesforce Lightning, allowing users to monitor, diagnose, and optimize them.

One of key issues for the end users – both Quantum Metric’s own customers and, potentially, their customers and those customers’ customers in turn – is that for the first time their live, confidential data can be at risk just because it is being used, and passing through, by two levels of third-party service provider as part of providing the optimisation and summarising processes. This, as Ciabarra indicated, includes both Quantum Metric and Google, which has access to the data as part of the summary processing work undertaken by its Gemini LLM. This is particularly the case with the enhanced session replay capabilities, he said: 

Consider a time when you've visited a website or opened a brand's mobile app, attempting to purchase a pair of shoes or a plane ticket, but encountered an issue at checkout. Our technology allows organizations to quickly identify these moments, understand where and why users struggle, quantify how many are experiencing the same issue, and connect that information across the organization to promptly rectify the problem to prevent recurrence. However, watching a replay of a customer's experience can be time-consuming, ranging from five minutes to an hour each time.

He stated that all the data used by Felix AI is sourced from non-identifying behaviors observed in specific session replays, and compiled into a concise summary that teams can review, share, and act upon. He also pointed to the fact that the company lays no claim to ownership of the data during the process:

Quantum Metric is a data processor. As such, all data Quantum Metric collects on behalf of our customer is owned and retained solely by the company or brand using the Quantum Metric platform. This includes all the data used for Felix AI.  Additionally, the data is not used to train models outside of fine-tuning uses for that specific company.

As a Google partner the company has specifically addressed the potential for data `slippage’ to occur as it is moved. Ciabarra explained that each customer's contract uniquely addresses each brand's specific requirements, including provisions that each customer's data is owned by that customer and is not used by Quantum Metric and/or shared with any third parties other than to provide the service. He said:

 Felix AI is under specific terms of use included in every contract that defines how Quantum Metric can access and use data coming from the individual company using the platform. With Felix AI - the terms of use include our ability to access and send data via a fully encrypted tunnel to Google's Gemini Pro to generate a response, and have the output streamed back to the Quantum Metric platform, again fully encrypted end-to-end. Gemini Pro will then purge any prompts and generated outputs, by wiping its memory clean. Input prompts, as well as outputs processed, are considered 'customer data' by Google. Any activity involving this is not logged by Google outside of Quantum's Virtual Private Cloud instance within the Google Cloud Platform.

My take

It doesn’t seem long ago that `optimising an application’ meant using the black arts of just getting it to work at all. Now the real money is going to be in which business can squeeze the most performance, or the greatest ease of use, out of a customer-facing application – one they interact with. The more that can be achieved the more customers are likely to make it their first choice: and the bond with the business behind it gets `stickier’. This need for super-optimisation is now starting to be fulfilled: `super-opt’ is now a definite thing, and gen AI will be its making. If you think about it, there could be a million variations on this theme

A grey colored placeholder image