Ansys – democratizing simulation is key to reducing uncertainty

George Lawton Profile picture for user George Lawton May 31, 2023
Summary:
There are some broad implications for the future of the composable enterprise and for building more trustworthy AI. 

trust-recoveringyou

One of the biggest problems with modeling anything, whether a business, a factory, or a new product, is connecting simulations across domains. Multiple kinds of experts need to come together to understand how small changes in product design might improve performance and lower procurement, manufacturing, or tax costs. 

But these tools typically require deep domain expertise. Chris Harrold, Developer Experience Program Director at Ansys, has been leading efforts to democratize physics simulations. But his experience has broad implications for the future of the composable enterprise and for building more trustworthy AI. 
Harrold says: 

The concept behind democratizing simulation is about making it so that you can take a single domain expert and sort of codify their knowledge into your workflow, so that more of your designers and more of your engineers have access to that knowledge, without them being a bottleneck.

Simulating across domains is a generic business problem

One of the big challenges lies in reducing the uncertainty across different domains. In product development, this could include structural, electrical, fluid, and heat simulations. The hard part is that reorganizing the data across disciplines takes effort.  Harrold explains:

Every time you change physics domains, there’s a process that happens. And a lot of times that process is terrible. This isn’t just engineering-specific, it is generally specific. A lot of times, business processes are just terrible. I give this guy a spreadsheet, and then he sits for three hours and remaps the spreadsheet so that the data matches his data. That’s a process, but it’s a terrible process.

This speaks to the need for better tools to map data across domains programmatically. This allows managers or someone in leadership to stitch together a better workflow. So Ansys has been exploring ways to automate the data prep and transformation processes across domains. 

Reducing the man in the middle

Physics, simulation, and engineering are all tough disciplines to get into. It takes a lot of time and practice to acquire the domain knowledge. As a result, many companies have a small team of domain experts, which can delay projects. 

Democratizing simulation could allow more people to access simulation directly without waiting for the team in the middle. The key is not just doing a one-off simulation. A designer might want to explore thousands of variations. Programmatic access to simulations allows teams to explore a wide range of variations to find the sweet spot between performance and cost. 

For example, gold is an excellent conductor of electricity and has great thermal properties. It’s also very expensive, so designers look for the best combination of gold, nickel, and copper. Opening programmatic access means someone can simulate a range of variables to build up a graph to find the sweet spot. This also presents a bigger volume of data to reduce the uncertainty. Harrold explains: 

It’s really about being a force multiplier. If you could do more because you could do it programmatically, then why wouldn’t you? When it effectively becomes free, you might as well just run as many simulations as you want.

Building trust in AI simulations 

Now here is where it gets interesting. One of the most impressive improvements in simulations over the last years has been using AI and machine learning to train new simulation models. These are called surrogate models because they are trained on statistical relationships standing in as a surrogate for the actual physical or chemical models. 

These surrogate models can be thousands or even millions of times faster than physics-based models. They are great for helping to find the sweet spot in tradeoffs, but then you want an expert to ensure they are reasonable and run a classic model to double-check and refine the results. 
Harrold observes:

There’s always got to be a human involved in that process at some point to make sure that the values that are coming back from that are accurate. The ability to run thousands of simulations programmatically really is important, just like the ability to have large language models that are easy to train is really important. But none of this is perfect. You still need to actually look and make sure the values actually line up with expectations. And I think there is a bias or desire to just, you know, just sort of hand it off and let the computer do it.

Harrold does not believe these new AI-powered models will soon replace humans—quite the opposite. Humans will need to set up physics-based simulations to explore the variations that train better AI models. He says: 

We trust math and physics. But when you introduce a new element, that’s where you have to be diligent about checking. We’re trying to close that uncertainty box down. But there will always be a small area you’ll have to double-check.

Embracing co-opetition

The leading engineering simulation vendors have each adopted specialized file formats that improve simulation performance but can also - accidentally or deliberately - build a competitive moat. One of the big promises of ideas like the industrial metaverse and digital twins is to make it easier to take advantage of the best tool for the job. Emerging new platforms like NVIDIA’s Omniverse and format translation tools from companies like Tech Soft 3D are making it easier to move across engineering, design, and management tools. 

Harrold believes future growth requires striking the right balance between proprietary advantage and easy access. That’s one of the reasons they have added support for popular Python programming across their simulation platform. However, this idea of openness is relatively new to the industry. The rise of clouds and everything-as-a-service business models are challenging this legacy attitude. He acknowledges: 

We’ve been very protective. We’ve been sort of a walled garden. Once you’re a customer, you can come in, but we don’t publish a lot of stuff publicly. And that is sort of a detriment. We’ve got to start exposing some of these tools to a bigger audience, to more people, to make them more accessible. The more accessible the tooling is, the more people are going to use it.

Although the future is open, there is also a risk that customers may gravitate toward the competition. Every vendor will have to strike the right balance between improving openness and building a competitive advantage. Harrold explains:  

There is a risk when you open access when you make it easier to use your tools because you make it easier to integrate with other things. I mean, we compete with those guys. We don’t necessarily want you to integrate with them. We’d prefer that you use our stuff. But that’s not reality.

In the long run, simplicity and ease of access may be the most important factors in selling into this emerging digital space. He says: 

When we make it easier to use the tools, people will want to use more. We want to build this ecosystem out and help standardize across all of this for customers because they will always have the choice. The movement to the cloud, SaaS and API-driven connectivity forced standardization across all these competing companies. Once you get to that point, you know the competition is no longer about who you know, who knows what or how to use it. Then the competition really boils down to that uncertainty boxing. Bringing that box down as small as you can. And the one with the smallest amount of uncertainty wins.

My take

Generative AI is all the rage today, particularly as developers find ways to connect AI across multiple modalities like text and images. But these AIs can also generate a lot of hallucinations, which might be considered a feature when generating a surrealistic image from a text prompt. 

In other domains, it will be important to ensure that new ideas are technically sound. This will require greater cooperation between engineering, business process modeling, and enterprise app tools. Each vendor in this community must strike the right balance between improving the ability to reduce uncertainty and improving access to new tools. 

This road will be difficult, requiring business, technical, and infrastructure changes. As Harrold observes: 

It’s a disservice to not mention the difficulty with which cultural change happens alongside all of this technical and ecosystem and platform change, and that takes investment.

Loading
A grey colored placeholder image