Main content

DNEG - the team behind Oppenheimer’s special effects - adopts Red Hat OpenShift to boost productivity of CGI artists

Gary Flood Profile picture for user gflood February 1, 2024
Summary:
The studio responsible for the digital mastering of many Oscar nominated films - DNEG - now has a consistent global platform to speed up software innovation and improve artist productivity for competitive differentiation

An image of the Oppenheimer movie poster, starring Cilian Murphy
(Image sourced via Universal Pictures)

One of the world’s top digital effects studios is moving to Red Hat’s OpenShift Container Platform to boost both the efficiency and productivity of film and TV CGI artists. Riding the wave of success this year, having worked on the box-office smash Oppenheimer, the new platform could result in more enjoyable movies for audiences - as the studio’s Software Infrastructure Architect, Ollie Harding points out:

If a filmmaker says it's all live action and there's been no post-production, that's not true. Even if it's the very most minimal thing you can do, which arguably is to grade it, i.e. ensure the finished film has a consistent color, that color grading process is a digital asset.

Harding works at London-based DNEG, which is one of the world's leading visual effects and animation studios. With over 20 years of industry experience, the organization has won seven Academy Awards for 'Best VFX' (visual effects), as well as numerous BAFTAs and other specialist industry prizes recognition.

In the movie theatre, the company has been involved in everything from ‘Oppenheimer’ to ‘Dune,’ and on TV its roster includes hits such as ‘Chernobyl,’ ‘The Last Of Us’ and ‘Stranger Things.’

One of its recent most notable achievements was to create the iconic images of black holes and wormholes used in Christopher Nolan’s ‘Interstellar.’

It is also increasingly a player in animation and is even producing visual effects for theme park rides.

Now, the company says it wants to get set for its next phase of success by simplifying the way it produces these effects.

A growing metadata headache

Harding explains the context of his team’s move to containerization is the ever-increasing demand from the entertainment industry for digital effects.

He says:

When you look at any movie, you might think that a lot of it is kind of shot in camera, but a huge amount of the content in a lot of movies is now added after the event.

We’re one of the handful of vendors working to produce all that additional content. So, we get sent the frames of the film, which have been shot on set, and it's our job to then turn it into the final frames which get shown at the cinema.

Unsurprisingly, doing so is very computationally expensive.

Harding confirms that the operation - which also now has studios in  North America, Europe, Asia, and now Australia - is nearly exclusively a Linux facility running mainstream but very high-end hardware.

However, on the software side, for various historic reasons, most effects get built with desktop applications.

That is mainly down to the very high security and confidentiality requirements of the industry.

He says:

The vast majority of what we do is on-prem, as the security constraints of the film and TV industry are very, very strict.

We're not strictly constrained to being on-prem, but when you start looking at the constraints and the costs of doing parts of the work in the cloud and then the costs of bouncing data, with the cost in time and latencies involved, there's not much which motivates you to use the cloud.

I do think the industry is moving in that way, and certifications for platforms and stuff are evolving so that more can be done in the cloud, so significant portions may well end up there. But now, DNEG is far from being unusual in the industry in being almost exclusively on-prem.

That means most of the studio’s work is focused on loading 100GB into memory and doing everything locally, he says.

The problem, he says, is that that’s fine when you have three developers working for 20 digital effect artists - less so when you have 100 serving more like 9,000.

Another issue was metadata handling, with so many items competing for access to the same database services.

Because of DNEG’s success that meant metadata for no less than 100 million digital assets, he says.

Helping boost artist productivity

To deal with this “classic monolithic architecture scaling problem,” Harding has led an adoption of a new way of working - centered on containers.

This is now industry standard software development techniques of packaging software components together to enable them to run smoothly in different environments.

He says:

When you end up with 15,000 computers all connected directly to the same database, then how do you do a software release?

It's really very hard, so we took a bold step and moved one of the most intractable parts of all our old monolithic software development process out, still on-prem, but a well-de-coupled, service-oriented architecture using cloud-native architectural patterns.

The aim, he says: optimize the way that DNEG’s work is delivered to its main customer base - the creative teams working on hit Hollywood and streaming programming.

This meant a big upgrade to the company’s internal global asset tracking system, which now has a new high-volume metadata interrogation API.

This provides, he says, an abstraction and orchestration layer between artists and the database that is speeding up asset delivery by radically cutting down the time they need for data to be fetched.

Harding says after evaluation he decided the Red Hat approach was the best way for him to do this.

That’s because it would give the company access to Kubernetes-powered hybrid cloud functionality.

He says:

We had quite an open brief when we were looking for a supplier, and we looked at many suppliers. We also did our own containerization research and could have opted to do it in-house as we knew what we wanted to do.

However, a key requirement was a need for truly worldwide availability. He adds: 

We needed to have a global on-prem presence; we have eight clusters around the world and while the asset tracking system is moderate in size, it’s not a simple architecture, even though it's moderately small.

We found that the licensing/subscription model for Red Hat meant we would be able to start small and scale up - start with one cluster in London, do our proof of concept, then have two, and then four, and ultimately the full eight without any massive outlay at the beginning.

Best practice IT—but also commercially useful

In terms of benefits from the move so far, Harding says it is not possible to compare as yet a full ‘before’ and ‘after’ in productivity terms.

However, he states that having a decoupled architecture has already allowed DNEG with an unexpected leg-up: easy adjustment to a recent significant change in the industry.

That is the need to move to a new industry-standard collaboration-based solution for constructing disparate elements into animated 3D scenes - Universal Scene Description (USD). 

He says:

Two years ago, we would not have been in a good place to respond to that change. Now, we were able to say, sure, we're ready for that.

That’s because we now have a platform which is flexible enough, where if we need to scale for a customer, we can. Whereas, in the past, it would have been a major problem.

He concludes:

This is a story about an IT project based on best practice and good architecture having unexpected benefits that end up being really significant to the business.

Loading
A grey colored placeholder image