This interview was originally published in DP 01 : 2022.
Good things come to those who wait – and fans of the “Dune” series had to wait for a very long time. But Denis Villeneuve took on the gargantuan source material and made one of the most highly anticipated films of the last few years.
Aconfession first: I have been a fan of the books for my whole adult life and I even slogged through all of the extended universe. So when I heard that the dude who made “Arrival” – Denis Villeneuve – was giving the first book a movie treatment, I was quite excited. But with the extended waiting time (the C that shall not be named) and an unhealthy amount of discussion on certain platforms (without any information – welcome to Social Media), the expectations were exceptionally high. So after we saw it in cinema (twice), we jumped on a call with VFX Supervisor Paul Lambert, and Tristan Myles and Brian Connor from DNEG.
Paul Lambert, VFX Supervisor, had been involved both in Denis Villeneuve’s last project, “Blade Runner 2049”, as well as a few of the biggest VFX-movies of the last two decades including “Tron”, “I,Robot”, “Harry Potter”, “Benjamin Button”, “Tomb Raider” and 30 more. For “Blade Runner 2049” and “First Man” he received Academy Awards.
Brian Connor, Visual Effects Supervisor from DNEG, is no newbie either. His filmography includes everything from “Star Wars” and “Star Trek” to “Transformers” as well as Marvel & DC movies, “Jurassic Park” and the Godzilla-Monsterverse (including the 1998 version by Roland Emmerich). The third supervsior on the call was DNEG’s Tristan Myles, who (along with Paul Lambert) won the Oscar for Best Achievement in Visual Effects for „First Man“. Besides that, he was a supervisor on “Fantastic Beasts”, “Interstellar” and many more, including favourites likfe “Hellboy 2”, “Kingdom of Heaven”, “Children of Men” and “The Dark Knight Rises”.
Another note: Between the interview and going to print, it has been announced that the second part has been greenlit and should hopefully be released in 2023. But we didn’t know that, with the film not being released at that point.
DP: When we first got to see “Dune”, I was amazed by the set extensions. How did you bring Arrakis, Caladan and the sets to life?
Paul Lambert: We built a fair amount of them (laughs). Denis Villeneuve and Patrice Vermette, the Production Designer, spent a year prior designing the worlds of “Dune”, the spaceships and the sets.
Usually, what concept art does for VFX is serve as a springboard into different ideas. But on this, Denis was so happy with the concept art that it became a solid reference. We built the sets in Budapest, and the 3D-assets extended that (“Dune” was partially shot in the Origo Studios in Budapest; origostudios.com).
In VFX, there is often some deviation from the concept art with new ideas or things that don’t match exactly. But Denis felt that in previous movies some things had gotten away, and when it goes down the wrong path, it takes a lot of money and energy to drag it back. This time, he was adamant, and – Brian can attest to that – the assets were as close as possible to the concepts. Any changes were marked and got approved. With the ships, we would A/B-test against the designs.
In a way, that helped a lot with the look – we knew what everything would look like. Instead of putting everything against a blueor greenscreen and then figuring it out, we never had the „We’ll fix it in post“-attitude. The phrase wasn’t even muttered, as far as I can tell.
DP: Could you give us an example for that?
Paul Lambert: The interior of the ornithopters, for one. Traditionally, you’d shoot that against a greenscreen in a studio, on a gimbal. And after shooting, you’d replace everything. But together with Greig Fraser, the Director of Photography, we decided that we would not try to replicate daylight in a studio. Arrakis is this hot desert environment, so everything that would happen outside, we would shoot outside. You can’t replicate the strength of the sun. DP: But a virtual production environment with LED screens? Paul Lambert: I had experience with LED screens, and Greig Frasers worked on the first season of “The Mandalorian” (DOP for episodes 1, 3 and 7). So, with that extensive experience with virtual production, we agreed that you can’t get enough light from LEDs to get the desert feel. If the movie had played during sunset, it would not have been a problem, then it would have been perfect. But the noon sun in Arrakis needs the actual sun. Because of that, we didn’t try to light it, but we found the highest hill in Budapest, put our gimbal on top so we could get a nice horizon, and surrounded it with an eight-meter screen, colored like sand. On a The ornithopter in flight hot day, the sun would bounce off the screen and enter the cockpit. Even when we looked at the dailies, with Greigs camerawork, it already felt like you were in the desert.
And then the compositors from DNEG added their magic. We shot hours of footage flying through deserts in the United Arab Emirates, with a six-camera-rig under a helicopter flying through the dunes. With that, the compositors could do a blend of that with the footage. Rather than a full extraction, like the classic „foreground and completely different background“, our foreground already had very similar tones, so we could mix. And honestly, it felt immediately real, and we didn’t have the usual problems with edges and a lack of believability.
For example, the glass dome of the ornithopter had reflections, and reflections of reflections on the inside. Shooting that on a greenscreen would have been problematic. Obviously, there are times when you have to rebuild that, and I think, the more you hit a plate, the lesser the credibility. And with our approach, we could be seamless.
Also, we had a lot of time in preproduction to find ideas and to think about what we want in the end. Visual effects goes hand in hand with the on-set experience and demands, if you do it properly. And when you come up with a good basis, the VFX artists have something that can succeed.
We all know: If you have a foreground that isn’t corresponding to the background in terms of lighting, there is not much you can do about it. The more you pull and push and grade it, the less believable it becomes. Yes, you can get the perfect seams, but it still doesn’t look natural. So, with the time we were given in preproduction, we could avoid bluescreen.
An example: When the story moves to Arrakis, whether it is in the desert or the city of Arrakeen, we shot a lot of it against a sand screen, a sand colored background. Which is funny – because if you invert the colors, you get a bluescreen. Obviously, there are issues with skin tones and the like, but it gives you a very good basis. And let’s be honest: At this point, we can come up with a process for extracting parts of the image for any color, as long as it isn’t a complex background. I was having this conversation with the DOP: We can remove any background or foreground, but if the lighting between the two doesn’t correspond, there isn’t much good we can do. LEDs and virtual production help with this particular challenge, but we already decided not to go down this path at this point.
DP: When did you get involved in the project?
Paul Lambert: Concept work, especially the interiors, were pretty much finalized, except for a few changes. We came in basically when the storyboard phase of the preparations started. We also had to previs a couple of scenes, which is not a thing Denis likes. But for some scenes we needed every department to know what we were about to do. For example, the sandworm attack on the crawler needed extensive preparation.
Also, from day one we knew that one of the most computationally expensive things for “Dune” was going to be the sand. And the sand around the sandworm in particular would be extremely important. When Tristan joined about a month later, one of the questions was: “How the heck are we going to displace all of that sand?”
The key to a good effect is having a visual reference which an artist can use to make informed creative decisions and even copy to. But we naturally couldn’t find a sandworm or something working in this way anywhere. I asked production if we could get at least some explosions in the desert for reference, while we were filming deserts in Jordan and the United Arab Emirates, but I was told that would not go down well in the Middle East.
DP: So how did you do it?
Paul Lambert: Tristan and the guys at DNEG Vancouver went through iteration after iteration. In an ideal world, you would just simulate every grain of sand by itself, but who has the processing power for that? So, you clump things together and hope the render goes through. But with that, sometimes things appear to not have the right scale or speed. So, during preproduction we did the R&D, so in the edit we could deliver the shots quickly. At the same time, Brian had a similar problem with scale: the ships coming out of the water on Caladan, which is a massive structure. Nobody has planes like that.
Brian Connor: Well, thankfully Paul found footage of icebergs tipping over which were about the same size. When they melt, they roll over, and that helped us to understand how such a massive structure behaves and displaces water on that scale. It was one of those shots that you work on pretty much until the end (laughs). You have to give it the love and the time and the disk space it needs for the simulations.
DP: You mean the scene where the Atreides flagship surfaces?
Brian Connor: Yeah, we had a couple of iterations on that. One of the first ones was with a person in a boat next to it, for scale. But we ran the simulations over and over, even though some looked odd. Remember: If it looks odd, it does so because you don’t know what it would actually do, for example, the huge amount of water piling up on the top of the ship.
If you get in there and change too much of what the dynamics of the simulation are telling you, you run into the same problem of over-processing that Paul talked about earlier. We just put all the distributed rendering power of the DNEG farm to use on this. Strategically, of course – when things slowed down, we took all the resources. It takes a lot of time to figure out the iterations. Same thing with the sandstorm. That was also a computationally heavy piece, but we were lucky that we had massive sandstorms from Africa as reference. So that was a bit easier.
DP: Can you talk a bit about your simulation pipeline?
Tristan Myles: We used Houdini and pushed it beyond its limits, I think. We came aboard early and tried to figure out how to show sand behaving at this massive scale – same as the water. In the beginning we put Fremen in the sand for scale, but we had to make scaled down versions for the edit, so it wouldn’t distract from the story. It couldn’t look distracting, like a visual effects scene – it is the environment for the story.
DP: So you manged to keep the scene files reasonable?
Tristan Myles: I can’t remember the exact file sizes, but Vancouver reserved three servers – we were in the petabytes-range. And 60 percent of that were the caches and the geometry. But with the types of things we were doing that was acceptable. The destruction, the heavy explosions, the sand simulation and the worm itself all were beasts to wrangle through the farm.
DP: So, the worm – how did you bring him to the screen?
Tristan Myles: We did model the whole worm including all the plates along the sides of its body. Those are all moveable, and the skin in between had a little bit of ‘give’, so it’s not fully rigid and organic.
Paul Lambert: We wanted something alive but prehistoric. One reference we had for that was elephant skin. Rigid plates over spots and areas with soft membranes in between, folding like an accordion. Obviously not super agile on this scale – the turning circle of a being like that would not be small. And a beast like that affects the whole environment it moves through. Robyn Luckham, the Animation Director, spent a long time figuring out how it moves. It’s more about the sand displacement when the dunes ripple and rise like water, almost. And when it goes faster, it becomes almost like an explosion as it is traveling towards whatever source. And in keeping that idea of water: The actual worm’s mouth has baleen as you would see in a large whale. Because like a whale sifting water and catching krill, the worm would sift through sand.
The thing is: On that scale, it is a force of nature, affecting the whole environment. When it appears, we show the scale, adding things like camera shaking and rumbling and little explosions when it approaches. But you don’t get a lot of screen time with the worm. This is not “Jaws”.
DP: The bubbling of the sand in the sandcrawler scene was like a whale coming up from below?
Paul Lambert: Yes, everything around them is influenced by the worm. Funnily enough, when they sink into the sand, that was done in camera. Gerd Nefzer, the Special Effects Supervisor, built a vibrating plate, which we buried under the sand. And when you dialled in the vibration just right, the sand looked like bubbling water and you would sink into it, just like you see in the movie. Tristan was able to replicate that on the larger environment.
DP: For the next part of “Dune”, the worm is ready and roaring to go? (At the time of the interview, no information about Part Two was available.)
Paul Lambert: Well, I assume for the bidding procedure for “Dune: Part Two”, having a sandworm on hand will be relevant (laughs). But so far, there hasn’t been any prep for “Dune: Part Two”. If so, I’d love to know!
DP: You mentioned that you had a lot of concept art. How detailed was it?
Paul Lambert: Extremely! But there are always some things which you need to actually see. For example, Denis wasn’t really sure about the shape and texture of the Guild Heighliners (the massive ship that transports other ships for example from Caladan to Arrakis). And Brian had a lot variations and iterations of its main docking port and the shape of that. When that was final, the texture was also important. These ships are old and have been around for a long time. Still working, but with bumps and scratches accumulated.
Brian Connor: I would love to use those in “Dune: Part Two”. The ships are so detailed, and with the structured insides and their scale, you could do wonderful things with that, interesting camera angles, composition and showing all that in relation to each other. I hope we can show it off!
DP: Another story beat that needed a lot of CGI was the shields. How was that done?
Paul Lambert: That was surprisingly straightforward. And it came from having artists involved in preproduction and on set, so you can do tests and inform the shoot. We had a list from things that we would have to figure out – among those, the shields. For example, would the shields add additional light? If yes, that would mean additional lights on set.
But we came up with a “past and future frames” approach which works really well when there is a lot of movement. Which there usually is in fight scenes. When there wasn’t a lot of it, we had to fake a bit of it.
What was very important was that we didn’t just procedurally grab frames – two from the back, two from the front and be done with it. We needed an artist’s trained eye and actual people who painted the frames out or in to get a look approved very early on. It shouldn’t feel digital, and DNEG has a few artists who are good at that kind of work.
We tested it with fight scenes from other movies, and it worked for everybody. It was just in DI and the edit that we saw that it became confusing – especially the fight between Paul and Gurney with its quick angles and cuts. There the idea of color came in. Blue for the normal state, and red for penetration.
Also, it was a bit of an homage to the first film, where people scratched the shields into the frame. And when we had it down for the fights, we had to recreate it digitally, for the ships – you’ll see it prominently in the attack of Arrakeen.
DP: Which techniques from “Dune” will you be carrying over to the next show?
Tristan Myles: Well, some of the tools we have written to manage the large amounts of data and bring them together at rendertime will be useful in the future. It’s part of visual effects that you always learn without trying to reinvent the wheel, although you generally end up reinventing the wheel anyway (laughs).
On “Dune”, we learned about large-scale effects simulation and what impact that has on the renderfarm, how to mitigate that and different setups to display what the final image is going to look like.
The real trick there is to work in lower resolution, but not making it look like low resolution. We had a setup for that which was called Ultra Res – once the simulation was signed off by Paul and Denis, then it went through the farm and we could wrangle every grain of sand. The backend of it all – it’s boring to write about, but is an essential part of heavier VFX.
DP: When we say „computationally heavy“, how did you plan for that during production?
Brian Connor: Usually, you can break it up a bit – for example only the front portion of the flagship. But the way it was shot – and you see all of it going into the distance – we couldn’t do only the front part of it. We ran the simulation for the whole ship, which added complexity to the background in addition to the stuff in the foreground. We had to strategize. Everything interacts with everything else. It took our supervisors a lot of work just to distribute it everywhere and to give us a way to iterate in a reasonable amount of time. We had that running on the side the whole time, but we knew that going in and planned for it.
What was also quite challenging was that we had different formats. For example, in Imax you see the whole frame, in 2.39:1 you miss some of the top and the bottom.
Paul Lambert: We framed for Imax on the set. We discussed 2.39, because there will always be something missing from the frame. But some shots couldn’t be done that way. So if you see it in Imax, there are a fair amount of shots which are different. One in particular: when Paul is standing in front of the worm, and it fills the frame with worm texture. We had to redesign that shot, because it fills the whole frame, even in Imax, from Paul at the bottom to the towering top of the worms mouth.
And on about 30 shots, we couldn’t go from Imax to 2.39. Usually you animate that visible area, and that’s that. But it didn’t work with the narrative, so we extended the Imax frame to the ratio of 2.39.
Funnily enough, when I saw the finished movie for the first time in Imax, I did not remember it like that. „Did we really shoot it like that?“ So, I encourage everybody to see the film twice – once in Imax and once in the usual theatrical aspect ratio.
Brian Connor: We called it the „megaframe“ – the resolution is just massive. It’s around 7K, because you’re widening the Imax frame, which already is large. You could just cheat and buffer on the side and not have it in Imax-height, but then the quality and the fidelity would have suffered. We got the 7K frames to DI, so they could shrink them into the format.
DP: Couldn’t you have scaled it up?
Paul Lambert: Not yet. AI enhancements are getting scary good, but not quite good enough for this vital scene. I have been keeping an eye on these technologies, and it could influence every aspect of VFX. It feels almost like we are pre-Newtonian. One example is that, while you are doing onset capture of textures and so on, one could do AI-passes to train the AI on actual footage and help with production, for example capturing actors to do certain things as reference to train a machine to do extractions.
In some ways we are already doing that. On “Dune” I always had a couple of witness cameras on set. You might not always use them, but it is so beneficial to have the data. And in the near future, we can do all kinds of things with that.
Another thing I would highly recommend to everybody: Attach a GoPro to your main camera. When your DOP fuses a shallow depth of field, you basically cannot do background extractions. But with the GoPro’s sensor and lens, you can get the camera movement and reference for the backgrounds.
Brian Connor: Another thing that is coming is the saving of props. On a recent production we scanned a few period cars with a smartphone. The prop was just rented for the day, so we got as much of it as we could, and it worked surprisingly well, even with the reflections. And if it is not going to be up close, but seen from an aerial perspective or to populate the background, that guerrilla style of capturing data and assets can really help.
At DNEG, we have a pipeline for that, and you can get many things so much faster than building it from scratch and with lower hurdles in preparation. There can always be somebody with a phone taking pictures, you don’t need scheduling for that.
Paul Lambert: I did the spinner in “Blade Runner 2049” like that. Since the sensors are so small, everything is always in focus and you get a decent solution for photogrammetry. You have the full range of depth.
DP: If we switch the Direction of scanning: Did you use Lidar scanners on “Dune”?
Paul Lambert: Yes, we had a small one running whenever we were shooting, and sometimes during scenes we captured particular setups. Obviously, scenes move around and props are all over the place. Of course, we had scheduled a proper capture of every room before it got taken down, but we had a special person doing scans and Lidar on the go for anything that we requested and whatever came up. Yes, it produces a massive amount of data, but that is easier to handle than wasting a lot of time in post trying to figure stuff out.
DP: With a movie made from a book: Did you read the novel (or novels) in preparation?
Paul Lambert: I had read the book when I was about 14 and had seen the David Lynch movie first. At that point I was fascinated, but in preparation for “Dune” I was torn whether I should read it again or stick with the script. I knew the story, but I stuck with the script and Denis’ vision of it. I was afraid it might create tension. In hindsight: It wouldn’t have.
Tristan Myles: I read the book at a similar age. My dad got me into it, and I read it again in my twenties. And when the script came, it was closer to the book than the movie Paul mentioned, so I stuck with that.
DP: So, with “Dune: Part One” finished, what sticks with you? Which scenes will you put in your showreel?
Paul Lambert: It’s been a while since I updated my showreel (laughs). But what stuck to memory is that I am really happy with what we achieved and the experience of making the movie. It will be hard to replicate the collaboration, having the guys come out to the set and experiencing this whole world. Sometimes it doesn’t happen this way. You want things to be shot for VFX in a certain way, and you don’t get it. This time, having this collaboration was fundamental to getting the movie onto the screen. And having had this experience, I know what I want for future movies (laughs).
And particular scenes? There are ones which were a challenging shoot for me, like Salusa Secundus, where we see the Sardaukar Legions. That was challenging, because suddenly we had rain and sunlight at the same time. The team did a fantastic job – we were worried that it wouldn’t be believable because we had to adapt to the weather at the morning of the shoot. It was a challenge, but it came out really well. The one thing I liked about “Dune” was that we had time. Originally, the movie was planned to come out October 2020, and that got extended to December, and then Covid hit. And then we finished January after that and went into DI.
Also, we did additional shooting. Denis felt we needed more connection between Paul and his parents, so we did some additional scenes. And we had a really quick turnaround to put that together. But it worked out well. For example, at the spaceport in Arrakeen, where Duncan lands and comes out and hugs Paul, that was a backlot in Budapest, and we had a sand screen going all the way around the backlot. Brian put in the spaceport.
DP: Doesn’t that make it harder?
Paul Lambert: I’m a huge believer in having a harder composite (laughs) – rather than breaking things down in layers and shooting those to be put together, I prefer to get everything in one go. That way then you have a harder composite. Which meant, that a lot of our scenes on Arrakis had us blowing sand and throwing dust from the ornithopters. You see that in the historical scene with the Fremen fighting the Harkonnen soldiers. We were throwing sand like crazy, and it was just texture and swirls. The guys did an amazing job at replacing the background – again, the idea of not doing full extractions, but to blend. The compositors might say it’s really hard, but the result looks more believable. I’m super proud of that approach and the way the artists brought it to life.
The same with the city of Arrakeen: We had a helicopter and flew around Jordan, and basically Denis was like: „I want Arrakeen to be there“, and we did Lidar scans of that whole area and imported them. So even when there are full CG scenes, the environment is real, and that adds a lot of believability.
Brian Connor: The scale of that movie looks really good – the massiveness. When I came to the set, Paul took me on a tour, and standing in all of these massive sets – we pretty much took over the whole studio, and the backlot itself is just gigantic. The sets – especially for the interiors – were amazingly detailed. Walking around in them felt like being completely surrounded by the world. That was a luxury to have – a lot of it is really there. And that was something that I’ll take away from this: We never settled and didn’t go for good enough. We didn’t cut corners, but went straight through – even if that meant a lot of work, even if the servers went down. The Tech department is probably not our friend anymore, but we came through with a result that we could be proud of.