Gods of Mars Movie Come Alive with NVIDIA RTX Real-Time Rendering

Published by

teaser

The movie, currently in production, features a mix of cinematic visual effects with live-action elements. The film crew had planned to make the movie primarily using real-life miniature figures. But they switched gears once they experienced the power of real-time NVIDIA RTX graphics and Unreal Engine.



Director Peter Hyoguchi and producer Joan Webb used an Epic MegaGrant from Epic Games to bring together VFX professionals and game developers to create the film. The virtual production started with scanning the miniature models and animating them in Unreal Engine.

“I’ve been working as a CGI and VFX supervisor for 20 years, and I never wanna go back to older workflows,” said Hyoguchi. “This is a total pivot point for the next 100 years of cinema — everyone is going to use this technology for their effects.”

Hyoguchi and team produced rich, photorealistic worlds in 4K to create rich, intergalactic scenes using a combination of NVIDIA Quadro RTX 6000 GPU-powered Lenovo ThinkStation P920 workstations, ASUS ProArt Display PA32UCX-P monitors, Blackmagic Design cameras and DaVinci Resolve, and the Wacom Cintiq Pro 24.

Stepping Outside the Ozone: Technology Makes Way for More Creativity
Gods of Mars tells the tale of a fighter pilot who leads a team against rebels in a battle on Mars. The live-action elements of the film are supported by LED walls with real-time rendered graphics created from Unreal Engine. Actors are filmed on-set, with a virtual background projected behind them.

To keep the set minimal, the team only builds what actors will physically interact with, and then uses the projected environment from Unreal Engine for the rest of the scenes.

One big advantage of working with digital environments and assets is real-time lighting. When previously working with CGI, Hyoguchi and his team would pre-visualize everything inside a grayscale environment. Then they’d wait hours for one frame to render before seeing a preview of what an image or scene would look like.

With Unreal Engine, Hyoguchi can have scenes ray-trace rendered immediately with lights, shadows and colors. He can move around the environment and see how everything would look in the scene, saving weeks of pre-planning.



Share this content
Twitter Facebook Reddit WhatsApp Email Print