Gods of Mars Movie Come Alive with NVIDIA RTX Real-Time Rendering
The movie, currently in production, features a mix of cinematic visual effects with live-action elements. The film crew had planned to make the movie primarily using real-life miniature figures. But they switched gears once they experienced the power of real-time NVIDIA RTX graphics and Unreal Engine.
Director Peter Hyoguchi and producer Joan Webb used an Epic MegaGrant from Epic Games to bring together VFX professionals and game developers to create the film. The virtual production started with scanning the miniature models and animating them in Unreal Engine.
“I’ve been working as a CGI and VFX supervisor for 20 years, and I never wanna go back to older workflows,” said Hyoguchi. “This is a total pivot point for the next 100 years of cinema — everyone is going to use this technology for their effects.”
Hyoguchi and team produced rich, photorealistic worlds in 4K to create rich, intergalactic scenes using a combination of NVIDIA Quadro RTX 6000 GPU-powered Lenovo ThinkStation P920 workstations, ASUS ProArt Display PA32UCX-P monitors, Blackmagic Design cameras and DaVinci Resolve, and the Wacom Cintiq Pro 24.
Stepping Outside the Ozone: Technology Makes Way for More Creativity
Gods of Mars tells the tale of a fighter pilot who leads a team against rebels in a battle on Mars. The live-action elements of the film are supported by LED walls with real-time rendered graphics created from Unreal Engine. Actors are filmed on-set, with a virtual background projected behind them.
To keep the set minimal, the team only builds what actors will physically interact with, and then uses the projected environment from Unreal Engine for the rest of the scenes.
One big advantage of working with digital environments and assets is real-time lighting. When previously working with CGI, Hyoguchi and his team would pre-visualize everything inside a grayscale environment. Then they’d wait hours for one frame to render before seeing a preview of what an image or scene would look like.
With Unreal Engine, Hyoguchi can have scenes ray-trace rendered immediately with lights, shadows and colors. He can move around the environment and see how everything would look in the scene, saving weeks of pre-planning.
Senior Member
Posts: 425
Joined: 2017-12-11
So more cgi , if i was actor i would worry about the future of acting , these special effects have place in the movies and can save loads of money but then for example the first alien movie was made on a very small budget and its one of the best movies ever made imo , because people put loads of hard work and the hearts in to making it and God it shows ...
Senior Member
Posts: 566
Joined: 2017-03-01
Half of these shots do not have full ray tracing in them. "police lights" that light up the scene more then missile explosions.
There is also a lot of DLS or converting artifacts in one scene even at 2160p.
And the framerate is at something like 30 or less.
The jet flames in first scene looks like something from the release of EVE- online in early 2000 or the flames from Star Wars Podracer from 1999.
Hopefully the original video is better.
Sorry for being Mr grumpy, but we are far away from Real time rendering at movie quality, if the spaceship clips are a indication of what is possible.
Senior Member
Posts: 404
Joined: 2010-03-07
Im a little confused - these footage burly pass as a modern game - definitely not a movie...
Senior Member
Posts: 4718
Joined: 2014-06-17
I seriously doubt they have the same budget as Mandalorian and can deliver as good results. Got to say the production footage from the video did look like video game cinematic for the most part.
But Mandalorian is just film quality so clearly it can be done, insane.
Senior Member
Posts: 3408
Joined: 2013-03-10
So, it's the same technology they used in Mandalorian? Although I suppose a bit more developed. It looked surprisingly good already in Mandalorian, so it's no wonder if studios decide to rely on it more and more.