Video: Disney shows off what Real Time Ray tracing of Star Wars looks like with 8 Nvidia GPUs
YouTube's 'VRMA Virtual Reality Media' shows a video of Disney's upcoming Star Wars flight simulator that is powered by Unreal Engine 4. This simulator makes use of real-time raytracing, the video shown used eight Nvidia Quadro P6000 24GB GDDR5X cards, which go for 6000 bucks each.
This flight simulator from Disney will be part of Star Wars: Galaxy's Edge. Star Wars: Galaxy's Edge is an upcoming Star Wars-themed area being developed at Disneyland Park at the Disneyland Resort in Anaheim, California.
It will bring players into the cockpit of the Millennium Falcon, where they can fire the blasters of the ship and prepare the ship for jumping to hyperspace. To show this in real time, Disney paused the demo and then moved around the camera.
Update: this demo is not based on DirectX Ray tracing, it uses the GPUs with Mosaic for the presentation. It is an in-progress animated sequence from the Millennium Falcon attraction. Produced by ILMxLAB and running in real time, it gives fans the first ever glimpse of the incredible detail and immersion the attraction will offer. When it launches, riders will enter a cockpit powered with a single BOXX chassis packed with eight high-end NVIDIA Quadro P6000 GPUs, connected via Quadro SLI. Quadro Sync synchronizes five projectors for the creation of dazzling ultra-high resolution, perfectly timed displays to fully immerse the riders in the world of planet Batuu.
Be a warned man though, I am afraid to have to mention the video is of horrible quality.
Shaky video of the last minute or so of a session at GTC 2018 where Disney showed off what Real Time Ray tracing of Star Wars looks like with 8 Nvidia GPU's. Only 8 GPUS. This is possible thanks to AI Denoising technology. Note this is not a movie, this is live raytracing, or live rendering of a lightfield, the holy grail of computing. The session was called: S8414 - Walt Disney Imagineering Technology Preview: Real-time Rendering of a Galaxy Far, Far Away Walt Disney Imagineering strives to create amazing guest experiences at Disney Parks worldwide. Partnering with Nvidia and Epic Games, Imagineering has developed new technology to drive one of the key attractions at the upcoming Star Wars: Galaxy's Edge opening in Disneyland Resort, CA and Disney's Hollywood Studios, FL. Come learn more about how we took advantage of the newest in Nvidia hardware and the technical modifications that we made for the Unreal Engine which will allow 8 GPUs to render at unprecedented quality and speed.
Senior Member
Posts: 226
Joined: 2015-01-28
First these developers need to learn how to make games with Multi-GPU.
Second, They need to learn how to optimize games..
And then, They need to learn, how to make games look attractive - without brute-force technology.
BTW. Why They even bothering, in so early stage..?
Senior Member
Posts: 11808
Joined: 2012-07-20
Not impressed. If they spent time to set everything for raytracing, they should have picked environmental settings where it shines.
Here, it is just dull. Regular rendering with well selected Ambient lightning/occlusion would look almost same.
Secondly, if they used raytraced scene to bake shadow textures for some stuff, it would look 99% same as there are really no good lighting/shadow changes.
Senior Member
Posts: 14091
Joined: 2004-05-16
Epic, how about you get 2 GPU non-SLI/non-Crossfire configuration work for all Unreal Engine games first?
Because they're probably making a lot more money off of Disney than they are from the general public?
It's because the effort required to get non SFR/AFR rendering to work is crazy, it's not compatible with various rasterized rendering techniques and the amount of people that run multi-GPU systems is like 7.
For example their implementation of volumetric lightmaps, dramatically increases lighting quality for things like fog/god rays/etc but copying that much data between GPU's would basically cut any performance you get out of adding the second GPU because the copy takes so long. So what can the second GPU render at that point? The same thing applies to their recently added distance fields for GI/AO that's being utilized in Fortnite. They'd have to develop an entirely new techniques for mGPU setups, maintain that through multiple iterations of the engine only to serve like the 7 customers that run mGPU. And then the entire problem is inevitably solved by raytracing, where the performance increase with it is 1:1 with GPUs - which is why mGPU was used here. So why bother?
Not impressed. If they spent time to set everything for raytracing, they should have picked environmental settings where it shines.
Here, it is just dull. Regular rendering with well selected Ambient lightning/occlusion would look almost same.
Secondly, if they used raytraced scene to bake shadow textures for some stuff, it would look 99% same as there are really no good lighting/shadow changes.
This is just a continuation of the work that Disney/Epic put in for Rogue One (K-2SO was rendered real time in all the scenes while they were shooting the film). I don't think they care about how people feel about this particular scene and if it best works with raytracing... it's going into a ride. Disney wants to use the tech for all future pre-production CGI in movies. Which is why Epic is pushing it so hard because they foresee all of hollywood using it.
Senior Member
Posts: 2502
Joined: 2014-01-21
No offence but....That video did NOTHING for me.
Moderator
Posts: 15142
Joined: 2006-07-04
Because they're probably making a lot more money off of Disney than they are from the general public?