Epic Games Demonstrates Real-Time Ray Tracing With Star Wars

Published by

Click here to post a comment for Epic Games Demonstrates Real-Time Ray Tracing With Star Wars on our message forum
https://forums.guru3d.com/data/avatars/m/258/258664.jpg
Looks cool
https://forums.guru3d.com/data/avatars/m/260/260103.jpg
Looks very cool indeed.
https://forums.guru3d.com/data/avatars/m/53/53598.jpg
So what is it going to cost us to use this new facny tech, let me guess, a new video card by any chance, in the region of say £1000. lol Yes, i have become somewhat cyinical of it all. lol
https://forums.guru3d.com/data/avatars/m/238/238382.jpg
Phasma is a pretty perfect example of a star wars character to test ray tracing on hahaha.. I can't help but take this as shots fired at EA.
https://forums.guru3d.com/data/avatars/m/269/269560.jpg
I find the special effects and the scenario much better than the latest movies
data/avatar/default/avatar06.webp
Today's video cards can't even handle 4k 60fps ultra yet (at least not on all titles). So I'd say we're definitely quite a few years away before we can experience 4k 60fps ultra settings with Ray tracing.
https://forums.guru3d.com/data/avatars/m/243/243702.jpg
tensai28:

Today's video cards can't even handle 4k 60fps ultra yet (at least not on all titles). So I'd say we're definitely quite a few years away before we can experience 4k 60fps ultra settings with Ray tracing.
4K was and still is overrated. People should understand it from all those CGI movies where 10% of content is real and rest is CGI. You do not see that it is not real world. And on 1080p it looks better than any game ever did and does on 4K. So, if it is 4K old rendering vs 1080p High Quality Raytracing. I'll continue advocating better quality 1080p to further technology. What do you need 4K anyway? Because you run TAA/TXAA which blurs damn many adjacent pixels and 4K minimizes its sharpness crippling effect? Because you prefer High Fidelity textures over high complexity and density objects on screen? Why? You can have 8K resolution and it brings nothing new to the table. If you had supercomputer at hand, you would be playing 1080p games with movie class photo-realistic image quality.
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
Fox2232:

4K was and still is overrated. People should understand it from all those CGI movies where 10% of content is real and rest is CGI. You do not see that it is not real world. And on 1080p it looks better than any game ever did and does on 4K. So, if it is 4K old rendering vs 1080p High Quality Raytracing. I'll continue advocating better quality 1080p to further technology. What do you need 4K anyway? Because you run TAA/TXAA which blurs damn many adjacent pixels and 4K minimizes its sharpness crippling effect? Because you prefer High Fidelity textures over high complexity and density objects on screen? Why? You can have 8K resolution and it brings nothing new to the table. If you had supercomputer at hand, you would be playing 1080p games with movie class photo-realistic image quality.
To watch 4K movies that have photo-realistic quality CGI anyway. To be able to choose between more/higher quality objects and/or resolution depending on the game. Productivity reasons. Etc.. lots of reasons for 4K to exist.
https://forums.guru3d.com/data/avatars/m/254/254969.jpg
XP-200:

So what is it going to cost us to use this new facny tech, let me guess, a new video card by any chance, in the region of say £1000. lol Yes, i have become somewhat cyinical of it all. lol
Nvidia said they used 4x NVlinked Volta V100:s to run that in 1080p 24fps. So yeah, no way this is getting in games until they find some cheaper solution or we wait until gpu:s have 10x or more power than currently.
data/avatar/default/avatar12.webp
Fox2232:

4K was and still is overrated. People should understand it from all those CGI movies where 10% of content is real and rest is CGI. You do not see that it is not real world. And on 1080p it looks better than any game ever did and does on 4K. So, if it is 4K old rendering vs 1080p High Quality Raytracing. I'll continue advocating better quality 1080p to further technology. What do you need 4K anyway? Because you run TAA/TXAA which blurs damn many adjacent pixels and 4K minimizes its sharpness crippling effect? Because you prefer High Fidelity textures over high complexity and density objects on screen? Why? You can have 8K resolution and it brings nothing new to the table. If you had supercomputer at hand, you would be playing 1080p games with movie class photo-realistic image quality.
No, I am sorry but I enjoy 4k. That's all your opinion. I'm also guessing you don't have a 4k screen.
https://forums.guru3d.com/data/avatars/m/243/243702.jpg
tensai28:

No, I am sorry but I enjoy 4k. That's all your opinion. I'm also guessing you don't have a 4k screen.
No, I do not. They "fix" artificially made problems and bring other disadvantages. I would rather have 3x 1080p. @Denial : 4K vs multiple screens for productivity... multi screen always wins due to flexibility and comfort. Screen used for coding can be turned 90 degrees. Each screen has its own window set arrangement. I started to use multi-screen 15+ years ago and there is nothing better. And mind that I did not write about watching movies, but playing games. Movies are mentioned so everyone understands that they are product of computer. I wrote about playing CGI quality-like 1080p game vs 4K game with traditional rendering.
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
Fox2232:

No, I do not. They "fix" artificially made problems and bring other disadvantages. I would rather have 3x 1080p. @Denial : 4K vs multiple screens for productivity... multi screen always wins due to flexibility and comfort. Screen used for coding can be turned 90 degrees. Each screen has its own window set arrangement. I started to use multi-screen 15+ years ago and there is nothing better. And mind that I did not write about watching movies, but playing games. movies are mentioned so, everyone understands that they are product of computer. I wrote about playing CGI quality-like 1080p game vs 4K game with traditional rendering.
I know what you meant, I'm just saying people have different use cases. I used to have 3, 27" monitors, I'm down to two now and honestly probably going down to 1 in the near future for aesthetic purposes (my computer is in my living room now, where I used to have it in hidden away). I'd rather have a 4K, ~32/34" monitor. Most of the games I play like SC2, League of Legends, Overwatch, I can play them all at 4K 60+FPS with no issue. For newer games I just lower the resolution, the scaling is good enough, when I get a new GPU I can drop it in and suddenly a bunch more old games I can bump the res up again. I also occasionally watch movies at my desk when I'm working, so having 4K movie support is nice. I don't think 4K is overrated, I don't think it's underrated either, it's just rated lol. If I had to choose between 4K and other features, like adaptive sync/HDR, then yeah I'd probably drop 4K.. but I don't - I can get everything in one package.
https://forums.guru3d.com/data/avatars/m/53/53598.jpg
tensai28:

No, I am sorry but I enjoy 4k. That's all your opinion. I'm also guessing you don't have a 4k screen.
I was a 4k skeptic until i tried it, and i was blown away, i still am, i just bought Titanfall 2 on the XBX and the game is just breathtaking in 4k, just stunning, and now that i have let the 4k genie out of the bottle i cannot put it back in, and it is painfully obvious when i go back to 1080p games, 4k is just night and day, well for me anyway.
https://forums.guru3d.com/data/avatars/m/270/270008.jpg
I'm not a fan of 4K until you get a monitor of at least 32 inches. I do like 2K on 27 to 30 inch screens as the difference between 2K and 4K on this sized screen at normal desk viewing distances is minimal plus no tweaking resolutions etc to get games to play. Also the higher res does make text nice and crisp as well for everyday use. This is how I plan to go until GPU's easily do 4K at 60fps and higher, which will be a few more years. Hopefully by then we will have some MicroLED monitors that do proper HDR without burn in issues(one can dream).
data/avatar/default/avatar04.webp
Yeah I'm gaming on a 43 inch 4k tv as a monitor. Setting my games to 1080p is painful to my eyes after getting used to 4k.
https://forums.guru3d.com/data/avatars/m/258/258688.jpg
I'll believe the hype when these guys start releasing *demos* instead of pre-rendered video clips. Every couple of years "real-time ray tracing" is resurrected and hits the publicity circuits, along with a cache of proof-of-concept video clips released to demonstrate the kinds of effects being hyped--and then it all dies and returns to hibernation--waiting for the next brief wake cycle! It is all rather long in the tooth. There is something they sense about the public's gullibility for the notion of "real-time ray tracing", I suppose (maybe the "cool" and ubiquitous reflective chrome sphere?), that causes them to beat this old dead horse one more time. They (the marketers) know how well the public likes shiny objects. Remember Larrabee? The irony was that Intel *never* pushed that soon-to-be-defunct cpu design (Larrabee was cancelled before it was ever produced as a prototype) as "real-time ray tracing." in fact, there were many interviews with Intel employees giving demos of *simulated* Larrabee concepts (because there was no Larrabee silicon, ever) who plainly stated the concept was *not* real-time ray tracing. It made no difference to well-known personalities at certain sites (not HH!) who kept insisting over and over again that Larrabee was indeed a real-time ray-tracing GPU--these personalities had written numerous articles on the glowing future of "RTRT"--and they *would not back down* on their assertions no matter what Intel said. I almost felt sorry for Intel at that point--and so I was not surprised when without fanfare or apology Intel cancelled the Larrabee project before it got beyond the concept stage. Of course they did--they had to cancel it--as the ludicrous expectations manufactured by the so-called "pundits" could never have been reached by anyone in the GPU business--and Intel, especially, has never been known for its GPU acumen and capability *cough*...;) Larrabee, had it ever been produced, would have been a massive disappointment to all of these people putting impossible expectations on it! Well, as we can see Larrabee is dead--but the idea of "RTRT" is anything but...;) (The entire idea behind rasterization since the V1 is that it *simulate* ray tracing, and in "real time!" Why? Because ray tracing is so incredibly heavy computationally speaking that it can *never* be done in "real time"--so rasterization was born to give us the visual benefits of ray tracing without the computational overhead--and every year, rasterization inches closer to that laudable goal. But rasterization is not ray tracing. )
https://forums.guru3d.com/data/avatars/m/224/224796.jpg
I used to have 3 24" 1080P screens; now I have a single 28" 4K screen. I would not go back for anything entertainment related personally. For productivity there are some benefits to having a secondary screen, although many 4K screens have a splitscreen capability that is very nice on a larger 30/32" monitor.
data/avatar/default/avatar02.webp
So Star wars named this new technology after the Ray character in the films. that's really cool. i hope other games can use this technology too.
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
[youtube=jwKeLsTG12A] Another cool demo from Epic -- not using raytracing, so kind of off topic but idk if it deserves its own thread so i figured I'd tag it onto here.
https://forums.guru3d.com/data/avatars/m/56/56686.jpg
JamesSneed:

I'm not a fan of 4K until you get a monitor of at least 32 inches. I do like 2K on 27 to 30 inch screens as the difference between 2K and 4K on this sized screen at normal desk viewing distances is minimal plus no tweaking resolutions etc to get games to play. Also the higher res does make text nice and crisp as well for everyday use. This is how I plan to go until GPU's easily do 4K at 60fps and higher, which will be a few more years. Hopefully by then we will have some MicroLED monitors that do proper HDR without burn in issues(one can dream).
While I agree that 32" at this point in time for TV is to small I belive 4k on 32" look better then then on 40", but at same time thing on 32" 4k from 7+ feet away is hard to read things on vs 40" 4k, but at same time 32" 4k look better then 40" 4k at 3feet or less, there is just no arguing that with me, For me a denser PPI will always win out imo. My dad has 4k 65" TV and while 4k on look amazing the 40" 4k wins out and 32" would win vs 40"