Metro Exodus - Official gameplay video

Published by

Click here to post a comment for Metro Exodus - Official gameplay video on our message forum
data/avatar/default/avatar18.webp
If you see stopping bashing at each other, is clear what is going on. Amd has nothing on the plate. If they wanted to give you another top card for 700 dollars, they would do the equivalent of a 1080ti with no RTX and no Turing. There are 2 new technologies here and they are putting a premium price on them. at 1300$ this is still the fastest card you can buy, it can still do more fps of a 1080ti at 4k i believe ( we will find out shortly from reviews ) If is worth the price it all depends on your economic situation and willing to spend.
https://forums.guru3d.com/data/avatars/m/242/242471.jpg
Well AMD doesn't exactly have tensor core like perf, but it can do half precision np I think 8int too, nv goes low as 4int if I'm not mistaken. That said AMD can do hw rendering rtx as well, not as fast as NV though, or maybe too. We will see when we actually have a working rtx game..
https://forums.guru3d.com/data/avatars/m/196/196426.jpg
http://dl.wavetrex.eu/2018/rtx-fail.jpg GOTCHA! Typical artifacts of partial raytracing, visible in any RT software that exists until the ray density reaches a certain point. Nvidia is not doing "Realtime Raytracing", it's only doing a small sampling of rays per frame, then probably blurring the output with "DLSS" and merging with the rest of the frame. However, in very high contrast areas this process fails, resulting in the pixelated visual artefacts (Marked by my orange arrows) What's also visible in this screenshot is that raytracing calculations are lower in resolution than rendering calculation, so not even 1080p ! Those points in that light/shadow are about twice the size of a pixel on the screen and have a soft blur, from which I deduct that raytracing calculation is done at half the resolution of normal rendering, and in this case at 960x540, upscaling the result. (Edit: And I think it's only doing raytracing once per two frames from what my eyes can tell by watching that part of the video a few times. The raytraced light pixels "linger" a bit behind. So half the res, half the framerate... of an already low resolution of 1080p and only 60fps. This tech is definitely not ready for prime time, even the highest end GPU 2080 Ti is still WAY too slow for proper realtime raytracing. Jensen was correct about one thing in his hour long spewing of lies: Realtime raytracing is still 10 years away. YES CEO dude, it is. And all you're selling us now is bulls**t fakes.) For those that want to see with their own eyes how raytracing actually builds a frame in a scene, get this demo (from Guru3D): http://downloads.guru3d.com/Frybench-download-2709.html Frybench draws every rendered pixel live on the framebuffer, meaning you can see the initial pixelated result and then it gets better as more and more rays add to the final image. To slow down the process and make it more obvious, restrict it to one thread only.
https://forums.guru3d.com/data/avatars/m/256/256969.jpg
wavetrex:

GOTCHA! Typical artifacts of partial raytracing, visible in any RT software that exists until the ray density reaches a certain point. Nvidia is not doing "Realtime Raytracing", it's only doing a small sampling of rays per frame, then probably blurring the output with "DLSS" and merging with the rest of the frame. However, in very high contrast areas this process fails, resulting in the pixelated visual artefacts (Marked by my orange arrows) What's also visible in this screenshot is that raytracing calculations are lower in resolution than rendering calculation, so not even 1080p ! Those points in that light/shadow are about twice the size of a pixel on the screen and have a soft blur, from which I deduct that raytracing calculation is done at half the resolution of normal rendering, and in this case at 960x540, upscaling the result. (Edit: And I think it's only doing raytracing once per two frames from what my eyes can tell by watching that part of the video a few times. The raytraced light pixels "linger" a bit behind. So half the res, half the framerate... of an already low resolution of 1080p and only 60fps. This tech is definitely not ready for prime time, even the highest end GPU 2080 Ti is still WAY too slow for proper realtime raytracing. Jensen was correct about one thing in his hour long spewing of lies: Realtime raytracing is still 10 years away. YES CEO dude, it is. And all you're selling us now is bulls**t fakes.) For those that want to see with their own eyes how raytracing actually builds a frame in a scene, get this demo (from Guru3D): http://downloads.guru3d.com/Frybench-download-2709.html Frybench draws every rendered pixel live on the framebuffer, meaning you can see the initial pixelated result and then it gets better as more and more rays add to the final image. To slow down the process and make it more obvious, restrict it to one thread only.
I'll start by saying that dropping to conclusions by a single developper early attempt at using this tech makes little sense. "Nvidia is not doing "Realtime Raytracing", it's only doing a small sampling of rays per frame" they're doing real time raytracing by bounding box volume, everything is in the name really. 1080p? 2millions pixels, thanks to bounding box you don't have to update them all each frame. *edited millions for billions* With RTX 2080 you have 8000millions ray per second. We can do the math really. Roughly 133 333 333 rays per frame at 60 FPS. If you want to fire 2millions ray, 60 times a second, it's... 120 millions (no bounces) rays. Thing is, you need bounces to make ray relevant, otherwise it won't fetch the lightning information. 240millions ray you get reflection of each ray, etc... That's a maximum bound, with optimization you don't need that many rays. 3 bounces per pixel and we'll have CGI-level quality in real time at 1080p. Just incredible. in 5-6 years maybe, individuals will be able to produce top CGI quality visuals from home. Can't begin to grasp the repercussion on movie/tv industry, this is incredible. It's still early days, early products, and early implementations. If you look at how developper managed to keep improving their game visuals during a single console generation lifetime you understand that everything is about optimization. Again, people buying those GPUs are early adopters like people who bought first generation VR headset.
https://forums.guru3d.com/data/avatars/m/196/196426.jpg
Gigarays = BILLION rays /second. You said "8 million", so your calculations are totally off. If indeed it was capable of shooting 8 billion rays/second, that's 64 rays per pixel at 1080p 60fps, more than enough to get several reflections and refractions for EVERY pixel in the frame. But the 6 or 8 or 10 "gigarays" or billion rays/second are a total lie, and I'm 100% sure it will be proven as a blatant lie by those people that actually understand how raytracing works, after they get their hands on the Turing cards.
https://forums.guru3d.com/data/avatars/m/63/63215.jpg
Wavetrex - I think most enthusiasts here understand that DX12 RT is merely a "slice" (this is how I describe it) of RT. No-one is under any illusion that what we're getting in real-time games is anywhere near the quality of the Star wars demo or in movies. However, I still feel that this "slice" is better than all other lighting methods produced by rasterization.
https://forums.guru3d.com/data/avatars/m/256/256969.jpg
wavetrex:

Gigarays = BILLION rays /second. You said "8 million", so your calculations are totally off. If indeed it was capable of shooting 8 billion rays/second, that's 64 rays per pixel at 1080p 60fps, more than enough to get several reflections and refractions for EVERY pixel in the frame. But the 6 or 8 or 10 "gigarays" or billion rays/second are a total lie, and I'm 100% sure it will be proven as a blatant lie by those people that actually understand how raytracing works, after they get their hands on the Turing cards.
My bad, updated the post. So for 8000millions ray you have 133 millions ray per frame at 60frame per seconds. Thing is, you need ray bounces to start getting informations, you trace from screen to bounding box, this bounding box trace toward multiple directions, if it finds emissive information it keep tracing new rays otherwise it stops tracing etc... We can already do incredible things, the Star Wars demo Hilbert ran for its Turing review, at 4K nontheless, speaks for itself.
data/avatar/default/avatar07.webp
Ray tracing looked really good at some points and I'm excited to see how it pans out over the coming years. Overall though, I'm not all that impressed, it looks more like cranking up the contrast than anything game changing. The improvements to shadows and reflections, while cool, are the type of thing that's easy to miss, especially in a fast paced game.
https://forums.guru3d.com/data/avatars/m/96/96435.jpg
The lighting looks nice with RTX but it also looks overlit. That might not actually be the fault of RTX, though. My monitor isn't the greatest and developers seem to default to washed-out contrast/gamma so I always have to use ReShade to correct it.
data/avatar/default/avatar23.webp
Ray tracing looks good, but i see drop performance signifcantly when enable RTX on video , may be need few years later to create far powerfull GPU than now
https://forums.guru3d.com/data/avatars/m/242/242573.jpg
AMDfan:

I don't see where the buzz is all about, i honestly think it looked better with RTX off. It will take years to come before it MIGHT get a thing. It's for sure not worth the price NVIDIA is asking, $600,- for the mainstream 2070, $800,- dollar for high end 2080 and a whopping $1.200,- dollar for the 2080Ti. IMHO you must be or blind, stupid or a payed shill if you truly believe it's worth NVIDIA pricing
2080 ti starts at $999, 2080 at $699, and 2070 at $499. Get your facts straight before calling people stupid in the future Your post is a perfect example of projection. If you can't see the difference, you're clearly blind, stupid, or as your name implies, a paid shill. Also, RTX 2080 ti is taking the place that GTX Titan has held, which is now in its own category of non-gaming cards with their own Titan family drivers. So in actuality, the price hasn't really gone up. They just shifted the product lines so instead of getting a cut down card first followed by the Ti a year later, they now release them all together including the full TU102 (2080 ti). Its kind of amazing how so many of you criticized nVidia for that stupid and confusing way of releasing cards, yet when they simplify the system to basically how people wanted it, the s**t storm continues to be flung.
https://forums.guru3d.com/data/avatars/m/240/240605.jpg
Damn! Games are getting pretty realistic. 😯
https://forums.guru3d.com/data/avatars/m/272/272769.jpg
Andrew LB:

2080 ti starts at $999, 2080 at $699, and 2070 at $499. Get your facts straight before calling people stupid in the future Your post is a perfect example of projection. If you can't see the difference, you're clearly blind, stupid, or as your name implies, a paid shill. Also, RTX 2080 ti is taking the place that GTX Titan has held, which is now in its own category of non-gaming cards with their own Titan family drivers. So in actuality, the price hasn't really gone up. They just shifted the product lines so instead of getting a cut down card first followed by the Ti a year later, they now release them all together including the full TU102 (2080 ti). Its kind of amazing how so many of you criticized nVidia for that stupid and confusing way of releasing cards, yet when they simplify the system to basically how people wanted it, the s**t storm continues to be flung.
It is pretty stupid to pre-order cards and CPU's before the NDA lift..., your post is the perfect example of a biased idiot. Hou are zelf blind and stupid to over pay for hardware, i just try to warn people... And i am not alone, never ever pre order before the independent reviews are out!!! The 20i0Ti turned out to be only 25% faster than a 1080Ti, and the prices here in Holland START @ €1.400,- Euro!!!