Advancing Real Time Graphics

Published by

Click here to post a comment for Advancing Real Time Graphics on our message forum
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
Pretty awesome but I'd like to see it under more varied lightning conditions - the largest problem with photoscan/photogrammetry is adopting it to conditions other then when the objects were scanned. (Basically they scan the texture of a real life rock under a specific lighting then set the scene in engine up exactly the same as the conditions the object were scanned in) Removing the natural lighting detail is often where the quality gets lost and not the re-add. DICE has been developing some cool solutions around this in it's latest games, but it's no where near the quality of the original lighting - but still vastly better than traditional methods.
https://forums.guru3d.com/data/avatars/m/239/239175.jpg
Over the years, whenever such demos come out that showcase "what's currently possible", it takes about 4-6 years for games to start looking like that. So in 2022, we'll probably see games looking like that. (If GPUs continue to get faster at the rate they do now.)
https://forums.guru3d.com/data/avatars/m/265/265608.jpg
Guess were at the stone age of realistic graphics 🙂 in real time.
data/avatar/default/avatar17.webp
Over the years, whenever such demos come out that showcase "what's currently possible", it takes about 4-6 years for games to start looking like that. So in 2022, we'll probably see games looking like that. (If GPUs continue to get faster at the rate they do now.)
And when it gets out, we we need quad SLI next-next-gext gen TitanX (because it has to be X) for 60 fps 1080p if nV introduces back more then 2way SLI. AMD fanboys would need mini powerplant to make it possible at 1080p, escpecially when we add the fact nV tech was used...
data/avatar/default/avatar30.webp
And to think that somehow a few games already put these cards on their knees at 1080p, "somehow".
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
Nice work with HDR and Tone Mapping. Would be nice to know what kind of AO/GI he used and if it was precomputed by Lightmass, coz then it wouldn't be completely real-time, right?
It states in the article he used Nvidia's VXGI.
data/avatar/default/avatar04.webp
i can already foresee my 980ti suing me for rape >.< kidding aside, i cant wait for the playable demo =)
data/avatar/default/avatar07.webp
I would love to have this in the form of a benchmark tool.
https://forums.guru3d.com/data/avatars/m/156/156133.jpg
Moderator
This is pretty damn cool! Shadows can't keep up though lol. Other than that, this is still pretty amazing.
https://forums.guru3d.com/data/avatars/m/254/254725.jpg
And when it gets out, we we need quad SLI next-next-gext gen TitanX (because it has to be X) for 60 fps 1080p if nV introduces back more then 2way SLI. AMD fanboys would need mini powerplant to make it possible at 1080p, escpecially when we add the fact nV tech was used...
It'd probably perform like junk on Nvidia hardware too until another series rolls out 🤓. Isn't SLI support mostly or entirely on Nvidia to provide? I don't imagine they'll be bringing back 3/4X SLI if that's the case.
This is pretty damn cool! Shadows can't keep up though lol. Other than that, this is still pretty amazing.
It seemed to be chugging pretty hard in the sequence before the end.
Pretty awesome but I'd like to see it under more varied lightning conditions - the largest problem with photoscan/photogrammetry is adopting it to conditions other then when the objects were scanned. (Basically they scan the texture of a real life rock under a specific lighting then set the scene in engine up exactly the same as the conditions the object were scanned in) Removing the natural lighting detail is often where the quality gets lost and not the re-add. DICE has been developing some cool solutions around this in it's latest games, but it's no where near the quality of the original lighting - but still vastly better than traditional methods.
Traditional lighting methods?
https://forums.guru3d.com/data/avatars/m/56/56686.jpg
untill i saw the water and shadows i thought that look real pretty impressive especial the 4k part
https://forums.guru3d.com/data/avatars/m/243/243702.jpg
I just wonder about storage tax 😀 How many dozens or hundreds MB of data per unique m^2? Stuff like this needs to be at least pseudorandomly generated.
https://forums.guru3d.com/data/avatars/m/63/63215.jpg
Imagine having to make real-time destructable environments at this level of quality lol.
https://forums.guru3d.com/data/avatars/m/237/237771.jpg
untill i saw the water and shadows i thought that look real pretty impressive especial the 4k part
Yeah the water was definitely in the "uncanny valley".
data/avatar/default/avatar30.webp
Fail when DICE have SLI working on a demo but they can't make it work on games. Battlefield 1 multigpu is pretty crappy on dx11 and nonexistent in DX12. Multigpu is a waste of money right now when most big games released this year don't support more than one card. DX12, the savior of multigpu has been a fail with only a couple games supporting dual gpu.
https://forums.guru3d.com/data/avatars/m/204/204717.jpg
Wow, that was amazing. I started to have that "is this real or fake" thought. That was only because I knew though, if I didn't know, I'd probably think it's real. It'd be icing on the cake if TES 6 if it looked like that, lol.
https://forums.guru3d.com/data/avatars/m/56/56686.jpg
Wow, that was amazing. I started to have that "is this real or fake" thought. That was only because I knew though, if I didn't know, I'd probably think it's real. It'd be icing on the cake if TES 6 if it looked like that, lol.
is TES6 Announced? i didnt even beat skyrim yet then again i probably never will every time i try to i get side track with messing with mods >>
https://forums.guru3d.com/data/avatars/m/260/260826.jpg
Fail when DICE have SLI working on a demo but they can't make it work on games. Battlefield 1 multigpu is pretty crappy on dx11 and nonexistent in DX12. Multigpu is a waste of money right now when most big games released this year don't support more than one card. DX12, the savior of multigpu has been a fail with only a couple games supporting dual gpu.
You can add the fabled agnostic (different makers) Mgpu only see in action in AOTS to the DX12 gimmicks list. ROTTR DX12 MGPU support was a mirage in the desert.
https://forums.guru3d.com/data/avatars/m/216/216349.jpg
Fail when DICE have SLI working on a demo but they can't make it work on games. Battlefield 1 multigpu is pretty crappy on dx11 and nonexistent in DX12. Multigpu is a waste of money right now when most big games released this year don't support more than one card. DX12, the savior of multigpu has been a fail with only a couple games supporting dual gpu.
Multigpu had it´s fate written on the wall when MS decide that it should be implemented by the gaming companies instead of MAD and Nvidia, as it was until then. Of course companies looked at the adoption rate of multgpu setups (1%) and decided immediately not to waste time and money implementing a feature very few actually use... The same happened with other features companies were supposed to add with the advent of DX12... And this is why DX12 is a failure, MS gave too much "power" to gaming companies over it and they don´t want it because it´s cheaper and quicker to implement the lowest amount of features possible...
https://forums.guru3d.com/data/avatars/m/66/66219.jpg
Wow that is really really impressive, looks just like real life. How long till this translates into gameplay.:infinity: