G.Skill TridentZ 5 RGB 6800 MHz CL34 DDR5 review
Be Quiet! Dark Power 13 - 1000W PSU Review
Palit GeForce RTX 4080 GamingPRO OC review
Core i9 13900K DDR5 7200 MHz (+memory scaling) review
Seasonic Prime Titanium TX-1300 (1300W PSU) review
F1 2022: PC graphics performance benchmark review
MSI Clutch GM31 Lightweight (+Wireless) mice review
AMD Ryzen 9 7900 processor review
AMD Ryzen 7 7700 processor review
AMD Ryzen 5 7600 processor review
Exploring the limits of real time rendering
Ex DICE artist Ren, shared a video on his forest project, showing the graphical capabilities of Unreal Engine 4 and explores the limits of real-time rendering. Ren aims to release a tech demo that combines Scifi, AI, robotics, nature and a story as a first introduction to the universe he is shaping in the near future. This will be a playable experience showcasing that developing high-end 4K graphics is the future, and this tech demo will ignore performance, memory, technical limitations and will purely focus on getting closer to photo-realistic visuals.
« Take-Two has delivered over 90 million copies of GTA V · Exploring the limits of real time rendering
· BenQ launches the ZOWIE XL2740 monitor »
Denial
Senior Member
Posts: 13996
Joined: 2004-05-16
Senior Member
Posts: 13996
Joined: 2004-05-16
#5518351 Posted on: 02/08/2018 09:12 PM
I doubt the scaling is 100%, so it's more like 450w total and honestly knowing SLI scaling on Unreal its probably less than that. Nvidia claims Volta's architecture improvements increases FP32 perf/w by 50% on TSMC's 12nm 7.5T SCL - on 6.5T we know TSMC node itself brings additional perf/w. Further all these techniques will improve significantly as they are able to get utilized - again, the issue with photogrammetry is the laborious task of photographing all the assets, which is terabytes of data at high resolutions, bringing them back and processing the delighting in them so they can be used in environments outside where the photographs were taken. Once that problem is taken care of that's when Nvidia/AMD/innovative graphic companies like DICE will come in and find weird tricks and techniques to improve performance when it's being utilized.
DICE already uses photogrammetry in consoles:
http://www.eurogamer.net/articles/digitalfoundry-2017-star-wars-battlefront-2-console-tech-analysis
I don't think it will be 20 years, probably next generation of consoles ~6 years.
Yeah, but how long you´ll have to wait for technology to bring those 550 watts worth of graphic computing power down to a console feasible 150 watts (so we can get mainstream games with that kind of detail). 8nm wont cut it at all, and by 7nm they will be using new materials, so another 20 years?.
I doubt the scaling is 100%, so it's more like 450w total and honestly knowing SLI scaling on Unreal its probably less than that. Nvidia claims Volta's architecture improvements increases FP32 perf/w by 50% on TSMC's 12nm 7.5T SCL - on 6.5T we know TSMC node itself brings additional perf/w. Further all these techniques will improve significantly as they are able to get utilized - again, the issue with photogrammetry is the laborious task of photographing all the assets, which is terabytes of data at high resolutions, bringing them back and processing the delighting in them so they can be used in environments outside where the photographs were taken. Once that problem is taken care of that's when Nvidia/AMD/innovative graphic companies like DICE will come in and find weird tricks and techniques to improve performance when it's being utilized.
DICE already uses photogrammetry in consoles:
http://www.eurogamer.net/articles/digitalfoundry-2017-star-wars-battlefront-2-console-tech-analysis
I don't think it will be 20 years, probably next generation of consoles ~6 years.
tsunami231
Senior Member
Posts: 13374
Joined: 2003-05-24
Senior Member
Posts: 13374
Joined: 2003-05-24
#5518353 Posted on: 02/08/2018 09:14 PM
if i didnt know better I would still think that CG but that pretty damn impressive we still gone need gpu that can push 8k+ that have such high PPI density that we can see jaggies. the power to push the textures.
15 years ago people said in 10 years we will have realistic graphics. people say the same now and in 15 years i still dont think we will have. and by realistic graphics i mean you can tell the difference between the CG and real life even if you tried.
if i didnt know better I would still think that CG but that pretty damn impressive we still gone need gpu that can push 8k+ that have such high PPI density that we can see jaggies. the power to push the textures.
15 years ago people said in 10 years we will have realistic graphics. people say the same now and in 15 years i still dont think we will have. and by realistic graphics i mean you can tell the difference between the CG and real life even if you tried.
HeavyHemi
Senior Member
Posts: 6952
Joined: 2008-10-27
Senior Member
Posts: 6952
Joined: 2008-10-27
#5518432 Posted on: 02/09/2018 06:30 AM
if i didnt know better I would still think that CG but that pretty damn impressive we still gone need gpu that can push 8k+ that have such high PPI density that we can see jaggies. the power to push the textures.
15 years ago people said in 10 years we will have realistic graphics. people say the same now and in 15 years i still dont think we will have. and by realistic graphics i mean you can tell the difference between the CG and real life even if you tried.
I think your expectations are unrealistic. :p As long as you're looking at an image your brain knows it is an image. In my view we have 'realistic' graphics. In other works things are represented as the appear with realistic fidelity. I don't think we can reach your level of perfection while looking at or engaged with technology that enables it. In other words, your nirvana is a holodeck....as I suspect we all would dream of some day.
if i didnt know better I would still think that CG but that pretty damn impressive we still gone need gpu that can push 8k+ that have such high PPI density that we can see jaggies. the power to push the textures.
15 years ago people said in 10 years we will have realistic graphics. people say the same now and in 15 years i still dont think we will have. and by realistic graphics i mean you can tell the difference between the CG and real life even if you tried.
I think your expectations are unrealistic. :p As long as you're looking at an image your brain knows it is an image. In my view we have 'realistic' graphics. In other works things are represented as the appear with realistic fidelity. I don't think we can reach your level of perfection while looking at or engaged with technology that enables it. In other words, your nirvana is a holodeck....as I suspect we all would dream of some day.

Elder III
Senior Member
Posts: 3735
Joined: 2010-05-16
Senior Member
Posts: 3735
Joined: 2010-05-16
#5518441 Posted on: 02/09/2018 07:58 AM
That's some gorgeous scenery for sure. I'd like to have it as a tech demo to drool over, even if we can't expect that level of graphics in games anytime soon. Benchmarks are always fun, even when they make your PC beg for mercy.
That's some gorgeous scenery for sure. I'd like to have it as a tech demo to drool over, even if we can't expect that level of graphics in games anytime soon. Benchmarks are always fun, even when they make your PC beg for mercy.

Click here to post a comment for this news story on the message forum.
Senior Member
Posts: 1303
Joined: 2014-06-15
Yeah, but how long you´ll have to wait for technology to bring those 550 watts worth of graphic computing power down to a console feasible 150 watts (so we can get mainstream games with that kind of detail). 8nm wont cut it at all, and by 7nm they will be using new materials, so another 20 years?.