The Three New GeForce GTX 1070 and 1080 Related Technologies

Published by

Click here to post a comment for The Three New GeForce GTX 1070 and 1080 Related Technologies on our message forum
data/avatar/default/avatar02.webp
So official support for rear mirrors for FPS is coming? That is cool isn't it like cheating?
https://forums.guru3d.com/data/avatars/m/260/260048.jpg
Very nice and interesting stuff. Looking forward. P.S. Any news on which companies will release GTX 1080 on day 1? I am very interesting in a purchase of Hybrid cooled GTX 1080.
https://forums.guru3d.com/data/avatars/m/256/256969.jpg
Very nice and interesting stuff. Looking forward. P.S. Any news on which companies will release GTX 1080 on day 1? I am very interesting in a purchase of Hybrid cooled GTX 1080.
last year Hybrid 980-Ti took a month to be available. And to be honest, with the new card taking half the power of 980-Ti, i'm not sure it'll be worth this time.
data/avatar/default/avatar16.webp
Zotac GeForce GTX 1080, 8GB GDDR5X, DVI, HDMI, 3x DisplayPort (ZT-P10800A-10P) Chip: GP104-400-A1 "Pascal" • Chip clock: 1607MHz, Boost: 1733MHz • Memory: 8GB GDDR5X, 2500MHz, 256bit, 320GB/s • Shader Units/TMUs/ROPs: 2560/80/40 • computing power: 8228GFLOPS (Single), not specified (Double) • Manufacturing process: 16nm • Power consumption: 180W (TDP), not specified (idle) • DirectX: 12.0 • OpenGL: 4.5 • OpenCL: 1.2 • Vulkan: 1.0 • Shader model: 5.0 • Interface: PCIe 3.0 x16 • Total height: dual-slot • Cooling: 1x radial-fan (65mm), reference design • Connectors: DVI, HDMI 2.0b, 3x DisplayPort 1.4 • external Power supply: 1x 8-Pin PCIe • Dimensions: 267x111x38mm • Special features: NVIDIA G-Sync, NVIDIA VR-Ready, 4-Way SLI, HDCP 2.2, Backplate • Warranty: five years with warranty extension (at registration within 28 days to purchase) Link with many pics: http://geizhals.eu/zotac-geforce-gtx-1080-zt-p10800a-10p-a1439612.html
data/avatar/default/avatar26.webp
Only GDDR5X memory for GTX 1080?
data/avatar/default/avatar26.webp
Only GDDR5X memory for GTX 1080?
For now, yes. looks like HBM2 won't be available in usable quantities until next year. And it will probably be expensive then.
data/avatar/default/avatar17.webp
These are very promising. Will depend on actual benchmarks done by Hilbert and others that I think I can trust. We should have learned by now not to believe Nvidia or AMD hype when it comes to performance. CPU, motherboard, memory are the next upgrades for me. If I have enough left, I will look at new Video as well. GTX 1070 might be a contender.
https://forums.guru3d.com/data/avatars/m/264/264961.jpg
i guess the most effective upgrade/feature is that SMP, that thing is beautiful and most impressive of them all 😯 The others are quite like stepping stones to more impressive techs later down the line imo.
data/avatar/default/avatar24.webp
For now, yes. looks like HBM2 won't be available in usable quantities until next year. And it will probably be expensive then.
It would have been more fair to compare the new 1080 with an 980 Ti.
https://forums.guru3d.com/data/avatars/m/254/254725.jpg
Too bad the enhanced audio stuff is VR only; I imagine this will fall by the wayside like True Audio. It would be nice if AMD decided to improve TA and get it implemented in more games but, I doubt it.
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
The audio feature is really nice, almost a necessity. But - I don't like that nvidia is doing it, because that means this is going to be proprietary and it is GPU-bound. Though this is very complex for audio processing, a discrete sound card (or even the CPU) should be handling this, not the GPU. In a simplified perspective, think of it like this: 1. The CPU is the first to know about the layout of the 3D area. 2. The CPU then tells the GPU what the area looks like. 3. The GPU renders everything. 4. The GPU adds filters to make up for lens correction, and then outputs the display. 5. The GPU processes the sound effects based on the environment. 6. The GPU sends this info back to the CPU. 7. The CPU tells the sound card what to play and how to play it. 8. The sound card plays the audio (add another step if it does it's own layer of processing). We want to be LOWERING latency in the GPU, not increasing it. As stated in step 1, the CPU already knows what the 3D area looks like. For the most part, the CPU is twiddling its thumbs as it's waiting for the GPU to finish rendering the scene, let alone render the audio too. Anyone meddling with VR should be smart enough to have a hefty CPU, which should have plenty of spare resources to process this info. So not only could step 6 be eliminated entirely, steps 5, 7, and 8 could be done in-parallel with everything else the GPU is doing.
https://forums.guru3d.com/data/avatars/m/175/175902.jpg
Only GDDR5X memory for GTX 1080?
yes, but there will be the Ti version too (if it is based on the pro pascal).
https://forums.guru3d.com/data/avatars/m/237/237771.jpg
It would have been more fair to compare the new 1080 with an 980 Ti.
Not really. I find comparing the GM204 to the GP104 a more fair comparison.
data/avatar/default/avatar22.webp
These are very promising. Will depend on actual benchmarks done by Hilbert and others that I think I can trust. We should have learned by now not to believe Nvidia or AMD hype when it comes to performance. CPU, motherboard, memory are the next upgrades for me. If I have enough left, I will look at new Video as well. GTX 1070 might be a contender.
It will be very interesting to see what kind of memory configuration AMD cards will have. HBM is fast which is good for 4K but on the other hand it is limited to 4Gb which is very bad for 4K. HBM2 is too far away so either they can offer something awesome at smaller resolutions with HBM or they just admit they were wrong in the first place and switch back to GDDR5/X.
https://forums.guru3d.com/data/avatars/m/262/262241.jpg
It will be very interesting to see what kind of memory configuration AMD cards will have. HBM is fast which is good for 4K but on the other hand it is limited to 4Gb which is very bad for 4K. HBM2 is too far away so either they can offer something awesome at smaller resolutions with HBM or they just admit they were wrong in the first place and switch back to GDDR5/X.
512bit bus (like 290/390) + GDDR5X can do the trick.
https://forums.guru3d.com/data/avatars/m/243/243702.jpg
ANSEL, how lovely. Only positive about this is, that Game has to support it. In other words, nVidia is not forcefully taking that geometry information. But I am pretty sure, soon enough there will be tons of cheaters using this, to take snapshot of scenery and render it on 2nd screen with transparency ON.
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
ANSEL, how lovely. Only positive about this is, that Game has to support it. In other words, nVidia is not forcefully taking that geometry information. But I am pretty sure, soon enough there will be tons of cheaters using this, to take snapshot of scenery and render it on 2nd screen with transparency ON.
How is that any different then what they do now? CS:GO counters wallhacks by delaying client data of player model positioning until it's absolutely necessary. If there is no client data it doesn't matter if you have transparent walls. I mean the thing is basically noclip with fancy effects and output options and I'm sure the "noclipping" part of it will be turned off in some titles. CS:GO basically already has this.
https://forums.guru3d.com/data/avatars/m/243/243702.jpg
How is that any different then what they do now? CS:GO counters wallhacks by delaying client data of player model positioning until it's absolutely necessary. If there is no client data it doesn't matter if you have transparent walls. I mean the thing is basically noclip with fancy effects and output options and I'm sure the "noclipping" part of it will be turned off in some titles. CS:GO basically already has this.
1st, it would be undetectable and you would have no proof it was used for cheating. 2ndly, Would you prevent everyone with nVidia GPU to play competitive online? And data are there, extractable from sound. Moment people get way to dump scene at will, there will be scripts doing rest.
https://forums.guru3d.com/data/avatars/m/54/54823.jpg
Triple 1070 > Dual 1080 w/ HB SLI bridge? WTF NVIDIA? LOL Chip: GP104-400-A1 "Pascal" • Chip clock: 1607MHz, Boost: 1733MHz • Memory: 8GB GDDR5X, 2500MHz, 256bit, 320GB/s • Shader Units/TMUs/ROPs: 2560/80/40 • computing power: 8228GFLOPS (Single), not specified (Double) • Manufacturing process: 16nm • Power consumption: 180W (TDP), not specified (idle) • DirectX: 12.0 • OpenGL: 4.5 • OpenCL: 1.2 • Vulkan: 1.0 • Shader model: 5.0 • Interface: PCIe 3.0 x16 • Total height: dual-slot • Cooling: 1x radial-fan (65mm), reference design • Connectors: DVI, HDMI 2.0b, 3x DisplayPort 1.4 • external Power supply: 1x 8-Pin PCIe • Dimensions: 267x111x38mm • Special features: NVIDIA G-Sync, NVIDIA VR-Ready, 4-Way SLI, HDCP 2.2, Backplate • Warranty: five years with warranty extension (at registration within 28 days to purchase) Just confusing... 🙄
data/avatar/default/avatar29.webp
1st, it would be undetectable and you would have no proof it was used for cheating. 2ndly, Would you prevent everyone with nVidia GPU to play competitive online? And data are there, extractable from sound. Moment people get way to dump scene at will, there will be scripts doing rest.
If I have understood correctly HANSEL working way, a game would disable HANSEL in MP and activate it in SP.