GeForce GTX 2070 and 2080 Could Launch Summer 2018

Published by

Click here to post a comment for GeForce GTX 2070 and 2080 Could Launch Summer 2018 on our message forum
https://forums.guru3d.com/data/avatars/m/29/29917.jpg
Moderator
Hm. If it's Pascal then it means no Tensor so perhaps no raytracing capabilities? Unless that DX12 feature doesn't need tensor or tensor like units. I would expect them to have a card with that feature on since they showed it working and engines are trying to get ready for it. AMD said that they are also working on that if i remember correctly.
https://forums.guru3d.com/data/avatars/m/16/16662.jpg
Administrator
DX12 Raytracing is supported, the RTX Library as well. However, some features could be accelerated by Tensor cores. So no, Tensor cores are not mandatory for DX RT.
data/avatar/default/avatar25.webp
I was under the impression that the ability to use AI to approximate a large chunk of the raytracing was the only thing that would make it possible to even do in realtime. Without the tensor cores and FP64, I didn't think Volta was really very different from Pascal apart from being on a newer fab process.
data/avatar/default/avatar19.webp
Volta's SMs are ~50% more efficient than Pascal\s. And Volta itself is almost a year old(!) So even if assume they have been doing nothing, but twiddling their thumbs, new GTX series will be at least as efficient as Volta. And this (50-60%), is pretty much the amount of efficiency improvement necessary for little-big core (GTX 1180) to win against the last-gen BIG core (1080Ti)
https://forums.guru3d.com/data/avatars/m/258/258664.jpg
1180Ti / 2080Ti / 1185Ti - 70-80% performance gain over a 1080Ti? Where's my wallet...
data/avatar/default/avatar22.webp
Missed the /s there on Gb/s
data/avatar/default/avatar06.webp
Noisiv:

Volta's SMs are ~50% more efficient than Pascal\s. And Volta itself is almost a year old(!) So even if assume they have been doing nothing, but twiddling their thumbs, new GTX series will be at least as efficient as Volta. And this (50-60%), is pretty much the amount of efficiency improvement necessary for little-big core (GTX 1180) to win against the last-gen BIG core (1080Ti)
I'm pretty sure that the majority of efficiency gains made by Volta were solely due to a newer fab process and lower clockspeeds.
https://forums.guru3d.com/data/avatars/m/250/250418.jpg
I wish they would stop making bigger dies and make GPU's more affordable. I know it won't happen unless someone releases a more powerful card for cheaper. What about AMD? They should be working on their own GDDR6 controller, maybe Polaris refresh again?
data/avatar/default/avatar40.webp
ttnuagmada:

I'm pretty sure that the majority of efficiency gains made by Volta were solely due to a newer fab process and lower clockspeeds.
Nah... 12nm is built on the same node as 16FF+, and TSMC calls it 16/12. 12nm is just an improvement of 16FF+ You won't get nowhere near 50% better efficiency from improvements made on the same node. ~20-25% tops Same as from slightly lower clocks (only for Titan vs Titan, mezzanine cards are similarly clocked) And even then, who cares about clocks if perf. and efficiency is there. It can ran at 1MHz for all I care. Oh and... almost forgot... scratch that clock advantage completely, because Titan V comes with 1/2 FP64 and with tensor cores, which certainly weighs more in terms of efficiency than measly 120MHz.
data/avatar/default/avatar15.webp
Hilbert Hagedoorn:

DX12 Raytracing is supported, the RTX Library as well. However, some features could be accelerated by Tensor cores. So no, Tensor cores are not mandatory for DX RT.
hilbert, think it is time for you to know, for so much time i was sure your name was Hilbert ANGERDOOM!!! I should stop playng wow or at least buy a new pair of eyeglasses LOL
https://forums.guru3d.com/data/avatars/m/248/248994.jpg
Nvidia could probably afford to charge more for the new architecture. Ethereum asics are apparently appearing, so the miner GPU demand should be losing steam somewhat. If the market is flooded with cheap second-hand, it seems like selling new stuff would be harder. So, if it's harder, they might as well sell less for more profit. It should be possible if the new generation beats the old handily in power, at least towards the upper end. Plus new architecture is always more exciting. Who knows what's going to happen to AMD. If they haven't got anything new to offer, it's hard to see how they'd sell anything much. HBM2 probably still isn't making things easy for them. Perhaps they will just weather a near extinction of their GPU side until they are ready to make a sudden comeback like they did with Ryzen in the CPU market.
https://forums.guru3d.com/data/avatars/m/260/260826.jpg
Silva:

What about AMD? They should be working on their own GDDR6 controller, maybe Polaris refresh again?
AMD? HBM1 and 2 was not enough fail for them in the consumer market. Probably they are working hard to use HBM3 in their next mining GPU, i mean "Gaming" GPU. LOL
data/avatar/default/avatar04.webp
Nvidia naming - stop getting it wrong. 580 680 780 880 980 1080... why would you think 2080 would follow? Series 20? What happened to Series 11...?
data/avatar/default/avatar19.webp
sammarbella:

AMD? HBM1 and 2 was not enough fail for them in the consumer market. Probably they are working hard to use HBM3 in their next mining GPU, i mean "Gaming" GPU. LOL
Is going to be super tough for AMD. More so because Vega and Pascal are not even contemporaries. It's Vega and Volta Uff... I dunno. Find a way to differentiate themselves from nvidia. Work with Monitor/panel manufacturers to bring range of high quality FreeSync2 products. Bring back Avivo 🙂 Put 2x 200W APUs on the same motherboard. fuk do I know... 😀
data/avatar/default/avatar14.webp
Brit90:

Nvidia naming - stop getting it wrong. 580 680 780 880 980 1080... why would you think 2080 would follow? Series 20? What happened to Series 11...?
It's marketing, not a computer program. To be honest 20x0 looks a bit "sexier" than 11x0. Besides, the same thing has happened countless times: The GeForce "hoops": 256 -> 2 4 -> FX 5xxx FX 5xxx -> 6xxx (no longer FX for some reason) 9xxx -> 100 100 and 300 series only had low end cards, no idea why 800 was entirely skipped (only mobile products) ... and the same goes for AMD and I'm pretty sure other industries.
data/avatar/default/avatar26.webp
Brit90:

Nvidia naming - stop getting it wrong. 580 680 780 880 980 1080... why would you think 2080 would follow? Series 20? What happened to Series 11...?
Why ? Commercially speaking, "GTX 2080" sounds much better than "GTX 1180" so it is not absurd. Video card naming is just a marketing naming and can have no logical suites. By the way, the desktop naming GTX 8 series doesn't exist, they jumped from 7XX to 9XX.
data/avatar/default/avatar08.webp
Hilbert AngerDoom keeps us all up to date with pc info the least we can do is spell his name correctly. Is that irish,scotish?
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
Idk everyone said 'GTX1080' wouldn't exist either because consumers might get confused by 1080p resolution, but it does so..
https://forums.guru3d.com/data/avatars/m/262/262995.jpg
Umm, sorry but "Ampere" is where we get the word Amps, which comes from André-Marie Ampère, the father of electrodynamics, not some obscure website. I suppose you think Tesla comes from the car name? LOL
https://forums.guru3d.com/data/avatars/m/260/260826.jpg
Noisiv:

Is going to be super tough for AMD. More so because Vega and Pascal are not even contemporaries. It's Vega and Volta Uff... I dunno. Find a way to differentiate themselves from nvidia. Work with Monitor/panel manufacturers to bring range of high quality FreeSync2 products. Bring back Avivo 🙂 Put 2x 200W APUs on the same motherboard. fuk do I know... 😀
The AMD reply for the next Nvidia Gaming GPUs and raytracing tech was already announced. It's just aother OpenGPU/Vulkan gimmick no gaming dev will use for Windows (98.5% of PC market): https://gpuopen.com/announcing-real-time-ray-tracing/ It looks like AMD "rebadged" an (already) revamped gimmick from two years ago: https://gpuopen.com/firerays-2-0-open-sourcing-and-customizing-ray-tracing/