NVIDIA CEO Makes Interesting Remark, New Gamers GeForce is a long time away from now

Published by

Click here to post a comment for NVIDIA CEO Makes Interesting Remark, New Gamers GeForce is a long time away from now on our message forum
https://forums.guru3d.com/data/avatars/m/239/239175.jpg
Good. I don't want my 980 Ti to be made obsolete yet. Let's stick to Pascal as the latest GeForce gen for a couple more years. /s
data/avatar/default/avatar02.webp
RealNC:

Good. I don't want my 980 Ti to be made obsolete yet. Let's stick to Pascal as the latest GeForce gen for a couple more years. /s
"Competing with oneself" - Credo put to the test. Sounds like a draw this time around 😉
https://forums.guru3d.com/data/avatars/m/250/250418.jpg
If there's a new process node available I'm sure Nvidia will take advantage of it to refresh the existing lineup. More efficiency and/or performance is always welcome and they capitalize on an already existing tech. That said, to see something new we have to wait for AMD to make the 1080 obsolete with something affordable.
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
Silva:

If there's a new process node available I'm sure Nvidia will take advantage of it to refresh the existing lineup. More efficiency and/or performance is always welcome and they capitalize on an already existing tech. That said, to see something new we have to wait for AMD to make the 1080 obsolete with something affordable.
It's definitely process node. 7nm is shipping volume for phone SoC's and such, it's going to take time to get yields acceptable to launch the typical ~300mm2 cards Nvidia does on a new release. Probably end of this year, maybe even early next year. But it would be stupid for them to launch 12nm cards now with 7nm so close to being ready. 12nm is barely an improvement on 16 unless you're doing the 6.5T variant and even then you only get a density improvement which Nvidia doesn't even need. 1080Ti is only 470mm2.
https://forums.guru3d.com/data/avatars/m/270/270233.jpg
Looks like JayzTwoCents was right. Despite all the rumors and speculations, there was really no solid evidence that a new consumer GPU was coming. As long as Pascal continues to dominate, Nvidia has no reason to release anything new (it's what inevitably happens when there is a lack of competition). As I said before, you should always treat such rumors with a mountain of salt. Of course Nvidia fans will continue to believe that a new GPU is right around the corner 😛 (what is a "long time"? A few months? Weeks? Days?)
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
D3M1G0D:

Looks like JayzTwoCents was right. Despite all the rumors and speculations, there was really no solid evidence that a new consumer GPU was coming. As long as Pascal continues to dominate, Nvidia has no reason to release anything new (it's what inevitably happens when there is a lack of competition). As I said before, you should always treat such rumors with a mountain of salt. Of course Nvidia fans will continue to believe that a new GPU is right around the corner 😛 (what is a "long time"? A few months? Weeks? Days?)
Idk, you say it's lack of competition - if Nvidia had competition right now, what would they build? What would a GPU nvidia ships right now even look like? 12nm barely provides any improvement over 16nm and the 1080Ti is limited by it's 250w/300w TDP. So they'd release what, a slightly bigger 1080Ti? People would just complain it's too expensive and not enough of a performance improvement. Notice how AMD, who has competition hasn't released anything either? It's not because of lack of competition.. it's because they are in the same boat.
data/avatar/default/avatar38.webp
Denial:

Idk, you say it's lack of competition - if Nvidia had competition right now, what would they build? What would a GPU nvidia ships right now even look like? 12nm barely provides any improvement over 16nm and the 1080Ti is limited by it's 250w/300w TDP. So they'd release what, a slightly bigger 1080Ti? People would just complain it's too expensive and not enough of a performance improvement. Notice how AMD, who has competition hasn't released anything either? It's not because of lack of competition.. it's because they are in the same boat.
They could have done the volta chip without hbm2, and instead use gddr5x until gddr6 is ready - the full volta chip without hbm2 should be like 550-600mm2.
https://forums.guru3d.com/data/avatars/m/270/270041.jpg
as lovely as a new card would be... why would Nvidia bother? from a business stand point the 10xx series is still selling really well... and they have no competition right now specially in the top end market, so from a purely business stand point id make no sense to make a new card. and im guessing they feel or know that AMD currently doesn't have anything to bring out any time soon
data/avatar/default/avatar35.webp
I hope AMD will be able to release something in between, because I don't want to spend a fortune on a 1080 today and later learn that a 1160 has just been released.
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
Dragam1337:

They could have done the volta chip without hbm2, and instead use gddr5x until gddr6 is ready - the full volta chip without hbm2 should be like 550-600mm2.
The 815mm2 die size of GV100 doesn't include the HBM2 stacks. You'd strip the tensors/FP64 and get 550-600 but then you run into what I said.. it's like 20% more performance, arguably because of clock speed due to power limitations.. but then what? Your new generation is maxed at 20% more than previous and costs more. No one is going to go for that and the same people saying "nvidia is holding back due to lack of competition" will just say "nvidia is milking the same architecture under a new name" or whatever. http://theconfab.com/wp-content/uploads/P-6-2018-ConFab-Talk-Andy-Wei.pdf GV100 interposer is ~1380mm2
https://forums.guru3d.com/data/avatars/m/248/248994.jpg
Denial:

Idk, you say it's lack of competition - if Nvidia had competition right now, what would they build? What would a GPU nvidia ships right now even look like? 12nm barely provides any improvement over 16nm and the 1080Ti is limited by it's 250w/300w TDP. So they'd release what, a slightly bigger 1080Ti? People would just complain it's too expensive and not enough of a performance improvement. Notice how AMD, who has competition hasn't released anything either? It's not because of lack of competition.. it's because they are in the same boat.
It's a complicated situation. Looking at the CPU side, Intel responded to Ryzen very quickly with 6C mainstream chips from the same process they already used for Sky and Kaby. 12nm would still be an improvement compared to 16nm, even if not much. It's no doubt partially the lack of pressure that made them save money by not hurrying forward with in the consumer market. The other factor is how complicated things are getting. For Intel it meant nothing to make the 6C they could have released already many years ago if they weren't such scumbags. But it's far harder to make better high level GPUs anymore, at least until AMD and Nvidia figure out a way to make working GPUs using smaller pieces like it goes with Ryzen. However, it has to take more time to make that work than simply following the old path. Or come up with something else revolutionary. AMD is further held back because it has little money for R&D (or anything). It needs to think exceptionally carefully how to spend the money. Already Intel is using its bigger bucks to buy brains away from AMD. But yes, I agree on both being in the same boat aside from the money issue.
data/avatar/default/avatar16.webp
Denial:

The 815mm2 die size of GV100 doesn't include the HBM2 stacks. You'd strip the tensors/FP64 and get 550-600 but then you run into what I said.. it's like 20% more performance, arguably because of clock speed due to power limitations.. but then what? You're new generation is maxed at 20% more than previous.
I would have taken 20% more performance half a year ago, rather than go 1 year more without any performance improvements at all 🙂
https://forums.guru3d.com/data/avatars/m/165/165326.jpg
Q4 2018 or maybe Q1 2019 ? A long time to wait indeed.
data/avatar/default/avatar01.webp
No suprises.... end of the year, 12nm refined Pascal with GDDR6/5X would be my guess. no competition or pressure from AMD
https://forums.guru3d.com/data/avatars/m/115/115462.jpg
Well, for me as a 1080Ti buyer at release this is good news I guess, since it made my choice from last year even better. But then again, for 1440p 140Hz gaming this card is more than enough. Same for 1070 and 1080. It IS weird though to have a generation going for so long and still be up there. So yeah, out of habit, even though this is in my advantage money wise, I was used to look for upgrades by now. Oh well... 😛 However, I can see the 4K guys being more pissed than me though, but considering how many are still on 1080p (or below) it's clear to me that they are and will be a niche resolution gaming wise for years to come (just like 1440p was ~8 years ago).
https://forums.guru3d.com/data/avatars/m/218/218363.jpg
This is fine as long as the game developers don't make games that require better tech than what we're stuck with for the next 12 months.
data/avatar/default/avatar01.webp
For 1080p even the old 780Ti is still providing good performance, in cases where the CPU isn't the limiting factor. The issue is not only the lackluster competitive pressure from AMD, but the general state of stagnation in graphics engine progress and development, while most of the effort is being spent in "gamblifying" micro-transactions into loot boxes. To me, it looks like everybody is digging for the next big quantitative jump -- ray-tracing, since neither DX12 nor Vulkan managed to motivate studios to invest enough time and resources.
data/avatar/default/avatar22.webp
Netherwind:

This is fine as long as the game developers don't make games that require better tech than what we're stuck with for the next 12 months.
Developers always follow hardware, you can't make a game that no one or very few can run, it makes no financial sense. developers heavily rely on hardware advancements to push their game's requirements up.
https://forums.guru3d.com/data/avatars/m/218/218363.jpg
HardwareCaps:

Developers always follow hardware, you can't make a game that no one or very few can run, it makes no financial sense. developers heavily rely on hardware advancements to push their game's requirements up.
Not sure I agree there since it's not possible to run all games at maximum settings. I remember Ryse, Son of Rome. That game was crazy tough on hardware. And AC:Origins isn't playable at 4K on max settings. There are many more examples like that.
data/avatar/default/avatar12.webp
Netherwind:

Not sure I agree there since it's not possible to run all games at maximum settings. I remember Ryse, Son of Rome. That game was crazy tough on hardware. And AC:Origins isn't playable at 4K on max settings. There are many more examples like that.
AC was always badly optimized but still runs fairly fine, 4K is minority in the market, you should look at mainstream hardware. Games that are difficult to run are not very successful, sometimes developers aim too high or assume hardware will improve alot faster(remember that developing a game can take a few years) anyways it is always hardware that dictates the tune.