NVIDIA RTX 40-Series 'Ada' AD102 GPU Enters Phase of Testing

Published by

Click here to post a comment for NVIDIA RTX 40-Series 'Ada' AD102 GPU Enters Phase of Testing on our message forum
data/avatar/default/avatar29.webp
Undying:

Yes but thats a 9% slower than your 480w oc. 😛
Sure, but then they can leave that extra performance for people who want to OC, and have it stock running with trimmed voltage, and nearly the same performance, at much reduced power consumption 🙂
https://forums.guru3d.com/data/avatars/m/227/227994.jpg
alanm:

Have feeling the extra power draw for these cards is Nvidia OC'ing the crap out of them in order to beat the competition.
What competition?
https://forums.guru3d.com/data/avatars/m/79/79740.jpg
TheDeeGee:

What competition?
RDNA 3. Even with Ampere they OC'd the cards to beat AMD. In lot of titles 6800xt very close to 3080. But undervolt the 3080 and you can shave off roughly a 100 watt while retaining 95% of performance. Somehow they figured that 5% extra perf was worth 100w added power draw.
https://forums.guru3d.com/data/avatars/m/216/216349.jpg
The problem is when Nvidia starts testing the price of it...
https://forums.guru3d.com/data/avatars/m/198/198862.jpg
H83:

The problem is when Nvidia starts testing the price of it...
They already tested how much people are ready to spend.
https://forums.guru3d.com/data/avatars/m/216/216349.jpg
Undying:

They already tested how much people are ready to spend.
I could be wrong but i think the test is still going on...
https://forums.guru3d.com/data/avatars/m/270/270008.jpg
DannyD:

We'll see.;)
I'm certain of it. Both AMD and Nvidia are moving to 5nm which is a full node shrink. Expect a 40%+ increase in perf as that is what historically happens on full node shrinks.
https://forums.guru3d.com/data/avatars/m/258/258664.jpg
... and then they cut back on power restrictions in BIOS (because omg efficiency!) and we get 10% real performance increases again. And another group of people is complaining, just wait for it.
data/avatar/default/avatar02.webp
fantaskarsef:

... and then they cut back on power restrictions in BIOS (because omg efficiency!) and we get 10% real performance increases again. And another group of people is complaining, just wait for it.
ehh... what? A 3090 running at 250 watts is still about 30% faster than a 2080 ti.
https://forums.guru3d.com/data/avatars/m/258/258664.jpg
Dragam1337:

ehh... what? A 3090 running at 250 watts is still about 30% faster than a 2080 ti.
Actually I was not talking about you but yeah, I know what you mean, and I don't doubt it. What I tried to convey is, that even if a certain new GPU can do any set of numbers with given power, Nvidia could just cut down power budget via BIOS and artificially restrict performance (LHR anyone?). And I could very well imagine this happening. Also, with silicone lottery, close matches are down the drain under any circumstances.
https://forums.guru3d.com/data/avatars/m/56/56686.jpg
hopefull we see 160watt card again or atlest under 200 watts I will not buy another card that pulls more power then 1070ti, that thing get loud once it start running high 60c to 70c would be nice if prices come back down to earth too
https://forums.guru3d.com/data/avatars/m/248/248994.jpg
fantaskarsef:

Nvidia could just cut down power budget via BIOS and artificially restrict performance (LHR anyone?).
It would be divine justice if the next gen consumed 1000W of power to mine Ethereum at GTX 1060 level of performance.
https://forums.guru3d.com/data/avatars/m/258/258664.jpg
Kaarme:

It would be divine justice if the next gen consumed 1000W of power to mine Ethereum at GTX 1060 level of performance.
And there I thought Intel wanted to tackle that market segment
https://forums.guru3d.com/data/avatars/m/282/282392.jpg
JamesSneed:

I'm certain of it. Both AMD and Nvidia are moving to 5nm which is a full node shrink. Expect a 40%+ increase in perf as that is what historically happens on full node shrinks.
You don't think they're gonna milk the crap outta it over years? Ain't gonna be like the jump from 20 supers > 30 series.
https://forums.guru3d.com/data/avatars/m/108/108389.jpg
DannyD:

You don't think they're gonna milk the crap outta it over years? Ain't gonna be like the jump from 20 supers > 30 series.
Milking the customers is the main objective of any business 😀, hell even charities these days are milking their donators. RTX3000 has good price/perf because Samsung 8nm was dirt cheap, Nvidia negotiated a cheaper wafer price on Samsung 8nm than even TSMC 12nm. TSMC 5nm must be bringing some cost and performance benefits that Nvidia chose TSMC over Samsung, so RTX4000 should be at least 25% higher efficiency and 25% price/perf over RTX3000
https://forums.guru3d.com/data/avatars/m/273/273678.jpg
alanm:

Even with Ampere they OC'd the cards to beat AMD
no they didn't.
https://forums.guru3d.com/data/avatars/m/79/79740.jpg
Astyanax:

no they didn't.
Is it normal for all generations cards to draw 100w for that last 5% of performance?
https://forums.guru3d.com/data/avatars/m/108/108389.jpg
alanm:

Is it normal for all generations cards to draw 100w for that last 5% of performance?
More like 100W for the last 11% of performance according to my testing (from 250W to 350W), from 350W to 450W maybe 6% gain.
https://forums.guru3d.com/data/avatars/m/79/79740.jpg
Krizby:

More like 100W for the last 11% of performance according to my testing (from 250W to 350W)
Was referring to the 3080 in particular. I dont believe any previous gen before it had consumed as much power for that last 5% performance. Which also leaves it with little headroom left for OC'ing beyond stock. Lets be clear, the days of 980 Ti where you got 25% perf gain from OC'ing are long past us. GPU makers are leaving less and less headroom with each new gen because they are using that headroom to get ahead of the competition. Its not rocket science, its business when you have to use every trick in the book to get atop the bench charts. Same with CPUs.