AMD on the Road: takes Radeon RX Vega to the Gamers
Click here to post a comment for AMD on the Road: takes Radeon RX Vega to the Gamers on our message forum
Noisiv
malitze
schmidtbag
Noisiv
I am sure you'll agree that Ryzen is an oddball when it comes to OC.
For the sake of the argument imagine if custom Vega RX come at ~375W of real ingame consumption and lets assume it equals 1080Ti FE @250W.
Who's more likely to be a faster card when OC-ed?
schmidtbag
Noisiv
schmidtbag
Noisiv
Denial
schmidtbag
Noisiv
I've lost you completely tbh...
There is no such thing as theoretical power equation for ic circuit.
power ~ clocks*voltage^2 is an approximative empirical equation that is usable under very limited scope of circumstances, merely a good starting point.
You keep repeating "Wattage does not scale linearly with clock rate." As if I had claimed that that is always the case. Matter of fact power DOES scale linearly with clocks - AT BEST.
In practice it often scales worse, sometimes much worse, especially going past the clock/power sweet spot ( which AMD lately has no trouble passing), especially closing in on max. OC. How does this, or your claim of power not scaling linearly help our Vega RX, ... I have no idea.
I WISH IF VEGA POWER CONSUMPTION SCALED LINEARLY WITH CLOCKS PAST 1600MHz! There I said it.
And how the **** did you draw me into this discussion when all I've said is: lets imagine 375W custom AIB Vega.
Which was a simple - for the sake of the argument, yet not out of this world assumption. Now I need to provide a whitepaper on this, else I am very pessimistic and biased?
And what about equating 1080 Ti, biased also?
But lets see you try:
Knowing that 1440MHz Vega draws 280W. How much would you assume that custom OC-ED 1700MHz Vega might draw?
Negative zero?
Sample variance affects the wattage... yes and??
You might wanna talk about the specific golden chip, I am interested into volume averages.
fantaskarsef
schmidtbag
Noisiv
https://www.youtube.com/watch?v=IfSGboBX1QE
OK now you're conflating the unknowns with the approximations within the model. Like... you really really need to know the initial velocity(!) to have any clue about the final velocity. While not having GPU temp merely suggests a rough model, an approximation.
And since you're being pedantic, you forgot the height above sea level, and dozen other initial conditions 🙂 And even if you had all of them, you still woul;dn't be able to solve this "simple" problem analytically, because as far as I know there is no general and the exact motion equation which accounts for air resistance.
So again you're are back to approximations and some kind of experimental model. But ok so far we agree.
would you have been any happier if you had temp, fan speed, voltage?
would this attempt at power calculation had been any different?
So after chastising me for being overly-simplistic in my pessimistic approximation,
you yourself went with the most basic, linear approximation (which you yourself said is wrong), and the one that everyone should knows is impossible in the real world.
What happened to common sense, why not add few %?
Anyone with a clue should know that power going linear past max. boost clock all the way to 1700MHz is VEEEERY optimistic. <-- DONT YOU AGREE?
Take a look:
Vega FE
1650MHz, 1.2V
375 Watts from 2x 8-pin alone
overclocking is kinda broken because once you OC, GPU goes to 1.2V
schmidtbag
Noisiv
Noisiv
BTW why do think that perf-wise Vega is more 1080, than 1080Ti competitor?
Wouldn't by any chance Vegas lowish clocks,relatively to Pascal, have anything to do with Vega's performance,
ie wouldn't Vega being only at 1080 level have anything to do with it's power consumption, ie being TDP limited?
So there you go -> efficiency=performance 😉
Elder III
At this point we just do not know anything for sure one way or the other. Until Hilbert, or some other reliable review site releases reviews with benchmarks nothing is certain.
With that said, my very rough guess is that it will land in between a GTX 1080 and 1080 Ti. That would put AMD unfortunately late to the game, but also have a very needed benefit of lowering prices for the high end gaming GPU market. So far Vega (frontier version) doesn't look like much of a game changer for mining, so hopefully the stock will not get swallowed up immediately by mining farms in China that run hundreds of GPUs on $0.01 electricity costs. 😛
schmidtbag
yasamoka
Why would AMD release a card to compete with the 1080 which is ~20% faster than the previous generation 980Ti when they already had the Fury X to compete with that (and it already competed well, disregarding VRAM limitations - and now competes better)?
Is Vega going to be 20-30% faster than a Fury X? They might as well not release a card at all.
It doesn't make sense to me for AMD to be targeting this card against the 1080. Not at all.