MSI: Radeon RX Vega needs a lot of power

Published by

Click here to post a comment for MSI: Radeon RX Vega needs a lot of power on our message forum
https://forums.guru3d.com/data/avatars/m/259/259654.jpg
If it's anything like polaris we can expect poor perf/watt compared to pascal. Example RX580 vs 1060 6GB uses 40% more power for the same performance. Also despite having higher theoretical Tflops the 580 has over 1060 it still performs the same averaged across many titles. Vega really needs to destroy the Ti to have any saving grace if rumors that it's a power hog are true.
Last time I checked, the RX580 was using a max of 22% more power, and depending on the load it was actually less than that. http://images.anandtech.com/graphs/graph11278/86529.png http://images.anandtech.com/graphs/graph11278/86530.png Less than ideal, but not 40%.
https://forums.guru3d.com/data/avatars/m/242/242134.jpg
@Only Intruder The same for you. Where's your proof that it is like you said? Based on the facts that the last couple of amd gpu releases were slower than Nv, while consuming more power, i assume its gonna be the same. Because if the cards are "so fast", why not have the card perform at Nv level, and thus reduce power consumption? This to me means the card is only able to compete with pascal when cranked up to level 11, hence the huge amount of power needed.
data/avatar/default/avatar18.webp
Last time I checked, the RX580 was using a max of 22% more power, and depending on the load it was actually less than that. http://images.anandtech.com/graphs/graph11278/86529.png http://images.anandtech.com/graphs/graph11278/86530.png Less than ideal, but not 40%.
Well 1070 destroys a 580 consuming less power. It's not very hard to conclude that polaris is not very efficient and VEGA will be the same. Even if it have the performance of two 580, that's not enough to beat a 1080 ti.
https://forums.guru3d.com/data/avatars/m/260/260826.jpg
Yeah but we also know the reasons why AMD's Tflops don't equal Nvidia's in game performance and Vega's changes were supposed to remedy most of those issues (architecture balance/drivers/etc).
Most. The gap in Tflops is narrow now (Vega VS 1080s) than it was in Fury X vs 980Ti. At least this time the stupid power limitation error AMD made in Polaris will not be a problem. Maybe we will even see some kind of OC in VEGA gaming GPUs. LOL
I still personally think the top end Vega will trade blows in various games with the Ti, especially at higher resolutions. I just don't know if the year and 3+ months of architecture design that Vega has over Pascal justifies what would essentially be a tie.
I agree, AMD seems to be late to the party...since 290X. A tie in price and performance is not enough for the underdog, it wasn't for Fury X...or Polaris. 1070 performance and price make it THE dedicated GPU of choice in the medium gaming range.In the high end range AMD is missing since 290X. AMD 56 should be able to battle it.64 should battle 1080 and Gaming Vega should battle 1080Ti. I simply don't see it.
I'm kind of on the fence when I think Volta will launch. I assumed it wouldn't be until 2018 because I thought GV100 was going to be delayed.. but it's pretty clear that GV100 is going to be shipping ~Q3. https://pbs.twimg.com/media/DBJY_f8XoAAbaZA.jpg - they already have them shipping to select partners in test servers. So now I'm thinking Nvidia might just move the launch of Volta up to sometime this year. I don't think they'll put out a 600mm2 card, but if they did put out a ~400mm2 GTX1180 @ 180w with 10-15% more performance than the 1080Ti (Kind of like the 1080 to the 980Ti) - it would definitely hurt Vega sales.
Do you have any doubt Nvidia has Volta ready to release and kill AMD gaming GPU options if Vega can deliver? I don't have any doubt.
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
Well 1070 destroys a 580 consuming less power. It's not very hard to conclude that polaris is not very efficient and VEGA will be the same. Even if it have the performance of two 580, that's not enough to beat a 1080 ti.
Why do you conclude Vega will be the same? Half the architecture changes that have been outlined are designed to save power. You can't just look at the past few generations and say "this is how it was, this is how it will be" - not when we have pretty detailed information on the changes going on under the hood. Will it have the same efficiency as the 1080Ti? I don't know, maybe in some games when it outperforms it - but I'm fairly confident it's perf/w is going to be 15%+ than Polaris on average.
https://forums.guru3d.com/data/avatars/m/264/264289.jpg
Tflops are not equal to gaming performance, that's the AMD problem.We didn't see FPS counter for a good reason in hype videos lately... As we know Nvidia TFlops give more gaming performance than AMD Tflops: GTX 980ti has 5.63 TFs and Fury X has 8.6 TFs. GTX 1080TI delivers 11.4 TFs, Vega Frontier 13 TFs, Vega Gaming in July/August (?), vega 56 11 TFs, Vega 64 13 TFs. Let's see AMD drivers, i expect the worst.
I took a quick look at Boss' latest 580 review throughout which the tested 580 is averages ~77 fps (100%) vs. ~70 (90%). So at least in this case "1060 it still performs the same averaged across many titles" is bit off. I think AMD just has to go for the one-fits-all approach for reusing the same chips with graphics and compute solutions resulting in a bit of an overhead in compute power than it can actually translate into fps.
https://forums.guru3d.com/data/avatars/m/237/237771.jpg
Last time I checked, the RX580 was using a max of 22% more power, and depending on the load it was actually less than that. http://images.anandtech.com/graphs/graph11278/86529.png http://images.anandtech.com/graphs/graph11278/86530.png Less than ideal, but not 40%.
Hate to burst your bubble but that's total system power draw in those tests. The system in that test looks to be drawing about 120-125W that puts 1060FE at about 120Wand 580 at about 200W in the gaming bench yep looks like 40% there. Fumark is not ideal but you can't not count it.
https://forums.guru3d.com/data/avatars/m/259/259654.jpg
Hate to burst your bubble but that's total system power draw in those tests. The system in that test looks to be drawing about 120-125W that puts 1060FE at about 120Wand 580 at about 200W in the gaming bench yep looks like 40% there. Fumark is not ideal but you can't not count it.
I honestly didn't notice it was full power draw. You're both correct guys.
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
Hasn't it already been established by AMD themselves that its performance sits between the 1080 and 1080Ti? Why is everyone expecting and hoping it will beat the Ti? As for power consumption, I've noticed AMD GPUs heat up tremendously when you tell them to do relatively simple tasks (like Furmark) at full force, but their wattage becomes adequate when they're doing something more complex (like Unigine Valley). I get the impression AMD's GPU pipelines are much shorter than Nvidia's, but they have more of them. This is why they're good for mining, and why they can build up heat so quickly doing the same task.
https://forums.guru3d.com/data/avatars/m/235/235344.jpg
Ok, if this really requires a significant increase in the power envelope, would it be fare to say that this is AMD's answer to help get the cards into gamer's hands instead of miner's. Otherwise it is game as usual...AMD/ATI were always known to run hotter than the competition; nothing new there. Only exception was Fermi tried to take that crown away. Title is still held by AMD though. What were the old adages: DX based games - ATI OpenGL based games - Nvidia More accurate colors - ATI Close enough colors - Nvidia Game Performance - Nvidia Close enough game performance - AMD More Heat - ATI/AMD Less Heat - Nvidia Now the whole picture is just a mixed bag but the last four have been more consistent.
https://forums.guru3d.com/data/avatars/m/237/237771.jpg
I honestly didn't notice it was full power draw. You're both correct guys.
Honestly it's easy to mis in that chart.
https://forums.guru3d.com/data/avatars/m/169/169351.jpg
@Only Intruder The same for you. Where's your proof that it is like you said?
What? Didn't you read what I said? I said we can't draw conclusions yet, we need to wait til you know... we actually have the product and the reviews are out, all we're doing is speculating.
data/avatar/default/avatar15.webp
What? Didn't you read what I said? I said we can't draw conclusions yet, we need to wait til you know... we actually have the product and the reviews are out, all we're doing is speculating.
Funny how it is always speculating but when it comes out and its true then its like aah well its not that bad?
https://forums.guru3d.com/data/avatars/m/118/118821.jpg
Vega really needs to destroy the Ti to have any saving grace if rumors that it's a power hog are true.
i think thats coming on quite strong. objectively it just needs to perform well & be priced competitively. as an aside, i personally dont care much about team reds wattage efficiency issues with recent gens cards. i dont fold or crunch coins.
data/avatar/default/avatar20.webp
i think thats coming on quite strong. objectively it just needs to perform well & be priced competitively. as an aside, i personally dont care much about team reds wattage efficiency issues with recent gens cards. i dont fold or crunch coins.
mostly I agree with you, but the fact its 1 Year late to the party might be a problem, as Nvidia can just do a quick refresh and kick AMD in the face again. Then they will take another year to catch up?
https://forums.guru3d.com/data/avatars/m/79/79740.jpg
Ok, if this really requires a significant increase in the power envelope, would it be fare to say that this is AMD's answer to help get the cards into gamer's hands instead of miner's. Otherwise it is game as usual...AMD/ATI were always known to run hotter than the competition; nothing new there. Only exception was Fermi tried to take that crown away. Title is still held by AMD though. What were the old adages: DX based games - ATI OpenGL based games - Nvidia More accurate colors - ATI Close enough colors - Nvidia Game Performance - Nvidia Close enough game performance - AMD More Heat - ATI/AMD Less Heat - Nvidia Now the whole picture is just a mixed bag but the last four have been more consistent.
Ha ha... where did you get that? From Nvidias HDMI bug that was long resolved? Or from NV's FX series 15 years ago?
data/avatar/default/avatar03.webp
Ha ha... where did you get that? From Nvidias HDMI bug that was long resolved? Or from NV's FX series 15 years ago?
its one of those ATI superior image myths If there was any truth at all in these myths, one would think that AMD would have used this to demo their superiority in color representation. But we've never ever seen any such demo, have we
https://forums.guru3d.com/data/avatars/m/259/259654.jpg
I wonder if the power consumption is that great due to a weird voltage cut off the chip has, or it's a general trend with Vega. I remember that most Furies could be undervolted for significant thermal and power gains, but overclocking and overvolting them would send consumption through the roof. Perhaps Vega is similar and the lower-clocked/voltage cards actually do compete well with NVIDIA in perf/watt (see the Nano vs the GTX 980), but the big boys which will most likely go over the optimal voltage/clock thresholds need a lot of work to get there. I would say it's that way just by the fact that AMD seems to provide a Vega model with a single fan on it, and no huge (initial) differences between it and the watercooled model.
https://forums.guru3d.com/data/avatars/m/270/270718.jpg
Ok, if this really requires a significant increase in the power envelope, would it be fare to say that this is AMD's answer to help get the cards into gamer's hands instead of miner's. Otherwise it is game as usual...AMD/ATI were always known to run hotter than the competition; nothing new there. Only exception was Fermi tried to take that crown away. Title is still held by AMD though. What were the old adages: DX based games - ATI OpenGL based games - Nvidia More accurate colors - ATI Close enough colors - Nvidia Game Performance - Nvidia Close enough game performance - AMD More Heat - ATI/AMD Less Heat - Nvidia Now the whole picture is just a mixed bag but the last four have been more consistent.
Let be real here-- why in the world would AMD want to do anything to REDUCE the sales of their GPUs??? lol man, they are a business-- if they could sell 5 million GPUs at a profitable price to someone who just wanted to grind them up to make play sand, do you not think they would do it??! [spoiler] They would. [/spoiler]