AMD RX Vega Shown Against GTX 1080 at Budapest Event

Published by

Click here to post a comment for AMD RX Vega Shown Against GTX 1080 at Budapest Event on our message forum
data/avatar/default/avatar06.webp
It certainly does. OTOH... In the middle of the winter, having 400W GPU, you are no better or worse than having a 400W heater. Lets do it one more time: Q: What is the difference, heat-production wise, between a GPU Which consumes 400W and a 400W heater? A: NONE. 100% of electrical energy is converted into heat in both cases. Very slightly diminished by noise, led lights and fans movement. Q:If your GPU produces 400W, how big your cooler has to be: A: At least 400W TDP. Everyone agrees? Right. Let us proceed then: ===$100 QUESTION BELLOW === JOIN NOW, AND WIN BIG === $$$ ==== Q: What is the difference between a GPU which consumes 400W and a 400W TDP GPU? A: ??
The GPU does useful computation in addition to its waste heat. The heater consumes 400W and only produces heat. Waste heat from datacenters in cold countries is often actually used for heating buildings and even entire towns. Whats your point?
https://forums.guru3d.com/data/avatars/m/179/179962.jpg
Here is the link to a video of the event. It may explain some questionmarks I've seen in this thread. https://www.youtube.com/watch?v=f-mRJ2Y3110 but man....those babes... ๐Ÿค“ ๐Ÿค“ ๐Ÿค“ ๐Ÿค“
data/avatar/default/avatar06.webp
Good idea. 400W of power (which is not GPU in article) can consume in one day up to 9,6 kWh. Daily commute in car to work and back is in range of 100kWh burned. (2ppl, 1 hour) One sourced by nuclear fusion, other by fossil fuel.
Sourced by nuclear fusion? How generous of you./.. How about something that is in actual use? Like a fossil fuel power station. And suddenly your 9.6kWh GPU becomes 32kWH of the actual fuel consumed. Or 1/3 of the ****ing car LOL
https://forums.guru3d.com/data/avatars/m/115/115710.jpg
Doesn't look good at all. 1) No FPS counter. 2) Monitor could be using freesync, thus making image more fluent against Nvidia. 3) AMD just pulling time here. Is it trying to magically boost fps with drivers or bios? This is plain stupid statement. What are "systems"? Nvidia running with Intel i9 CPU? I would love to see Vega succeed, but AMD makes it very hard to believe. Very unprofessional.
One does this kind of even when - They can't compete with the competitor's products. - "Our product is just as good in practice. Even though it's slower you don't notice the difference." - Adaptive sync makes things much harder to spot the difference. - If it would have been faster they would have shown real numbers. - They used GTX 1080 which means that it's no where near GTX 1080 Ti. If they did use GTX 1080 Ti it doesn't really change much. Vega might be decent architecture for APUs but it seems to have serious scalability issues. Either AMD should start to rethink their approach when it comes to keeping GCN alive forever or they should just ditch GloFo until 7nm process is ready. On top of that RTG's marketing team does just ridiculous things. "Poor volta" ๐Ÿ˜›uke2: Seems like that's going to bite in their ass.
https://forums.guru3d.com/data/avatars/m/270/270718.jpg
Someone mentioned that other site, the one that starts with*** Just like to clarify one thing-- yes, it is the "National Inquirer" of hardware news. BUTT-- the comments section is so over the top ridiculous its what keeps me going back, like its something you feel really guilty about doing but just cant quit. :stewpid: Seriously, you could just grab some beer and go straight to the comments and laugh for an hour over *most* of the comments there.....
https://forums.guru3d.com/data/avatars/m/250/250418.jpg
Vega is turning more and more disappointing.
data/avatar/default/avatar15.webp
The GPU does useful computation in addition to its waste heat. The heater consumes 400W and only produces heat. Waste heat from datacenters in cold countries is often actually used for heating buildings and even entire towns. Whats your point?
For once stick to the question plx, instead of stumbling all over the place ๐Ÿ™‚ Me, I am the quiz master. You provide the answer, OK?
Vega is turning more and more disappointing.
RX is not even out yet, but these geniuses over at AMD marketing are doing their best to **** it up. In the end it doesn't matter too much. Once reviews hit the web, all that we know so far will be (insert Blade Runner quote) ...lost in time. Like tears in the rain.
data/avatar/default/avatar04.webp
For once stick to the question plx, instead of stumbling all over the place ๐Ÿ™‚ Me, I am the quiz master. You provide the answer, OK? RX is not even out yet, but these geniuses over at AMD marketing are doing their best to **** it up. In the end it doesn't matter too much. Once reviews hit the web, all that we know so far will be (insert Blade Runner quote) ...lost in time. Like tears in the rain.
Youre not the master of **** dude. Get a clue plz
data/avatar/default/avatar25.webp
Plx...don't queue if you're not gonna play the quiz Thx
data/avatar/default/avatar04.webp
Plx...don't queue if you're not gonna play the quiz Thx
Wattage is Wattage. If its a 400W its obviously the same as a 400W heater in terms of heating... It even comes with a blower to blow the heat around the room!
https://forums.guru3d.com/data/avatars/m/233/233002.jpg
I thought I'd wait for Vega until I saw the power consumption and performance and then I didn't care any longer for the price I went and bpught a 1080 before prices spiked because of mining and couldn't be happier.
data/avatar/default/avatar17.webp
Wattage is Wattage. If its a 400W its obviously the same as a 400W heater in terms of heating... It even comes with a blower to blow the heat around the room!
Good answer. So true. All of it. Sadly, that was not the question
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
I feel like Vega's launch has been marred by a number of different factors. I wonder how different it would have looked if this thing launched in Q1 before the 1080Ti, before Volta's announcement. I also wonder what it would look like if it was manufactured by TSMC. Last year we had so many rumors that it would be launching around Q4/Q1 - then the HBM2 delay rumors started coming. Then the 1080Ti launched at a price no one was expecting at 35% increased performance over the 1080. And now Volta specs are in the wild casting a shadow on Vega's launch as well. Just seems like a lot of stuff went wrong and I wonder how much the decision to use HBM2 factored into any potential delay.
data/avatar/default/avatar12.webp
Good answer. So true. All of it. Sadly, that was not the question
TDP isnt usually the absolute maximum Wattage of a chip though. It depends on how the manufacturer calculates TDP for their chips, so the actual Wattage in a worst case scenario and the TDP of x processor can be different numbers. It varies from vendor to vendor and even chip line to chip line from the same vendor.
data/avatar/default/avatar12.webp
TDP isnt usually the absolute maximum Wattage of a chip though. It depends on how the manufacturer calculates TDP for their chips, so the actual Wattage in a worst case scenario and the TDP of x processor can be different numbers. It varies from vendor to vendor and even chip line to chip line from the same vendor.
Why are you bringing up absolute maximum Wattage of a chip? I never mentioned it. No one cares that Fury X spikes to 430 Watt. 1ms power readings are thermally insignificant, and power and thermal wise they cancel out with 1ms power drops What's important is a maximum sustained power usage.
data/avatar/default/avatar12.webp
Why are you bringing up absolute maximum Wattage of a chip? I never mentioned it. No one cares that Fury X spikes to 430 Watt. 1ms power readings are thermally insignificant, and power and thermal wise they cancel out with 1ms power drops What's important is a maximum sustained power usage.
If someone calls it a "400W GPU" then they're saying it has a 400W TDP, right? If so, that's an issue of semantics, not power consumption measurement.
https://forums.guru3d.com/data/avatars/m/265/265608.jpg
Who cares how mutch watt it takes to power these one`s no one will ever see one if these things do ok in crypto mining. just like the illustrious RX580
https://forums.guru3d.com/data/avatars/m/202/202673.jpg
Not that you can base an opinion on AIDA64 GFX results but if two RX580s on Crossfire are faster then an AIB 1080 Ti, that would disappoint how precisely?
LOL AIDA64 is from Budapest. And that 580 crossfire result is what RX Vega should be capable of with the right drivers...I don't know if AMD are sandbagging or struggling with drivers atm, but that level of performance should be inside that Vega chip somewhere or better: somewhen.
data/avatar/default/avatar29.webp
If someone calls it a "400W GPU" then they're saying it has a 400W TDP, right?
Yes, because its basically the same thing. If IHV is telling use that their product's TDP is 400W, that means that it can dissipate 400W, which means it can accommodate GPU with 400W of power consumption. So unless IHV has purposely decided to shoot themselves in the foot by overestimating TDP(*), TDP=power consumption. <- my point * this does happen in some cut down lower chip versions, but it's just a simplification. Overestimating TDP does not happen with major products, when the eyes of the press and everyone else is glued on TDP. Or sometimes IHV can even cheat, and communicate lower TDP to the press, but that's dishonesty, and I am talking about the proper, the best attempt at TDP estimate. Ingame power consumption(averaged across 6-7 reviews) vs TDP. See the trend? https://abload.de/img/screenshot2017-07-191xgkza.png