Radeon RX Vega Benchmark Shows 3DMark 11 Performance just above GTX 1080

Published by

Click here to post a comment for Radeon RX Vega Benchmark Shows 3DMark 11 Performance just above GTX 1080 on our message forum
https://forums.guru3d.com/data/avatars/m/237/237771.jpg
Thank you "Master Guru"
There are always ways you can limit or eliminate screen tear. The good thing is the 1080ti can almost always deliver 90+ FPS at QHD so you could always use the custom resolution feature in Nvidia control panel to set a static refresh rate at say 90 or 75Hz then use frame limiting. The higher the refresh rate the less likely you are to "see the tear. Edit: sorry for the double post. I wasn't paying attention.
data/avatar/default/avatar33.webp
AMD's TDP = maximum possible draw (6/8 PINs + Board Draw) - meaning the card's actual TDP is in fact unknown. If history is any indicator however, expect a ~250W behemoth which (hopefully) has some decent undervolting capabilities again.
This myth of here comes the overclocking or undervolting to save the turd needs to stop. I am not saying that Vega will be a turd, but if OC/UV is needed to make it competitive then yes, it most certainly is a turd. Because undervolting is nothing but a roll of dice made possible due to sample variations, not due to the architecture. And GCN does not have monopoly on luck or sample variations. My 290 can BSOD with mere -30mV, but my custom GTX 460 was a terrific undervolter. Pascal for example can be undervolted just as well: https://forum.beyond3d.com/posts/1979000/ The fact that undervolting has become much more popular on AMD GPUs is nothing but a consequence of their high power consumption coupled with their throttling issues on reference designs, and even further made popular by their mining usage. And if anyone thinks that GPU that's already breathing fire has been intentionally over-volted by AMD across the entire shipping volume, therefore further increasing power consumption, just so you can undervolt it, yeah.... right
https://forums.guru3d.com/data/avatars/m/228/228512.jpg
14 months late and all that they can muster is a 1080 Vanilla at double the power usage and 50% larger die size. Its a design failure from the start. Even if they give it away....
https://forums.guru3d.com/data/avatars/m/255/255573.jpg
Glad I didn't wait in hindsight. Wanted the best performance which vega is not. I hope AMD price it right as that's the only way it will succeed. On paper HBM2 looks good, but in reality it's not at the moment. AMD should of stuck to GDDR5X.
Well, it has not to be the best, probably the best price/cost ratio. I agree, HBM2 will be the bottleneck. Perhaps GDDR5x would have been the bettter choice this year. Perhaps that is the resaon for release-delay ?? W'll see ...
https://forums.guru3d.com/data/avatars/m/239/239175.jpg
Thank you "Master Guru"
Not sure if sarcasm, but that's exactly how you find out how games will play without freesync. You disable it. No? :3eyes:
https://forums.guru3d.com/data/avatars/m/63/63215.jpg
We need to see the price. Considering a GTX1080 is around £510-550 right now, I don't think they could go higher especially if they can't win the majority of benchmarks against the GTX1080. However, if their gaming numbers are higher, then, £550 or more would be no problem. If they could go lower like £480 or less, then, they might have a winner. With Volta coming within the next 8months though, it's going to be a tough market. Maybe the mining craze is doing them a favour and bringing in much needed capital.
data/avatar/default/avatar38.webp
No problem there they can make gpu versions for mining only and priced lower than gaming counterpart so that miners can't sell them to people who want to play but they can sell them for scrap and both sides will be happy but time will tell what it will be.
I bet not many are going to buy a mining version just to save a bit of money because they would lose much more by not being able to sell them.
https://forums.guru3d.com/data/avatars/m/241/241896.jpg
Well i was hoping for a little more performance to be honest , i have a 1440p ,144hz refresh rate Freesync monitor and have been using Nvidia , i don't know yet if this would be a sensible upgrade for me from my current 980ti sli . I would like to go back to a single card setup and to be able to use freesync will RX Vega have the power i need ? . I guess we will have to wait and see gaming benchmarks before i make a decision .
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
Well i was hoping for a little more performance to be honest , i have a 1440p ,144hz refresh rate Freesync monitor and have been using Nvidia , i don't know yet if this would be a sensible upgrade for me from my current 980ti sli . I would like to go back to a single card setup and to be able to use freesync will RX Vega have the power i need ? . I guess we will have to wait and see gaming benchmarks before i make a decision .
If RX Vega performs similarly to a 1080 then it should be fine at QHD with Freesync. I had a 1080 and I have a Acer XB270HU - there was maybe 1-2 games where I had to turn down a setting or two to get a good experience (I consider 75+ fps a good experience). Honestly most games that are coming out now in days, the difference in visual quality between "Ultra" and "High" or whatever names they use are negligible compared to the performance impact.
https://forums.guru3d.com/data/avatars/m/270/270008.jpg
Ryzen wins more that it loses over Intels equivalent. Vega so far does not. It's a bummer really.
Also, the Ryzen R7 is 1/2 the power draw as Intel's 7820x so 8-core vs 8-core Ryzen is much more efficient under stock clocks. I don't think this will be the case for Vega.
data/avatar/default/avatar35.webp
We need to see the price. Considering a GTX1080 is around £510-550 right now, I don't think they could go higher especially if they can't win the majority of benchmarks against the GTX1080. However, if their gaming numbers are higher, then, £550 or more would be no problem. If they could go lower like £480 or less, then, they might have a winner. With Volta coming within the next 8months though, it's going to be a tough market. Maybe the mining craze is doing them a favour and bringing in much needed capital.
I feel like I'm being harsher on AMD than others but if it came out at £500+ and at GTX 1080 level, it's a flop. I'm not price checking the 1080s these days but they have regularly been on sale for less than £450 (I nabbed mine for £425). If they are over £500 these days, it's cos of mining (not RRP) and all cards are subject to this inflation (£550 Rx 580, come on). We can say with some confidence that when Volta comes out, 1080/1080ti performance will be offered at xx70 level, and all stock of the 10xx gen will drop in price correspondingly. Hell, Nvidia's release of the 1080ti will probably have done more to drop Nvidia prices than AMD's competition this gen. I'm very bummed out by this. I'm looking for high end (not quite enthusiast - close to 60fps/4k/ultra is good enough for me) performance and AMD hasn't accommodated this. Delivering the same performance as previous cards for prices in line with gradual price reductions seems pointless to me in this segment. I'm sure others are happy and excited by more quality options in the £200 ball park, but then I'd just see equivalent performance from older cards that are offered for those same prices. I'm just left frustrated and deflated by AMD over the last few years. For what it's worth, I think we'll see another Rx 480 release and we should be prepared to see the initial (non-synthetic) gaming benchmarks for Vega [b[below 1080 level, but over time they'll be finewined on par/above 1080. I'm sure there's some optimisations that'll lift Rx Vega above the performance numbers we've seen for Vega FE, but I'd put money that they won't be substantial enough at launch to meet the hyped expectations. I'm saying this because I think people hyping "to 1080ti and beyond!" are going to further hurt the launch.
https://forums.guru3d.com/data/avatars/m/241/241896.jpg
If RX Vega performs similarly to a 1080 then it should be fine at QHD with Freesync. I had a 1080 and I have a Acer XB270HU - there was maybe 1-2 games where I had to turn down a setting or two to get a good experience (I consider 75+ fps a good experience). Honestly most games that are coming out now in days, the difference in visual quality between "Ultra" and "High" or whatever names they use are negligible compared to the performance impact.
Thanks for the info 🙂 😛c1: 🙂
data/avatar/default/avatar15.webp
Thank you "Master Guru"
You have 7 posts... He's right. Turn off FreeSync and test. Don't be an idiot. It's not rocket science.
https://forums.guru3d.com/data/avatars/m/237/237771.jpg
Not sure if sarcasm, but that's exactly how you find out how games will play without freesync. You disable it. No? :3eyes:
You have 7 posts... He's right. Turn off FreeSync and test. Don't be an idiot. It's not rocket science.
290x=1080ti??? It think both of you missed the point of his question. He was inquiring as to how bad would the screen tearing be if he upgraded to a 1080ti. While I agree he could some what simulate the FPS of a 1080ti he was asking for assistance and got a smartass answer.
data/avatar/default/avatar37.webp
So much powah! Not for f-+_'s shake! Poor volta my *#$. 2070 will trample any Vega gpu. RIP rtg. RIP our wallet.
https://forums.guru3d.com/data/avatars/m/196/196426.jpg
RIP our wallet.
That's what hurts most... not that RTG is failing, but the fact that without any serious competition, nVidia will charge whatever they want on high-end cards. I for one paid through my bleeding nose on this GTX1080 when it was brand new, but that's not going to happen again for sure. Even today it makes me sick thinking how much money it was...
https://forums.guru3d.com/data/avatars/m/258/258688.jpg
Glad I didn't wait in hindsight. Wanted the best performance which vega is not. I hope AMD price it right as that's the only way it will succeed. On paper HBM2 looks good, but in reality it's not at the moment. AMD should of stuck to GDDR5X.
You haven't seen Vega yet. Try and keep that in mind. I realize you want to "feel better" about your purchase by jumping to conclusions, but technology is always like that--no matter when you buy, now or later, there is always something better coming right around the corner. I never buy for the illusion that what I am buying is "the best," even if it is--because I know in a matter of months at the most it will no longer be "the best" anymore...;) A good purchase for me is one that I am content with--I *never* buy based on benchmarks--price/performance is king for me. For instance, I'd never be content with paying 50% more for a GPU that garnered 10% higher benchmark scores, etc. No way, Jose'...;)
data/avatar/default/avatar15.webp
That's what hurts most... not that RTG is failing, but the fact that without any serious competition, nVidia will charge whatever they want on high-end cards. I for one paid through my bleeding nose on this GTX1080 when it was brand new, but that's not going to happen again for sure. Even today it makes me sick thinking how much money it was...
Whats funny to me is when consumers think they can buy Nvidias high end chip after the 780ti was replaced by GM200.
https://forums.guru3d.com/data/avatars/m/270/270233.jpg
Not even pitched against 1080ti ? And Volta coming ? No Vega for me and sorry about it. I was expecting a kind of Ryzen in GPU
In order for Vega to be like Ryzen it would have to have comparable performance to a 1080 but at half or third the price. Simply put, it's not going to happen (large die + HBM2 makes it impossible). As for mining, it all depends on the hash rate and power draw. If it consumes more power than a 1080 then I doubt miners would touch it. Also, Nvidia cards hold the advantage in ETH mining over time, which might work in favor of Vega for gamers (here's hoping).
https://forums.guru3d.com/data/avatars/m/101/101279.jpg
R9 290s are being sold close to 300$ for mining. Miners have to use their equipment for as long as it's profitable. So why won't mining last? This isn't BitCoin mining that's going on you know.
Wouldn't the obvious solution be to sell two lines of GPU's, one gaming and mining compatible and one only gaming compatible?