AMD takes the lead in the new Forza Horizon 4 DX12 benchmark

Published by

Click here to post a comment for AMD takes the lead in the new Forza Horizon 4 DX12 benchmark on our message forum
data/avatar/default/avatar24.webp
My Vega cant overclock what so ever on the core setting slider. I can push the power limit up. The memory i can overclock massively to 1200 from 945 but my everyday use is 1100 MHZ for stability reasons. Who ever said HBM dont overclock load of cobs wallop. You can slash over 150 watts of the vega by dropping the clock speeds by 5% and letting the card under volt itself that's what i found to be the best because setting the voltage manually effect stability. I actually like the fact i cant overclock the card because its already on the maximum it can perform and saves me the bother Nvidia users have wile they search for their maximum. Mine is given to me out the box and that's to me is very convenient and yet people see that as a negative. Love reading the haters hate and make their excuses to cover their loyalty with ignorance it gives me such a warm fuzzy feeling.
https://forums.guru3d.com/data/avatars/m/156/156133.jpg
Moderator
RzrTrek:

AMD takes the lead at twice the wattage, but I get it, we want to have some good GPU news once in a while, heck I would be the 1st to ask for it. Let's just hope they can come back and compete with Nvidia's GTX 1000 & RTX 2000 series at the same wattage, because my RX580 is not aging well. Also my new favorite UE4 based racing sim is very demanding (also CPU intensive) and I reckon a GTX 1080 or faster should do the trick.
To be fair, there is Radeon Chill to help regulate the power usage and even bring it down. One thing people are forgetting though, is Vega 64 and the 1080 are usually back and fourth in the past few months now. You can just find a 1080 a lot easier and cheaper still.
data/avatar/default/avatar05.webp
OnnA:

Here Real 4k (4096:2304) on Vega 64 XTX LiQuiD 4k Ultra http://i65.tinypic.com/osgw8g.jpg
wow nice. but makes life hard for others to compare results. what monitor you using pls to get this TRUE 4 k result pls.
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
Vega's biggest problem was its necessity to tackle multiple markets at once. Nvidia is in the same boat now with Turing.. they don't want to split their architecture for big data/AI/Automotive and gaming - so instead of splitting it they are pivoting the technologies useful for professional markets (things like the tensor cores, FP/INT separation, BVH performance improvements for automotive, RT cores for their visual/optix customers) into value-add gaming features. The problem is all of that stuff costs a ton of money to support from the software development side, something AMD couldn't afford to do but Nvidia can. I'd like to see what an alternate reality would look like if Vega was fully supported, FP16 (RPM) utilized in more games, Primative Shaders working and supported, ACE's properly tuned for, Compute performance utilized etc. It probably would have easily bested the 1080Ti. In the end AMD chose to devote their resources to Zen, which I think was the better choice due to Intel's stagnation. A lot of people gave Raja flak but honestly I think he overperformed given the circumstances the company was under. I don't know if anyone could have done better.
https://forums.guru3d.com/data/avatars/m/156/156133.jpg
Moderator
Denial:

Vega's biggest problem was its necessity to tackle multiple markets at once. Nvidia is in the same boat now with Turing.. they don't want to split their architecture for big data/AI/Automotive and gaming - so instead of splitting it they are pivoting the technologies useful for professional markets (things like the tensor cores, FP/INT separation, BVH performance improvements for automotive, RT cores for their visual/optix customers) into value-add gaming features. The problem is all of that stuff costs a ton of money to support from the software development side, something AMD couldn't afford to do but Nvidia can. I'd like to see what an alternate reality would look like if Vega was fully supported, FP16 (RPM) utilized in more games, Primative Shaders working and supported, ACE's properly tuned for, Compute performance utilized etc. It probably would have easily bested the 1080Ti. In the end AMD chose to devote their resources to Zen, which I think was the better choice due to Intel's stagnation. A lot of people gave Raja flak but honestly I think he overperformed given the circumstances the company was under. I don't know if anyone could have done better.
I remember reading somewhere that Raja was basically given a small team for Vega, and other members of his team and others were working on different projects. I'll have to find the article, which if true makes sense.
https://forums.guru3d.com/data/avatars/m/142/142982.jpg
Like someone say'd before the timing is ..., if this was in early 2017, but now with the new 2000 series coming is like they say: Finaly we are finishing the race... Hoping for next gen from AMD to improve more. As for prices and avability in my country the cheapest AMD VEGA64 is with 150 Euro's more that the cheapest Nvidia 1080 so for me at least is to much. This is also because Nvidia has more cheap partners. See AMD Vega64 vs see NVIDIA 1080 vs see Nvidia 1080 TI.
https://forums.guru3d.com/data/avatars/m/275/275145.jpg
Pimpiklem:

I actually like the fact i cant overclock the card because its already on the maximum it can perform and saves me the bother Nvidia users have wile they search for their maximum. Mine is given to me out the box and that's to me is very convenient and yet people see that as a negative.
Of course is a negative thing, I think everyone would like a bit more performance for free, if they can! If you have a GPU that don't have OC headroom and the equivalent competitor have, I don't see how this is positive for the first one,
https://forums.guru3d.com/data/avatars/m/156/156133.jpg
Moderator
kings:

Of course is a negative thing, I think everyone would like a bit more performance for free, if they can! If you have a GPU that don't have OC headroom and the competitor have, I don't see how this is positive for the first one.
Most people don't overclock their GPU's though, if there is one it's a factory overclock that people want
data/avatar/default/avatar05.webp
I dont see overclock headroom as a positive i see it as a negative myself because it tells me the power usage plastered on the side of the box is a lie. You cant advertise a TDP and overclocking headroom in the same sentence because one eradicates the other. My vega says 300 watts but i use 250 usually. Where as a Nvidia card says 250 they end up using 320. Do you see do you see whats happening. I have seen a 1080ti pull 600 watts at the wall with a 2050 mhz overclock with liquid metal.
data/avatar/default/avatar05.webp
wow so many panties in a bunch here, lol.
data/avatar/default/avatar17.webp
vbetts:

Most people don't overclock their GPU's though, if there is one it's a factory overclock that people want
"overclocking" is different on Vega then Pascal, its all about the undervolt, you can get 15% gains tuning the voltage floor and use less power: [youtube=9zKhf_ddUJg]
data/avatar/default/avatar37.webp
Vega is a strange animal . You oddly get more performance lowering the core clock by 5% under stock and without hitting any thermal throttling causing any issue. Something happens outside my understanding to be honest. less power more speed 😡
https://forums.guru3d.com/data/avatars/m/156/156133.jpg
Moderator
Pimpiklem:

I dont see overclock headroom as a positive i see it as a negative myself because it tells me the power usage plastered on the side of the box is a lie. You cant advertise a TDP and overclocking headroom in the same sentence because one eradicates the other. My vega says 300 watts but i use 250 usually. Where as a Nvidia card says 250 they end up using 320. Do you see do you see whats happening. I have seen a 1080ti pull 600 watts at the wall with a 2050 mhz overclock with liquid metal.
Vega is a little different now with Radeon Chill though, which is a newer feature. So typically the advertised TDP is going to be a bit lower, but where has Nvidia at least promoted power usage with Pascal that ended higher?
holler:

"overclocking" is different on Vega then Pascal, its all about the undervolt, you can get 15% gains tuning the voltage floor and use less power: Le Snip
Still the same thing, most people aren't going to buy it to undervolt it themselves, they'll buy one that's already been tuned and even now like I said with Radeon Chill, the card can do it itself without being a special edition or factory done.
data/avatar/default/avatar40.webp
korn87:

tested with friends in 4k at maximum settings. Vega goes forward only at lower resolutions. http://funkyimg.com/p/2Lk49.jpg https://s8.hostingkartinok.com/uploads/images/2018/09/1af3234ce346cebcb8bf48f3acbe4aaf.jpg
What? "RX Vega 56 ran 13-22% faster than the GTX 1070" The 1080 is only 6% faster than the Vega56 on your chart and it's 22% faster on average than the 1070. Given these numbers and your chart, the 1070 would score around 46 fps, a 15-16% lead for Vega56.
Archvile82:

Considering the lead platform is AMD based X box this isn't really surprising. If anything it's impressive Nvidia being this close. Anyway compare when Nvidia release game ready drivers.
Actually AMD cards beat NV cards in Destiny 2 or were very close (not 15-20% like NV here behind AMD, but 5% at most) in AC: Origins or last title that is sponsored by NV (namely Shadow of the Tomb Raider), AMD cards are again on par with their NV counterpart. So given these results by AMD in NV supported titles, this 15-20% difference is not close but far.
anxious_f0x:

Good to see AMD can finally compete with a two and a half year old GPU that’s literally being replaced in a matter of days. Navi needs to be something special, they’re slipping to far back at this point.
Just to remember you, Vega is also more than a year old technology and had the same performance at launch as the 1080. So I have no idea what you are talking about. Only a fraction of the gamers buy 1080Ti -ish cards, most of them stuck with 1050Ti and 1060 level of cards. BTW, the RTX cards performance growth will be overwhelmingly bad compared to last (this) gen, especially when comparing prices. you +nicugoalkeeper: Check Techpowerup's poll: Are you getting a new GeForce RTX 2000 card? Skipping this generation 44% Will wait for reviews 28% Doubt it 23% Already preordered 5% Total: 6,945 votes So we have 7000 votes and 44% of the voters will skip the RTX series, 23% will probably and 28% will wait for reviews then decide. If early leaks were true, a not too small bunch of the review waiters will skip it. So it's not impossible that 80 even 85% of the voters will not buy an RTX card until at least having normalized prices.
nicugoalkeper:

As for prices and avability in my country the cheapest AMD VEGA64 is with 150 Euro's more that the cheapest Nvidia 1080 so for me at least is to much. This is also because Nvidia has more cheap partners. See AMD Vega64 vs see NVIDIA 1080 vs see Nvidia 1080 TI.
This is not AMD's fault as in Germany, UK, USA, prices are similar or very near to each other.
https://forums.guru3d.com/data/avatars/m/275/275145.jpg
Pimpiklem:

I dont see overclock headroom as a positive i see it as a negative myself because it tells me the power usage plastered on the side of the box is a lie. You cant advertise a TDP and overclocking headroom in the same sentence because one eradicates the other. My vega says 300 watts but i use 250 usually. Where as a Nvidia card says 250 they end up using 320. Do you see do you see whats happening. I have seen a 1080ti pull 600 watts at the wall with a 2050 mhz overclock with liquid metal. You wont see that in the marketing.
Overclock has always been an extra! Of course if you overclock a GPU or CPU, it will consume more power! TDP is usually measured at default clocks on most brands. If you overclock, naturally the power consumption rises, but most people are fine with that, because they get extra performance. No one is forced to overclock the cards, we can simple run them at default clocks with the original TDP. But, if we have some headroom to gain for example 10% or 15%, i don't see that as a negative aspect.
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
Pimpiklem:

I dont see overclock headroom as a positive i see it as a negative myself because it tells me the power usage plastered on the side of the box is a lie. You cant advertise a TDP and overclocking headroom in the same sentence because one eradicates the other. My vega says 300 watts but i use 250 usually. Where as a Nvidia card says 250 they end up using 320. Do you see do you see whats happening. I have seen a 1080ti pull 600 watts at the wall with a 2050 mhz overclock with liquid metal. You wont see that in the marketing.
Why would you see that in marketing? It's a user overclock. Should they market that someone could shunt mod the card and drop liquid nitrogen on it?
vbetts:

where has Nvidia at least promoted power usage with Pascal that ended hgiher?
Did they? Nvidia's listed TDP for the 1080Ti FE is 250w. Guru3D's maximum power is 279w for the review but techpowerup has average gaming down at 231w. Seems like 250W is a fine average for Typical Design Power. You can also undervolt Pascal cards too and retain the same clock speed or even overclock them with less than stock voltage - something that people conveniently ignore when making these comparisons (this isn't aimed at you vbetts but all the posts i've seen in the last few days comparing the extremes of each vendor's card in order to make really weird points).