Guru3D.com
  • HOME
  • NEWS
    • Channels
    • Archive
  • DOWNLOADS
    • New Downloads
    • Categories
    • Archive
  • GAME REVIEWS
  • ARTICLES
    • Rig of the Month
    • Join ROTM
    • PC Buyers Guide
    • Guru3D VGA Charts
    • Editorials
    • Dated content
  • HARDWARE REVIEWS
    • Videocards
    • Processors
    • Audio
    • Motherboards
    • Memory and Flash
    • SSD Storage
    • Chassis
    • Media Players
    • Power Supply
    • Laptop and Mobile
    • Smartphone
    • Networking
    • Keyboard Mouse
    • Cooling
    • Search articles
    • Knowledgebase
    • More Categories
  • FORUMS
  • NEWSLETTER
  • CONTACT

New Reviews
PowerColor RX 6650 XT Hellhound White review
FSP Hydro PTM Pro (1200W PSU) review
ASUS ROG Radeon RX 6750 XT STRIX review
AMD FidelityFX Super Resolution 2.0 - preview
Sapphire Radeon RX 6650 XT Nitro+ review
Sapphire Radeon RX 6950 XT Sapphire Nitro+ Pure review
Sapphire Radeon RX 6750 XT Nitro+ review
MSI Radeon RX 6950 XT Gaming X TRIO review
MSI Radeon RX 6750 XT Gaming X TRIO review
MSI Radeon RX 6650 XT Gaming X review

New Downloads
AIDA64 Download Version 6.70
FurMark Download v1.30
Display Driver Uninstaller Download version 18.0.5.1
Download Samsung Magician v7.1.1.820
Intel ARC graphics Driver Download Version: 30.0.101.1732
HWiNFO Download v7.24
GeForce 512.77 WHQL driver download
Intel HD graphics Driver Download Version: 30.0.101.1960
AMD Radeon Software Adrenalin 22.5.1 WHQL driver download
3DMark Download v2.22.7359 + Time Spy


New Forum Topics
Rumor: NVIDIA could unveil a GeForce GTX 1630 this month Intel raptor Lake Caches Confirmed through leaked CPU-Z screenshot [3rd-Party Driver] Amernime Zone Radeon Insight 22.5.1 WHQL Driver Pack (Released) Acer Predator Helios 300 with 3D Display, New Predator Triton Laptops and Gaming Displays Intel Arc desktop graphics cards from Intel are further delayed (could be September at the earliest) ASUS releases two Radeon RX 6400 variants with Axial-tech fan coolers. Color me White - The ASUS ROG STRIX LC II 360 ARGB WE MSI's thin gaming PC Trident X gets GeForce RTX 3070 LHR option Huawei Launches MateBook D 15 NVIDIA Profile Inspector 2.3.0.13




Guru3D.com » News » AMD Radeon Shows Strong performance with Battlefield V Closed Alpha

AMD Radeon Shows Strong performance with Battlefield V Closed Alpha

by Hilbert Hagedoorn on: 07/05/2018 09:22 AM | source: | 75 comment(s)
AMD Radeon Shows Strong performance with Battlefield V Closed Alpha

From the looks of things AMD seems to be doing pretty well when looking at performance and the closed Alpha release for Battlefield V. In a quick test a GeForce GTX 1060 6GB was put up against a Radeon RX 580 8GB at both 1080p and 1440p, and the results are interesting. 

Overall the Radeon RX 580 performed over 30% faster. Interesting. Then again, where the previous battlefield was an NVIDIA sponsored title, Battlefield V we're not sure of who has the partnership this round (though we're pretty confident it's NV though). In the end, I'd like to mention, this is an ALPHA release, drivers have not been optimized and likely nothing has been tuned. Hey, for all we know it's image quality related. Have a peek at the results that PCGamesN reported, incl DX11/DX12 measurements. Allow me to be realistic here though, with an Alpha release this early in the development stage, the numbers mean absolutely nothing.

 

Source: PCGamesN



AMD Radeon Shows Strong performance with Battlefield V Closed Alpha AMD Radeon Shows Strong performance with Battlefield V Closed Alpha




« Sapphire announces FS-FP5V embedded Ryzen system · AMD Radeon Shows Strong performance with Battlefield V Closed Alpha · Review: Corsair Strafe RGB MK.2 (silent switches) »

Related Stories

AMD Radeon RX rumor: 12nm Polaris 30 in the 4th quarter - 07/02/2018 05:43 PM
A fairly credible source in the past at the Chiphell forums is now mentioning that AMD might be releasing a new (another) version of Polaris. Polaris 30 would be manufactured at 12 nm and would be ...

AMD Radeon Vega 12 and Vega 20 Wave Hello From Ashes Of The Singularity Database - 06/19/2018 05:43 PM
It is getting a little too obvious who is planting small Vega leaks into the Ashes Of The Singularity database, I mean repetitive pattern each and every year. And what company is all over who is all ...

Resident Evil 2 Remake to be optimized for AMD Radeon - 06/15/2018 03:26 PM
AMD announced some new partnerships with Rebellion, Ubisoft and Capcom at E3. According to Scott Herkelman, VP and GM at AMD Radeon Gaming, The Division 2, Resident Evil 2 Remake and Strange Brigade w...

AMD Radeon Pro V340 with two Vega 10 GPUs surfaces online - 06/15/2018 08:49 AM
Some specs have been spotted on the AMD Radeon Pro V340, a graphics card intended for the professional segment of the market, interesting fact: it has 32GB HBM2 and has been fitted with two Vega GPU'...

AMD Radeon Pro Software Enterprise Edition 18.Q2 - 05/10/2018 08:36 AM
AMD released the new Radeon Pro Software Enterprise Driver 18.Q2, continuing to protect the investment of IT professionals and buyers by providing significant performance improvements in leading profe...


15 pages « < 12 13 14 15


Fox2232
Senior Member



Posts: 11809
Joined: 2012-07-20

#5563657 Posted on: 07/10/2018 06:58 AM
Ok, I see what you're saying. Lower settings, cap fps, turn Nvidia stuff off.

That's general thing you do with every game. If there are options, you check them and tune them. When Witcher 3 came out, I had nVidia options tuned down. Because they did not add much to game and degraded performance greatly.
Today, Same GPU and nVidia options can be maxed. That's how it works.
Here's my thoughts. If a game has both DX11 and DX12 options, and the DX12 option is buggy and causes crashes, then, isn't it reasonable for a user to choose to use DX11? Reasonable enough in-fact that making the conscious choice to use DX11 because it's working "better" for him/her doesn't make that person any kind of noob. This scenario is one of the most common we've seen after-all.

Bit overstatement isn't it? Or do you want to claim that it is common for DX12 games to crash and stutter? Because it is not!
Because that's what this comes down to. IMHO, if someone chooses DX11 over DX12, he's in general choosing a more stable experience (which is regardless of performance).

"in general" part is stupid. Because that kind of "generalizes" and again puts all DX12 games into bag marked with: "Worse than DX11."

If someone has trouble with DX12 for whatever reason and can't fix them by other means, then it is rational to switch to another available API. But cases where everyone has that problem and API is to blame... other story.
You've seen it here 1000 times... guy comes and has problem. Few others with same configuration say that they do not have it. Guy reinstalls windows as last resort after few attempts to fix it and magic happens.

fantaskarsef
Senior Member



Posts: 12969
Joined: 2014-07-21

#5563682 Posted on: 07/10/2018 09:36 AM
Digital Foundry's comment sections on youtube.



i am staying out of this

Chill out guys, I was just joking. If it wasn't for the sad fact that PC gaming and console generations are unluckily locked, I couldn't really be bothered to have problems with them.

Stormyandcold
Senior Member



Posts: 5771
Joined: 2003-09-15

#5563693 Posted on: 07/10/2018 10:43 AM
That's general thing you do with every game. If there are options, you check them and tune them. When Witcher 3 came out, I had nVidia options tuned down. Because they did not add much to game and degraded performance greatly.
Today, Same GPU and nVidia options can be maxed. That's how it works.

Bit overstatement isn't it? Or do you want to claim that it is common for DX12 games to crash and stutter? Because it is not!

"in general" part is stupid. Because that kind of "generalizes" and again puts all DX12 games into bag marked with: "Worse than DX11."

If someone has trouble with DX12 for whatever reason and can't fix them by other means, then it is rational to switch to another available API. But cases where everyone has that problem and API is to blame... other story.
You've seen it here 1000 times... guy comes and has problem. Few others with same configuration say that they do not have it. Guy reinstalls windows as last resort after few attempts to fix it and magic happens.
Changing settings is indeed a part of PC gaming. However, you're having to make assumptions here that people (like in the reddit post I quoted) don't know how to do this. IMHO, if a user can list his parts and actually describe what he's tried and the performance, then, that user is far from a noob like what you're trying to imply.

As for choosing DX11 over DX12, actually, I've seen the exact scenario I hypothetically described much more to fix issues, definitely more than re-installing win10, but, that sometimes helps as well and is common. However, it's just as common for a w10 re-install to have no effect. It's not an overstatement at all. I think you're actually trying your best to avoid directly answering my question or to even directly credit DX11 as a viable choice over DX12.

After what we've all seen and all the tests done, it's my opinion that theoretically DX12 should be better than DX11. However, in the real world, it's just as common for DX11 to be as good or better in "some" games.

To avoid going too far off topic, I would like to bring this back to the topic at hand; alpha benchmark numbers for this thread.

If you look at the Battlefield V alpha benchmarks for the RX580, then, it's clear and undeniable that DX11 runs better than DX12 at 1440p. That includes both better minimum and better high fps. The same is true for the GTX1060, despite obviously lower numbers. We know that certain features on the AMD side like async-compute, have been back-ported to DX11 to further close the gap between the two apis. It's not out of the question (and this is hypothetical of-course), then, that DX11 could end-up running better than DX12 for AMD. For this particular game, I'd say if an AMD user with the same or similar gpu and 1440p (freesync) monitor chose to run it with DX11, then, that's the smart choice.

Fox2232
Senior Member



Posts: 11809
Joined: 2012-07-20

#5563710 Posted on: 07/10/2018 12:19 PM
Changing settings is indeed a part of PC gaming. However, you're having to make assumptions here that people (like in the reddit post I quoted) don't know how to do this. IMHO, if a user can list his parts and actually describe what he's tried and the performance, then, that user is far from a noob like what you're trying to imply.

As for choosing DX11 over DX12, actually, I've seen the exact scenario I hypothetically described much more to fix issues, definitely more than re-installing win10, but, that sometimes helps as well and is common. However, it's just as common for a w10 re-install to have no effect. It's not an overstatement at all. I think you're actually trying your best to avoid directly answering my question or to even directly credit DX11 as a viable choice over DX12.

After what we've all seen and all the tests done, it's my opinion that theoretically DX12 should be better than DX11. However, in the real world, it's just as common for DX11 to be as good or better in "some" games.

To avoid going too far off topic, I would like to bring this back to the topic at hand; alpha benchmark numbers for this thread.

If you look at the Battlefield V alpha benchmarks for the RX580, then, it's clear and undeniable that DX11 runs better than DX12 at 1440p. That includes both better minimum and better high fps. The same is true for the GTX1060, despite obviously lower numbers. We know that certain features on the AMD side like async-compute, have been back-ported to DX11 to further close the gap between the two apis. It's not out of the question (and this is hypothetical of-course), then, that DX11 could end-up running better than DX12 for AMD. For this particular game, I'd say if an AMD user with the same or similar gpu and 1440p (freesync) monitor chose to run it with DX11, then, that's the smart choice.
nVidia having worse performance under DX12 is not surprising. Most of DX12 perform for nV cards worse than DX11.
I do not want to start flame or anything, but I though that everyone knew about way nVidia use SW optimizations for DX11 and saved on having some parts of GPU less robust which allowed them to run faster than AMD's counterparts.
It was good move for DX11. But when it is every game developer who would have to use same specific optimizations to help nVidia... Simply not feasible. That's why nVidia in some of those DX12 games loses 30% performance in comparison to DX11.

AMD's GCN architecture lives and dies with HW design. nVidia's lives and dies with SW implementation. And in case of DX12, it is kind of out of nVidia's sphere of influence. I would not blame DX12 for showing thing or two people knew already.

As of theory that DX12 should be better than DX11. It is. It has shader model 5.1 which adds single thing: Lists or whatever it is called.
Then SM 6.0. And threaded draw calls handling. Which brings you to part where at least 2 of those instances where comparison is made is done on Quad Core CPU (Quantum Break, Vermintide 2). DX12 up to 6 threads for pushing rendering stuff + 2~3 for Game engine itself. In this scenario, I think it is bit out of place to blame DX12 while having Quad core CPU.
And I can tell you that I had poor experience with Vermintide 2 and i5@4.5GHz and then wonderful on Ryzen 2700X. (I already wrote it here like 5 times over course of last 2 months.) And same situation with BF1.

Stormyandcold
Senior Member



Posts: 5771
Joined: 2003-09-15

#5563762 Posted on: 07/10/2018 03:12 PM
nVidia having worse performance under DX12 is not surprising. Most of DX12 perform for nV cards worse than DX11.
I do not want to start flame or anything, but I though that everyone knew about way nVidia use SW optimizations for DX11 and saved on having some parts of GPU less robust which allowed them to run faster than AMD's counterparts.
It was good move for DX11. But when it is every game developer who would have to use same specific optimizations to help nVidia... Simply not feasible. That's why nVidia in some of those DX12 games loses 30% performance in comparison to DX11.

AMD's GCN architecture lives and dies with HW design. nVidia's lives and dies with SW implementation. And in case of DX12, it is kind of out of nVidia's sphere of influence. I would not blame DX12 for showing thing or two people knew already.

As of theory that DX12 should be better than DX11. It is. It has shader model 5.1 which adds single thing: Lists or whatever it is called.
Then SM 6.0. And threaded draw calls handling. Which brings you to part where at least 2 of those instances where comparison is made is done on Quad Core CPU (Quantum Break, Vermintide 2). DX12 up to 6 threads for pushing rendering stuff + 2~3 for Game engine itself. In this scenario, I think it is bit out of place to blame DX12 while having Quad core CPU.
And I can tell you that I had poor experience with Vermintide 2 and i5@4.5GHz and then wonderful on Ryzen 2700X. (I already wrote it here like 5 times over course of last 2 months.) And same situation with BF1.
You're ignoring the RX580 results showing DX11 to be faster than DX12 in Battlefield V alpha. You don't need to regurgitate stuff we already know as you're basically going round in circles without addressing what I'm saying.

As for having to use higher-end/more cores CPU to show DX12 in the best light, well, if we're going to go there, then, it's obvious to also pair it with a GTX1080ti. No problem for such a combo and gives you the best performance for nearly every game. This is nothing new though, the fact more processing power will offer a better experience, like the GTX1080ti just brute-forces it's way past the competition, regardless of any architectural deficits.

Here's an article about DX12 by Ubisoft dev Tiago Rodrigues; https://www.pcgamesn.com/microsoft/ubisoft-dx12-performance

Quote "If you take the narrow view that you only care about raw performance you probably won’t be that satisfied with amount of resources and effort it takes to even get to performance parity with DX11,” explained Rodrigues. “I think you should look at it from a broader perspective and see it as a gateway to unlock access to new exposed features like async compute, multi GPU, shader model 6, etc."

Ubisoft have only seen a GPU performance boost of around 5% at best, and that’s mostly down to the better initial async compute prowess of AMD’s latest graphics tech. The CPU performance scales better, in their experience delivering between 15-30% improved processor performance. Though that’s still not necessarily going to translate into hefty FPS gains in-game.

For a cross-platform publisher though, especially one using their own engine, the newly exposed features are the most attractive parts of DX12 because they’re essentially giving developers feature-parity with the APIs used on consoles. (lol, ever heard the saying "pot calling the kettle black"?)

“For feature set, we are now pretty much at feature-parity on PC. Which is great from an engine design perspective,” said Rodrigues. “‘You can, more or less, unify the feature set with the consoles and you have the opportunity to do some positive architectural changes to your engine.”

15 pages « < 12 13 14 15


Post New Comment
Click here to post a comment for this news story on the message forum.


Guru3D.com © 2022