AMD Gives Pointers On How to Improve Ryzen 1080p game performance

Published by

Click here to post a comment for AMD Gives Pointers On How to Improve Ryzen 1080p game performance on our message forum
https://forums.guru3d.com/data/avatars/m/99/99142.jpg
So is the 6800k and 6900k costing up to 3 times more and I don't see anyone bashing them.
the 6800k costs same as the 1700x, not 3 times, please get your facts right. And yes, you're right, the 7700k beats it at gaming. That's why I went with the 7700k.
Looking at the one you have now I'd say you shouldn't. Get a cheap 1080 or if you can afford it go all in for a 1080Ti. I doubt Vega will bring anything new and revolutionary. That said, I defend Ryzen is the best CPU since Sandy Bridge.
And that's exactly my point. AMD have nothing to offer me in gpu or cpu department atm. When they were king, I was rocking their hardware, including Winchester, Manchester CPUs and various GPUs. But ever since the 79xx series, I don't see anything AMD offer me in GPU department. I was actually very close to buy a 290x on launch but we all know how hot and loud those cards ran so I bought a 970 instead. I'm just calling it how I see it, from a gamer's perspective. And I would agree, these Ryzens are the best CPU releases since Sandy. But not quite on par with Sandy. When the 2500k/2600k came out, they swept the floor with it's competition.
Most sensible people will take a slightly slower but vastly better general purpose CPU over an inflated 4c any day.
You have a dual core intel in your spec.
data/avatar/default/avatar32.webp
What AMD is talking about in the observer effect isn't isolated to their platform, its universal. If you are trying to establish a performance baseline for comparison, you don't need the monitoring tools skewing your results. For the average user (as opposed to review sites) once you've established your overclock and done all of your worse case max temp tests, there is actually little reason to have have the monitor running constantly. Its not like Crysis 4 is gonna come along and push the system anymore than the combination of running SuperPi and Furmark simultaneously.
AMD are shooting themselves in the foot with PR attempts like this! Asking their users to have all this trouble for the cpu to deliver proper performance is beyond stupid!!!:bang: Who the hell is in charge of AMDs PR department? And to make things better, we have this kind of amazing tips like: Ensure there are no background CPU temperature or frequency monitoring tools when performance is essential. Real-time performance measurement tools can have an observer effect that impacts performance, especially if the monitoring resolution (>1 sample/sec) is increased. What´s the point of having an 8 core cpu if AMD tell us that using some programs are detrimental for performance!... Then we have a long and complicated list of how to setup RAM properly. And editing games files also seems lots of fun... AMD is ruining what should be a turining point for them with this kind of stuff... Someone make them stop!
https://forums.guru3d.com/data/avatars/m/262/262197.jpg
I'm a bit confused that there was nothing said by AMD there about the forthcoming Windows 10 updates to the scheduler that is supposed to fix a lot of the 1080p gaming performance - maybe they don't know when it's gonna happen, & maybe it's not gonna happen, who knows!
I suspect AMD isn't in a position to speak for Microsoft, as it isn't the case the other way round. On the firmware/driver front - AMD can and does officially announce Chipset drivers and GPU drivers/firmware - Motherboard manufacturers can and do officially announce BIOS updates. - Microsoft can and does officially announce Windows OS updates. No company interferes with announcments of another companies field of responsibility. So if we are waiting for a OS update, we wait for an announcment of MS. If we are waiting for a Chipset update, we wait for an announcment of AMD. If we are waiting for a BIOS update, we wait for an announcment of ASUS/MSI/GB/Asrock/etc. The fact is, we don't know what it will be until it gets announced. So we can't know WHO will announce it. Possible outcomes (multiple answers or no answer at all can be correct at the same time): MS announces: "Windows 10 KB123456 -> includes Kernel optimizations for new generation of AMD CPUs" AMD announces: "AM4 Chipset Driver Update -> includes optimizations for new generation of AMD CPUs" Mobo company announces: "BIOS Update -> includes optimizations for new generation of AMD CPUs & AM4 memory compatibility" What will happen first? And when? Nobody not working closely with those big players knows.
What AMD is talking about in the observer effect isn't isolated to their platform, its universal. If you are trying to establish a performance baseline for comparison, you don't need the monitoring tools skewing your results. For the average user (as opposed to review sites) once you've established your overclock and done all of your worse case max temp tests, there is actually little reason to have have the monitor running constantly. Its not like Crysis 4 is gonna come along and push the system anymore than the combination of running SuperPi and Furmark simultaneously.
classic Heisenberg & Schrodinger :infinity:
https://forums.guru3d.com/data/avatars/m/254/254725.jpg
the 6800k costs same as the 1700x, not 3 times, please get your facts right. And yes, you're right, the 7700k beats it at gaming. That's why I went with the 7700k. And that's exactly my point. AMD have nothing to offer me in gpu or cpu department atm. When they were king, I was rocking their hardware, including Winchester, Manchester CPUs and various GPUs. But ever since the 79xx series, I don't see anything AMD offer me in GPU department. I was actually very close to buy a 290x on launch but we all know how hot and loud those cards ran so I bought a 970 instead. I'm just calling it how I see it, from a gamer's perspective. And I would agree, these Ryzens are the best CPU releases since Sandy. But not quite on par with Sandy. When the 2500k/2600k came out, they swept the floor with it's competition. You have a dual core intel in your spec.
I'll give you most of that but GPUs? Nvidia wasn't truly ahead until the 1000 series, aside from power usage and temps.
data/avatar/default/avatar02.webp
You have a dual core intel in your spec.
Yes, I'm using a HP Pavilion 15 laptop, and sadly AMD's offerings in this area are not so great, especially considering how capable Intel integrated graphics have become. The i5 5157u is a fine CPU, especially with 16GB ram and an SSD. Considering Ryzen's efficiency, their GPU IP, and a node shrink, AMD should be able to offer something up soon enough.
https://forums.guru3d.com/data/avatars/m/243/243702.jpg
Buy an I5? Just kidding
It would be a good joke. But since BF1 Multiplayer we finally entered age where i5 (4C/4T) are dead. I am happy that mine lasted that many years.
https://forums.guru3d.com/data/avatars/m/99/99142.jpg
I'll give you most of that but GPUs? Nvidia wasn't truly ahead until the 1000 series, aside from power usage and temps.
coming from 7950 crossfire I had 2 options... either 290/x or 970/980. I went with 970, it almost matched my 7950 but it was literally silent. And don't worry, I've had plenty of AMD cards before that. 7950, then 7950 cfx, then 4850, x1800x, x1800xtx, x800 pro, x850 xt and a few more.
It would be a good joke. But since BF1 Multiplayer we finally entered age where i5 (4C/4T) are dead. I am happy that mine lasted that many years.
To be fair, BF1 was perfectly playable on my 2500k @4.5ghz. Not saying it was ripping 80fps solid on my 980ti, but 60fps minimum was pretty much achievable with some settings dropped. That is at 1440p.
https://forums.guru3d.com/data/avatars/m/236/236670.jpg
I'll give you most of that but GPUs? Nvidia wasn't truly ahead until the 1000 series, aside from power usage and temps.
980ti still kicks a** all over amd's new stuff and its almost a 2year old product
data/avatar/default/avatar10.webp
... You do realise that "small budget" nets you still as much as for a midrange Intel system? 1800X (500$) + X370 MB (200$) = 700$ 7700K (350$) + Z270 MB (250-300$) = 650$ Honestly? This does not look like a budget build to get, honestly. And the 180X is compared to the 7700K in so many discussions that I just use it here too. The average desktop user gets a few more cores, agreed, but does the average desktop user make good use of them? ....
Sorry, but your statement contradicts itself. If you don't care about the "few extra cores" then either you wait for the R5/R3 series or you go for the 1700. They all max out around 4.0/4.1GHz, so for that reason you don't have to go for the 1800X. This would change your picture: 1700 (330$) + X370 MB (200$) = 530$ So, even with only the high-ends on the market, it is already cheaper to get an AMD system. Lest alone the R5 & R3 systems...
https://forums.guru3d.com/data/avatars/m/243/243702.jpg
To be fair, BF1 was perfectly playable on my 2500k @4.5ghz. Not saying it was ripping 80fps solid on my 980ti, but 60fps minimum was pretty much achievable with some settings dropped. That is at 1440p.
Yes, Singleplayer is quite fine (CPU utilization 4x 100%). My GPU utilization was bad DX11/DX12, but I had no freezes and stutter fest. But moment I went to multiplayer... All the stuff there is... DX12 unplayable nightmare (CPU utilization 4x 100%). DX11 playable (CPU utilization 4x 100%), still good FPS (70+), but GPU utilization so low (~25~35%) that I can simply get much more from GPU if I pair it with CPU capable to crunch more threads. So, those benchmarks, I saw few of them on YT, they like to run that SP tank mission as it delivers good consistency. But if they just recorded multiplayer, they would have to show huge differences between those CPUs.
data/avatar/default/avatar07.webp
Interesting, AMD is unable to manipulate steam cloud in 2017. Well, this killed my hope that AMD will manage to discover UVD bug on 1st gen GCN and then fix it... Community really needs to start working on community patches.
https://forums.guru3d.com/data/avatars/m/254/254725.jpg
coming from 7950 crossfire I had 2 options... either 290/x or 970/980. I went with 970, it almost matched my 7950 but it was literally silent. And don't worry, I've had plenty of AMD cards before that. 7950, then 7950 cfx, then 4850, x1800x, x1800xtx, x800 pro, x850 xt and a few more. To be fair, BF1 was perfectly playable on my 2500k @4.5ghz. Not saying it was ripping 80fps solid on my 980ti, but 60fps minimum was pretty much achievable with some settings dropped. That is at 1440p.
Nvidia has had AMD bested on reference coolers for years now. I think the 680 was when they introduced their much improved blower? AMD definitely needs to do well in Vega but, I don't think the situation was terribly mismatched until more recently.
980ti still kicks a** all over amd's new stuff and its almost a 2year old product
You've got me there, I forgot about the 980ti :P. Still though, I think my point stands. Most of their line holds up against what it was aimed against.
https://forums.guru3d.com/data/avatars/m/99/99142.jpg
1800X (500$) + X370 MB (200$) = 700$ 7700K (350$) + Z270 MB (250-300$) = 650$
I especially love how you decided to make the z270 board a full $100 for no reason. Good post.
https://forums.guru3d.com/data/avatars/m/197/197287.jpg
Core parking is Off by default in both power plans - High performance and Balanced - since Win8.
I would love to say you are correct, except that even in high performance power plans on windows 10 with my CPU, it consistently put them into park. I had to get ParkControl to disable core parking, on windows 10.
https://forums.guru3d.com/data/avatars/m/55/55855.jpg
so, if you want to improve your Ryzen performance, overclock it. :P
https://forums.guru3d.com/data/avatars/m/236/236670.jpg
Nvidia has had AMD bested on reference coolers for years now. I think the 680 was when they introduced their much improved blower? AMD definitely needs to do well in Vega but, I don't think the situation was terribly mismatched until more recently. You've got me there, I forgot about the 980ti :P. Still though, I think my point stands. Most of their line holds up against what it was aimed against.
Yea...I do love my $149 rx480...
https://forums.guru3d.com/data/avatars/m/211/211979.jpg
Knowing the limitations of my cpu I have always just used supersampling between 1.25x-1.5x usually if game had native support for it or enabled vsr to do 1440p in every game i have played for the past 2-3 years or so. I have always emphasized gpu over cpu and have always just played at a higher res. My cpu before my 8350 was a Phenom II x3 710 then a Phenom II x4 970. Also for AMD gpu guys in the Wattman thing change values for gpu clocks for stages 2-6 to the highest clock of the card. so for example on mine voltages are all auto but speed for 2-6 is set to 1050. I get way more consistent performance in every game since the power states don't **** me jumping all over. And overall power consumption and temps are virtually the same. To me Ryzen looks fine. I have gamed on a 1700 system in person and after doing so will start replacing all my 8350 systems with 1700/1700x systems when prices drop a teeny bit. the small stutters that are present in my system are not there at all in the 1700 system. Also i don't care about 120/144 refresh rates but if I did I would still go for a 7700k. I am actually still debating getting a 7700k/Z270 combo for 350$ and delay replacing my encoding/streaming rigs a little further. Really curious to see what kind of clocks intel will be working with on their 6 core i7 cannonlake parts.
https://forums.guru3d.com/data/avatars/m/242/242134.jpg
@eclap the 7700K is only really faster when it comes to ST and 1080p (and lower) gaming performance. i have yet to see someone spend around 300$ for a cpu (and you dont slap it on a 50$ board) and then starts gaming at 720p with a 750ti. everyone i personally know (friends/customers) that owns any i7 games at 1440p, where there is almost no difference between amd and intel. and the fact that its ST is better than ryzen? for how long? its not like future software/games and OSes will get optimized for single core performance, so why buy a cpu that will lose in MT already, and even more over the next few years. unless of course you plan on swapping the rig every year or two...
https://forums.guru3d.com/data/avatars/m/267/267641.jpg
What is that, AMD Wishlist for best possible testing scenario? Im missing part about disable half of Intel cores.. They are pathetic, its like asynch shader performance boost.