Gears of War 5: PC graphics performance benchmark review

Game reviews 126 Page 1 of 1 Published by

Click here to post a comment for Gears of War 5: PC graphics performance benchmark review on our message forum
https://forums.guru3d.com/data/avatars/m/53/53598.jpg
Well i watched the benchmark video and i can only assume that YT is impacting on that video because the game looks just terrieble in that benchmark video, the vibrante colours and lighting effects of the game look all grey and dull for some reason, what is going on there then, is YT messing it up in compression possibly, and even picking 4k to watch the video in does not help, the game look way better than anything in that benchmark on the XBX so i would have assumed a PC with a RTX2080 would look bee knees on ultra settings, and not so grey and washed out as it does in that video, i mean the snow levels are eye popping in the game so i have no idea what is going on with that benchmark video. ๐Ÿ˜•
https://forums.guru3d.com/data/avatars/m/250/250066.jpg
the 1080ti still rocking ๐Ÿ˜Ž
https://forums.guru3d.com/data/avatars/m/218/218363.jpg
vbetts:

While this is a sponsored title by AMD, it's still an Unreal Engine 4 game which definitely favors Nvidia. Although either system you aren't going to have an issue running it.
Tried Gears 4 on my 2080Ti and although the benchmark showed a steady 60fps result at 4K, playing the game was a different story meaning I couldn't maintain 60fps at all times at Ultra settings. I guess some tweaking is necessary after all.
data/avatar/default/avatar04.webp
This game uses some sort of DRS below your minimum framerate setting. I have mine set to 60 fps. With my 1070 in 4k ultra in the benchmark, a little over 50% of the frames are rendered in full 4k. The rest a lower res. Which improves performance. Instead 29 fps, I get 57fps on ultra. Haven't seen a 2080ti render every frame in 4k even.
https://forums.guru3d.com/data/avatars/m/273/273678.jpg
XP-200:

Well i watched the benchmark video and i can only assume that YT is impacting on that video because the game looks just terrieble in that benchmark video, the vibrante colours and lighting effects of the game look all grey and dull for some reason, what is going on there then, is YT messing it up in compression possibly, and even picking 4k to watch the video in does not help, the game look way better than anything in that benchmark on the XBX so i would have assumed a PC with a RTX2080 would look bee knees on ultra settings, and not so grey and washed out as it does in that video, i mean the snow levels are eye popping in the game so i have no idea what is going on with that benchmark video. ๐Ÿ˜•
change your driver video output to Full, your screen is full but your driver is playing the video back in limited w/Acceleration.
https://forums.guru3d.com/data/avatars/m/243/243702.jpg
Only Intruder:

Oof, Fiji based cards are doing crap in this title :'( VRAM limitation?
No. Issue is in clock. Fiji is robust, but not all parts are appropriately capable. Therefore RX-580 with higher clock pulls ahead on lower resolution where shader brute force is not as needed. But on higher resolutions Fiji is well ahead of RX-580 even while later has double VRAM size. VRAM limitations would be best tested on cards that have multiple memory configuration. Like RX-580 4/8G and GTX 1060 3/6G. Apparently, any single high amount VRAM card can be used for test as long as you can edit and sign vBIOS that will have reduced memory addressing. But I think both AMD and nVidia are not making vBIOS editing easy on new cards.
https://forums.guru3d.com/data/avatars/m/79/79740.jpg
Wow, the inconsistencies. TPU
XP-200:

Well i watched the benchmark video and i can only assume that YT is impacting on that video because the game looks just terrieble in that benchmark video, the vibrante colours and lighting effects of the game look all grey and dull for some reason, what is going on there then, is YT messing it up in compression possibly, and even picking 4k to watch the video in does not help, the game look way better than anything in that benchmark on the XBX so i would have assumed a PC with a RTX2080 would look bee knees on ultra settings, and not so grey and washed out as it does in that video, i mean the snow levels are eye popping in the game so i have no idea what is going on with that benchmark video. ๐Ÿ˜•
Sounds like a classic case of limited output range. Happens with some displays when you update the Nvidia driver. Make sure driver settings as follows: [spoiler]https://i.postimg.cc/h4kYchv1/rgb-full-NVIDIA-Control-Panel.jpg [/spoiler]
https://forums.guru3d.com/data/avatars/m/273/273678.jpg
alanm:

Wow, the inconsistencies. TPU Sounds like a classic case of limited output range. Happens with some displays when you update the Nvidia driver. Make sure driver settings as follows: [spoiler]https://i.postimg.cc/h4kYchv1/rgb-full-NVIDIA-Control-Panel.jpg [/spoiler]
Its actually more likely not that setting at all, but https://cdn.discordapp.com/attachments/247516623018131456/621271520936722432/unknown.png Since i've noticed youtube will stream limited range with hardware accelerated decoding.
https://forums.guru3d.com/data/avatars/m/53/53598.jpg
Astyanax:

change your driver video output to Full, your screen is full but your driver is playing the video back in limited w/Acceleration.
I watched this on my tv , not on the PC, 4k HDR YT via the Tv app.
data/avatar/default/avatar24.webp
This is an insult, a $1300 graphics card can just do barely 60fps in 4k. I was expecting more from a 2080 TI.
https://forums.guru3d.com/data/avatars/m/79/79740.jpg
samir72:

This is an insult, a $1300 graphics card can just do barely 60fps in 4k. I was expecting more from a 2080 TI.
You dont need to put everything on ultra.
https://forums.guru3d.com/data/avatars/m/198/198862.jpg
alanm:

You dont need to put everything on ultra.
yeah, there is a titan rtx for that. ๐Ÿ˜‰
https://forums.guru3d.com/data/avatars/m/242/242134.jpg
@samir72 and? what has the chip/gpu maker to do with the game? if the next top card is 1500$, do you expect it to do 8K@120Hz? lol
https://forums.guru3d.com/data/avatars/m/273/273678.jpg
alanm:

You dont need to put everything on ultra.
its been 13 years since oblivion and people still haven't worked this out.
https://forums.guru3d.com/data/avatars/m/156/156133.jpg
Moderator
@Hilbert Hagedoorn When are we gonna squad up for some ranked matches? :P
https://forums.guru3d.com/data/avatars/m/267/267787.jpg
I'm installing it today. Can't wait to see how the new Gears are.
https://forums.guru3d.com/data/avatars/m/16/16662.jpg
Administrator
vbetts:

@Hilbert Hagedoorn When are we gonna squad up for some ranked matches? ๐Ÿ˜›
Oh online gaming is so not my thing ... standalone games with a nice storyline anytime anywhere.
https://forums.guru3d.com/data/avatars/m/199/199386.jpg
Astyanax:

its been 13 years since oblivion and people still haven't worked this out.
Ah...happy days, when tessellation was just a wink in John Carmacks' eyes.
https://forums.guru3d.com/data/avatars/m/262/262995.jpg
asturur:

I did not expect the game to not hit 120fps at 1080p with the top high end cards.
Say what? My 1070ti gives me 120fps + on max settings @1080p. What are you even talking about??? EDIT: Ok I've looked at the benchmarks and I know Hilbert usually does a good job, but something is wrong here. I have an 8600k @4.8ghz, 3200mhz RAM and an non-OC'd 1070ti, and I don't ever drop below 90fps, on max settings (yes, everything maxed). More often than not I'm sitting at 120-140fps! On a 1070ti! Wtf.... The benchmark is giving me Titan XP results....something is definitely not right with this article. "Aside from a random occasional stutter and some FPS drops" Something I've yet to see, AT ALL, even in the most intense scenes and even while playing Co op. I and my friend have not experienced ANY fps drops, I monitor mine closely and am sensitive to such things, also stutters are non-existent in the game, period. I'm convinced something is inherently wrong with the system used to perform this benchmark.
https://forums.guru3d.com/data/avatars/m/262/262995.jpg
samir72:

This is an insult, a $1300 graphics card can just do barely 60fps in 4k. I was expecting more from a 2080 TI.
An insult? This is one of the best optimised games of the last few years. 4k /60fps is simply unrealistic at this point in time and little more than a gimmick for bragging rights. Also "one of the best PC ports" - Hilbert - this is not a port.