Resident Evil 3: PC graphics performance benchmark review

Game reviews 126 Page 1 of 1 Published by

Click here to post a comment for Resident Evil 3: PC graphics performance benchmark review on our message forum
https://forums.guru3d.com/data/avatars/m/242/242471.jpg
Yeah it ran great, even on oldies 980ti OC'ed. I maxed it out, except vram 6gb (just in case) and I could play DSR'd 2880x1620 fine (above 60fps all the time), actually it was very similar performance to RE2 remake or RE7.
https://forums.guru3d.com/data/avatars/m/248/248994.jpg
How do you even code the engine so that added cores lower the performance? Shouldn't the game just use a limited number of cores but perform just as well? It wasn't even a question of SMT. Unless it's somehow related to the turbo clocks and stressing multiple cores for no reason whatsoever, so that with more cores active and at 100%, they obviously can't clock so high. It would be historically bad programming. Years ago there was a game that artificially stressed cores so much that the game itself ended up suffering. I just don't remember which game it was. It was patched, fortunately.
https://forums.guru3d.com/data/avatars/m/198/198862.jpg
Nice performance all around. 6gb vram usage at 1080p become a norm.
https://forums.guru3d.com/data/avatars/m/255/255510.jpg
Nice review, I liked the CPU utilisation. Something for me to look into when I'm playing. Hmmm ๐Ÿ˜Ž
https://forums.guru3d.com/data/avatars/m/165/165018.jpg
at 1080 and 4k changing graphics settings makes little to no difference. That's interesting
data/avatar/default/avatar15.webp
Is cool that you can still have a radeon rx470 and being able to play this game. Sorry to see we can't compare with older geforce cards. The 470x is as old as? the serie 900? Would be nice too see which cards aged best.
https://forums.guru3d.com/data/avatars/m/202/202662.jpg
Thanks. Can you do Doom Eternal?
https://forums.guru3d.com/data/avatars/m/250/250667.jpg
Max 8GB at 4K,can hit between 10GB to 11 GB on Vram and sytem ram 10.4GB, DX11 is smoother than DX12 on my system.
https://forums.guru3d.com/data/avatars/m/72/72485.jpg
Not gonna lie, Jill is freakin gorgeous! ๐Ÿ˜€
https://forums.guru3d.com/data/avatars/m/196/196284.jpg
asturur:

Is cool that you can still have a radeon rx470 and being able to play this game. Sorry to see we can't compare with older geforce cards. The 470x is as old as? the serie 900? Would be nice too see which cards aged best.
The 900 series actually came out first....by nearly 2 years. The GeForce 900 series launched in September 2014 and the RX400 series launched in June 2016.
https://forums.guru3d.com/data/avatars/m/201/201426.jpg
jbscotchman:

Not gonna lie, Jill is freakin gorgeous! ๐Ÿ˜€
Rarely do I be like damn over a video game character, but yes, Jill is fine as hell. I bought my new monitor at the right time lol The demo ran so damn good. I could not believe how well it ran 3440x1440 with a vega 56 reference card. And I just swapped my x370 and 2600x out for a x470 and 2700x.
https://forums.guru3d.com/data/avatars/m/250/250667.jpg
When I get some time , going to run my 290x CFX .
Agonist:

Rarely do I be like damn over a video game character, but yes, Jill is fine as hell. I bought my new monitor at the right time lol The demo ran so damn good. I could not believe how well it ran 3440x1440 with a vega 56 reference card. And I just swapped my x370 and 2600x out for a x470 and 2700x.
https://forums.guru3d.com/data/avatars/m/201/201426.jpg
HARDRESET:

When I get some time , going to run my 290x CFX .
I do wonder if it does support MGpu in DX12 or crossfire at all in DX11. I have 2 570s I can play around with as well if it does. Those are about 290x performance too.
https://forums.guru3d.com/data/avatars/m/272/272728.jpg
jbscotchman:

Not gonna lie, Jill is freakin gorgeous! ๐Ÿ˜€
Agonist:

Rarely do I be like damn over a video game character, but yes, Jill is fine as hell. I bought my new monitor at the right time lol The demo ran so damn good. I could not believe how well it ran 3440x1440 with a vega 56 reference card. And I just swapped my x370 and 2600x out for a x470 and 2700x.
Being based on the international russian model Sasha Zotova, she couldn't be nothing else then gorgeous. ๐Ÿ™‚
https://forums.guru3d.com/data/avatars/m/201/201426.jpg
Error8:

Being based on the international russian model Sasha Zotova, she couldn't be nothing else then gorgeous. ๐Ÿ™‚
Had no idea, that is awesome and explains alot!!!
https://forums.guru3d.com/data/avatars/m/250/250667.jpg
Even the Zombies want a part of her. Jump to 6:00 [youtube=0qHL4qZkcOo]
data/avatar/default/avatar20.webp
Too bad they reduced gore effects in this one, also no dismemberments oh well.. ''Jill even is a bit sexy looking in that Tomb Raider style'' LOL I will take Carlos tvm ๐Ÿ˜€ I still like RE2R better ๐Ÿ˜€ Even if Nemesis is more crazy than Mr.X xD
data/avatar/default/avatar32.webp
Quick question, you never benchmark on ultra-wide monitor (3440x1440) ? Im thinking of buying one and i'd gladly check some benchmark on such resolution
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
I'm impressed. Looks great and can be very playable on very outdated hardware.
Kaarme:

How do you even code the engine so that added cores lower the performance? Shouldn't the game just use a limited number of cores but perform just as well? It wasn't even a question of SMT. Unless it's somehow related to the turbo clocks and stressing multiple cores for no reason whatsoever, so that with more cores active and at 100%, they obviously can't clock so high. It would be historically bad programming. Years ago there was a game that artificially stressed cores so much that the game itself ended up suffering. I just don't remember which game it was. It was patched, fortunately.
I imagine this has to do with scheduler. Seeing as this was optimized for modern consoles, it'd likely perform better if all threads were running in parallel (like on a FX, Ryzen, or non-HT Intel CPU). If the scheduler was using the HT'd threads, that's slowing the game down since the parent thread is out-of-sync. This is why HT was known to hurt performance of games about 10 years ago. The only reason that hasn't remained especially relevant lately is because most games for the longest time were only optimized for Intel. This game could possibly be the first change in that. I'm sure if this were tested on a 3700X or an 10980XE with HT off, there would be no performance difference between core quantity, until you reach 8 total threads.