G.Skill 2x4GB CL7 1600 MHz Trident DDR3 review

Memory (DDR4/DDR5) and Storage (SSD/NVMe) 368 Page 12 of 13 Published by

teaser

Performance Gaming

 

Far Cry 2

Throw your memory back to the year 2004 and the release of the innovative Far Cry on PC. Developer Crytek managed to fashion one of the most convincing and striking locales in all of gaming, and satisfied gamers with the freedom to pass through the landscape and tackle enemies in almost any way they saw fit. You surely remember Jack Carver and that things were about to get seriously messed up for you? Well, tough luck. You are no longer at that deserted tropical island but hop into a jeep and arrive at the sandy savannah surroundings of Africa. And that's a change... as much as you'll no longer run into any mutants, aliens, or any superpowers or psychic powers. Also - you are no longer Jack Carver, you assume the role of one of nine different mercenaries who are embedded in the midst of a brutal civil war which rages in an imaginary African nation.
Everything that goes down is involved in a dirty little bush war in central Africa and you'll have to use a rusty AK-47 and whatever bits of scavenged land mine you can duct-tape together. Two factions struggle for supremacy: the United Front for Liberation and Labour and the Alliance for Popular Resistance, and both are known for blood and control.

Far Cry 2 I like very much. Not so much for the gameplay anymore, yet the rendered environment and how the game can react to it. We are in high-quality DX10 mode with 4x AA (anti-aliasing) and 16x AF (anisotropic filtering).

Here you can see that the title is GPU bound after 1600x1200 which gives us a decent way of showing any performance differences related to memory in the resolutions below 16x12. The memory at 1600 MHz C7 in fact leads leads over the very same setup with triple channel memory at 1066 MHz on the 980X hexacore processor. It however is the only title where we can measure a significant enough difference. The graphics card used is a Radeon HD 5870 by the way.
 

Battlefield: Bad Company 2

The Battlefield series has been running for quite a while. The last big entry in the series, Bad Company, was a console exclusive, much to the disappointment of PC gamers everywhere. DICE broke the exclusivity with the sequel, thankfully, and now PC owners are treated to the best Battlefield since Battlefield 2.

The plot follows the four soldiers of Bad Company as they track down a "new" super weapon in development by Russian forces. You might not immediately get that this game is about Bad Company, as the intro mission starts off with a World War II raid, but it all links together in the end.

A new title in the benchmark test suite, it's Battlefield Bad Company 2. Next to being a great game for gameplay, it's also an awesome title to test both graphics cards and processors with. The game has native support for DirectX 11 and on processor testing side of things, parallelized processing supporting two to eight parallel threads, which is great if you have a quad core processor. 

There is currently a title on the market that is utilizing multi-CPU cores and is heavily threading. Battlefield Bad Company 2 DX11Battlefield Bad Company 2 will happily use four or more cores. The result is that very quickly the CPU does not matter anymore as it maximizing the incredible amount of processors power. As a result the GPU really quickly becomes a bottleneck; even the Radeon HD 5870 flat out is running at 100% whilst the processors have plenty of force left.

That would result very similar performance throughout a large scope of processors.

We opt to test DX11 solely for this title as we want to look at the most modern performance and image quality. DX11 wise we get as extra softened dynamic shadows and shader based performance improvements. A great game to play, a great game image quality wise. We raise the bar, image quality settings wise:

  • DirectX 11 enabled
  • 8x Multi-sample Anti-aliasing
  • 16 Anisotropic filtering
  • All image quality settings enabled at maximum

It is nearly shocking to see (and we did this deliberately) what happens if you make the test really GPU dependant. Memory doesn't matter in this scenario as is abundantly clear.

Share this content
Twitter Facebook Reddit WhatsApp Email Print