G.Skill Flare DDR3 2000 MHZ C7 AMD kit review -
G.Skill DDR3-2200 C7 - Game performance
Far Cry 2
Throw your memory back to the year 2004 and the release of the innovative Far Cry on PC. Developer Crytek managed to fashion one of the most convincing and striking locales in all of gaming, and satisfied gamers with the freedom to pass through the landscape and tackle enemies in almost any way they saw fit. You surely remember Jack Carver and that things were about to get seriously messed up for you? Well, tough luck. You are no longer at that deserted tropical island but hop into a jeep and arrive at the sandy savannah surroundings of Africa. And that's a change... as much as you'll no longer run into any mutants, aliens, or any superpowers or psychic powers. Also - you are no longer Jack Carver, you assume the role of one of nine different mercenaries who are embedded in the midst of a brutal civil war which rages in an imaginary African nation.
Everything that goes down is involved in a dirty little bush war in central Africa and you'll have to use a rusty AK-47 and whatever bits of scavenged land mine you can duct-tape together. Two factions struggle for supremacy: the United Front for Liberation and Labour and the Alliance for Popular Resistance, and both are known for blood and control.
Far Cry 2 I like very much. Not so much for the gameplay anymore, yet the rendered environment and how the game can react to it. We are in high-quality DX10 mode with 4x AA (anti-aliasing) and 16x AF (anisotropic filtering).
Here you can see that the title is GPU bound after 1600x1200 which gives us a decent way of showing any performance differences related to memory in the resolutions below 16x12. The memory at 2000 MHz C leads over the very same setup compared to say memory multiplier and timings at 1333 MHz CAS 9. It however is the only title where we can measure a significant enough difference. The graphics card used is a Radeon HD 5870 by the way.
Battlefield: Bad Company 2
The Battlefield series has been running for quite a while. The last big entry in the series, Bad Company, was a console exclusive, much to the disappointment of PC gamers everywhere. DICE broke the exclusivity with the sequel, thankfully, and now PC owners are treated to the best Battlefield since Battlefield 2.
The plot follows the four soldiers of Bad Company as they track down a "new" super weapon in development by Russian forces. You might not immediately get that this game is about Bad Company, as the intro mission starts off with a World War II raid, but it all links together in the end.
A new title in the benchmark test suite, it's Battlefield Bad Company 2. Next to being a great game for gameplay, it's also an awesome title to test both graphics cards and processors with. The game has native support for DirectX 11 and on processor testing side of things, parallelized processing supporting two to eight parallel threads, which is great if you have a quad core processor.
There is currently a title on the market that is utilizing multi-CPU cores and is heavily threading. Battlefield Bad Company 2 DX11Battlefield Bad Company 2 will happily use four or more cores. The result is that very quickly the CPU does not matter anymore as it maximizing the incredible amount of processors power. As a result the GPU really quickly becomes a bottleneck; even the Radeon HD 5870 flat out is running at 100% whilst the processors have plenty of force left.
That would result very similar performance throughout a large scope of processors.
We opt to test DX11 solely for this title as we want to look at the most modern performance and image quality. DX11 wise we get as extra softened dynamic shadows and shader based performance improvements. A great game to play, a great game image quality wise. We raise the bar, image quality settings wise:
- DirectX 11 enabled
- 8x Multi-sample Anti-aliasing
- 16 Anisotropic filtering
- All image quality settings enabled at maximum
It is nearly shocking to see (and we did this deliberately) what happens if you make the test really GPU dependant. memory doesn't matter in this scenario as is abundantly clear.
Recently we looked at the performance differential between DDR4 and DDR5 on Alder-Lake, Intels Gen 12th series processors. Today we review a G.Skill TridentZ5 6400 CL32 (!) DDR5 kit and fire off freq...
G.Skill TridentZ 5 DDR5 5600 CL36 review
G.Skill has prepared very well for launching a new memory standard and has presented three DDR5 series: the Trident Z5 (5600-6400 MHz), the Trident Z5 RGB (the same range as the non-RGB’s), and the Ripjaws S5 (5200-5600 MHz). They all come in 32 GB kits (2 x 16 GB), and their frequency is higher than the base 4800 MHz. Today, we are checking the G.Skill TridentZ5 5600 MHz CL36 DDR5 kit. It’s not the high-end of the series, as even the 6400 MHz CL32 are available, and the 5600 MHz is the lowest frequency you can get from this DDR5 family.
DDR5 scaling with G.Skill TridentZ5 6000 CL36 review
Recently we looked at the performance differential between DDR4 and DDR5 on Alder-Lake, Intels Gen 12th series processors. Today we review a G.Skill TridentZ5 6000 CL36 DDR5 kit and fire off frequenc...
G.Skill Z5i (Mini ITX) chassis review
This time, we're checking out the G.Skill Z5i, the first chassis from a company known the most from the RAM. Some can remember that there's an AIO (Enki) available, as well as the keyboards (like KM360), mice (and mousepad), headsets, or PSUs. The attempt is made in a not-so-popular segment, meaning the Mini-ITX (so that's one of the reasons for the mentioned limit). This choice is a brave one as it's not so easy to create a good product here. Yes, it's gaining the share, but the ATX is dominating (maybe we'll also see something from G.Skill?).