Are these real AMD R9-290X Benchmarks ?
An interesting set of benchmarks has been posted by a user called Grant Kim, apparantly he has access to a AMD R9-290X and has posted a handful of benchmarks. Now before we begin I have no clue about the validity of the specs and benchmarks.
The post originates from TPU:
The GPU core is clocked at 800 MHz. There is no dynamic-overclocking feature, but the chip can lower its clocks, taking load and temperatures into account. The memory is clocked at 1125 MHz (4.50 GHz GDDR5-effective). At that speed, the chip churns out 288 GB/s of memory bandwidth, over its 512-bit wide memory interface. Those clock speeds were reported by the GPU-Z client to us, so we give it the benefit of our doubt, even if it goes against AMD's ">300 GB/s memory bandwidth" bullet-point in its presentation. Among the tests run on the card include frame-rates and frame-latency for Aliens vs. Predators, Battlefield 3, Crysis 3, GRID 2, Tomb Raider (2013), RAGE, and TESV: Skyrim, in no-antialiasing, FXAA, and MSAA modes; at 5760 x 1080 pixels resolution. An NVIDIA GeForce GTX TITAN was pitted against it, running the latest WHQL driver. We must remind you that at that resolution, AMD and NVIDIA GPUs tend to behave a little differently due to the way they handle multi-display, and so it may be an apples-to-coconuts comparison. In Tomb Raider (2013), the R9 290X romps ahead of the GTX TITAN, with higher average, maximum, and minimum frame rates in most tests.
Senior Member
Posts: 9731
Joined: 2008-01-06
How the hell can a TITAN have a min frame rate of 35.0 but an avg or 29.1??????
That does not make any sense the avg should be higher than the min!! I call fake on these!
Senior Member
Posts: 8878
Joined: 2007-06-17
Why the high res?
The percentile of people using a res like that must be less than two decimal places.
It's incredible.
When reviewers use very high resolutions to more accurately measure GPU performance, we hear how hardly anybody uses such high resolutions and so it's unrealistic. Then when reviewers choose to use relatively low gaming resolutions to more accurately measure how CPUs perform, it's the exact same argument.
You can't win.
I suppose, if pharmaceutical drug producers started to test in general, random populations rather than using a control mechanism, that would also be more realistic.
Senior Member
Posts: 6640
Joined: 2010-08-27
How the hell can a TITAN have a min frame rate of 35.0 but an avg or 29.1??????
That does not make any sense the avg should be higher than the min!! I call fake on these!
I'm not saying that they're fake or not, just that the graph labels are poorly chosen

The graph is actually made up of two components, FXAA and MSAA 4x. FXAA is represented by the top 3, and MSAA 4x is represented by the bottom 3. For each set, the minimum is less than the average.
Senior Member
Posts: 19558
Joined: 2010-04-21
You should be able to follow which line is which using this:
Screen 1: green lowest
Screen 2: Red starts and finishes lowest
Screen 3: Red lowest
Screen 4: Green lowest
Screen 5: Red starts and finishes lowest
Screen 6: Red is higest in the middle
Screen 7: Red starts and finishes highest
Screen 8: Red finishes highest
Senior Member
Posts: 1692
Joined: 2012-10-07
Why the high res?
The percentile of people using a res like that must be less than two decimal places.
Haha, probably because a similar percentile of people will own said videocards! Those cards are more suited to uber high screen res anyway, that's what most people would buy them for I would think.