Far Cry Primal: PC graphics performance benchmark review
Click here to post a comment for Far Cry Primal: PC graphics performance benchmark review on our message forum
rodrigoles
CRTFTW
I wonder if they made any upgrades to the Dunia engine, or if everything is identical like Blood Dragon was to Far Cry 3.
Artefact
GeniusPr0
FC: P doesnt use "Gimpworks" haha
Artefact
zzzaac
Still love the Crytek Far Cry
MMking
Great graphics review, i am always looking forward to these as i consider Guru3d to be one of the most, if not the most trustworthy source for such information. Though i am disappointed you did not include the R9-390 (non x) from AMD in the test. On various forums, i have found the dispute between the 970 and the 390 to be the most uncertain topic.
It's not a big deal, as the 290 is included and the 390 tend to score marginally, and i do mean marginally better than the 290. So it's an easy calculation.
Edit: As you have not yet included multi-gpu scaling results. AMD has released the 16.2.1 driver. I am not yet able to post links.
Quote:
Radeon Software Crimson Edition 16.2.1 Highlights
Crossfire Profile available for
Far Cry Primal
Endquote.
Kaarme
Hughesy
Ryu5uzaku
http://cdn.overclock.net/a/a8/a88f48c5_Far_Cry_4-nv-test-FarCry4_1920.jpeg
just fyi. The difference between 290x and 780 ti was negligible back when both released it was never huge. If 780 ti was around 20 fps faster it would be way faster then 980 and almost catching up to 980 ti.
The game seems to perform well pretty much on all platforms kepler not included (even then it ain't bad).
Ieldra
Undying
Denial
I have a question, is there a way to see what AMD runs the tessellation factor at for this game?
Like if you load the game up and have the tessellation slider set to auto, I guess, or w/e it defaults to in the AMD driver panel. And then you go into the game, can you see what the game is defaulting the tessellation level to?
Ieldra
Denial
I don't know, to me the comparisons are dumb. Take the FC4 example. GameGPU shows the 290x 12% faster than a 780Ti in FC4 but the review on the other page shows the 780Ti 15% faster than the 290x. It can't be both at the same settings, so either one review has a different configuration in hardware and/or settings, or the reviews are at two different times -- perhaps a time after AMD updated it's drivers, I don't know. If you go by GameGPU in FC4 and Guru3D in FC:P.. then the difference is only 2%. That's nothing, it's like margin of error.
But I mean imagine what a Witcher 3 comparison would look like compared to a theoretical Witcher 4. Release day Witcher 3 benchmarks with Gameworks turned on all heavily favored Nvidia cards. A few weeks later and AMD updated it's drivers + the game updated and suddenly AMD cards were much faster, how much faster? I don't know, no one ever really retested it. Now Witcher 4 comes out on the same engine or whatever, and people are like "980 was winning by 15% in Witcher 3, now it's losing by 5% in Witcher 4, MAXWELL DOWNGRADED" But in reality the 290x was always faster after those updates/drivers and it's just carrying over to the new game. That isn't even to mention things like texture sizes increasing in more recent games (nvidia equiv cards have less memory) and AMD having a natural architecture advantage going forward.
That's why I said in the other thread, unless someone does a super in depth test that rules out memory limitations, future design paradigms, driver updates, etc -- saying that Nvidia is intentionally downgrading Kepler has no real basis. It can just as easily be AMD is upgrading GCN. Or developers are just getting better at optimizing for GCN. Or games are just tending to use more memory now and most GCN equiv cards happened to have more memory during that time.
Ieldra
Denial
GeniusPr0
Denial
GeniusPr0