Far Cry Primal: PC graphics performance benchmark review

Game reviews 126 Page 1 of 1 Published by

Click here to post a comment for Far Cry Primal: PC graphics performance benchmark review on our message forum
https://forums.guru3d.com/data/avatars/m/250/250542.jpg
a 290 beating a 780ti at 1080p i remeber when the 780ti came out it was about 20-30 fps faster than a 290 lols how things change
Yep. And i expect the same will happen with the FuryX and the 980ti in the near future. πŸ™‚
https://forums.guru3d.com/data/avatars/m/267/267008.jpg
I wonder if they made any upgrades to the Dunia engine, or if everything is identical like Blood Dragon was to Far Cry 3.
https://forums.guru3d.com/data/avatars/m/267/267015.jpg
Yes, it's the reason we nVIDIA users enjoyed better performance in most games for the long past while. Now, it's just REALLY showing. And with AMD's improvements with CPU overhead, the gap will only keep closing between Hawaii and Kepler. It's pretty obvious Maxwell's fate is sealed, but the cards won't be widely used by the time some heavy use of DX12 features emerge anyway.
They are clearly doing something for sure.Kepler is keep falling more and more behind. One famous guy from YT club4ghz is complaining that he has lost 10 fps in far cry primal with latest game ready driver compared to previous driver on his single 780 ti this is club4ghz words Just installed game ready drivers and 10 Fps less than i had yesterday with old drivers in benchmark. Gimpworks at it's best. I am also playing Far Cry Primal on my old pc with single prahistoric gts 450 card at custom settings 1600x900 and its working fine 25-31 fps without recording...i did not test latest driver, i had play it with old icafe 344.47 driver which is best performer on fermi cards for me. It gives me 4-5 fps advantage in almost all games, compared to all other drivers. You can see my Far Cry Primal gameplay on you tube if you copy this in search Far Cry Primal GTS 450 900p gameplay so it is possible to play even on antient cards fine. so i dont know what to say but its fishy that kepler has very poor performance in this game compared to GCN. I would not be surprised that nvidia has started to gimp kepler again, and removed previous improvements. πŸ™‚
https://forums.guru3d.com/data/avatars/m/54/54823.jpg
FC: P doesnt use "Gimpworks" haha
https://forums.guru3d.com/data/avatars/m/267/267015.jpg
FC: P doesnt use "Gimpworks" haha
I know that FC Primal does not have gameworks effects,but kepler is still falling behind,its suspicious somehow...or game dunia engine is modified and heavily favors maxwell only, or drivers. some people are saying that previos driver performs better compared to FCP game ready driver,club4ghz is saying on you tube that he have 10 fps more with it on his 780 ti :3eyes: nvidia you are very shady :3eyes:
https://forums.guru3d.com/data/avatars/m/254/254800.jpg
Still love the Crytek Far Cry
data/avatar/default/avatar27.webp
Great graphics review, i am always looking forward to these as i consider Guru3d to be one of the most, if not the most trustworthy source for such information. Though i am disappointed you did not include the R9-390 (non x) from AMD in the test. On various forums, i have found the dispute between the 970 and the 390 to be the most uncertain topic. It's not a big deal, as the 290 is included and the 390 tend to score marginally, and i do mean marginally better than the 290. So it's an easy calculation. Edit: As you have not yet included multi-gpu scaling results. AMD has released the 16.2.1 driver. I am not yet able to post links. Quote: Radeon Software Crimson Edition 16.2.1 Highlights Crossfire Profile available for Far Cry Primal Endquote.
https://forums.guru3d.com/data/avatars/m/248/248994.jpg
It's not a big deal, as the 290 is included and the 390 tend to score marginally, and i do mean marginally better than the 290. So it's an easy calculation.
390 has oft performed about the same as 290X. That's why I look at the 290X score when judging my card's prospects if 390 is not included.
https://forums.guru3d.com/data/avatars/m/242/242371.jpg
I know that FC Primal does not have gameworks effects,but kepler is still falling behind,its suspicious somehow...or game dunia engine is modified and heavily favors maxwell only, or drivers. some people are saying that previos driver performs better compared to FCP game ready driver,club4ghz is saying on you tube that he have 10 fps more with it on his 780 ti :3eyes: nvidia you are very shady :3eyes:
You're focusing on Kelper, but it's not as if Maxwell is performing that great either...
https://forums.guru3d.com/data/avatars/m/164/164033.jpg
Is nVidia actively gimping their cards or just not optimizing drivers for Kepler anymore, Or did Ubi not optimise nVidia at all? The 780 Ti is a joke! Just look at this, Far Cry 4 uses same engine than primal, and in that game it outperforms the 290X Tri-X by almost 20FPS http://www.pcgameshardware.de/Far-Cry-4-Spiel-23145/Specials/Technik-Test-Benchmark-1143026/ But in primal its 1 fps slower, because the 290X actually GAINS FPS in a more DEMANDING game, :frown: http://www.pcgameshardware.de/Far-Cry-Primal-Spiel-56751/Specials/Benchmark-Test-1187476/ So 3 years later, in a more demanding game, the 290X gains +-4FPS on the same engine, and test where done by the same site and the 780Ti loses 20FPS!!! What the hell is going on??? Is nVidia that reliant on their drivers that as soon as they stop optimising your card's performance tanks :/ I have a 980Ti and I guess going on this trend it will be obsolete this time next year πŸ™
http://cdn.overclock.net/a/a8/a88f48c5_Far_Cry_4-nv-test-FarCry4_1920.jpeg just fyi. The difference between 290x and 780 ti was negligible back when both released it was never huge. If 780 ti was around 20 fps faster it would be way faster then 980 and almost catching up to 980 ti. The game seems to perform well pretty much on all platforms kepler not included (even then it ain't bad).
https://forums.guru3d.com/data/avatars/m/169/169957.jpg
I know that FC Primal does not have gameworks effects,but kepler is still falling behind,its suspicious somehow...or game dunia engine is modified and heavily favors maxwell only, or drivers. some people are saying that previos driver performs better compared to FCP game ready driver,club4ghz is saying on you tube that he have 10 fps more with it on his 780 ti :3eyes: nvidia you are very shady :3eyes:
Oh god not this kepler thing again, it's been debunked so many times. If I had a penny for every time I heard this story I'd have solved G3D's advertising/funding problem
https://forums.guru3d.com/data/avatars/m/198/198862.jpg
Oh god not this kepler thing again, it's been debunked so many times. If I had a penny for every time I heard this story I'd have solved G3D's advertising/funding problem
Isnt it sad where 780ti stand right now? It just keep us surprised game after game so people will not stop adressing the issue.
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
I have a question, is there a way to see what AMD runs the tessellation factor at for this game? Like if you load the game up and have the tessellation slider set to auto, I guess, or w/e it defaults to in the AMD driver panel. And then you go into the game, can you see what the game is defaulting the tessellation level to?
https://forums.guru3d.com/data/avatars/m/169/169957.jpg
Isnt it sad where 780ti stand right now? It just keep us surprised game after game so people will not stop adressing the issue.
It's sad in the same way any older hw losing performance over time is, I understand what you're basing you're argument on; it's just a wrong premise. Say we have two GPUs, GK115 and GM115 and they have an equal number of cores. If GM115 is twice as efficient at tessellation, this means it can run DOUBLE the load GK115 can. If a game is stressing both GPUs to the maximum, and 50% of the load is tessellation work on GK115, on GM115 this tesselation load will only be 25%. There are other factors to take into account other than tesselation efficiency. It's not hard to see how Kepler has been 'losing' performance slowly, it's silly to accuse nvidia of doing this on purpose Hardware is underused, then fully taken advantage of, then made obsolete - c'est la vie.
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
I don't know, to me the comparisons are dumb. Take the FC4 example. GameGPU shows the 290x 12% faster than a 780Ti in FC4 but the review on the other page shows the 780Ti 15% faster than the 290x. It can't be both at the same settings, so either one review has a different configuration in hardware and/or settings, or the reviews are at two different times -- perhaps a time after AMD updated it's drivers, I don't know. If you go by GameGPU in FC4 and Guru3D in FC:P.. then the difference is only 2%. That's nothing, it's like margin of error. But I mean imagine what a Witcher 3 comparison would look like compared to a theoretical Witcher 4. Release day Witcher 3 benchmarks with Gameworks turned on all heavily favored Nvidia cards. A few weeks later and AMD updated it's drivers + the game updated and suddenly AMD cards were much faster, how much faster? I don't know, no one ever really retested it. Now Witcher 4 comes out on the same engine or whatever, and people are like "980 was winning by 15% in Witcher 3, now it's losing by 5% in Witcher 4, MAXWELL DOWNGRADED" But in reality the 290x was always faster after those updates/drivers and it's just carrying over to the new game. That isn't even to mention things like texture sizes increasing in more recent games (nvidia equiv cards have less memory) and AMD having a natural architecture advantage going forward. That's why I said in the other thread, unless someone does a super in depth test that rules out memory limitations, future design paradigms, driver updates, etc -- saying that Nvidia is intentionally downgrading Kepler has no real basis. It can just as easily be AMD is upgrading GCN. Or developers are just getting better at optimizing for GCN. Or games are just tending to use more memory now and most GCN equiv cards happened to have more memory during that time.
https://forums.guru3d.com/data/avatars/m/169/169957.jpg
I don't know, to me the comparisons are lazy.
Corrected
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
Corrected
Fair enough lol
https://forums.guru3d.com/data/avatars/m/54/54823.jpg
I don't know, to me the comparisons are dumb. Take the FC4 example. GameGPU shows the 290x 12% faster than a 780Ti in FC4 but the review on the other page shows the 780Ti 15% faster than the 290x. It can't be both at the same settings, so either one review has a different configuration in hardware and/or settings, or the reviews are at two different times -- perhaps a time after AMD updated it's drivers, I don't know. If you go by GameGPU in FC4 and Guru3D in FC:P.. then the difference is only 2%. That's nothing, it's like margin of error. But I mean imagine what a Witcher 3 comparison would look like compared to a theoretical Witcher 4. Release day Witcher 3 benchmarks with Gameworks turned on all heavily favored Nvidia cards. A few weeks later and AMD updated it's drivers + the game updated and suddenly AMD cards were much faster, how much faster? I don't know, no one ever really retested it. Now Witcher 4 comes out on the same engine or whatever, and people are like "980 was winning by 15% in Witcher 3, now it's losing by 5% in Witcher 4, MAXWELL DOWNGRADED" But in reality the 290x was always faster after those updates/drivers and it's just carrying over to the new game. That isn't even to mention things like texture sizes increasing in more recent games (nvidia equiv cards have less memory) and AMD having a natural architecture advantage going forward. That's why I said in the other thread, unless someone does a super in depth test that rules out memory limitations, future design paradigms, driver updates, etc -- saying that Nvidia is intentionally downgrading Kepler has no real basis. It can just as easily be AMD is upgrading GCN. Or developers are just getting better at optimizing for GCN. Or games are just tending to use more memory now and most GCN equiv cards happened to have more memory during that time.
Yeah, GameGPU, the site that shows the 5960X benchmarks in the lead for 90% of titles, which in fact are not in line with every other review site. Nevertheless, before you speculate, you should realize the tessellation in FC: P is low to nonexistent. Also FC4 had GameWorks. This doesn't. Move on.
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
Yeah, GameGPU, the site that shows the 5960X benchmarks in the lead for 90% of titles, which in fact are not in line with every other review site. Nevertheless, before you speculate, you should realize the tessellation in FCP is low to nonexistent. Also FC4 had GameWorks. This doesn't. Move on.
Which is basically my point? People are comparing FC4 to FCP in terms of card performance. "Move on" No.
https://forums.guru3d.com/data/avatars/m/54/54823.jpg
Which is basically my point? People are comparing FC4 to FCP in terms of card performance. "Move on" No.
It's well known that AMD cannot optimize for specific games until the game has been released if they utilize GameWorks (FC4). FC4 was one of the biggest messes of GW to date at release. It's decently stable now. I'm amused to hell and back to find that it's not being used in Primal. Not to mention optimizations are specific for Kepler, as they are for Maxwell. nVIDIA doesn't even take care of Kepler anymore. This was obvious when they released a "bugged" driver, that they admitted, at the release of W3 that was degrading performance. πŸ™„ I'm pretty sure the tinfoil hate theories were put to rest last year. Kepler isn't getting special care with optimizations like Maxwell is. Meanwhile, GCN gets constant updates. If there was heavy tessellation use in Primal, the Kepler cards would be even slower. In the Primal graph, the 980 is only 12% faster than the 780 Ti in avg fps, and 15% faster with minimum fps. /fin foil hat Nothing to see here.