The Witcher 3 - GTX 780Ti Could Push 35-45 FPS At Max?
If it is up to Gametechs claims then The Witcher 3: Wild Hunt will be a harsh on the GPU title, so much that One 550 USD GTX 780 Ti already has a hard time to keep up with the game engibe, under the condition that higher levels of AA are enabled. The website mentions that the PC version runs HQ and 8xMSAA at 35-45fps on a GTX 780Ti at 1080p.
Naturally, a single 780Ti is able to hit the 60fps sweet spot in its current form provided MSAA is lowered. Moreover, it is said that the PS4 version will run at 900p/30fps while the Xbox One version will run at 720p/30fps. What's also interesting is that the console versions of the game will meet the minimum graphics settings of the PC version. Gametech claimed that The Witcher 3 on consoles will look similar to The Witcher 2.
The Witcher 3: Wild Hunt Screenshots - 01/29/2014 09:04 AM
Three new screenshots from CD Projekt RED's upcoming RPG have surfaced, showcasing the game's mind-blowing visuals. The Witcher 3: Wild Hunt is planned for a 2014 release on PC, Xbox One and PS4....
The Witcher 3: Wild Hunt - TGS 2013 Trailer - 09/21/2013 08:15 AM
Some fight for glory, justice, or gold. Geralt of Rivia, the Witcher, fights for those he holds dear in this trailer from TGS 2013 for The Witcher 3: Wild Hunt. ...
The Witcher 3 Screenshots - 08/22/2013 05:33 PM
CD Projekt released some new Witcher 3 screenshots at Gamescom 2013. The game is planned for a 2014 release on PC, Xbox One and PS4. Check them out after the break....
The Witcher 3: Wild Hunt - Killing Monsters Cinematic Trailer - 08/15/2013 07:27 AM
CD Projekt RED is proud to present The Witcher 3: Wild Hunt "Killing Monsters" cinematic trailer to the public for the first time, offering a thrilling glimpse into the morally ambiguous u...
The Witcher 3: Wild Hunt Screenshots - 03/02/2013 10:20 AM
I have some screenshots from The Witcher 3: Wild Hunt along with some official details about their upcoming RPG sequel: The Witcher 3: Wild Hunt is an unprecedented combination of story-driven RPG an...
Senior Member
Posts: 6956
Joined: 2008-10-27
AA is typically the first thing I turn off. High AA can make a game look nearly photo-real, it's just so computationally demanding it's usually not worth it.
I'd like to see a discreet graphics chip that does nothing but calculate AA, if such a thing is possible (and does not require a dual-GPU-ready motherboard). A discreet card was used to calculate physics (Physx, by AGEIA), and for various reasons that failed, but anti-aliasing is very different than physics simulation. AA just serves to enhance what is already there, which means it could potentially enhance every game, not just ones that are built specifically for it, as with the Physx card.
You can run SLI AA for games...
http://developer.download.nvidia.com/whitepapers/2011/SLI_Best_Practices_2011_Feb.pdf
Senior Member
Posts: 5642
Joined: 2012-11-10
I too think AA makes things a bit blurry at times. If I enable it, I only keep it at 2x or 4x. The performance vs visual result ratio is not worth it to me to use AA, so I've been getting away with using mid-range GPUs for a while. Besides, it's pretty easy to get used to no AA at all. There are plenty of old games you can play today and cringe at how ugly they are but after a couple hours you quickly learn to accept the look. It's almost as though your brain fills in the gaps in the details.
I agree with one of the earlier posters though - it would be nice to see a piece of hardware dedicated to processing AA, and maybe the smoothing of shadows too (as far as I'm aware, it's a similar calculation).
Senior Member
Posts: 305
Joined: 2013-09-05
No, thanks. Can't they create games instead that run with an enjoyable performance? If I want slideshows, I'll look at Google Images.
Game devs seem to miss the point of high-end hardware. It's not there in order to raise a game's performance slightly above "barely acceptable." It's there to get enough performance for your 144Hz monitor, your multi-monitor "eyefinity" or whatever setup, etc.
And why are people saying "this game has the best graphics ever?" If the graphics run like crap, guess what, they're bad. Image quality is only one aspect of good graphics. The other is fluidity. If either of those is crap, then the overall result is crap too. Image quality without fluidity is only important for static images.
Unless they actually intend to make sales 4 years from now, since that's the timeframe for the hardware to catch up to this stuff. But by then, the game won't be profitable anymore to begin with.
The Witcher and Crysis - wasting otherwise good PC firepower on unoptimized crap.
Nobody is forcing you to run the game on max possible settings, you can mess with options and get the visuals/performance how ever you want - its not like the game will be an ugly mess if its not maxed out!
Senior Member
Posts: 1045
Joined: 2006-02-06
No, thanks. Can't they create games instead that run with an enjoyable performance? If I want slideshows, I'll look at Google Images.
Game devs seem to miss the point of high-end hardware. It's not there in order to raise a game's performance slightly above "barely acceptable." It's there to get enough performance for your 144Hz monitor, your multi-monitor "eyefinity" or whatever setup, etc.
And why are people saying "this game has the best graphics ever?" If the graphics run like crap, guess what, they're bad. Image quality is only one aspect of good graphics. The other is fluidity. If either of those is crap, then the overall result is crap too. Image quality without fluidity is only important for static images.
Unless they actually intend to make sales 4 years from now, since that's the timeframe for the hardware to catch up to this stuff. But by then, the game won't be profitable anymore to begin with.
The Witcher and Crysis - wasting otherwise good PC firepower on unoptimized crap.
well, thing is, playing in highest end graphics, is a luxury. And for luxury my friend, you have to pay

If one 780ti is not enough for max graphics, I am sure, two will be enough. If not, i will still be happy that something is out there which justifies my investment in two 780ti.
Senior Member
Posts: 1773
Joined: 2008-09-10
No, thanks. Can't they create games instead that run with an enjoyable performance? If I want slideshows, I'll look at Google Images.
Game devs seem to miss the point of high-end hardware. It's not there in order to raise a game's performance slightly above "barely acceptable." It's there to get enough performance for your 144Hz monitor, your multi-monitor "eyefinity" or whatever setup, etc.
And why are people saying "this game has the best graphics ever?" If the graphics run like crap, guess what, they're bad. Image quality is only one aspect of good graphics. The other is fluidity. If either of those is crap, then the overall result is crap too. Image quality without fluidity is only important for static images.
Unless they actually intend to make sales 4 years from now, since that's the timeframe for the hardware to catch up to this stuff. But by then, the game won't be profitable anymore to begin with.
The Witcher and Crysis - wasting otherwise good PC firepower on unoptimized crap.
What?