The Witcher 3 - GTX 780Ti Could Push 35-45 FPS At Max?
If it is up to Gametechs claims then The Witcher 3: Wild Hunt will be a harsh on the GPU title, so much that One 550 USD GTX 780 Ti already has a hard time to keep up with the game engibe, under the condition that higher levels of AA are enabled. The website mentions that the PC version runs HQ and 8xMSAA at 35-45fps on a GTX 780Ti at 1080p.
Naturally, a single 780Ti is able to hit the 60fps sweet spot in its current form provided MSAA is lowered. Moreover, it is said that the PS4 version will run at 900p/30fps while the Xbox One version will run at 720p/30fps. What's also interesting is that the console versions of the game will meet the minimum graphics settings of the PC version. Gametech claimed that The Witcher 3 on consoles will look similar to The Witcher 2.
The Witcher 3: Wild Hunt Screenshots - 01/29/2014 09:04 AM
Three new screenshots from CD Projekt RED's upcoming RPG have surfaced, showcasing the game's mind-blowing visuals. The Witcher 3: Wild Hunt is planned for a 2014 release on PC, Xbox One and PS4....
The Witcher 3: Wild Hunt - TGS 2013 Trailer - 09/21/2013 08:15 AM
Some fight for glory, justice, or gold. Geralt of Rivia, the Witcher, fights for those he holds dear in this trailer from TGS 2013 for The Witcher 3: Wild Hunt. ...
The Witcher 3 Screenshots - 08/22/2013 05:33 PM
CD Projekt released some new Witcher 3 screenshots at Gamescom 2013. The game is planned for a 2014 release on PC, Xbox One and PS4. Check them out after the break....
The Witcher 3: Wild Hunt - Killing Monsters Cinematic Trailer - 08/15/2013 07:27 AM
CD Projekt RED is proud to present The Witcher 3: Wild Hunt "Killing Monsters" cinematic trailer to the public for the first time, offering a thrilling glimpse into the morally ambiguous u...
The Witcher 3: Wild Hunt Screenshots - 03/02/2013 10:20 AM
I have some screenshots from The Witcher 3: Wild Hunt along with some official details about their upcoming RPG sequel: The Witcher 3: Wild Hunt is an unprecedented combination of story-driven RPG an...
Senior Member
Posts: 287
Joined: 2009-02-10
But this is what amazes me is... They haven't even tried? Surely it's a matter of software? I remember when I had my Geforce 3 and tried AA for the first time BANG crippled, and since then have never used. And don't see the point adding it when most cannot use it. It's pointless hogging hardware or software on a graphics card.
Senior Member
Posts: 5587
Joined: 2012-11-10
I agree - something should have been done about it a long time ago. They could have at the very least included some instruction set dedicated to AA. The stupid thing is by the time there is a hardware solution to AA, UHD screens will obsolete the need of AA. With enough pixel density, you don't really need to smooth the edges anymore. I get the impression that raising the screen resolution has a much lesser effect on performance than AA. In other words, enabling AA at 720p could have roughly the same visual smoothness as 1080p with no AA, but it wouldn't surprise me if AA at 720p would perform slower. At the very least, an easy fix to a higher resolution is more VRAM and better bandwidth, but I do realize it isn't quite that simple. AA, to my knowledge, doesn't really have a straight-forward hardware solution.
Senior Member
Posts: 167
Joined: 2013-10-13
As many others have said, I also hardly ever go above 2x MSAA unless the game can easily handle higher (like CS:GO where I can go to 16xQCSAA no problem). There just doesn't seem to be much of a difference between 2x and higher settings to make the performance hit worth it.
It's pretty easy to claim a game will cripple GPUs when you turn on 8x MSAA or some form of SSAA.
Senior Member
Posts: 1045
Joined: 2006-02-06
I paid 500$/780ti open box from craigslist. beat that

and oh yeah! +1 to boycott nvidia games lolol
Senior Member
Posts: 5587
Joined: 2012-11-10
What's bugging me with AA is people complain about it, but yet they still haven't in my opinion allowed for two cards to deal with this?
I have a 660ti old a bit slow but games today are a joke in terms of graphical progression, and there is no need for me to upgrade to upscaled console games. But yet why can't I use my old 560 Ti to do the work of AA and let my 660Ti do everything else? Or is this to easy? You can do it with a physX card and what ever you are using why not do the same with AA?
While I like the idea and personally think this should happen to, it is comparing apples to oranges. Physics is a completely separate set of calculations that has nothing to do with visuals; representing physics through visuals is an entirely separate process. AA is a visual process, and PROPER AA needs to know the geometry of the objects being displayed, otherwise there isn't much difference between AA and just simply blurring the image. I figure dedicating a GPU to AA is like putting a motor in a trailer attached to your car - sure the extra power might be handy but getting to operate at the same speed as the car (and knowing when to stop) isn't that easy.