The Witcher 3 - GTX 780Ti Could Push 35-45 FPS At Max?

Published by

Click here to post a comment for The Witcher 3 - GTX 780Ti Could Push 35-45 FPS At Max? on our message forum
https://forums.guru3d.com/data/avatars/m/237/237771.jpg
lolwut
SHHHH! (Don't tell him thats AA.)
https://forums.guru3d.com/data/avatars/m/186/186319.jpg
bring it on my gtx 780s 3 way-sli @ 2560 x 1600 going love it ..:D
https://forums.guru3d.com/data/avatars/m/181/181778.jpg
By that time nvidia released there duel 790 card or the new maxwell gtx 800 series. I refuse to play this game without aa, and i only want to use msaa, best aa in all ways 🙂 4x is still oke for me hopefully my 690 wil pull that off @ ultra detail @ 1920x1080p @ 120 hz monitor between 70-80 fps i dont complain. else i just buy a single duel gpu chip that runs this game to maximum like a 790 😀
https://forums.guru3d.com/data/avatars/m/243/243536.jpg
they have released a dual card, the titanZ.
https://forums.guru3d.com/data/avatars/m/124/124168.jpg
lolwut
Lmao. .
https://forums.guru3d.com/data/avatars/m/181/181778.jpg
they have released a duel card, the titanZ.
**** that 🙂 I just wait for a duel 780 ti 🙂
https://forums.guru3d.com/data/avatars/m/243/243536.jpg
the titanZ is duel 780 ti. I dunno if we'll see 790.
https://forums.guru3d.com/data/avatars/m/130/130856.jpg
well, when I got my first 780ti, I tried to enable ubersampling on witcher 2. To be honest, my fps still sucked. May be its the same ubersampling thing again 😏?
https://forums.guru3d.com/data/avatars/m/181/181778.jpg
the titanZ is duel 780 ti. I dunno if we'll see 790.
True but the titan z is more a workstation card then a gaming card tho its more like the god of the nvidia quarto family. I still wait for a propper duel gpu like the gtx 690 i have now. And not a price tag of 3000 euro thats just insane. But just 1000 or 1100 euro, and hopefully they name it the gtx 795 or 790 ultra or what so ever.
data/avatar/default/avatar13.webp
So articles only concerned about Nvidia or only focus on Green site? Nice, then i'll avoid this Nvidia game and stick to games that made with both cards in mind or only AMD:P At least i have more Vram then 780 ti:D
https://forums.guru3d.com/data/avatars/m/142/142454.jpg
But just 1000 or 1100 euro, and hopefully they name it the gtx 795 or 790 ultra or what so ever.
Good luck with that! I think the 295x2 is the only thing that will fit that bill.
https://forums.guru3d.com/data/avatars/m/130/130856.jpg
So articles only concerned about Nvidia or only focus on Green site? Nice, then i'll avoid this Nvidia game and stick to games that made with both cards in mind or only AMD:P At least i have more Vram then 780 ti:D
I think nvidia spend additional cash in game development which concludes in their product's cost of production. hence more expensive hardware :P while you enjoy similar \ same performance for 100-150$ less, at least let us enjoy our 150$ difference worth of propaganda 😀
https://forums.guru3d.com/data/avatars/m/245/245459.jpg
Am I the only one around here who never uses AA? I hate how blurry it makes everything look. Same goes for motion blur, ugh! MSAA all the way 😀
I agree, MSAA (x4) and CSAA (x8) are the only ones I use. No blurriness to textures with those. FXAA and TXAA just make everything blurry & decrease the sharpness of the appearance & textures.
https://forums.guru3d.com/data/avatars/m/239/239175.jpg
No, thanks. Can't they create games instead that run with an enjoyable performance? If I want slideshows, I'll look at Google Images. Game devs seem to miss the point of high-end hardware. It's not there in order to raise a game's performance slightly above "barely acceptable." It's there to get enough performance for your 144Hz monitor, your multi-monitor "eyefinity" or whatever setup, etc. And why are people saying "this game has the best graphics ever?" If the graphics run like crap, guess what, they're bad. Image quality is only one aspect of good graphics. The other is fluidity. If either of those is crap, then the overall result is crap too. Image quality without fluidity is only important for static images. Unless they actually intend to make sales 4 years from now, since that's the timeframe for the hardware to catch up to this stuff. But by then, the game won't be profitable anymore to begin with. The Witcher and Crysis - wasting otherwise good PC firepower on unoptimized crap.
https://forums.guru3d.com/data/avatars/m/142/142454.jpg
No, thanks. Can't they create games instead that run with an enjoyable performance? If I want slideshows, I'll look at Google Images. Game devs seem to miss the point of high-end hardware. It's not there in order to raise a game's performance slightly above "barely acceptable." It's there to get enough performance for your 144Hz monitor, your multi-monitor "eyefinity" or whatever setup, etc. And why are people saying "this game has the best graphics ever?" If the graphics run like crap, guess what, they're bad. Image quality is only one aspect of good graphics. The other is fluidity. If either of those is crap, then the overall result is crap too. Image quality without fluidity is only important for static images. Unless they actually intend to make sales 4 years from now, since that's the timeframe for the hardware to catch up to this stuff. But by then, the game won't be profitable anymore to begin with. The Witcher and Crysis - wasting otherwise good PC firepower on unoptimized crap.
:spam:
https://forums.guru3d.com/data/avatars/m/199/199519.jpg
No, thanks. Can't they create games instead that run with an enjoyable performance? If I want slideshows, I'll look at Google Images. Game devs seem to miss the point of high-end hardware. It's not there in order to raise a game's performance slightly above "barely acceptable." It's there to get enough performance for your 144Hz monitor, your multi-monitor "eyefinity" or whatever setup, etc. And why are people saying "this game has the best graphics ever?" If the graphics run like crap, guess what, they're bad. Image quality is only one aspect of good graphics. The other is fluidity. If either of those is crap, then the overall result is crap too. Image quality without fluidity is only important for static images. Unless they actually intend to make sales 4 years from now, since that's the timeframe for the hardware to catch up to this stuff. But by then, the game won't be profitable anymore to begin with. The Witcher and Crysis - wasting otherwise good PC firepower on unoptimized crap.
What? :wanker:
data/avatar/default/avatar22.webp
AA is typically the first thing I turn off. High AA can make a game look nearly photo-real, it's just so computationally demanding it's usually not worth it. I'd like to see a discreet graphics chip that does nothing but calculate AA, if such a thing is possible (and does not require a dual-GPU-ready motherboard). A discreet card was used to calculate physics (Physx, by AGEIA), and for various reasons that failed, but anti-aliasing is very different than physics simulation. AA just serves to enhance what is already there, which means it could potentially enhance every game, not just ones that are built specifically for it, as with the Physx card.
You can run SLI AA for games... http://developer.download.nvidia.com/whitepapers/2011/SLI_Best_Practices_2011_Feb.pdf
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
I too think AA makes things a bit blurry at times. If I enable it, I only keep it at 2x or 4x. The performance vs visual result ratio is not worth it to me to use AA, so I've been getting away with using mid-range GPUs for a while. Besides, it's pretty easy to get used to no AA at all. There are plenty of old games you can play today and cringe at how ugly they are but after a couple hours you quickly learn to accept the look. It's almost as though your brain fills in the gaps in the details. I agree with one of the earlier posters though - it would be nice to see a piece of hardware dedicated to processing AA, and maybe the smoothing of shadows too (as far as I'm aware, it's a similar calculation).
data/avatar/default/avatar15.webp
No, thanks. Can't they create games instead that run with an enjoyable performance? If I want slideshows, I'll look at Google Images. Game devs seem to miss the point of high-end hardware. It's not there in order to raise a game's performance slightly above "barely acceptable." It's there to get enough performance for your 144Hz monitor, your multi-monitor "eyefinity" or whatever setup, etc. And why are people saying "this game has the best graphics ever?" If the graphics run like crap, guess what, they're bad. Image quality is only one aspect of good graphics. The other is fluidity. If either of those is crap, then the overall result is crap too. Image quality without fluidity is only important for static images. Unless they actually intend to make sales 4 years from now, since that's the timeframe for the hardware to catch up to this stuff. But by then, the game won't be profitable anymore to begin with. The Witcher and Crysis - wasting otherwise good PC firepower on unoptimized crap.
Nobody is forcing you to run the game on max possible settings, you can mess with options and get the visuals/performance how ever you want - its not like the game will be an ugly mess if its not maxed out!
https://forums.guru3d.com/data/avatars/m/130/130856.jpg
No, thanks. Can't they create games instead that run with an enjoyable performance? If I want slideshows, I'll look at Google Images. Game devs seem to miss the point of high-end hardware. It's not there in order to raise a game's performance slightly above "barely acceptable." It's there to get enough performance for your 144Hz monitor, your multi-monitor "eyefinity" or whatever setup, etc. And why are people saying "this game has the best graphics ever?" If the graphics run like crap, guess what, they're bad. Image quality is only one aspect of good graphics. The other is fluidity. If either of those is crap, then the overall result is crap too. Image quality without fluidity is only important for static images. Unless they actually intend to make sales 4 years from now, since that's the timeframe for the hardware to catch up to this stuff. But by then, the game won't be profitable anymore to begin with. The Witcher and Crysis - wasting otherwise good PC firepower on unoptimized crap.
well, thing is, playing in highest end graphics, is a luxury. And for luxury my friend, you have to pay 🙂 If one 780ti is not enough for max graphics, I am sure, two will be enough. If not, i will still be happy that something is out there which justifies my investment in two 780ti.