NVIDIA GeForce GTX 880 and GTX 870 to Launch Q4 2014

Published by

Click here to post a comment for NVIDIA GeForce GTX 880 and GTX 870 to Launch Q4 2014 on our message forum
https://forums.guru3d.com/data/avatars/m/175/175739.jpg
Sounds interesting! So are we saying that it would be fine to get a card at the end of the year like a 870 or 880 that is 28nm that also has an on-board chip instead of waiting for the 20mn or less cards?
https://forums.guru3d.com/data/avatars/m/227/227994.jpg
Sounds interesting! So are we saying that it would be fine to get a card at the end of the year like a 870 or 880 that is 28nm that also has an on-board chip instead of waiting for the 20mn or less cards?
For me it depends on Game Releases. I do notice my 680 is getting weak at 1200p.
https://forums.guru3d.com/data/avatars/m/206/206905.jpg
The problem is you could buy a gtx 880 and then 6 months later a dx12 card appears.
https://forums.guru3d.com/data/avatars/m/175/175739.jpg
gtx 880 not going to be DX12?
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
gtx 880 not going to be DX12?
I mean currently shipping Maxwell chips are not full DX12, it's possible that they can add some stuff to it but who knows if Microsoft will have the spec finalized by this fall. The preview for Windows 9 isn't even supposed to be until Q3 2015 now so.. Honestly the announced hardware required portions of DX12 don't even seem that groundbreaking so I doubt there will be a massive push for implementation in games. The best part of 12 will be supported on current hardware. Shouldn't really be an issue.
https://forums.guru3d.com/data/avatars/m/206/206905.jpg
I mean currently shipping Maxwell chips are not full DX12, it's possible that they can add some stuff to it but who knows if Microsoft will have the spec finalized by this fall. The preview for Windows 9 isn't even supposed to be until Q3 2015 now so.. Honestly the announced hardware required portions of DX12 don't even seem that groundbreaking so I doubt there will be a massive push for implementation in games. The best part of 12 will be supported on current hardware. Shouldn't really be an issue.
Yeah but once dx12 comes out your card will feel inferior and most will feel the need to have dx12 no matter what card you have.......ill be proceeding with my next GPU purchase with caution!
https://forums.guru3d.com/data/avatars/m/227/227994.jpg
The problem is you could buy a gtx 880 and then 6 months later a dx12 card appears.
95% of the Games are still DX9 ๐Ÿ™‚ It's gonna take atleast 10 Years untill 75% of the Games are running DX11.
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
95% of the Games are still DX9 ๐Ÿ™‚ It's gonna take atleast 10 Years untill 75% of the Games are running DX11.
Uh, what? You mean 95 percent of the games ever released? And if you do, who cares? It's not like buying a DX12 card prevents you from fully experiencing DX9 games. But not buying a DX12 card definitely prevents you from the DX12 experience, which isn't much but whatever.
https://forums.guru3d.com/data/avatars/m/227/227994.jpg
Uh, what? You mean 95 percent of the games ever released? And if you do, who cares? It's not like buying a DX12 card prevents you from fully experiencing DX9 games. But not buying a DX12 card definitely prevents you from the DX12 experience, which isn't much but whatever.
By the time DX12 Games come out you're probably 5 years from now, which means you got a new Videocard by then which DOES support DX12.
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
By the time DX12 Games come out you're probably 5 years from now, which means you got a new Videocard by then which DOES support DX12.
I don't think its worth worrying about because DX11 cards are capable of supporting the most significant DX12 features. That being said I definitely don't agree with 5 years. There were 23 AAA DX11 titles 2 years after DX11's release and that was without the Xbox 360 supporting it. Xbox One SDK will support DX12 so all titles developed on XBOX after that SDK update will already ship with 12. Granted those titles will most likely not ship with hardware specific DX12 features immediately but they will still take advantage of the multithreading improvements and low level access. In my opinion there won't be an advantage of owning DX12 class hardware until at least 2016 but there will definitely be DX12 games. And I definitely think if someone is on a 3-4 year upgrade plan/budget they should probably wait closer to mid 2015. If you upgrade every year anyway and/or spend $1000 on video cards you can probably afford to upgrade whenever regardless.
https://forums.guru3d.com/data/avatars/m/242/242471.jpg
5tflops is more then enough for U4E and a like, for example SquareEnix Luminous. U4E needs ~ 3tflops on PC, nextgen consoles have what good 2tflops. Imo these GK110 will last for a while and run anything next-gen next year, im pretty sure about that.
https://forums.guru3d.com/data/avatars/m/244/244064.jpg
I find the Vram ,3GB,to be the problem not the GK110 unit(GPU).
https://forums.guru3d.com/data/avatars/m/242/242471.jpg
I would say no, its all down to texture streaming.
https://forums.guru3d.com/data/avatars/m/256/256367.jpg
if these have the onboard cpu then i imagine these will support dx12. @-Tj-, texture streaming is fine when it's on gpu ram, but when swapped to system ram after vram fills up then you got bottleneck.
https://forums.guru3d.com/data/avatars/m/242/242471.jpg
Well I played Bf4 @ ultra textures with just 1.5gb vram np, its all down to how driver handles data, some stream stuttered some didn't. Also it depends how a game handles texture streaming, WD can't properly, yet. No amount of ram helps, otherwise Titan users wouldn't complain with 6gb vram. ๐Ÿ˜€ For example "next-gen" U4E is designed at ~ 2gb vram, if it uses more its because it stores extra textures in advance and not because it needs to, its the same by other engines. Its all down to how efficient it is., if it jitters/stutter then it swaps too much or too little. There is a certain threshold for each setting for sure, but that doesn't mean A value is now not enough, because B value can use up to this amount. Check this U3E texture streaming wiki
Specific textures or meshes can be streamed in ahead of time by forcing them into memory in the game code, by calling one of the PrestreamTextures() functions. For a specified time, all their mip-levels will be loaded even if nothing is using those textures yet. Once they start to get used, they can go back to be handled by the streaming system in the normal way again.
http://udn.epicgames.com/Three/TextureStreaming.html ^ See this is why it can use more then 3gb in something like Dying light, I tested max with 589gtx 1.5gb and it still played ok @ 35-45fps but it didnt stutter because of low vram.
https://forums.guru3d.com/data/avatars/m/242/242573.jpg
So ~ 6-7Gb for gpu, Titan and Titan black then.. And you see what happens on console like game with 6gb vram, imo all judge WD way too much. Its a ****ty port no matter what. ๐Ÿ˜€ Nvidia ain't stupid to let 3gb be the limit on their flagship, 1.5gb 580gtx can still cope @ 1080p with low or no aa of course, rest almost maxed.
Watch Dogs is NOT a console port. It was developed with PC as the lead platform. Even Ubisoft's CEO said it in an interview. Remember the E3 2012 version? That was on PC. Ubisoft didn't even have PS4 or Xbone dev kits till a year later. The game runs like crap on PC because once Ubisoft found out that the "Next-gen" consoles were far weaker than they had expected, the remainder of the development time and the months of delay were spent dumbing everything down for consoles, causing PC to not get the final polishing that it needed.
data/avatar/default/avatar11.webp
Hm no idea why people think that GTA5 is going to have mass settings for, the only thing they are really going to do is add AA and a few candy eye thing's, unless we are going to have modders making it look so nice, as for watch dogs they did that nice bait and switch on people and laugh on there way to the bank. so what i say is hey if your GPU can run games just fine now hold out till it can't max anymore then update.
https://forums.guru3d.com/data/avatars/m/253/253706.jpg
4GB of VRAM is far too limiting especially for an SLI setup with higher resolutions. I hope they use at least 6GB.
https://forums.guru3d.com/data/avatars/m/256/256367.jpg
4GB of VRAM is far too limiting especially for an SLI setup with higher resolutions. I hope they use at least 6GB.
it'll be 4, with 8gb versions imo. 6 is the sweet spot for uber high end imo, and is reserved for the titan.
data/avatar/default/avatar37.webp
Why 256bit? Disappointed.