Nvidia GeForce GTX 1080 Ti Arrives late March

Published by

Click here to post a comment for Nvidia GeForce GTX 1080 Ti Arrives late March on our message forum
https://forums.guru3d.com/data/avatars/m/120/120642.jpg
10GB memory seems unlikely, its a way too odd amount for such a bus width.
i have seen other sites speculating on a 320 bit bus, so that could also be possible.. -andy-
https://forums.guru3d.com/data/avatars/m/248/248994.jpg
Or they are just less competent than NVIDIA?
That generally goes hand in hand. It's not cheap to develop this kind of cutting edge technology. It takes experts and material resources, both of which cost money. I imagine there are people who also love what they are doing and aren't only doing it for money (talking about tech companies other than Intel), but they still require a compensation equal to their skills. Nvidia has lots of money, so they can hire a lot of brains and allow them to experiment a lot. AMD is lacking money, so they will have to do with less, meaning it takes longer. Even if the single researches were just as competent, it becomes different if one company has 10 and the other 100 of them.
https://forums.guru3d.com/data/avatars/m/243/243702.jpg
If it was 1070 competitor they wouldnt be selling it on flash sales for 200 something bucks πŸ˜‰ To anyone buying 4GB card I say, GO WITH CHRIST! :D
It is 2017, quite some time after "4GB vram doom" people came. And I am yet to see stutter caused by full vram. When "3GB vram doom" came, I did my testing: In FS mode, game has "all" vram for itself as unnecessary stuff is unloaded. So I tested in windowed mode. It took me to load 7 good looking maxed out games to get into point where stutter came due to vram utilization. Yes, yes. Stupid 4k Skyrim textures happen. They cost arm and leg in vram, but deliver visual improvement only at very close range. VRAM utilization is not going to be end of my quite powerful GPU, it will be either complex shader code or high polygon count. Same way as always. If we looked at history, and compiled list of reasons why each GPU became obsolete, you would maybe, just maybe found GTX 680 2GB as a GPU which faced true vram caused stutter before it did run out of horsepower. But main historical reasons for High End GPUs to become obsolete: Shader Throughput, then Polygon Throughput, then DX support too low (mainly in time of jumps from DX7 to 8, 8.1, 9.0~9.04).
data/avatar/default/avatar03.webp
for AMD as long they can release "value" from performance/price, they should be fine say if they can release 1070ti-like performance but like 100bucks cheaper it will selling like cakes anyway i dont think amd pursuing "KING" of gpu title not for now at least, they seems just focusing on ryzen launch for now even though gpu division is completely separate from cpu
https://forums.guru3d.com/data/avatars/m/258/258664.jpg
So no Pascal refresh? 3/4 of a price would be sweet, just like with GTX285. And Volta is sheduled for next year?
Yes about that Pascal refresh / Volta thing I was wondering too... not sure how they'll handle it, probably a few refresh models in H2/17, but in 2018 eventually Volta is supposed to launch afaik.
Probably not even big Vega for me. But it is close. AMD Navi will have fully working ability to discard invisible geometry at low computational (time) cost. Vega has just an taste of it (AMD stated double geometry throughput in slides and final notes state 2.6x, but 11/4 = 2.75). For Navi to be brought down to its knees one will have to put as many visible polygons on screen as possible. Invisible polygons will be just culled at nearly no time cost. I expect that geometry which brings Navi on knees will slaughter any current GPU. And as such I expect Navi to last till developers have reason to put extreme polygon complexity on screen. Fiji: 4 geometry Engines = 4 polygons per clock processed Vega: 4 geometry engines = 11 polygons per clock processed - - - - Then there is shader/compute performance. I remember that Slide for Vega and talk Raja had about improvement. He was very optimistic and slide was either wrong or crazy in potential. Slide not only shown double operations per tick, but it did show double frequency in comparison to old GCN. (And I was like, are they pulling old nVidia's double shader clock?) And then I connected 2 together (Infinity Fabric), can this actually allow for Shaders to tick at different speed than rest of the chip without additional latency/complexity? Like 20% higher clock? [spoiler]https://image.slidesharecdn.com/vegafinalpresentation-170106034755/95/amd-vega-presentation-gpu-memory-architecture-29-1024.jpg?cb=1483674511[/spoiler]- - - - And afterward there is Memory handling. Vega already has promise coming with it: - greatly reduced utilization of vram in comparison to current GPUs under same gaming conditions -> that as consequence reduces stutter from caching of stuff which is not needed - improved compression and ability to load from memory only parts of textures which are needed for rendering for given polygon (if you see only 10% of some surface, then only ~10% of texture covering it will be loaded) = = = = There are some other not so big things for gaming. But Vega has 1st iteration of many things and Navi will have them fine tuned + some new stuff (if we are lucky).
Ah, thanks for the insights!
It is 2017, quite some time after "4GB vram doom" people came. And I am yet to see stutter caused by full vram. When "3GB vram doom" came, I did my testing: In FS mode, game has "all" vram for itself as unnecessary stuff is unloaded. So I tested in windowed mode. It took me to load 7 good looking maxed out games to get into point where stutter came due to vram utilization. Yes, yes. Stupid 4k Skyrim textures happen. They cost arm and leg in vram, but deliver visual improvement only at very close range. VRAM utilization is not going to be end of my quite powerful GPU, it will be either complex shader code or high polygon count. Same way as always. If we looked at history, and compiled list of reasons why each GPU became obsolete, you would maybe, just maybe found GTX 680 2GB as a GPU which faced true vram caused stutter before it did run out of horsepower. But main historical reasons for High End GPUs to become obsolete: Shader Throughput, then Polygon Throughput, then DX support too low (mainly in time of jumps from DX7 to 8, 8.1, 9.0~9.04).
I'm with you on this one. Still playing happily at 1440p with (2x) 4GB VRAM, haven't found any stuttering due to a lack of VRAM. I've experienced it with my trusty by then 580 Lightning 3GB VRAM and Skyrim's 4K textures for 1080p, but I've never tried it with 4GB VRAM. But I can safely say, for instance, Shadow of Mordor's HD textures do not need 6GB VRAM, they run just fine on my system at 1440p with no issues.
https://forums.guru3d.com/data/avatars/m/243/243702.jpg
for AMD as long they can release "value" from performance/price, they should be fine say if they can release 1070ti-like performance but like 100bucks cheaper it will selling like cakes anyway i dont think amd pursuing "KING" of gpu title not for now at least, they seems just focusing on ryzen launch for now even though gpu division is completely separate from cpu
They share same IP and cooperate. That's why GPUs and CPUs using Infinity Fabric will be released at same time. Before they were not clearly separated, yet APUs took 4 years to finally move from VLIW4 to GCN iGPUs. Now, they are separated and use newest and greatest from both divisions.
https://forums.guru3d.com/data/avatars/m/88/88871.jpg
Now this may be a worthy upgrade.. The 1080 just didn't cut it
data/avatar/default/avatar08.webp
So Nvidia's profits will now Quadruple when all the FOMO Sheep rush out to buy this when it's released πŸ˜€ The eye candy whores and framerate freaks will no doubt be drooling over hardly noticeable visual improvements and a few extra FPS @ 4k, while proper gamers will just πŸ™„ at them πŸ™‚
https://forums.guru3d.com/data/avatars/m/258/258664.jpg
So Nvidia's profits will now Quadruple when all the FOMO Sheep rush out to buy this when it's released πŸ˜€ The eye candy whores and framerate freaks will no doubt be drooling over hardly noticeable visual improvements and a few extra FPS @ 4k, while proper gamers will just πŸ™„ at them πŸ™‚
:roll: Somebody's salty.
https://forums.guru3d.com/data/avatars/m/233/233335.jpg
Yeah!! Can't wait..Hope it's equal or better than Titan XP...Will change my 1080 SLI to 1080Ti SLI..I consider Titan XP SLI before but Titan XP is not available in our country...
https://forums.guru3d.com/data/avatars/m/224/224952.jpg
So Nvidia's profits will now Quadruple when all the FOMO Sheep rush out to buy this when it's released πŸ˜€ The eye candy whores and framerate freaks will no doubt be drooling over hardly noticeable visual improvements and a few extra FPS @ 4k, while proper gamers will just πŸ™„ at them πŸ™‚
Jealous much. Why do you care?
https://forums.guru3d.com/data/avatars/m/260/260826.jpg
Yes but Vega will be that much sweeter when it does come out. And cheaper too! For having me wait this long, dirt cheap is what I expect.
Don't bet on that. Remember Fury X launch price compared to 980 TI price on...and performance. :3eyes:
https://forums.guru3d.com/data/avatars/m/242/242471.jpg
1080Ti 1 year too late πŸ˜›
https://forums.guru3d.com/data/avatars/m/259/259654.jpg
Everything will depend on Vega's price/performance ratio, but for this launch cycle I get this feeling that is NVIDIA who is late with their new stuff.
https://forums.guru3d.com/data/avatars/m/165/165326.jpg
Great news ! I will wait until Vega comes out to compared them two since i'm in no rush to upgrade just yet as my GTX 1070 is doing fine for the games i play at 4K , then and then only i will upgrade to one of this 2 video cards ( GTX 1080Ti or Vega ) . I need to see performance of this 2 cards gaming at 4K in reviews to make a well informed decision.
https://forums.guru3d.com/data/avatars/m/260/260048.jpg
Great news ! I will wait until Vega comes out to compared them two since i'm in no rush to upgrade just yet as my GTX 1070 is doing fine for the games i play at 4K , then and then only i will upgrade to one of this 2 video cards ( GTX 1080Ti or Vega ) . I need to see performance of this 2 cards gaming at 4K in reviews to make a well informed decision.
Yup, thats the correct thing to do. For me not an option, due to my monitor. However for someone building a new Rig, might as well make it full AMD this yeah as Ryzen seems hell of a promising thingy. Unless Intel will SIGNIFICANTLY drop prices on their 8 core CPU's.
https://forums.guru3d.com/data/avatars/m/69/69564.jpg
1080Ti 1 year too late πŸ˜›
Yeah, and probably less than a year until a brand new GPU comes out, buying a 1080ti would be just silly. Those who dont care about money and throw them at.. whatever new comes out probably have a titan already, for the rest, unless it's someone in need of a new GPU and cant wait... Well anyway, maybe i'll get a 2nd hand 1080 now with a few here selling them off cheapish hopefully. Profit for me i guess πŸ€“ Maybe i'm the stupid one here, but releasing a 1080ti at this point in time just seems silly (or rather buying one would be)
https://forums.guru3d.com/data/avatars/m/252/252408.jpg
10.8 TFLOPS seems a bit generous for the next Ti since the difference between the Maxwell cards was only 0.51 TFLOPS. So 10.49 TFLOPS seems more accurate. Hopefully AMD would release a cheaper 12 TFLOPS+ Single GPU though.