NVIDIA Very Likely To Reveal GeForce GTX 1180 on August 20

Published by

Click here to post a comment for NVIDIA Very Likely To Reveal GeForce GTX 1180 on August 20 on our message forum
https://forums.guru3d.com/data/avatars/m/79/79740.jpg
Silva:

And for just 1999.99! Cuz you know, we have so much old stock we could not have lowered the price to get rid of it.
I dont think they want to shrivel their market share from 70% to 1%.
data/avatar/default/avatar20.webp
Seems to me GPUs are being marketed a bit more for the mining market, so it will be interesting ti see efficiency. With coin mining being such an up and down cycle, it a good market the GPU industry. Mining down, they sell cards off. Mining up they have new more efficient cards to buy. Could be what's up with these. They know they have loyalty with gaming, and many will upgrade for minor gains in some cases.
https://forums.guru3d.com/data/avatars/m/270/270041.jpg
fantaskarsef:

Yeah... I mean I had CFX with BF3, it was a PITA (because back then AMD was very slow with offering profiles), then I went for a single card, and gotten SLI afterwards (BF4 and BF1), and it was really cool... sadly, with BF1 I already felt the first problems when game updates broke SLI support...
I don't mind sli/xfire dying, it was rarely supported and often had many issues on many games especially at launch, we even saw titles get worse performance than a single card! As for performance just wait for HH review, chances are it isn't a refresh this late into the market. they would have done that last year if then...
data/avatar/default/avatar29.webp
I wouldn't bother upgrading from 1080ti to Titan V even if it only cost me £300 to make the switch. The difference isn't big enough. If Titan V could run at higher overclocks than a 1080ti though....................................but it can't - you have to do all sorts of mods to titan V to get over 2000mhz.
https://forums.guru3d.com/data/avatars/m/216/216349.jpg
fantaskarsef:

I mean, seriously... where's the barrier for a "worthy" upgrade? For me personally those wouldn't be worth flashing out that kind of money again for a 1180Ti that is 15% faster than my current card.
Maybe Nvidia is going to market/sell those new cards to users who are still using the 9xx gen or older Nvidia cards giving them the chance for a very significant upgrade. And let´s face it, very few owners of Volta need new GPUs...
https://forums.guru3d.com/data/avatars/m/258/258664.jpg
H83:

Maybe Nvidia is going to market/sell those new cards to users who are still using the 9xx gen or older Nvidia cards giving them the chance for a very significant upgrade. And let´s face it, very few owners of Volta need new GPUs...
While this may be true, users of Maxwell cards ("generation 900") could have switched over to Pascal ("generation 1000"), if not for the still high prices. There's still users here (talking about you @-Tj- 😀) that are waiting for an upgrade, but not for MSRP. You are absolutely right, another 15% on top of Pascal could make an upgrade at MSRP more interesting for them, I didn't think about that. And Pascal upgrades to what comes next ("generation 1100") might not be that useful (especially if they pair it with a Gsync monitor already), that's also true, but most of all 4K gamers will definately want to upgrade their 1080s / 1080TIs, just to keep those 60fps without turning graphic settings down. That's basically what I was thinking about when typing my second post in this thread, I have a Gsync monitor, I don't care if I game at 142 fps or 130... that's why I was asking out loud if it really is a worthy upgrade to gain 15%, and that for me, it most likely won't be worth the money. I would just have one more reason to be happy with my monitor, it had the price of a high end card by itself, now I could justify those investments, actually 😀
https://forums.guru3d.com/data/avatars/m/216/216349.jpg
fantaskarsef:

And Pascal upgrades to what comes next ("generation 1100") might not be that useful (especially if they pair it with a Gsync monitor already), that's also true, but most of all 4K gamers will definately want to upgrade their 1080s / 1080TIs, just to keep those 60fps without turning graphic settings down. That's basically what I was thinking about when typing my second post in this thread, I have a Gsync monitor, I don't care if I game at 142 fps or 130... that's why I was asking out loud if it really is a worthy upgrade to gain 15%, and that for me, it most likely won't be worth the money. I would just have one more reason to be happy with my monitor, it had the price of a high end card by itself, now I could justify those investments, actually 😀
I understand perfectly that for your specific case an 15% improvement is not worth buying a new card, it´s normal. But i think guys like you who want better cards to power 4K screens are the minority so Nvidia is probably going to neglect them in this next generation... Specially because they little to no competition... Personally i´m going to skip this gen unless Nvidia releases something really great an unexpected.
https://forums.guru3d.com/data/avatars/m/258/258664.jpg
H83:

I understand perfectly that for your specific case an 15% improvement is not worth buying a new card, it´s normal. But i think guys like you who want better cards to power 4K screens are the minority so Nvidia is probably going to neglect them in this next generation... Specially because they little to no competition... Personally i´m going to skip this gen unless Nvidia releases something really great an unexpected.
Yes I understand your reasoning. Although I just wanted to say, gaming at 1440p here, so I would have even more time to wait with an upgrade than at 4K. But... there's the itch... 😀
https://forums.guru3d.com/data/avatars/m/79/79740.jpg
If its a weak performance gain, then maybe in another 6-12 months we'll see the real beef @ 7nm. Timed in accordance with what AMD is doing in that time frame.
https://forums.guru3d.com/data/avatars/m/79/79740.jpg
Yeah... Duh... 😀.
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
H83:

I understand perfectly that for your specific case an 15% improvement is not worth buying a new card, it´s normal. But i think guys like you who want better cards to power 4K screens are the minority so Nvidia is probably going to neglect them in this next generation... Specially because they little to no competition... Personally i´m going to skip this gen unless Nvidia releases something really great an unexpected.
Nvidia should stop pushing 4K TV's and monitors then. They basically removed my ability to buy an HDR 1440P because the only ones available are FreeSync, which they don't support - but then if performance is only 15% on the new cards, they also removed my ability to play games at 4K 60Hz on their new monitors. Even at the Titan V's 30-35% gain, 4K 60 just becomes playable in most newer titles, beyond 60Hz still relatively unplayable. So they'd basically need 50%+ performance at minimum to cross that magic "threshold" where it becomes worth it. Honestly the G-Sync lock in is really starting to get frustrating.. the selection of G-Sync monitors is pathetic compared to FreeSync and the advantages G-Sync had are gone now unless you're willing to shell out $2000 for a monitor no card can run in newer/high graphic titles.
https://forums.guru3d.com/data/avatars/m/122/122801.jpg
Hense why Nvidia will be selling old cards as new, they have tons of stock and no one is buying...............I wont hold my breath.
https://forums.guru3d.com/data/avatars/m/216/216349.jpg
Denial:

Nvidia should stop pushing 4K TV's and monitors then. They basically removed my ability to buy an HDR 1440P because the only ones available are FreeSync, which they don't support - but then if performance is only 15% on the new cards, they also removed my ability to play games at 4K 60Hz on their new monitors. Even at the Titan V's 30-35% gain, 4K 60 just becomes playable in most newer titles, beyond 60Hz still relatively unplayable. So they'd basically need 50%+ performance at minimum to cross that magic "threshold" where it becomes worth it. Honestly the G-Sync lock in is really starting to get frustrating.. the selection of G-Sync monitors is pathetic compared to FreeSync and the advantages G-Sync had are gone now unless you're willing to shell out $2000 for a monitor no card can run in newer/high graphic titles.
Well you can play games on a 4K monitor with a 1080Ti as long as you make some compromises with the graphics/visual options, it´s not like it´s mandatory to max out every graphical settings specially because lots of those settings offer insignificant improvements while demanding a big performance hit... As for G-Sync, let´s just say i hate proprietary stuff on PC gaming and i´m waiting for it to die as soon as possible. Nvidia has to adopt FreeSync!
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
H83:

Well you can play games on a 4K monitor with a 1080Ti as long as you make some compromises with the graphics/visual options, it´s not like it´s mandatory to max out every graphical settings specially because lots of those settings offer insignificant improvements while demanding a big performance hit...
True - I'd really like to see more reviewers try games at 4K with AA off. That often butchers performance and at 4K, the visual difference is likely too small to make the heavy performance impact worth it. I'd much rather have AA off than my frame rate dip below 45FPS.
As for G-Sync, let´s just say i hate proprietary stuff on PC gaming and i´m waiting for it to die as soon as possible. Nvidia has to adopt FreeSync!
Probably never going to happen. Nvidia would rather a technology get abandoned than imply AMD (or Intel or Microsoft or VESA or Khronos, etc, doesn't matter who) had a better way of going about something. There are only 2 reasons why I avoid Nvidia's products, and one of them is their arrogance (the other being they're a bit too expensive for my taste). To be fair, Nvidia has valid reasons for their arrogance, but it really gets in the way of progress. Just as a side note, AdaptiveSync is the VESA standard that Nvidia should support; FreeSync is AMD's branding of it, so Nvidia wouldn't use that name no matter what.
https://forums.guru3d.com/data/avatars/m/232/232349.jpg
fantaskarsef:

Yeah... I mean I had CFX with BF3, it was a PITA (because back then AMD was very slow with offering profiles), then I went for a single card, and gotten SLI afterwards (BF4 and BF1), and it was really cool... sadly, with BF1 I already felt the first problems when game updates broke SLI support...
I hear you on that. Interesting how they would shoot themselves in the foot when I personally would have bought as many cards as were ever supported....
https://forums.guru3d.com/data/avatars/m/243/243702.jpg
EL1TE:

Wrong, AMD 7nm and Intel next year, if they don't make something good people wont bother buying it due to not being worth the upgrade, by doing this and delay they risk both AMD and Intel beat them, and this is just the tip of the Iceberg. Investors need money and board partners like ASUS and MSI already complained (i saw it on Guru3D in a phrase on an article about this same context) they aren't making money due to the cycle of GPU releases still didn't happen, they don't care if you or me think it's not the time to release it, NVIDIA doesn't make good GPUs to make us happy they make it to be the best so that people buy their products and it makes investors happy, delaying it puts that at risk, because not only AMD is going for 7nm which allows crazy improvements in clocks (doesn't mean the architecture is good obviously) we also have Intel GPUs next year. And like i said before they can always make a better product that a Ti, as in two version of a xx80 excluding the base one. I wonder if we will ever see a dual GPU cards like there used to be, since SLI was pretty much abandoned. Yes this is the point, however they can always make 1180 12nm and then go for 1190 7nm, but i'm not sure that people would like this and probably feel betrayed lol.
Intel dGPU plans are more to 2020. And I still consider them being more oriented towards server market. (With possibility of mining sales.) If they are good for gamers too, and intel improves options of their control panel... Hard to say, I like even nVidia's old school control panel more than intel's: "I want to annoy every user" kind of control panel.
DW75:

Crypto is going to gain value like crazy in August. We, the gamers, are going to end up with the new video cards costing double what they should all over again. It is going to be the same situation as before. Retailers and 3rd party sellers will be trying to get 1200 bucks for a GTX 1180, 900 bucks for a GTX 1170, and 600 bucks for a GTX 1160 for the first 4 to 6 months after the cards launch.
Yesterday, I mentioned possibility because market depth looked very balanced. Moments after, and bitcoin is crashing... lost ~8% since then. You can make hundreds of jokes about mining. But then... sadly... you realize... that they were no jokes. Best one is one about how mining created money for miner: "Hey, I burn electricity and it creates some hashes. Gimme your money, and I'll give you some of those sequences. Then you can sell them to someone else to make money."
H83:

Maybe Nvidia is going to market/sell those new cards to users who are still using the 9xx gen or older Nvidia cards giving them the chance for a very significant upgrade. And let´s face it, very few owners of Volta need new GPUs...
Did you just say that internet coffees in China are going to do massive upgrades?
https://forums.guru3d.com/data/avatars/m/242/242471.jpg
fantaskarsef:

While this may be true, users of Maxwell cards ("generation 900") could have switched over to Pascal ("generation 1000"), if not for the still high prices. There's still users here (talking about you @-Tj- 😀) that are waiting for an upgrade, but not for MSRP. You are absolutely right, another 15% on top of Pascal could make an upgrade at MSRP more interesting for them, I didn't think about that. And Pascal upgrades to what comes next ("generation 1100") might not be that useful (especially if they pair it with a Gsync monitor already), that's also true, but most of all 4K gamers will definately want to upgrade their 1080s / 1080TIs, just to keep those 60fps without turning graphic settings down. That's basically what I was thinking about when typing my second post in this thread, I have a Gsync monitor, I don't care if I game at 142 fps or 130... that's why I was asking out loud if it really is a worthy upgrade to gain 15%, and that for me, it most likely won't be worth the money. I would just have one more reason to be happy with my monitor, it had the price of a high end card by itself, now I could justify those investments, actually 😀
Well, I don't play as much lately, but if the price is right I might buy it. I'm still thinking it will be 699$ like said at first, otherwise they won't make any real money out of it..
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
-Tj-:

I'm still thinking it will be 699$ like said at first, otherwise they won't make any real money out of it..
I have a really hard time understanding how you came to that conclusion... The vast majority of the expense you're paying for is the engineering, not the physical product itself. Seeing as this is a rebranded product and is nearing 3 years old, their engineering costs have been long paid for. Let's not forget Nvidia's huge success in the server market (where profit margins are even higher) and in cryptocurrency mining. For $700, Nvidia is making some crazy profits, and I think think multi-billion dollar net revenue is sufficient proof of that. So yes, they will be making plenty of real money out of it, and the reason it's so expensive is because of lack of competition.
https://forums.guru3d.com/data/avatars/m/242/242471.jpg
That was a initial leak saying, 1170 for 399$.