Rumor: Nvidia Ampere might launch as GeForce GTX 2070 and 2080 on April 12th
Click here to post a comment for Rumor: Nvidia Ampere might launch as GeForce GTX 2070 and 2080 on April 12th on our message forum
dr_rus
GP102 launched back in August 2016 in the form of Titan X (Pascal).
Silva
Solfaur
If the GPU crisis will last until that launch date, this will be a massively underwhelming start regardless of the performance of the cards. Expect not just he performance of the 2070 to be at ~1080Ti levels, price too if not more... like they say in crypto - to the moooooooon! 😱
tensai28
Hopefully the 2080 can handle 4k 60fps ultra.
maize1951
To bad they WILL all be bought up by miners, every last one of them unless the manufactures put a restriction on sellers that they can only sell one per household (sorry SLI,ers).
GroinShooter
Yeah my current 1080 costs 720€ at the same store I bought mine from for ~600€ roughly a year ago. Them prices are insane, I didn't even pay that much for my old 980Ti...
Denial
Unless they put something in the hardware of the cards to deter miners or artificially inflate the upfront cost of cards with game rebates, which I wouldn't mind as a way to combat them.
As for the why not 'Volta' thing - i think its more than just HBM2, which Pascal also had, and Tensors - which can be stripped. The entire scheduler on Volta GV100 is oriented towards compute and away from gaming. TSMC's 12nmFF+ 7.5T that Nvidia dubs "FFN" is optimized for large reticule size, the 12nmFF+ 6.5T process is optimized towards lower power and/or higher clockspeeds, gaming chips will most likely utilize this process. GV100 also contains a number of FP64 cores that will most likely be stripped for a gaming variant. I'm also kind of curious if Nvidia supports FP16v2 on "Ampere" gaming variants similar to AMD's FP16 support in Vega - Pascal GP100 supported 16v2 in it's FP32 cores but I don't know how those cores fair in gaming scenarios - optimization might have been done there.
I think for a while Nvidia could have kept similar architecture names for compute/gaming but there at the point now where both sectors need specific optimizations and that's why they are splitting the name. There are going to be a ton of differences between Volta Compute and Volta Gaming (Ampere) or whatever it ends up being called.
fl2015
Picolete
Isn't Ampere an ARM cpu?
Edit: my bad, it's a company that makes ARM cpus
Venix
mattm4
So about the miners... Didn't the board partners tease/show at one point they were going to make miner specific cards? Like a cheaper variant just for mining? Was wondering why that hasn't happened yet.
nevcairiel
Kaarme
coth
Table is unreadable. Gray text on gray background.
Shadowdane
When they showed the Titan V and that price tag.. I pretty much knew the Volta chip wasn't going to be making it to the consumer Geforce series.
It will be interested to see if Ampere has any resemblance to the design of Volta or if it will be something completely different.
sammarbella
Darren Hodgson
I made the mistake of upgrading from a GTX 980 Ti to a GTX 1080 instead of doing the sensible thing and waiting for the GTX 1080 Ti. I didn't need the GTX 1080, it was a case of me just wanting one.
Of course, I now own the GTX 1080 Ti and think it is a terrific card but while the GTX 1080 was certainly faster than the GTX 980 Ti, on reflection it really didn't offer the massive leap in performance that the GTX 1080 Ti has. This time round I instead skipping the GTX 2080, no matter how tempting it might be, and waiting for the GTX 2080 Ti. I'm sure it will be easier this time round as NVIDIA's pricing on high end cards seems to increase by £80-£100 each year. I've no doubt that the GTX 2080 will be an expensive card, at least as much as the GTX 1080 Ti if not more simply because NVIDIA will have no competition.
H83
Like all the others said before this launch only matters if Nvidia can satisfy the miners demand or if virtual currencies crash and burn...
Good thing i already have enough GPU power for the next 2 or 3 years.
ingeon
With new GPU's and M.2 SSD's in mind, when is PCIE4 motherboards expected?
D3M1G0D