New Rumors: GeForce GTX 1180, 2080
It has been a long time and a long road of rumors, however, the rumors now slowly seem to be solidifying. Nvidia would be briefing engineering employees for a new graphics card. The graphics card series would be Turing.
The name Turing has come up a couple of times already, Reuters at one point mentioned an NVIDIA GPU called Turing, they tagged it as a gamers card, which really contradicted with the name Turing. Turing would be a name better suited to AI and HPC products (Turing test for artificial intelligence). As to what it precisely is, nobody outside the ring of insiders really knows. Most of us expect a Pascal respin with GDDR6 memory though. As to what it's called, GeForce GTX 1180, 2080 we can only guess.
The latest rumor is that Nvidia would be training development and engineering departments, which really is all the news there is today. The rumor comes from Tom's Hardware Germany where they have "unofficially learned from some board partners that Nvidia has already started training the relevant employees from the development departments."
The piece they wrote contains a lot of usage of the word 'if'. Media also claim that a BoM (Bill of materials needed to produce these cards) would have been released, however, if you read the piece article a bit more clearly, it's merely based on an assumption.
"If you follow the 3-month rule, the first board partner cards should appear on the market in late August or early September. However, some of the partners are now expecting a shift of at least two weeks, so that September seems rather plausible"
Well, at least we know something is brewing in that kettle.
BOM release | Bill Of Materials Release | begin |
---|---|---|
EVT | Engineering Validation Test | 1-2 weeks |
DVT | Design Validation Test | 2 weeks |
WS | Working sample | 1-2 weeks |
EMI Test | Electromagnetic Interference Test | less than a week |
PVT | Production Validation Test | 2-3 weeks |
PVT sorting | ||
PPBIOS | Final BIOS | a few days |
Ramp & MP | Mass production and shipping | a few days |
Senior Member
Posts: 1664
Joined: 2017-02-14
They are on the rise though. 4K displays are steadily becoming more affordable, and most modern applications are built to be compatible with such resolutions. Besides, that's a bit of a chicken and egg situation - are there not many 4K users because there's not enough good/playable 4K content, or is there not enough 4K content because there aren't enough users with 4K displays?
Back around 2015, the latter was definitely true. As of today, I'd say the former is true.
The point is: there is actually a demand for better GPUs.
Another alternative is that 4k has a low adoption rate is because to get the frame rates to have a very smooth experience you have to spend $800+ on a GPU. I went with 2k(1440p) for that very reason.
Senior Member
Posts: 2068
Joined: 2017-03-10
They are on the rise though. 4K displays are steadily becoming more affordable, and most modern applications are built to be compatible with such resolutions. Besides, that's a bit of a chicken and egg situation - are there not many 4K users because there's not enough good/playable 4K content, or is there not enough 4K content because there aren't enough users with 4K displays?
Back around 2015, the latter was definitely true. As of today, I'd say the former is true.
The point is: there is actually a demand for better GPUs.
I dunno, 4K monitors are still prohibitively expensive, especially the ones with G-Sync. Most of them are also limited to 60 FPS (this is the main reason why I did not buy a 4K monitor). I would only consider going 4K when 144hz models are out, and prices come down on them (so not for a very long time). 4K content has very little to do with it.
Senior Member
Posts: 7233
Joined: 2012-11-10
From what I've noticed, 1080Tis have still been in somewhat high demand, despite their price. I personally wouldn't be willing to spend $800+ on a GPU (I find it hard to justify spending more than $300) but I'm not everyone else. I would also like to point out that I myself don't have a 4K display and don't intend to get one for a while.
For your specific needs, yes, 4K is ludicrously expensive. But 4K displays in general definitely are not prohibitively expensive. These are very reasonable prices:
https://www.newegg.com/Product/ProductList.aspx?Submit=ENE&IsNodeId=1&N=100167585 600474748 4814
Sure, monitors are more expensive, but the pixel density is higher. That being said, these prices aren't that outrageous either, especially when compared to 1440p:
https://www.newegg.com/Product/ProductList.aspx?Submit=ENE&IsNodeId=1&N=100160979 601305587 4814
As for anything beyond 60Hz.... why? A 1080Ti often doesn't even reach 60FPS in a lot (if not most) modern games. Getting a 4K display higher than 60Hz with no hardware to take advantage of that is pointless. That's like buying a racecar and never taking it on the track. So before we can start demanding 4K @ 144Hz displays, we first need GPUs that can reliably handle 60Hz. And that comes full-circle to my original point: there is in fact a demand for better hardware.
Senior Member
Posts: 2068
Joined: 2017-03-10
I've been using a 144hz monitor for a while and I don't want to go back to 60 if I can help it. GPUs will eventually become fast enough to render 60+ FPS @ 4K and I don't want to be stuck with a 60 FPS monitor when that happens. It'll be the same with 6K/8K monitors (I'll probably skip the 60hz ones). Just my personal preference.
Senior Member
Posts: 7233
Joined: 2012-11-10
They are on the rise though. 4K displays are steadily becoming more affordable, and most modern applications are built to be compatible with such resolutions. Besides, that's a bit of a chicken and egg situation - are there not many 4K users because there's not enough good/playable 4K content, or is there not enough 4K content because there aren't enough users with 4K displays?
Back around 2015, the latter was definitely true. As of today, I'd say the former is true.
The point is: there is actually a demand for better GPUs.