New Rumors: GeForce GTX 1180, 2080
It has been a long time and a long road of rumors, however, the rumors now slowly seem to be solidifying. Nvidia would be briefing engineering employees for a new graphics card. The graphics card series would be Turing.
The name Turing has come up a couple of times already, Reuters at one point mentioned an NVIDIA GPU called Turing, they tagged it as a gamers card, which really contradicted with the name Turing. Turing would be a name better suited to AI and HPC products (Turing test for artificial intelligence). As to what it precisely is, nobody outside the ring of insiders really knows. Most of us expect a Pascal respin with GDDR6 memory though. As to what it's called, GeForce GTX 1180, 2080 we can only guess.
The latest rumor is that Nvidia would be training development and engineering departments, which really is all the news there is today. The rumor comes from Tom's Hardware Germany where they have "unofficially learned from some board partners that Nvidia has already started training the relevant employees from the development departments."
The piece they wrote contains a lot of usage of the word 'if'. Media also claim that a BoM (Bill of materials needed to produce these cards) would have been released, however, if you read the piece article a bit more clearly, it's merely based on an assumption.
"If you follow the 3-month rule, the first board partner cards should appear on the market in late August or early September. However, some of the partners are now expecting a shift of at least two weeks, so that September seems rather plausible"
Well, at least we know something is brewing in that kettle.
BOM release | Bill Of Materials Release | begin |
---|---|---|
EVT | Engineering Validation Test | 1-2 weeks |
DVT | Design Validation Test | 2 weeks |
WS | Working sample | 1-2 weeks |
EMI Test | Electromagnetic Interference Test | less than a week |
PVT | Production Validation Test | 2-3 weeks |
PVT sorting | ||
PPBIOS | Final BIOS | a few days |
Ramp & MP | Mass production and shipping | a few days |
Senior Member
Posts: 11808
Joined: 2012-07-20
Games have quite poor per-pixel-quality. While increased resolution allows for higher details, it still has same poor per-pixel-quality. There are just more pixels.
Native 1440p vs downsampling from 1440p to 1080p does not make reasonable difference to pay extra or sacrifice screen refresh rate. Same goes for 4k vs down downsampling from 4k to 1440p.
Yes, downsampling, extra shader effects injection takes users time and native is just a bit better. But one does not need to go to higher resolution before it has all features required.
It is like watching in-game video from upcoming games. You open news with it and see it in that small youtube embedded window. And you are like:
"Wow, such fidelity, such quality per pixel bot in effects and geometry."
But then you put it to to fullscreen, and you see all those places where they had to take step back on shaders, geometry, ... And it does not matter that I run 1080p screen, even when video was recorded in 4k, and I play it in 4k. It still does not reach such high per-pixel-quality to make me think about 1440p screen.
Senior Member
Posts: 11808
Joined: 2012-07-20
HDMI 2.1 Does not officially support 240Hz
https://www.hdmi.org/manufacturer/hdmi-2-1/
My monitor does 240Hz via HDMI and Freesync works with it too in full range.

Those "standards" are there for bandwidth. Resolution vs refresh rate are just consequence of what manufacturer of screen electronics decides to support.
Senior Member
Posts: 11545
Joined: 2004-05-10
Have you tried 3840x1620? That reduces pixels by 25% and gives you 21:9 UW while maintaining 1:1 scaling (since basically you are just cropping the height). Much easier to run that way and looks great on large screens.
Senior Member
Posts: 8201
Joined: 2010-11-16
Hardocp Site made a test maxing out "newer" games (rotr, divison, watchdogs2, gta5, etc)
https://m.hardocp.com/article/2018/05/24/last-gen-games-max-iq-perf-on-todays-gpus
Let’s look at The Witcher 3 for example. There is a 16% difference turning on HairWorks on the GeForce GTX 1080 Ti at 1440p. However, there is only a 10% difference turning on HairWorks on the AMD Radeon RX Vega 64 at 1440p using the same settings. The AMD Radeon RX Vega 64 takes less of a performance hit turning on HairWorks. Another example is Watch Dogs 2 on the GTX 1080 Ti the difference with PCSS Shadows is 27% but on Radeon RX Vega 64 it is 23%, so more efficient. In The Division the difference is exactly the same.


--
Posts: 6070
Joined: 2011-01-02
Hardocp Site made a test maxing out "newer" games (rotr, divison, watchdogs2, gta5, etc) and even at 1440p a 1080ti was barely enough to have somewhat ok fps i.e. near or above 60fps min.. so no 1080ti is not strong enough at max game settings, this includes hfsl shadows, vxao, supersampling,..
https://m.hardocp.com/article/2018/05/24/last-gen-games-max-iq-perf-on-todays-gpus
Visually maxed out games is really bad benchmark in my opinion. Games might be poorly optimized in first place. (Look at Dishonored 2 for example).
Hell, I won't be surprised if recent game might have fps drops on 1080p@60fps with 1080Ti.
If consumer must adjust GPU so it runs latest games on MAX settings flawlesly, it must change GPU whenever there new flag GPU available... only to run at 1080p.
We shouldn't buy better GPU, cause devs might lazy to optimize engine and just telling consumers to get better GPU to enjoy all eye candies.
We have stable benchmarks like Mark3D and other benchmarks to tell how well GPU performs.