New Signs of a GTX 1100 Series, GeForce GTX 1180 without RT cores?
It has been a rumor for a long time now, would NVIDIA be ballsy enough to release an 1100 series that have no Tensor and RT cores? Fact is they are missing out on a lot of sales, as the current pricing stack is just too much to swallow for many people.
Meanwhile, Andreas over at Hardwareluxx (German website) noticed an NVIDIA GeForce GTX 1180 has appeared in the online database of the GFXBench 4.0 . The device ID is already recognized and the hardware information also indicates in which area the GeForce GTX 1180 has a certain similarity to the Turing cards because this is called the "GeForce RTX 2080 / PCIe / SSE2".
The entry could be indicative of a complete product line of the GTX-11 series. When looking closer, the entry shown above indicates a GeForce GTX 1180 with similar specs towards the GeForce RTX 2080 with 2,944 shader units and a GeForce GTX 1160 accordingly a GeForce RTX 2060 with 1,920 shader units.
I still don't believe that an 1100 series is inbound for the simple reason it is too expensive to design two architectures, it just does not make much sense. But evidence and leaks are slowly prooving my ideas on this wrong.
Senior Member
Posts: 766
Joined: 2010-03-17
I don't think you need a complete redesign to have a card with no tensor cores. More like, you just push cards with defective tensor cores disabled. Surely cards come off the line where the tensor cores etc don't work but the basic rasterization works just fine?
To me, it would be an obvious way to increase yields and satisfy gamers that (rightfully so) aren't interested in paying the premium for technology that comes with so many sacrifices.
Senior Member
Posts: 1992
Joined: 2013-06-04
RTX is the new PhysicsX: a joke.
Ray tracing may bring the promise of better imager quality and photorealism, but having cores dedicated to it is just a bad idea.
Also, I've talked about price before and I'll say again: a GPU shouldn't be bigger than 250 mm2, as the space on a wafer is expensive. Nvidia is making bigger and bigger chips, that's translating in bigger prices.
Senior Member
Posts: 14091
Joined: 2004-05-16
RTX is the new PhysicsX: a joke.
Ray tracing may bring the promise of better imager quality and photorealism, but having cores dedicated to it is just a bad idea.
Also, I've talked about price before and I'll say again: a GPU shouldn't be bigger than 250 mm2, as the space on a wafer is expensive. Nvidia is making bigger and bigger chips, that's translating in bigger prices.
The vast majority of Turing's size increase is the cache doubling and not the RT cores - which are extremely small.
Senior Member
Posts: 22303
Joined: 2008-08-28
Well said. If those features are so next gen and necessary why not all your cards supporting it.
Is the RTX cards selling so bad that they need to pull out this move?
Senior Member
Posts: 2760
Joined: 2012-10-22
Interesting. The question is how much overpriced are those almost useless RTX features by the end of the day. I dare say useless, because we are into 2019, and even Nvidia doesn't believe in RTX anymore, neither game devs! Only current RTX owners are happy with only 2 games in 4 months.