New Signs of a GTX 1100 Series, GeForce GTX 1180 without RT cores?

Published by

Click here to post a comment for New Signs of a GTX 1100 Series, GeForce GTX 1180 without RT cores? on our message forum
data/avatar/default/avatar27.webp
Interesting. The question is how much overpriced are those almost useless RTX features by the end of the day. I dare say useless, because we are into 2019, and even Nvidia doesn't believe in RTX anymore, neither game devs! Only current RTX owners are happy with only 2 games in 4 months.
data/avatar/default/avatar10.webp
I don't think you need a complete redesign to have a card with no tensor cores. More like, you just push cards with defective tensor cores disabled. Surely cards come off the line where the tensor cores etc don't work but the basic rasterization works just fine? To me, it would be an obvious way to increase yields and satisfy gamers that (rightfully so) aren't interested in paying the premium for technology that comes with so many sacrifices.
https://forums.guru3d.com/data/avatars/m/250/250418.jpg
RTX is the new PhysicsX: a joke. Ray tracing may bring the promise of better imager quality and photorealism, but having cores dedicated to it is just a bad idea. Also, I've talked about price before and I'll say again: a GPU shouldn't be bigger than 250 mm2, as the space on a wafer is expensive. Nvidia is making bigger and bigger chips, that's translating in bigger prices.
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
Silva:

RTX is the new PhysicsX: a joke. Ray tracing may bring the promise of better imager quality and photorealism, but having cores dedicated to it is just a bad idea. Also, I've talked about price before and I'll say again: a GPU shouldn't be bigger than 250 mm2, as the space on a wafer is expensive. Nvidia is making bigger and bigger chips, that's translating in bigger prices.
The vast majority of Turing's size increase is the cache doubling and not the RT cores - which are extremely small.
https://forums.guru3d.com/data/avatars/m/198/198862.jpg
warlord:

Interesting. The question is how much overpriced are those almost useless RTX features by the end of the day. I dare say useless, because we are into 2019, and even Nvidia doesn't believe in RTX anymore, neither game devs! Only current RTX owners are happy with only 2 games in 4 months.
Well said. If those features are so next gen and necessary why not all your cards supporting it. Is the RTX cards selling so bad that they need to pull out this move?
data/avatar/default/avatar05.webp
Denial:

The vast majority of Turing's size increase is the cache doubling and not the RT cores - which are extremely small.
The images I've found breaking down the die seem to suggest they're roughly 1/6 to 1/4 of the entire functioning GPU (depending on the version). That's not insignificant.
data/avatar/default/avatar35.webp
I think the main problem they have is that the sizeable and very significant group of current 1070 - 1080Ti owners really don't have any acceptable upgrade path. The natural performance gain level (historically) expected here would equate to a 2080Ti - unfortunately, Nvidia put excess profit before sense (take that early hit that you know you can afford) and messed things up quite a bit here. I really don't know how they can get themselves out of this without a backlash from current 2080/Ti owners/early adopters, and that is a shame since innovation (RT/X) - something we all welcome - is at the centre of this, and the second major backlash is from developers who now have reduced optimism for supporting your innovation. Take a lesson from consoles and get your product out there to the masses...DON'T rip loyal and new customers off.
https://forums.guru3d.com/data/avatars/m/225/225706.jpg
Dam circus. Just keep it 20x0 when you're at it (should've been 11x0 from the start imo. They're just going to run out of numbers sooner). Work with the new prefix. RTX for the raytracing stuff and GTX for regular.
https://forums.guru3d.com/data/avatars/m/226/226864.jpg
If AMD somehow managed to come out with cards that performed close to the 2xxx (with DXR disabled) at much lower price, I could certainly see Nvidia countering with a 11xx series that has a similar render performance and competitive price but lacks RT cores. Especially if they could find a way to repurpose 2xxx chips that failed QA because of faulty RT cores for it.
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
dirthurts:

The images I've found breaking down the die seem to suggest they're roughly 1/6 to 1/4 of the entire functioning GPU (depending on the version). That's not insignificant.
You have a link?
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
dirthurts:

Seems to me most of the die is set aside for non-essentials (even up to half depending on which slide you trust).
That's just a graphic it's not indicative of how the actual die is split. For reference the Titan RTX has roughly the same CUDA/MM2 as GP100 which is the only Pascal GPU with FP16v2 and it does that with double the cache size. On 7.5T TSMC library there basically is no process advantage in terms of density. So there is no way 1/2, 1/4 or even 1/6th of these dies are made up of RT cores. Tensor cores aren't required for RT acceleration - as demonstrated by BF5 which doesn't use them.
https://forums.guru3d.com/data/avatars/m/167/167379.jpg
Looks like NGREEDIA is pulling an Intel move. "Fact is they are missing out on a lot of sales" ..... my ass.. JUST LOWER THE PRICES OF THE RTX CARDS!!!
https://forums.guru3d.com/data/avatars/m/225/225084.jpg
There is nothing wrong with the RTX line up apart from the price. In a few years time when RT is the norm people will eventually switch but atm it's too much money for very little performance improvement over the 1080Ti. RT still remains a tough sale because we don't really see anything real come from it yet. Is it a case of the tech has outran the software side of things because i see no RT games being talked about apart from those we already know about. Who knows what's Nvidia's next move, they seem to have gone off the idea of G-Sync and are looking to enable freesync instead? Is that a sign of a loss to AMD or what?
data/avatar/default/avatar04.webp
It is like Intel's latest generation of processors with disabled iGPU. Same tactics. Same ideology in some way. Perhaps they are gonna answer to upcoming cheap RTG's solutions properly, they can't convince medium sized pockets anymore. If you add VRR support for monitors, they are starting to focus on high, mid or even low ranged gamers who traditionally were choosing AMD. Truth being told, the average gamer, do not give a frak about ray tracing. She/he needs lots of frames for any resolution just to play! Few people are standing still to enjoy sightseeing reflections and lights.
https://forums.guru3d.com/data/avatars/m/122/122801.jpg
I think if they do its a smart move, suck up all the main streamvhigh end sales plus they have a linevof higher priced cards to make keeping the lead easier.
data/avatar/default/avatar13.webp
Reddoguk:

Who knows what's Nvidia's next move, they seem to have gone off the idea of G-Sync and are looking to enable freesync instead? Is that a sign of a loss to AMD or what?
Well, HDMI 2.1 has Adaptive Sync built into spec. They basically have no advantage over the free offering, especially now that it is going to be basically built into every monitor in the near future, unless Display port somehow manages 100% saturation. Gsync was always going to lose...
warlord:

Truth being told, the average gamer, do not give a frak about ray tracing. She/he needs lots of frames for any resolution just to play! Few people are standing still to enjoy sightseeing reflections and lights.
This is my stance. I'd rather have smaller dies without RT, or larger dies using that space to give me more rasterization performance. Cutting performance in half to turn on the feature plus dedicating expensive die space is just a double lose in my eyes. Sure, it may be the future, but it's not right now.
https://forums.guru3d.com/data/avatars/m/198/198862.jpg
You guys will tell a different story when Metro comes out. Seems like RTX will look much better in comparison. If only 2060/2070 can handle it.
data/avatar/default/avatar39.webp
Undying:

You guys will tell a different story when Metro comes out.
I'm not so sure. Primarily, those games tend to be huge performance sinks when they launch anyway, requiring high spec to even run well. Adding RTX effects on top of that? Plus, are the devs really going to rework the entire game to use global illumination, or just a few choice rooms here and there? Will you tell the difference between them? I'm betting the implementation will be limited as the "it just works" is obviously not true as it requires huge amounts of optimization to run well.
data/avatar/default/avatar36.webp
Undying:

You guys will tell a different story when Metro comes out. Seems like RTX will look much better in comparison. If only 2060/2070 can handle it.
Maybe yes, maybe not. It will be the first real RTX enabled game since launch. 6 months after new generation's gpu release, it could finally show us the joy of having an RTX 20 series gpu. I can't wait. 😀 But, the question arising is, why if gtx 11 series coming to production, Nvidia chooses RTX 20 series for laptop? We know they are gonna be rather weak. They could give GTX for mobility.