New Signs of a GTX 1100 Series, GeForce GTX 1180 without RT cores?

Published by

Click here to post a comment for New Signs of a GTX 1100 Series, GeForce GTX 1180 without RT cores? on our message forum
https://forums.guru3d.com/data/avatars/m/250/250418.jpg
RTX is the new PhysicsX: a joke. Ray tracing may bring the promise of better imager quality and photorealism, but having cores dedicated to it is just a bad idea. Also, I've talked about price before and I'll say again: a GPU shouldn't be bigger than 250 mm2, as the space on a wafer is expensive. Nvidia is making bigger and bigger chips, that's translating in bigger prices.
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
Silva:

RTX is the new PhysicsX: a joke. Ray tracing may bring the promise of better imager quality and photorealism, but having cores dedicated to it is just a bad idea. Also, I've talked about price before and I'll say again: a GPU shouldn't be bigger than 250 mm2, as the space on a wafer is expensive. Nvidia is making bigger and bigger chips, that's translating in bigger prices.
The vast majority of Turing's size increase is the cache doubling and not the RT cores - which are extremely small.
https://forums.guru3d.com/data/avatars/m/198/198862.jpg
warlord:

Interesting. The question is how much overpriced are those almost useless RTX features by the end of the day. I dare say useless, because we are into 2019, and even Nvidia doesn't believe in RTX anymore, neither game devs! Only current RTX owners are happy with only 2 games in 4 months.
Well said. If those features are so next gen and necessary why not all your cards supporting it. Is the RTX cards selling so bad that they need to pull out this move?
https://forums.guru3d.com/data/avatars/m/225/225706.jpg
Dam circus. Just keep it 20x0 when you're at it (should've been 11x0 from the start imo. They're just going to run out of numbers sooner). Work with the new prefix. RTX for the raytracing stuff and GTX for regular.
https://forums.guru3d.com/data/avatars/m/226/226864.jpg
If AMD somehow managed to come out with cards that performed close to the 2xxx (with DXR disabled) at much lower price, I could certainly see Nvidia countering with a 11xx series that has a similar render performance and competitive price but lacks RT cores. Especially if they could find a way to repurpose 2xxx chips that failed QA because of faulty RT cores for it.
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
dirthurts:

The images I've found breaking down the die seem to suggest they're roughly 1/6 to 1/4 of the entire functioning GPU (depending on the version). That's not insignificant.
You have a link?
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
dirthurts:

Seems to me most of the die is set aside for non-essentials (even up to half depending on which slide you trust).
That's just a graphic it's not indicative of how the actual die is split. For reference the Titan RTX has roughly the same CUDA/MM2 as GP100 which is the only Pascal GPU with FP16v2 and it does that with double the cache size. On 7.5T TSMC library there basically is no process advantage in terms of density. So there is no way 1/2, 1/4 or even 1/6th of these dies are made up of RT cores. Tensor cores aren't required for RT acceleration - as demonstrated by BF5 which doesn't use them.
https://forums.guru3d.com/data/avatars/m/167/167379.jpg
Looks like NGREEDIA is pulling an Intel move. "Fact is they are missing out on a lot of sales" ..... my ass.. JUST LOWER THE PRICES OF THE RTX CARDS!!!
https://forums.guru3d.com/data/avatars/m/225/225084.jpg
There is nothing wrong with the RTX line up apart from the price. In a few years time when RT is the norm people will eventually switch but atm it's too much money for very little performance improvement over the 1080Ti. RT still remains a tough sale because we don't really see anything real come from it yet. Is it a case of the tech has outran the software side of things because i see no RT games being talked about apart from those we already know about. Who knows what's Nvidia's next move, they seem to have gone off the idea of G-Sync and are looking to enable freesync instead? Is that a sign of a loss to AMD or what?
https://forums.guru3d.com/data/avatars/m/122/122801.jpg
I think if they do its a smart move, suck up all the main streamvhigh end sales plus they have a linevof higher priced cards to make keeping the lead easier.
https://forums.guru3d.com/data/avatars/m/198/198862.jpg
You guys will tell a different story when Metro comes out. Seems like RTX will look much better in comparison. If only 2060/2070 can handle it.
https://forums.guru3d.com/data/avatars/m/198/198862.jpg
So this 1180 is a basically a 1080ti. It has the same performance without ray tracing. Nvidia this is a mess.
https://forums.guru3d.com/data/avatars/m/254/254725.jpg
Undying:

So this 1180 is a basically a 1080ti. It has the same performance without ray tracing. Nvidia this is a mess.
If it's a lot less than MSRP of the 1080ti I expect it'll do well.
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
I still don't believe that an 1100 series is inbound for the simple reason it is too expensive to design two architectures, it just does not make much sense. But evidence and leaks are slowly prooving my ideas on this wrong.
I don't think it is 2 different architectures. To my understanding, the tensor and RT cores function largely independently of the rest of the die, so as far as I'm concerned, Nvidia could just laser cut those parts off (or just not include them at all) and everything should "just work". Think of it like Intel CPUs with and without an iGPU: the CPU architecture is exactly the same regardless of whether you have the iGPU or not, but, the models with the iGPU often still have a monolithic chip.
https://forums.guru3d.com/data/avatars/m/235/235344.jpg
DX10 did not do much, DX11 better, DX12 is still a "why?" so ray tracing...who gives a flying... The main issue was that the performance hit taken for ray tracing required DLSS to erase it. Taking advantage of the new effects requires learning and implementing two new items at once. Of course there is not going to be many jumping on the bandwagon to implement both. Historically new tech is pushed to the enthusiast crowd first. Then after a good while it filters down to the mainstream. Market mechanics have not changed. Innovation on this level has not been released for a good while. These one or two games are only for the early adopters. If the adopters do not budge past the early adopters, then it is dead in the water. That is why the price of these initial RTX cards are so hefty. Has nothing to do with "Ngreedia." The early adopters foot the bill for the rest of us. The RTX2060 had to be. Just surprised it happened so soon. If the 11XX series comes about, just plain sad. That would indicate that we may not see ray tracing in games till the old tech is liken to beating a dead horse. We have been beating a horse...just not sure the horse being dead has been noticed yet.
data/avatar/default/avatar40.webp
Undying:

So this 1180 is a basically a 1080ti. It has the same performance without ray tracing.
Historically, this is how it has always worked. New generation, GPUs step up about one step. A new x70 about replaces the old x80, the new x80 replaces the old x80 Ti, and so forth. So new generation, you get one "step" of the performance "for free". Only Pascal was different because the performance boost was much bigger then in previous generations, and in Turing the new generation is a bit too expensive. So if these 11 cards are in fact true, and bring down the cost, then it would be the usual generational stepup.
https://forums.guru3d.com/data/avatars/m/273/273822.jpg
nevcairiel:

Historically, this is how it has always worked. New generation, GPUs step up about one step. A new x70 about replaces the old x80, the new x80 replaces the old x80 Ti, and so forth. So new generation, you get one "step" of the performance "for free". Only Pascal was different because the performance boost was much bigger then in previous generations, and in Turing the new generation is a bit too expensive. So if these 11 cards are in fact true, and bring down the cost, then it would be the usual generational stepup.
You're wrong. new x70 card matched/outperformed the previous generation x80ti card. 970 matched the 780ti and the 1070 matched the 980ti.
https://forums.guru3d.com/data/avatars/m/270/270319.jpg
I find the hate for RTX laughable. It's a good thing and it is ultimately the direction that rendering is going to go in the future. The problem is that it is currently too expensive and that nVidia - arguably - should not have primarily targeted the gaming community. However, I do think they should have released the cards and I'm glad they did. At some point the technology needs to be introduced to the market so why not now? Sure, the costs are prohibitive but there will always be a fortunate few who are able to enjoy these products and swallow the premium costs that they demand. I have absolutely no issue with that whatsoever. You know, I'd love to drive around in a brand new Ferrari but I can't afford it; but it doesn't mean others should not be able to enjoy those levels of performance, luxury and features. I've been writing 3D graphics software professionally now for nearly 30 years and absolutely welcome this direction towards ray / path tracing and dedicated hardware to accelerate the process. There's still a way to go but this is just the first step. It can feasibly be argued that a ray tracing engine is actually easier to implement than a cutting-edge 3D rasterization-based engine as you don't have to develop novel solutions and workarounds to simulate light bouncing around an environment. Ray tracing (and forms thereof) provide a good chunk of the results of the rendering equation as-is. It's just such an incredibly computationally intensive process and that's why performance (i.e. real time at high fps) will take some time to arrive at the resolutions gamers expect these days.
https://forums.guru3d.com/data/avatars/m/56/56686.jpg
to many naming scheme to many diffrent setups, nvidia trying to confuse people?
https://forums.guru3d.com/data/avatars/m/271/271560.jpg
i'll try not to pile on too much. if Nvidia launched 11xx and 20xx at the same time i would have worn out my fingers writing about how clever they were. instead i not only wrote about a premature launch with vaporware features, but also about how Turing was designed for a smaller node than what we have. it's still technically brilliant, but so is the neighbors car. which you're not quite ready to buy one of the same.