New Signs of a GTX 1100 Series, GeForce GTX 1180 without RT cores?
It has been a rumor for a long time now, would NVIDIA be ballsy enough to release an 1100 series that have no Tensor and RT cores? Fact is they are missing out on a lot of sales, as the current pricing stack is just too much to swallow for many people.
Meanwhile, Andreas over at Hardwareluxx (German website) noticed an NVIDIA GeForce GTX 1180 has appeared in the online database of the GFXBench 4.0 . The device ID is already recognized and the hardware information also indicates in which area the GeForce GTX 1180 has a certain similarity to the Turing cards because this is called the "GeForce RTX 2080 / PCIe / SSE2".
The entry could be indicative of a complete product line of the GTX-11 series. When looking closer, the entry shown above indicates a GeForce GTX 1180 with similar specs towards the GeForce RTX 2080 with 2,944 shader units and a GeForce GTX 1160 accordingly a GeForce RTX 2060 with 1,920 shader units.
I still don't believe that an 1100 series is inbound for the simple reason it is too expensive to design two architectures, it just does not make much sense. But evidence and leaks are slowly prooving my ideas on this wrong.
Senior Member
Posts: 2389
Joined: 2018-04-10
Historically, this is how it has always worked. New generation, GPUs step up about one step. A new x70 about replaces the old x80, the new x80 replaces the old x80 Ti, and so forth. So new generation, you get one "step" of the performance "for free". Only Pascal was different because the performance boost was much bigger then in previous generations, and in Turing the new generation is a bit too expensive.
So if these 11 cards are in fact true, and bring down the cost, then it would be the usual generational stepup.
You're wrong. new x70 card matched/outperformed the previous generation x80ti card.
970 matched the 780ti and the 1070 matched the 980ti.
Member
Posts: 21
Joined: 2016-06-20
Raytracing as a technology is good and a step forward in graphics... from tech point of view. But in reality, from gamer perspective, when you play BV with RTX ON/OFF is it a big deal in terms of immersion? I dont think so. Also problem with this pc exclusive technology is that today console market dictates what will be a standard and what will not (whera are those dx12 games?). And another problem is that this gen performance is not good enough for RT games.
Senior Member
Posts: 487
Joined: 2016-10-25
I must have missed that, but what is the second game that supports RTX?

Junior Member
Posts: 6
Joined: 2017-03-20
I find the hate for RTX laughable. It's a good thing and it is ultimately the direction that rendering is going to go in the future. The problem is that it is currently too expensive and that nVidia - arguably - should not have primarily targeted the gaming community. However, I do think they should have released the cards and I'm glad they did. At some point the technology needs to be introduced to the market so why not now? Sure, the costs are prohibitive but there will always be a fortunate few who are able to enjoy these products and swallow the premium costs that they demand. I have absolutely no issue with that whatsoever. You know, I'd love to drive around in a brand new Ferrari but I can't afford it; but it doesn't mean others should not be able to enjoy those levels of performance, luxury and features.
I've been writing 3D graphics software professionally now for nearly 30 years and absolutely welcome this direction towards ray / path tracing and dedicated hardware to accelerate the process. There's still a way to go but this is just the first step. It can feasibly be argued that a ray tracing engine is actually easier to implement than a cutting-edge 3D rasterization-based engine as you don't have to develop novel solutions and workarounds to simulate light bouncing around an environment. Ray tracing (and forms thereof) provide a good chunk of the results of the rendering equation as-is. It's just such an incredibly computationally intensive process and that's why performance (i.e. real time at high fps) will take some time to arrive at the resolutions gamers expect these days.
Senior Member
Posts: 844
Joined: 2015-05-19
Historically, this is how it has always worked. New generation, GPUs step up about one step. A new x70 about replaces the old x80, the new x80 replaces the old x80 Ti, and so forth. So new generation, you get one "step" of the performance "for free". Only Pascal was different because the performance boost was much bigger then in previous generations, and in Turing the new generation is a bit too expensive.
So if these 11 cards are in fact true, and bring down the cost, then it would be the usual generational stepup.