NVIDIA: GeForce 1000 Series To Sell into 2019 - likely no RTX for Lower End Cards

Published by

Click here to post a comment for NVIDIA: GeForce 1000 Series To Sell into 2019 - likely no RTX for Lower End Cards on our message forum
https://forums.guru3d.com/data/avatars/m/145/145154.jpg
That sucks but I'm feeling pretty good about buying 2x 1070's right when they 1st released. The price only went up with the mining craze and now there's nothing new on the horizon (as a worthy replacement in the price range I prefer) for possibly the longest time we've ever gone. Unless AMD pulls a rabbit out of their hat, it's looking stagnant. This may be like after Sandy Bridge released. I hope not. It took a long time for a Sandy to be worth replacing. That's STILL a solid CPU series, even now.
https://forums.guru3d.com/data/avatars/m/123/123760.jpg
Venix:

There is no doupt in my mind this and at least the next generation nvidia's rtx will be what is hairworks now the perfomance tax is way too high and the lack of support in a wider range other than high end and more ....most likely 2xxx cards will never use it other than few games and benches
It's actually a very interesting and needed feature for the future and AMD will probably have to adopt it too, but it will require a non vendor specific implementation imo for it to really hit off. It's the next step in lightening. Though I'm not defending the 20XX cards, they're overpriced and it's to soon to care about RTX atm.
https://forums.guru3d.com/data/avatars/m/271/271700.jpg
Buy used 10XX's. They are crazy cheap now and tons of em. Count on really good deals this holiday season for new 10xx's cards as well. The used market is flooded. Now is the best time to upgrade your 4 year old card with a used 1080.
https://forums.guru3d.com/data/avatars/m/245/245459.jpg
Maybe if they ever did decide to do ray tracing for the lower tier cards like RTX 2060, RTX 2050ti, then maybe they'd include the same number of ray tracing & tensor cores as the RTX 2070, and then decrease the number of conventional cuda cores - so it would be a less powerful product overall, but still retaining a minimum acceptable level of raytracing; after all, even the ray tracing cores in the RTX 2080ti are only targeting 1080p 60fps in one game so far, so it wouldn't be hard for the decreased number of cuda cores of a RTX 2060 or 2050ti to push 1080p 60fps in pure conventional game rendering power. In fact this is probably a better match than the overpowered conventional rendering power of the RTX 2080ti compared to it's minimal ray tracing capabilities - ray tracing capabilities would severely bottleneck the rest of the GPU. This is my hunch based on what we know so far - I don't see why they couldn't include ray tracing in their lower tier cards, except perhaps on point of cost/profit, although it might look bad in the situation where an RTX 2050ti is getting the same fps in ray tracing games as an RTX 2070! Further on from these thoughts, after a generation or two of ray tracing cards, when the ray tracing portion of the card's performance is balanced with the overall rendering power of the conventional CUDA cores - this is when ray tracing will have truly arrived without compromise! Maybe it will be more than one or two generations, I do not know - maybe it will. Maybe it will be a lot longer, and only truly balanced when the only lighting scheme in games is ray tracing, when the game developers no longer use any 'cheat lighting'. They'll probably be a whole new fundamentally different GPU architecture by then though!
https://forums.guru3d.com/data/avatars/m/243/243702.jpg
Robbo9999:

Maybe if they ever did decide to do ray tracing for the lower tier cards like RTX 2060, RTX 2050ti, then maybe they'd include the same number of ray tracing & tensor cores as the RTX 2070, and then decrease the number of conventional cuda cores - so it would be a less powerful product overall, but still retaining a minimum acceptable level of raytracing; after all, even the ray tracing cores in the RTX 2080ti are only targeting 1080p 60fps in one game so far, so it wouldn't be hard for the decreased number of cuda cores of a RTX 2060 or 2050ti to push 1080p 60fps in pure conventional game rendering power. In fact this is probably a better match than the overpowered conventional rendering power of the RTX 2080ti compared to it's minimal ray tracing capabilities - ray tracing capabilities would severely bottleneck the rest of the GPU. This is my hunch based on what we know so far - I don't see why they couldn't include ray tracing in their lower tier cards, except perhaps on point of cost/profit, although it might look bad in the situation where an RTX 2050ti is getting the same fps in ray tracing games as an RTX 2070!
That's what I thought about AMD and next cards which will have Raytracing capability. It there is HW capable to do RT, it would be meaningless to have just few units. Card needs to deliver sufficient RT performance. And as it does, it is pretty bad idea to pair that good RT performance with weak rasterization. So it has to have enough SP/TMU/ROPs (or clock). There is bare minimum for RT and once that is met, there is bare minimum for rasterization. But either of them may be deciding factor in maximum achievable fps +-few %. Same applies to nV. And while their RT additions do not cost them that many transistors, they do not want to make this investment for lower-end cards. But from my point of view, even current GTX 1070 has good enough rasterizations capabilities to deserve RT cores. Therefore if RTX 2060 has same rasterization performance, it would still be feasible to give it RT. I think they are more afraid of giving those special computational cores for cheap. As it may prove to hurt their business class one day. And btw. I am sure that even with same amount and clock on RT cores, card with stronger rasterization would deliver more fps. If RT takes 2ms of rendering time on both cards, and one card has twice as powerful rasterization, then one card's rasterization takes 4ms and others 8ms. So final composition would be like 6ms vs. 10ms. Then post-processing and stuff, but but difference would be there.
https://forums.guru3d.com/data/avatars/m/245/245459.jpg
Fox2232:

I think they are more afraid of giving those special computational cores for cheap. As it may prove to hurt their business class one day. And btw. I am sure that even with same amount and clock on RT cores, card with stronger rasterization would deliver more fps. If RT takes 2ms of rendering time on both cards, and one card has twice as powerful rasterization, then one card's rasterization takes 4ms and others 8ms. So final composition would be like 6ms vs. 10ms. Then post-processing and stuff, but but difference would be there.
Yes, I think they may be concerned to give away ray tracing too cheap on their 2060 and lower cards - maybe it costs too much to add ray tracing to cheaper cards (not enough profit) - I know that's a different angle to your 'hurt their business class' that you mentioned, but profit related all the same. I'm not sure I agree with that last paragraph you wrote. Ray tracing costs a lot more than the 2ms you mentioned, in fact that's the point - ray tracing costs lots & lots of milliseconds, to the point that this portion takes longer than the rasterization. Ray tracing is the bottleneck, which is why BF V is targetting only 1080p 60fps with ray tracing turned on with an RTX 2080ti! So that may be why they wouldn't want to add that minimum level of acceptable ray tracing performance (let's say RTX 2070 type ray tracing performance) to lower tier cards as they would likely all perform the same in ray tracing games which wouldn't provide sufficient differentiation & would just look bad for them.
https://forums.guru3d.com/data/avatars/m/243/243702.jpg
Robbo9999:

Yes, I think they may be concerned to give away ray tracing too cheap on their 2060 and lower cards - maybe it costs too much to add ray tracing to cheaper cards (not enough profit) - I know that's a different angle to your 'hurt their business class' that you mentioned, but profit related all the same. I'm not sure I agree with that last paragraph you wrote. Ray tracing costs a lot more than the 2ms you mentioned, in fact that's the point - ray tracing costs lots & lots of milliseconds, to the point that this portion takes longer than the rasterization. Ray tracing is the bottleneck, which is why BF V is targetting only 1080p 60fps with ray tracing turned on with an RTX 2080ti! So that may be why they wouldn't want to add that minimum level of acceptable ray tracing performance (let's say RTX 2070 type ray tracing performance) to lower tier cards as they would likely all perform the same in ray tracing games which wouldn't provide sufficient differentiation & would just look bad for them.
You are right about time portion, it was just as example. From 16ms budget for 60fps gaming, RT may be 8ms since game goes above 100fps without it. And I wondered about something there. In videos, there were clearly portions with just few reflective surfaces needing RT. And then there were places where reflective surfaces had massive screen coverage. Those should have choked performance badly, yet they did not. I think they have some clever tricks and can adjust depth/count per surface type. And if they added one extra layer which would multiply this depth/count of rays based on rendering time, then RT would be done just in lower quality. Possibly adding slider for "RT target fps", so one could sacrifice fps to gain visuals or gain fps at cost of visuals.
https://forums.guru3d.com/data/avatars/m/245/245459.jpg
Fox2232:

You are right about time portion, it was just as example. From 16ms budget for 60fps gaming, RT may be 8ms since game goes above 100fps without it. And I wondered about something there. In videos, there were clearly portions with just few reflective surfaces needing RT. And then there were places where reflective surfaces had massive screen coverage. Those should have choked performance badly, yet they did not. I think they have some clever tricks and can adjust depth/count per surface type. And if they added one extra layer which would multiply this depth/count of rays based on rendering time, then RT would be done just in lower quality. Possibly adding slider for "RT target fps", so one could sacrifice fps to gain visuals or gain fps at cost of visuals.
Ah, I think ray tracing happens in parallel with the cards other rendering activities, I'm pretty sure I saw that during the NVidia reveal presentation. That was my main reason for stating RT as a bottleneck, as the rest of the card would be just waiting for RT to 'finish'. But even if RT happens in serial with the cards other activities it still represents probably the most significant time chunk of a frame, so increased conventional rendering power would only reduce frame time marginally in comparison to cards with less CUDA cores. In this situation, if they offered RTX 2070 ray tracing level of capability to lower cards, then I don't think there would be significant frames per second difference in ray tracing games between all of them - not enough to look good for NVidia, so that might be a driving force in not supplying lower cards with RT capability. Yes, the slider adjustment option for RT in games sounds viable, I've heard that before, it makes sense, I think that will happen.
https://forums.guru3d.com/data/avatars/m/243/243702.jpg
Robbo9999:

Ah, I think ray tracing happens in parallel with the cards other rendering activities, I'm pretty sure I saw that during the NVidia reveal presentation. That was my main reason for stating RT as a bottleneck, as the rest of the card would be just waiting for RT to 'finish'. But even if RT happens in serial with the cards other activities it still represents probably the most significant time chunk of a frame, so increased conventional rendering power would only reduce frame time marginally in comparison to cards with less CUDA cores. In this situation, if they offered RTX 2070 ray tracing level of capability to lower cards, then I don't think there would be significant frames per second difference in ray tracing games between all of them - not enough to look good for NVidia, so that might be a driving force in not supplying lower cards with RT capability. Yes, the slider adjustment option for RT in games sounds viable, I've heard that before, it makes sense, I think that will happen.
If it goes in parallel, then it is pretty bad failure on nVidia's side from engineering perspective. They know as next guy who is doing RT in software how many rays they need to produce per second for certain real-time result. And cards should have been designed around that. Well, probably more tricks will be needed even for RTX 2080Ti once next generation with 30~50% higher RT performance comes around.
https://forums.guru3d.com/data/avatars/m/245/245459.jpg
Fox2232:

If it goes in parallel, then it is pretty bad failure on nVidia's side from engineering perspective. They know as next guy who is doing RT in software how many rays they need to produce per second for certain real-time result. And cards should have been designed around that. Well, probably more tricks will be needed even for RTX 2080Ti once next generation with 30~50% higher RT performance comes around.
Parallel is more efficient than serial when it comes to overall frame time, but we were just hypothesising it's effect of using a set minimum level of RT on lower end cards with less rendering power in terms of how differing levels of conventional rendering power will reduce or increase frames per second in those situations. (That's a complicated sentence, for a complicated topic, but we've discussed it, so I shouldn't try and add more words to this!) Yes, I agree, next generation of cards after Turing will be increasing that RT performance, I think those will be the ones to get.
https://forums.guru3d.com/data/avatars/m/242/242573.jpg
DLD:

"Actually it is rumored that they likely do not get Raytracing cores, and actually might remain in the GTX line of naming. It could also mean it might take a while before cards like 2060 would hit the market." It looks like No-vidia is in a deep sh**. That is, the shelves are still full of (overpriced) cards and the new process is doubtful in terms of a yield and overall quality. Greediness takes it's toll. And most certainly (not just likely) they do not get my money in foreseeable future. Until then, they will offer "raytracing cores" for free, just to sell any kind of cards.
If they have excess supply, it would be due to everyone screaming at them to increase production during much of last year because it was impossible to find most cards plus the retailers put a huge price increase on them. It seems like some of you just like to ignore what actually happened because you see an opportunity to talk s***. Also, AMD has stated that NAVI wont hit until at least 2H, 2019. That being the case, at least nVIdia is releasing new cards while AMD is still selling Vega 64 for the same price of a 1080ti but with worse performance. Bravo!
https://forums.guru3d.com/data/avatars/m/271/271560.jpg
Andrew LB:

If they have excess supply, it would be due to everyone screaming at them to increase production during much of last year because it was impossible to find most cards plus the retailers put a huge price increase on them. It seems like some of you just like to ignore what actually happened because you see an opportunity to talk s***. Also, AMD has stated that NAVI wont hit until at least 2H, 2019. That being the case, at least nVIdia is releasing new cards while AMD is still selling Vega 64 for the same price of a 1080ti but with worse performance. Bravo!
not quite. Nvidia overproduced not because of gamers screaming at them, but because of windfall profits from crypto...which they then over-relied on. and AMD can do as it likes as they have sold out all of their production without over producing. the fact that a person can buy all the gpu they need to game, matched with a far less expensive freesync monitor that is otherwise the same as a g-sync monitor for hundreds less is a potent combination. you are being a fanboy, and i say that as an owner of two Nvidia cards (1080ti, 1070).
data/avatar/default/avatar02.webp
Newegg has three models of Vega 64, Sapphire Nitro being one for $499 w/three game bundle. Not quite the price of a 1080ti, but hey.
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
I just want to point out that AIB Vendors sign contracts with Nvidia for chip shipments. They estimate demand, not Nvidia. Nvidia doesn't even profit off the increased price hikes on GPUs during Crypto, just total sales.. the cost is negotiated months in advance.
https://forums.guru3d.com/data/avatars/m/243/243702.jpg
Denial:

I just want to point out that AIB Vendors sign contracts with Nvidia for chip shipments. They estimate demand, not Nvidia. Nvidia doesn't even profit off the increased price hikes on GPUs during Crypto, just total sales.. the cost is negotiated months in advance.
Actually Huang mentioned something in one of his "speeches" that they did increase pricing in middle of crypto craze.
https://forums.guru3d.com/data/avatars/m/236/236670.jpg
so no 2060 anytime soon 🙁
https://forums.guru3d.com/data/avatars/m/267/267829.jpg
Kaarme:

Considering 2070 will cost more than 1080, they kind of have the lower range covered for the time being even without releasing anything new. At least Nvidia didn't do AMDs and rebrand old Pascals to cover the mainstream and entry levels of the 2000 series.
I don't know why, but it smells to me that 2050 and 2060 will be rebranded Pascal, wait and see.
https://forums.guru3d.com/data/avatars/m/165/165326.jpg
Interesting move by nvidia this time around , let's see where this is going ...
https://forums.guru3d.com/data/avatars/m/268/268248.jpg
Corrupt^:

It's actually a very interesting and needed feature for the future and AMD will probably have to adopt it too, but it will require a non vendor specific implementation imo for it to really hit off. It's the next step in lightening. Though I'm not defending the 20XX cards, they're overpriced and it's to soon to care about RTX atm.
I agree 100% with you if this is going mainsteam needs both ventors to support it at some level and MOST certainly not only the high end up to enthusiast class cards.