NVIDIA prepares suspension of production GeForce RTX 20 series

Published by

Click here to post a comment for NVIDIA prepares suspension of production GeForce RTX 20 series on our message forum
https://forums.guru3d.com/data/avatars/m/243/243702.jpg
12nm Turing uses must be so crowded... It is not like nVidia ordered (and paid for) certain production capacity and then someone took it away from them.
https://forums.guru3d.com/data/avatars/m/266/266726.jpg
I was looking at that new fancy A100, Its almost unbelievable they have an 826mm^2 7nm die, I imagine those 12nm fabs will be refitted pretty quickly inorder to keep up with nvidia's demand insane demands lmao.
https://forums.guru3d.com/data/avatars/m/248/248994.jpg
Meanwhile AMD isn't worried because they can always use excess old chips in rebranded "new" video cards.
data/avatar/default/avatar16.webp
great news, release those 3000 already ! but after watching Jayz2Cents video about consoles and their AMD gpu I guess they are doing their annoying "see what the other brand release 1st" dance
https://forums.guru3d.com/data/avatars/m/259/259067.jpg
Yeah,because suddenly Rtx 2080Ti cant handle 60fps on Watch Dogs Legion Preview Build at 1080p Ultra with RTX (digital foundry notes). Rtx,unfortunatelly another gimmick for masses.
data/avatar/default/avatar32.webp
Turanis:

Yeah,because suddenly Rtx 2080Ti cant handle 60fps on Watch Dogs Legion Preview Build at 1080p Ultra with RTX (digital foundry notes). Rtx,unfortunatelly another gimmick for masses.
It should be noted that it did handle 30fps without DLSS enabled. In that same video you quoted they mentioned that, and showed that the DLSS mention was off while gameplay videos of the Cyberpunk game have DLSS enabled. EDIT: It will be interesting how the next-gen consoles handle RTX.
https://forums.guru3d.com/data/avatars/m/274/274425.jpg
Kaarme:

Meanwhile AMD isn't worried because they can always use excess old chips in rebranded "new" video cards.
Can't disagree, but AMD certainly isn't alone in that practice. Whenever you have a few minutes to waste, research how many "generations" the GF108, one of the most mediocre and recycled GPUs to ever exist, hung around. It's like the Barnabas Collins of graphic processors. If Turing is ramping down, maybe they could let the GF108 have another go? Just for old times sake.
https://forums.guru3d.com/data/avatars/m/243/243702.jpg
mentor07825:

It should be noted that it did handle 30fps without DLSS enabled. In that same video you quoted they mentioned that, and showed that the DLSS mention was off while gameplay videos of the Cyberpunk game have DLSS enabled. EDIT: It will be interesting how the next-gen consoles handle RTX.
They will not... it is DX-R.
https://forums.guru3d.com/data/avatars/m/209/209146.jpg
It's something, thought DXR was just the D3D12 integration for ray tracing but the actual solution and software possibly also how that operates on capable hardware would be handled by AMD or NVIDIA unless Microsoft extends and works out so DXR can operate on it's own or game engines such as Cry-Engine handle alternate solutions via compute shaders I think is how it was with the current solution (mesh shaders then as D3D12_2 catches on?) since trying to use DXR on it's own was just very slow and more for testing or validation / compatibility. I could be misremembering the details a bit on how it all worked though. No idea at all what the Playstation 5 will be doing, something Sony specific perhaps not quite Vulkan API I assume with Microsoft doing D3D12.x with the XBox Next or whatever it's name will actually end up being. Going to be a odd little divide here I expect at least on PC and the different hardware even after NVIDIA 3000 and AMD Navi20 hardware availability before ray-tracing even in part can be a mainstream thing as either it's RTX and thus NVIDIA only else some support however AMD's solution works if they have something similar (I doubt their hardware will work with RTX unless NVIDIA and AMD works out a deal here.) or we get various game engine solutions and however these utilize D3D12 and/or Vulkan ray tracing capabilities and AMD or NVIDIA hardware through DXR (D3D12) or extensions (Vulkan) and whatever ups and downs and hoops this will lead to initially. Hmm now that NVIDIA is slowing down production of the 2000 series I also expect availability and pricing will start to shift a bit but there should still be plenty of stock available in stores at least for a while but after this news cost might go up a bit. (And once whatever it will be named the 3000 series comes out then they'll probably get sold off via discounts and deals and all that for remaining retailer inventory.) EDIT: Capable hardware sounds like some ray tracing engine thing like the tessellation stuff come to think of it, RTX cores are just large bits on the GPU chip for handling heavy math or something assuming I remember it correctly heavily focused on floating point (FP16 calculations?) but good at what they do even if it's nothing unique it still helps. 🙂 (Well that's Turing and the current solution, remains to be seen what Ampere will do and what AMD has for Navi20 for handling this.)
https://forums.guru3d.com/data/avatars/m/218/218363.jpg
Wonder what our cards will be worth once it's time to sell 'em.
https://forums.guru3d.com/data/avatars/m/273/273678.jpg
Fox2232:

It is not like nVidia ordered (and paid for) certain production capacity and then someone took it away from them
warehouses cost money for floorspace.
https://forums.guru3d.com/data/avatars/m/258/258664.jpg
Netherwind:

Wonder what our cards will be worth once it's time to sell 'em.
Same question here, might be the first time I'm actually selling old hardware because usually I found somebody I know who wanted it, gifted it away, even my 1080TI is now running in my best friend's PC. This time I might sell that 2080TI thing 😀
https://forums.guru3d.com/data/avatars/m/218/218363.jpg
fantaskarsef:

Same question here, might be the first time I'm actually selling old hardware because usually I found somebody I know who wanted it, gifted it away, even my 1080TI is now running in my best friend's PC. This time I might sell that 2080TI thing 😀
It'd be very generous to gift a 2080Ti 🙂
https://forums.guru3d.com/data/avatars/m/258/258664.jpg
Netherwind:

It'd be very generous to gift a 2080Ti 🙂
Being generous is easy when you have enough, and I'm lucky to be in such a position. That said, people around me have nice GPUs right now, so not sure what I'll do with this card 😀
data/avatar/default/avatar29.webp
kakiharaFRS:

great news, release those 3000 already ! but after watching Jayz2Cents video about consoles and their AMD gpu I guess they are doing their annoying "see what the other brand release 1st" dance
I 100% think it is this. Why would NV put something new out when they do not need to yet really. Hopefully whatever AMD has is good to maybe push NV prices down.
https://forums.guru3d.com/data/avatars/m/243/243702.jpg
Astyanax:

warehouses cost money for floorspace.
You have missed meaning of my post. It is about decision making which is on nVidia's side. Article states: "According to news from Juxun Gate last Thursday, due to insufficient output, NVIDIA's high-end graphics cards may face shortages..." In reality, root cause of shortage would not be insufficient output, but nVidia's choice to limit output.
data/avatar/default/avatar31.webp
Fox2232:

You have missed meaning of my post. It is about decision making which is on nVidia's side. Article states: "According to news from Juxun Gate last Thursday, due to insufficient output, NVIDIA's high-end graphics cards may face shortages..." In reality, root cause of shortage would not be insufficient output, but nVidia's choice to limit output.
If the chip is that big as rumoured, at 8nm Samsung mobile process, there are going to have atrocious yields at single digits per 300mm waffer. So Nvidia would have to limit supply not to lose a lot of money. It would only produce in numbers parts that are profitable. And we see it with Intel HEDT & server been on the same corner for over 2 years now and Intel mainstream platform since last year. Intel barely produces the top HEDT chips of the 10th series, because they are too expensive with crap yields and even if someone wants to buy them they do not exist. Nvidia is even in worse situation because their dies are even bigger. That is the result of Intel & Nvidia deciding to rest on their laurels using the same archaic 10y old "brute force" designs, not expecting serious competition.
https://forums.guru3d.com/data/avatars/m/243/243702.jpg
Fediuld:

If the chip is that big as rumoured, at 8nm Samsung mobile process, there are going to have atrocious yields at single digits per 300mm waffer. So Nvidia would have to limit supply not to lose a lot of money. It would only produce in numbers parts that are profitable. And we see it with Intel HEDT & server been on the same corner for over 2 years now and Intel mainstream platform since last year. Intel barely produces the top HEDT chips of the 10th series, because they are too expensive with crap yields and even if someone wants to buy them they do not exist. Nvidia is even in worse situation because their dies are even bigger. That is the result of Intel & Nvidia deciding to rest on their laurels using the same archaic 10y old "brute force" designs, not expecting serious competition.
We're in thread of 12nm Turing and its hypothetical shortages.
https://forums.guru3d.com/data/avatars/m/273/273678.jpg
Fox2232:

In reality, root cause of shortage would not be insufficient output, but nVidia's choice to limit output.
Well yes, theres a choice to make on whether you want to pay XXX to outbid someone for fab usage on a product that you've reached end of production on.
https://forums.guru3d.com/data/avatars/m/113/113761.jpg
Regardless of what's coming, you really need to ask if your current performance isn't getting the job done. At this point, there really is little reason for me to move on from my GTX 1080, unless I upgrade my monitor first.