Poor availability of RX 6800 (XT) and RTX 3060 Ti and 3070 would be due shortage of GDDR6 chips

Published by

Click here to post a comment for Poor availability of RX 6800 (XT) and RTX 3060 Ti and 3070 would be due shortage of GDDR6 chips on our message forum
data/avatar/default/avatar08.webp
My friends in the states just got lucky and nabbed a 3060Ti and a second or third shipment 3070 at Microcenter. The sad fact is they were going to buy 2x 3080! So they will either use these as "rentals" and get 3080Ti or 6900XT later. Why? Given the lack of goods, they can save up the difference by the time it actually becomes available for sale!
data/avatar/default/avatar15.webp
now is gddr6 fault. Is always someone else fault. I wonder how they plan to do a 20gb version if the problem is the ram.
data/avatar/default/avatar20.webp
asturur:

now is gddr6 fault. Is always someone else fault. I wonder how they plan to do a 20gb version if the problem is the ram.
If 3080 Ti gets 20GB, it will be GDDR6X and not GDDR6 What 3080 Ti gets, depends on bus width 320 bit bus, like 3080 = 20GB 384 bit bus, like 3090 = 12GB
https://forums.guru3d.com/data/avatars/m/198/198862.jpg
Sovsefanden:

If 3080 Ti gets 20GB, it will be GDDR6X and not GDDR6 What 3080 Ti gets, depends on bus width 320 bit bus, like 3080 = 20GB 384 bit bus, like 3090 = 12GB
Issue is even worse for g6x.
data/avatar/default/avatar13.webp
Undying:

Issue is even worse for g6x.
No-one knows, all this is speculation. Yesterday it was lack of wafers, today its lack of RAM. In the end it might just be that demand is huge... Because it is. People are literally standing in line to get their hands on an Ampere card. Nvidia only fulfilled 10% of orders worldwide. Aprox. AMD fulfilled 0.1% with 6800 series tho... LOL
https://forums.guru3d.com/data/avatars/m/273/273838.jpg
All prices are going up. I got me an RX 5700XT for 380€ a year ago (24% VAT included), prices are nowhere near that at the moment. Also, I could get a Ryzen 7 3700x for around 275€ before the Ryzen 5xxx launch. But I said "new series will push the prices of older series down". Well, the 3700x is 20% more expensive now.
https://forums.guru3d.com/data/avatars/m/198/198862.jpg
Sovsefanden:

No-one knows, all this is speculation. Yesterday it was lack of wafers, today its lack of RAM. In the end it might just be that demand is huge... Because it is. People are literally standing in line to get their hands on an Ampere card. Nvidia only fulfilled 10% of orders worldwide. Aprox. AMD fulfilled 0.1% with 6800 series tho... LOL
Nvidia cards are already out for a while so lets wait for those 6-8 weeks amd said their cards will be more available.
data/avatar/default/avatar37.webp
Undying:

Nvidia cards are already out for a while so lets wait for those 6-8 weeks amd said their cards will be more available.
Problem for AMD is that not many are paying 600+ dollars for an AMD GPU, 6800 has worse perf/value than 3070 The most succesful AMD GPUs on Steam HW Survey are sub 200 dollars GPUs, like RX 570 and RX 580 Most gamers are not paying a dime above 400-500 dollars, which is why AMD needs to hurry with 6700 series or they simply won't ship many 6000 GPU's - Demand is huge right now. Nvidia cards are flying off the shelves RIGHT NOW, AMD are not selling much anyway. 6800 series are not available, way worse availability than Ampere and it won't improve anytime soon My 2x 6800XT orders says March, 2 different companies, ref cards lol Nvidia lacked cards on release, sure, but AMD did a TRUE PAPERLAUNCH. Close to zero cards on release. I expect people will be waiting deep into 2021 to get 6800 series and 6900XT.. If Nvidia releases 3060 non-Ti and 3050 Ti in january together with 3080 Ti they have an almost complete lineup. AMD only has top-end (with no availablity), meaning LOW SALES.
data/avatar/default/avatar15.webp
Sure is a lot of BS flying around these days
https://forums.guru3d.com/data/avatars/m/243/243702.jpg
Sovsefanden:

Problem for AMD is that not many are paying 600+ dollars for an AMD GPU, 6800 has worse perf/value than 3070 The most succesful AMD GPUs on Steam HW Survey are sub 200 dollars GPUs, like RX 570 and RX 580 Most gamers are not paying a dime above 400-500 dollars, which is why AMD needs to hurry with 6700 series or they simply won't ship many 6000 GPU's - Demand is huge right now. Nvidia cards are flying off the shelves RIGHT NOW, AMD are not selling much anyway. 6800 series are not available, way worse availability than Ampere and it won't improve anytime soon My 2x 6800XT orders says March, 2 different companies, ref cards lol Nvidia lacked cards on release, sure, but AMD did a TRUE PAPERLAUNCH. Close to zero cards on release. I expect people will be waiting deep into 2021 to get 6800 series and 6900XT.. If Nvidia releases 3060 non-Ti and 3050 Ti in january together with 3080 Ti they have an almost complete lineup. AMD only has top-end (with no availablity), meaning LOW SALES.
Poor speculation. Sure, you need more VRAM for card that has 16GB than one that has 8GB. But those are 2GB vs 1GB chips. It is not like nVidia or anyone else is stealing 2GB chips from AMD. Manufacturers sell everything they can make. And that applies to AMD too. People managed to get cards. Even with failed launch in here, where shops were very late and were not able to secure many cards. Same situation did happen with Ryzen launch. But yesterday our country got bigger batch of Ryzen 5800X CPUs. Was gone in 30 minutes. Today another batch. And it is still available. If chips were not moving slowly through logistics chain, even this batch would be already gone due to demand. Both AMD and nVidia are moving cards. It is just question of who in logistics chain has bigger D* and is able to secure supply. Some countries have more of one product some have more of other products. And while historically we have seen biggest volume on cheaper cards, 2020 is different. People did not spend as much on usual wastefulness of life. And many can afford more expensive upgrade as result. That together with Turing cards where many people felt need to upgrade, but did not pull trigger. And now with better value in comparison. And question is: "Does AMD want to sell 3x 6800 (XT) with 16GB of GDDR6 or do they want to sell 4x 6700 (XT) with 12GB of GDDR6?" I guess they make more money by selling 3x 6800 (XT) when GDDR6 availability is limiting factor. It is apparent that neither can sell on desired volume. So it is question of demand and ability to satisfy it. There are still old RX 570/580 in places to buy. Same applies to nVidia's older cards. Not best value proposition, but people can get them. And releasing low end cards would kill chance to sell those cards. Instead they try to satisfy newly introduced performance level 1st. And there is still bigger demand than supply. - - - - Answer to all those problems is patience.
https://forums.guru3d.com/data/avatars/m/79/79740.jpg
Hope the power outage at Micron Fab didnt affect any of the GDDR6 or 6x production.
data/avatar/default/avatar21.webp
Fox2232:

Poor speculation. Sure, you need more VRAM for card that has 16GB than one that has 8GB. But those are 2GB vs 1GB chips. It is not like nVidia or anyone else is stealing 2GB chips from AMD. Manufacturers sell everything they can make. And that applies to AMD too. People managed to get cards. Even with failed launch in here, where shops were very late and were not able to secure many cards. Same situation did happen with Ryzen launch. But yesterday our country got bigger batch of Ryzen 5800X CPUs. Was gone in 30 minutes. Today another batch. And it is still available. If chips were not moving slowly through logistics chain, even this batch would be already gone due to demand. Both AMD and nVidia are moving cards. It is just question of who in logistics chain has bigger D* and is able to secure supply. Some countries have more of one product some have more of other products. And while historically we have seen biggest volume on cheaper cards, 2020 is different. People did not spend as much on usual wastefulness of life. And many can afford more expensive upgrade as result. That together with Turing cards where many people felt need to upgrade, but did not pull trigger. And now with better value in comparison. And question is: "Does AMD want to sell 3x 6800 (XT) with 16GB of GDDR6 or do they want to sell 4x 6700 (XT) with 12GB of GDDR6?" I guess they make more money by selling 3x 6800 (XT) when GDDR6 availability is limiting factor. It is apparent that neither can sell on desired volume. So it is question of demand and ability to satisfy it. There are still old RX 570/580 in places to buy. Same applies to nVidia's older cards. Not best value proposition, but people can get them. And releasing low end cards would kill chance to sell those cards. Instead they try to satisfy newly introduced performance level 1st. And there is still bigger demand than supply. - - - - Answer to all those problems is patience.
3090 uses 1GB chips - GDDR6X does not come in 2GB chips yet. Most 6800 buyers recieved non-XT, for higher than MSRP price, very few recieved cards even on this forum 3080 already have the same marketshare on STEAM HW SURVEY as AMD 5700 series which has been on the market for 18 months or more, 6800 series are not even listed I'd never buy those older gen cards, because price/performance is terrible compared to 3060 Ti / 3070 5700 XT is still 350-400 dollars (if in stock) and 25% slower than 3060 Ti, with alot less features, like DLSS, Ray Tracing to name a few. RTX cards literally has tons of features that streamers can use, or regular gamres like RTX Voice or Nvidia Reflex AMD sells the bare minimum, all focus is on rasterization performance. Barely any extra features. Ray Tracing is poor - Perf hit too high, making it useless even on the top-end 6000 GPU's like 6800XT (worse RT perf than 3060 Ti) And STILL AMD demands a high price. Wait till people see how much DLSS 2.0 boosts perf in Cyberpunk 2077... It will pretty much be the only way to play this game maxed at 4K and even 3080 will do it I would not be surprised if 6800XT dips to sub 500 dollars when it's actaully in stock everywhere (meaning 3-6 months from now) - If not most people will go with Nvidia 3000 series.
https://forums.guru3d.com/data/avatars/m/243/243702.jpg
Sovsefanden:

3090 uses 1GB chips - GDDR6X does not come in 2GB chips yet. Most 6800 buyers recieved non-XT, for higher than MSRP price, very few recieved cards even on this forum 3080 already have the same marketshare on STEAM HW SURVEY as AMD 5700 series which has been on the market for 18 months or more, 6800 series are not even listed I'd never buy those older gen cards, because price/performance is terrible compared to 3060 Ti / 3070 5700 XT is still 350-400 dollars (if in stock) and 25% slower than 3060 Ti, with alot less features, like DLSS, Ray Tracing to name a few. RTX cards literally has tons of features that streamers can use, or regular gamres like RTX Voice or Nvidia Reflex AMD sells the bare minimum, all focus is on rasterization performance. Barely any extra features. Ray Tracing is poor - Perf hit too high, making it useless even on the top-end 6000 GPU's like 6800XT (worse RT perf than 3060 Ti) And STILL AMD demands a high price. Wait till people see how much DLSS 2.0 boosts perf in Cyberpunk 2077... It will pretty much be the only way to play this game maxed at 4K and even 3080 will do it I would not be surprised if 6800XT dips to sub 500 dollars when it's actaully in stock everywhere (meaning 3-6 months from now) - If not most people will go with Nvidia 3000 series.
I see you missed it. nVidia uses X, AMD uses non-X memories. Even if lower nVidia's cards used non-X, they are not using 2GB chips like current new AMD's offerings. I hope it starts to click now in terms of who can steam chips from who. (They simply can't.) Literally nobody has reson to trust you speculations about paparlaunch and magical numbers telling how big percentage of demand been satisfied.
Sovsefanden:

AMD fulfilled 0.1% with 6800 series tho... LOL
You do not even think about what you write. Our country received around 200 cards in 1st batch of reference cards. Sux, but if that satisfied 0.1% of demand. Then our market has space for 200,000 RX 6800 (XT) or more as then we got other batches. Same way your opinion on AMD/nVidia's stock is irrelevant. It is irrelevant to AMD/nVidia's business decision making. Old stock has to go. And in current situation, producing cards which do not collide with existing stock is sound business strategy. Both companies used it. Rant about 3060 Ti or anything that's not even seen GO for sales is irrelevant. People buy what's available. That's why Poor, old cards like 5700 (XT) are almost sold out too, as is Turing which has much lower value proposition. You would not be surprised by naked Yoda dancing in Sea of Tranquility. That's because there is nothing that would surprise you... as long as you already expect even RX 6800 XT to drop under price of RTX 3070 while being 24% faster on average. (According to TPU's aggregate scoring.) And one more thing. I forgot that everyone is pro-streamer today. Literally 1 in 5,000 is going to use those features, but everyone's decision should be based on them. (Good logic, man.) And as far as DX-R goes. Tell me, how many games you played this year with DX-R enabled. And how many hours you took to finish them. Then do same for non-DX-R games. Does not compute? Can't your really understand? Then look at AMD's VLIW4 and why cards based on it were failure. Good architecture for DX11 which was new at time, bad for anything older. There was not enough DX11 games. People were not stupid to buy cards which will be weak in 95% of gaming scenarios they intend to use them with. DX-R may be future of cheap game development and cost transferred to gamer. But we are not there, yet. Developers are still learning about good/bad use cases. And they more often than not end up with bad ones. If I play 2 DX-R enabled games this year, it will be at best 80 hours of single player. And I'll still play at 80~100fps. If those games have compelling multiplayer, DX-R will end up being turned OFF to get maximum fps at minimal visual penalty. What is 80 hours out of 1000~2000 hours gamers spend playing in a year? And many people with weaker cards on both camps will not even enable DX-R when available.
https://forums.guru3d.com/data/avatars/m/250/250418.jpg
No doubt covid19 slowed down everything and there's no plan (and it would be really hard to implement one) to keep supply going. That said, it would have helped if they kept sales free from online bots: sell only physically, one card per happy costumer. @Fox2232 Agree with you in your last paragraph: until RT is either a mind blowing visual enhancement or purely mandatory, I'll disable it to prioritize maxing out other game settings and high FPS (Even if I have the hardware for it).
data/avatar/default/avatar14.webp
Fox2232:

I see you missed it. nVidia uses X, AMD uses non-X memories. Even if lower nVidia's cards used non-X, they are not using 2GB chips like current new AMD's offerings. I hope it starts to click now in terms of who can steam chips from who. (They simply can't.) Literally nobody has reson to trust you speculations about paparlaunch and magical numbers telling how big percentage of demand been satisfied. You do not even think about what you write. Our country received around 200 cards in 1st batch of reference cards. Sux, but if that satisfied 0.1% of demand. Then our market has space for 200,000 RX 6800 (XT) or more as then we got other batches. Same way your opinion on AMD/nVidia's stock is irrelevant. It is irrelevant to AMD/nVidia's business decision making. Old stock has to go. And in current situation, producing cards which do not collide with existing stock is sound business strategy. Both companies used it. Rant about 3060 Ti or anything that's not even seen GO for sales is irrelevant. People buy what's available. That's why Poor, old cards like 5700 (XT) are almost sold out too, as is Turing which has much lower value proposition. You would not be surprised by naked Yoda dancing in Sea of Tranquility. That's because there is nothing that would surprise you... as long as you already expect even RX 6800 XT to drop under price of RTX 3070 while being 24% faster on average. (According to TPU's aggregate scoring.) And one more thing. I forgot that everyone is pro-streamer today. Literally 1 in 5,000 is going to use those features, but everyone's decision should be based on them. (Good logic, man.) And as far as DX-R goes. Tell me, how many games you played this year with DX-R enabled. And how many hours you took to finish them. Then do same for non-DX-R games. Does not compute? Can't your really understand? Then look at AMD's VLIW4 and why cards based on it were failure. Good architecture for DX11 which was new at time, bad for anything older. There was not enough DX11 games. People were not stupid to buy cards which will be weak in 95% of gaming scenarios they intend to use them with. DX-R may be future of cheap game development and cost transferred to gamer. But we are not there, yet. Developers are still learning about good/bad use cases. And they more often than not end up with bad ones. If I play 2 DX-R enabled games this year, it will be at best 80 hours of single player. And I'll still play at 80~100fps. If those games have compelling multiplayer, DX-R will end up being turned OFF to get maximum fps at minimal visual penalty. What is 80 hours out of 1000~2000 hours gamers spend playing in a year? And many people with weaker cards on both camps will not even enable DX-R when available.
I played many titles with RT already, single player only, and Cyberpunk will have it enabled for sure too, in 1-2 days from now. It's going to look insane with RT on. Luckily it has DLSS 2.0 too, so fps will be high anyway. Can't wait to play around with this highly anticipated game. It's been years since a title like this has come out. And it has all the RTX features. 3080 Ti will be shipped with this game I expect. Nvidia worked closely with CDPR. Pretty much all gameplay video's are done using RTX cards. I expect the game to look pretty dull on non-RTX cards.
https://forums.guru3d.com/data/avatars/m/243/243702.jpg
Silva:

No doubt covid19 slowed down everything and there's no plan (and it would be really hard to implement one) to keep supply going. That said, it would have helped if they kept sales free from online bots: sell only physically, one card per happy costumer. @Fox2232 Agree with you in your last paragraph: until RT is either a mind blowing visual enhancement or purely mandatory, I'll disable it to prioritize maxing out other game settings and high FPS (Even if I have the hardware for it).
There are good use cases. In time, bad use will diminish. And developers will learn to use it where it makes sense. Because sensible use of DX-R is only sustainable way forward. Like I wrote when Turing came. Raytracing can gobble as much performance as developer wants. One more bounce/one more ray per pixel. IQ improves, performance goes down again. If developers target higher end cards like they did with Turing. Lower end cards will suffer. Then nVidia practically doubles DX-R performance again. AMD will follow. Current gen high end $1000~1500 cards will become entry level cards in next generation in terms of raytracing. And mainstream Turing will become quite unusable. If we were to judge cards purely by raytracing performance, then no sane person would buy even 3080/6800 XT unless they are fine with card losing 60~80% of its value in 2 years. And if TSMC enables it, we'll have cards that will perform that much better in raytracing in 2 years. Not so much better in rasterization, but raytracing performance will go up considerably. And that's big part of DX-R argument problems. I am not going to base value of product on something that loses its value this fast. @Sovsefanden Yeah, you are right. Pretty much all legs are right too. Left legged people are so dull.
data/avatar/default/avatar09.webp
Fox2232:

There are good use cases. In time, bad use will diminish. And developers will learn to use it where it makes sense. Because sensible use of DX-R is only sustainable way forward. Like I wrote when Turing came. Raytracing can gobble as much performance as developer wants. One more bounce/one more ray per pixel. IQ improves, performance goes down again. If developers target higher end cards like they did with Turing. Lower end cards will suffer. Then nVidia practically doubles DX-R performance again. AMD will follow. Current gen high end $1000~1500 cards will become entry level cards in next generation in terms of raytracing. And mainstream Turing will become quite unusable. If we were to judge cards purely by raytracing performance, then no sane person would buy even 3080/6800 XT unless they are fine with card losing 60~80% of its value in 2 years. And if TSMC enables it, we'll have cards that will perform that much better in raytracing in 2 years. Not so much better in rasterization, but raytracing performance will go up considerably. And that's big part of DX-R argument problems. I am not going to base value of product on something that loses its value this fast. @Sovsefanden Yeah, you are right. Pretty much all legs are right too. Left legged people are so dull.
I didnt expect an AMD user to care about RT or DLSS in this game, afterall, only RTX users gets to use those features. Non-RTX users are constantly hating on RT and DLSS everywhere, because, you know why. No experience and can't use it anyway, acting like they don't miss out. Thats what people do.
https://forums.guru3d.com/data/avatars/m/243/243702.jpg
Sovsefanden:

I didnt expect an AMD user to care about RT or DLSS in this game, afterall, only RTX users gets to use those features. Non-RTX users are constantly hating on RT and DLSS everywhere, because, you know why. No experience and can't use it anyway, acting like they don't miss out. Thats what people do.
Sorry, RTX != RT. It's DX-R. Did you get into GPUs because you saw it as cash grab opportunity? You are definitely out of your boat when it gets to everything else. Like 90~95% of what you write is incorrect. And rest is not about computers.
data/avatar/default/avatar14.webp
Fox2232:

Sorry, RTX != RT. It's DX-R. Did you get into GPUs because you saw it as cash grab opportunity? You are definitely out of your boat when it gets to everything else. Like 90~95% of what you write is incorrect. And rest is not about computers.
If Cyberpunk uses DXR why does AMD 6000 series not support RT tho LOL Will get it MUCH later, sometime in mid or end of 2021. CDPR said. However, it will be useless without a feature like DLSS 2.0. fps hit on 6800XT is insane when DXR is enabled, performance drops below 3060 Ti level (using RT) 3060 Ti is generally 5-10% faster than 6800XT in Ray Tracing, and this is WITHOUT DLSS 2.0 enabled.. LOL
https://forums.guru3d.com/data/avatars/m/156/156348.jpg
Honestly in Canada nothing is available. Lot of monitors are out of stock. I've never seen so many out of stock computer products on Amazon.ca.