Leak: ASRock Radeon RX 5600 video cards likely get 6GB of graphics memory

Published by

Click here to post a comment for Leak: ASRock Radeon RX 5600 video cards likely get 6GB of graphics memory on our message forum
https://forums.guru3d.com/data/avatars/m/189/189980.jpg
Well, that's for reference model. Partners could easily fit 2 GB more on that model, and have a good card on their hands which will sell pretty well, provinding good FPS at a reasonable price for its own category.
https://forums.guru3d.com/data/avatars/m/279/279265.jpg
When does a 5800XT appear as competitor of the 2080Ti performance class ?
https://forums.guru3d.com/data/avatars/m/72/72485.jpg
6 GB 192-bit will still outperform 8 GB 128-bit more than likely. However with four different models all releasing inside 2 months it'll be interesting to see how pricing ends up.
data/avatar/default/avatar29.webp
anticupidon:

Well, that's for reference model. Partners could easily fit 2 GB more on that model, and have a good card on their hands which will sell pretty well, provinding good FPS at a reasonable price for its own category.
No, they can't. This is a 192 bit card which means it needs to feature 6 Memory Modules. So you either go with 3gb, 6gb or 12gb.
https://forums.guru3d.com/data/avatars/m/189/189980.jpg
Well, that's very helpful. Just didn't know how the memory chips are distributed proportionally related to the bus width.
data/avatar/default/avatar14.webp
GuruBjörn:

When does a 5800XT appear as competitor of the 2080Ti performance class ?
I do not see that coming. The 5700XT, while at a less price, is a competitor of the 2070s. 2080 is already a stretch. I wonder if the eventual 5800xt can go up to comparable to 2080s.
data/avatar/default/avatar14.webp
anticupidon:

Well, that's very helpful. Just didn't know how the memory chips are distributed proportionally related to the bus width.
Yep, basically, 96 bits (3 memory modules) 128 bits (4 memory modules) 192 bits (6 memory modules) 256 bits (8 memory modules) 384 bits (12 memory modules) 512 bits (16 memory modules) 192 bits it's always tricky, that's the reason we had a GTX 1060 3GB 😀 They simply couldn't put 4gb in there, unless they pulled off a GTX 970 3.5GB all over again. Where they had 3.5GB into a 224bits and a disconnected 512MB into a 32bits bandwidth. Same thing with integrated GPUs, if you have 1 single ram DDR (which contains 2 chips), you are using 64 bit memory, but if you use dual channel, you will have 128 bits 🙂
https://forums.guru3d.com/data/avatars/m/275/275145.jpg
If this is true, it will be fun to see what the people who criticized Nvidia for the 6GB will say now. 😎
data/avatar/default/avatar17.webp
Also, I did understood why nvidia have gone 192 bits. At the time RX 580 (GCN) haven't had the efficiency that Nvidia latest archs had. So Nvidia simply didn't need to have a 256 bits GPU to beat the memory bandwidth that AMD had at 256 bits memory, and 128 bits was too low, so they have gone with 192 bits, saving on power consumption and productions costs.
data/avatar/default/avatar22.webp
GuruBjörn:

When does a 5800XT appear as competitor of the 2080Ti performance class ?
rx5800xt competitor for 2080ti rx5900xt beating their 3080ti super /s (even tho, i hope)
https://forums.guru3d.com/data/avatars/m/261/261894.jpg
6GB is more than enough to play any game at 1080p Extreme settings... its hard a game that use 5GB of Vram
https://forums.guru3d.com/data/avatars/m/248/248994.jpg
kings:

If this is true, it will be fun to see what the people who criticized Nvidia for the 6GB will say now. 😎
I don't quite understand what you are saying. If someone couldn't afford the (overpriced) 2070 and had to get a 2060 with 6GB of memory, isn't it their own business to criticise Nvidia for it? Still, it was more than likely enough. There are stranger cards around, like AMD Fury (X) with only 4GB of memory, despite being the flagship. It soon did run into real issues with that amount. GTX 1060 with only 3GB might have been questionable as well, at that point in time. In any case, like @Borys said, 6GB is enough for 1080p. A 2060 owner shouldn't dream of much more than 1080p anyway. RX 5600 isn't going to be any Lamborghini of video cards either.
https://forums.guru3d.com/data/avatars/m/198/198862.jpg
Borys:

6GB is more than enough to play any game at 1080p Extreme settings... its hard a game that use 5GB of Vram
There are few games using 6gb+ already even on 1080p. These 6gb gpus are not futureproof. I would skip this 5600 and 1660's.
https://forums.guru3d.com/data/avatars/m/274/274425.jpg
ObscureangelPT:

Yep, basically, 96 bits (3 memory modules) 128 bits (4 memory modules) 192 bits (6 memory modules) 256 bits (8 memory modules) 384 bits (12 memory modules) 512 bits (16 memory modules) 192 bits it's always tricky, that's the reason we had a GTX 1060 3GB 😀 They simply couldn't put 4gb in there, unless they pulled off a GTX 970 3.5GB all over again. Where they had 3.5GB into a 224bits and a disconnected 512MB into a 32bits bandwidth. Same thing with integrated GPUs, if you have 1 single ram DDR (which contains 2 chips), you are using 64 bit memory, but if you use dual channel, you will have 128 bits 🙂
I feel certain ObscureangelPT knows this, but Nvidia used to mismatch memory amounts and board bus-width now and then back in the Fermi and Kepler days. The 1Gb GTX 550 Ti, and the 2Gb GTX660 Ti, which I owned for a bit, come to mind. They were both on 192-bit boards. With regard to the RX 5600, I suppose the economy of producing a 192-bit PCB versus a 256-bit alternative, plus using two less DDR6 memory modules as well, represents a savings that just seems too irresistible to ignore. (???) A 6Gb design launching in 2020 seems, at least to me, a bit too much of a potential limit. While I've been pleasantly surprised how well some of my 4Gb cards have fared thus far, even at 2k, I also must admit I'm not playing pseudo photo-realistic, 64-player twitchfests, either. (Not that there is anything wrong with that.) If a card that sells for, let's say, $249.99 USD, is supposed to provide a good two-to-three years of service for those who feel that sort of price should represent a meaningful investment, I think something with an additional 2Gb of VRAM hanging on it would be quite welcome come 2021 or thereabouts.
https://forums.guru3d.com/data/avatars/m/243/243702.jpg
ObscureangelPT:

Yep, basically, 96 bits (3 memory modules) 128 bits (4 memory modules) 192 bits (6 memory modules) 256 bits (8 memory modules) 384 bits (12 memory modules) 512 bits (16 memory modules) 192 bits it's always tricky, that's the reason we had a GTX 1060 3GB 😀 They simply couldn't put 4gb in there, unless they pulled off a GTX 970 3.5GB all over again. Where they had 3.5GB into a 224bits and a disconnected 512MB into a 32bits bandwidth. Same thing with integrated GPUs, if you have 1 single ram DDR (which contains 2 chips), you are using 64 bit memory, but if you use dual channel, you will have 128 bits 🙂
That's incorrect. nVidia's example is completely wrong and not related to bus width. And 192bit bus can have 8GB of VRAM in 4x 1GB + 2x 2GB configuration. Not that it is really needed for GPU of that performance as 6GB would be enough for any use case outside of those meant to break it.
https://forums.guru3d.com/data/avatars/m/269/269649.jpg
For the most powerful gpus amd is waiting at 7nm+, surely.