AMD Might Replace RX 500 Cards with RX Vega 28 and 32

Published by

Click here to post a comment for AMD Might Replace RX 500 Cards with RX Vega 28 and 32 on our message forum
https://forums.guru3d.com/data/avatars/m/182/182690.jpg
Go home AMD, your drunk. Releasing a mid-range card that doesn't have more than 4gb of ram is nuts. It's going to be gimped like the gtx1060 3gb is.
https://forums.guru3d.com/data/avatars/m/248/248994.jpg
Considering how the current Vega cards are memory bandwidth starved due to cutting costs, this would only make sense if they used the same memory configuration for less powerful GPUs that need less bandwidth. Cutting the amount of memory in half seems so weird at this point I find it hard to believe.
data/avatar/default/avatar17.webp
Sure, but this is only unofficial info at this point, can't determine their strategy from just that.
data/avatar/default/avatar26.webp
Knox:

Go home AMD, your drunk. Releasing a mid-range card that doesn't have more than 4gb of ram is nuts. It's going to be gimped like the gtx1060 3gb is.
Well, let me tell you something, Vega's Memory architecture is very powerful, needless to say. Sure, they are using HBM2 which is expensive but they've introduced a lot of other changes as well. They've basically made it so good, that at most, Vega actually needs 50% of vram compared to previous generations. I am not making this up. Raja Koduri himself said it during some interview. Also, since the bandwidth is so large, memory can be swapped in and out much faster, if needed.
https://forums.guru3d.com/data/avatars/m/248/248994.jpg
SalazarTennyson:

Well, let me tell you something, Vega's Memory architecture is very powerful, needless to say. Sure, they are using HBM2 which is expensive but they've introduced a lot of other changes as well. They've basically made it so good, that at most, Vega actually needs 50% of vram compared to previous generations. I am not making this up. Raja Koduri himself said it during some interview. Also, since the bandwidth is so large, memory can be swapped in and out much faster, if needed.
The bandwidth is not particularly large. Just look at the specs. It would have been huge if they had used more than two stacks, like Nvidia did in their pro card, but undoubtly due to cost issues AMD went for two stacks. It resulted in a bandwidth that is only similar to the original 1080Ti and less than the new 1080Ti version with faster memory.
data/avatar/default/avatar08.webp
@SalazarTennyson The GPU can be quick to swap assets, but you need to remember, that it swaps with the I/O HDD or SSDs, so even if the GPU is quick, the HDD or SSD will delay it anyway, aside if you use HBBC which consumes a lot of extra ram. Considering how bad vega performs in IPC wise comparing to Fiji, they really need to come up with insane clock in order to even stay at the same level as the RX 500 series, since the RX 580 has 2308 Stream Processors. I can even risk to say that this cards will be slower and more expensive than the RX 580, but probably with a much better power consumption. For content creators like rendering 3D stuff and workstation work, probably it will be a steal option considering how well RX Vega performs. Altough for gaming, I think it will be a disaster! I'm curious. Cheers
https://forums.guru3d.com/data/avatars/m/260/260826.jpg
SalazarTennyson:

Well, let me tell you something, Vega's Memory architecture is very powerful, needless to say. Sure, they are using HBM2 which is expensive but they've introduced a lot of other changes as well. They've basically made it so good, that at most, Vega actually needs 50% of vram compared to previous generations. I am not making this up. Raja Koduri himself said it during some interview. Also, since the bandwidth is so large, memory can be swapped in and out much faster, if needed.
The more efficient VRAM (less amount needed) HBM fairy tale was already told the previous iteration in Fury X vs 980 Ti and we know how it ended. In real life (read: gaming performance) it showed little to no effect.
https://forums.guru3d.com/data/avatars/m/166/166706.jpg
sammarbella:

The more efficient VRAM (less amount needed) HBM fairy tale was already told the previous iteration in Fury X vs 980 Ti and we know how it ended. In real life (read: gaming performance) it showed little to no effect.
In real life yes, in reality HBM/2 is needed for optimising Radeons memory controller power consumption - and it still is much higher than GeForces. AMD should rework memory controller in the first place, but thats a bit harder as we see Vega 32, 28 why ? AMD pricing needs a big time fix and availability...
https://forums.guru3d.com/data/avatars/m/250/250418.jpg
Man, HBM2? I was hopping to buy this card because I couldn't get a RX570 but now I'm worried about the price...
https://forums.guru3d.com/data/avatars/m/236/236670.jpg
If it can game like a 1060 at 1050ti price then its a winner....
https://forums.guru3d.com/data/avatars/m/132/132389.jpg
Honestly, I don't even know if AMD is even attempting to aim for the gaming market anymore. Meanwhile, nVidia is laughing its ass off selling year old cards at a premium. So if I want a GPU upgrade I can either: 1 - Get screwed by AMD and pay a fortune for sub-par performance. 2 - Get screwed by nVidia and pay a fortune for year old technology. Sounds legit.
https://forums.guru3d.com/data/avatars/m/250/250418.jpg
airbud7:

If it can game like a 1060 at 1050ti price then its a winner....
If it doesn't beat them, AMD will have a hard time justifying the price due to HBM2.
Neo Cyrus:

Honestly, I don't even know if AMD is even attempting to aim for the gaming market anymore. Meanwhile, nVidia is laughing its ass off selling year old cards at a premium. So if I want a GPU upgrade I can either: 1 - Get screwed by AMD and pay a fortune for sub-par performance. 2 - Get screwed by nVidia and pay a fortune for year old technology. Sounds legit.
Vega has the performance (12,665 GFlops), it just failed at gaming (witch is the main selling factor). And I was afraid they couldn't just swap the memory controller to be able to use GDDR5(X). I Guess Navi will be using HBM2/3? This will suck if AMD can't get performance closer to Nvidia. AMD cards MSRP is actually fine, problem is this never ending mining craze... It's not like you desperately need an upgrade, but the 1070Ti is a good deal. That or a discounted 1070.
https://forums.guru3d.com/data/avatars/m/105/105985.jpg
perfect for the new consoles I hope these smaller chips don't need a 6+6 set up:p
https://forums.guru3d.com/data/avatars/m/132/132389.jpg
Silva:

If it doesn't beat them, AMD will have a hard time justifying the price due to HBM2. Vega has the performance (12,665 GFlops), it just failed at gaming (witch is the main selling factor). And I was afraid they couldn't just swap the memory controller to be able to use GDDR5(X). I Guess Navi will be using HBM2/3? This will suck if AMD can't get performance closer to Nvidia. AMD cards MSRP is actually fine, problem is this never ending mining craze... It's not like you desperately need an upgrade, but the 1070Ti is a good deal. That or a discounted 1070.
I'd be fine with this OC for a while longer if I played at 1080p, but I've almost never used that resolution. At 1440p a 970 falls just short or holding 60 fps in modern games with settings I would want (max textures and effects if possible), even at 1.6GHz (which is unstable lately so I turned it down, not sure if it's due to age/degradation, drivers, or both). I don't know about their lower end cards but the MSRP for the cheapest Vega 64 I can find (here anyway) is retarded. $1005.7 ($890 + 13%) for the shitty blower fan reference card. That price is in 1080 Ti territory which is already inflated. I'm sitting here wondering if it's really necessary to use HBM on a low-end card. It feels like the price is going to be shit because of it. A new memory controller must cost more than straight up using HBM, but I would bet the price is still going to bad for the gaming performance they will provide if Vega 64 is anything to go by.
https://forums.guru3d.com/data/avatars/m/31/31371.jpg
Neo Cyrus:

Honestly, I don't even know if AMD is even attempting to aim for the gaming market anymore. Meanwhile, nVidia is laughing its ass off selling year old cards at a premium.
That could be true
Neo Cyrus:

So if I want a GPU upgrade I can either: 1 - Get screwed by AMD and pay a fortune for sub-par performance. 2 - Get screwed by nVidia and pay a fortune for year old technology. Sounds legit.
1 - No it not sub-par performance your only look at DirectX 11 performance where DirectX 12 and Vulkan tell a diff story depend on the game. 2 - Well that may be partly true but that not where we are get screwed at it G-sync technology which paid out nose for on the monitor. 3 - Where really lays in get screwed is all those that are price gouging carp out cards even with nVidia card to but let hope this all changes or sale will take a big nose dive this holiday season.
https://forums.guru3d.com/data/avatars/m/132/132389.jpg
SHS:

That could be true 1 - No it not sub-par performance your only look at DirectX 11 performance where DirectX 12 and Vulkan tell a diff story depend on the game. 2 - Well that may be partly true but that not where we are get screwed at it G-sync technology which paid out nose for on the monitor. 3 - Where really lays in get screwed is all those that are price gouging carp out cards even with nVidia card to but let hope this all changes or sale will take a big nose dive this holiday season.
Well, I mean for the price it is sub-par. That and the amount of time it took to reach it. The benchmarks in the games I would want a Vega card for, in DX 12/whatever, lost to a 1080. Not 1080 Ti, just 1080... which can be found for $200 less for a reference model that would match those benchmarks. A reference MSI 1080 ends up $195.5 less than a reference (blower fan) Vega 64. Good job on that pricing AMD.
https://forums.guru3d.com/data/avatars/m/216/216349.jpg
Neo Cyrus:

Honestly, I don't even know if AMD is even attempting to aim for the gaming market anymore. Meanwhile, nVidia is laughing its ass off selling year old cards at a premium. So if I want a GPU upgrade I can either: 1 - Get screwed by AMD and pay a fortune for sub-par performance. 2 - Get screwed by nVidia and pay a fortune for year old technology. Sounds legit.
Welcome to the wonderfull free market world!... Seriously this is what happens when we only have 2 companies on a single market, if one of them screws up then the customer is fucked one way or another...
data/avatar/default/avatar17.webp
1060 GTX is by far a superior choice and more futureproof (6GB version) or used 980ti/1070(non ti). Who the hell is gonna buy 4gb card and call himself a gamer in 2018? ROFL come on AMD you are digging your grave this year.
https://forums.guru3d.com/data/avatars/m/31/31371.jpg
Neo Cyrus:

Well, I mean for the price it is sub-par. That and the amount of time it took to reach it. The benchmarks in the games I would want a Vega card for, in DX 12/whatever, lost to a 1080. Not 1080 Ti, just 1080... which can be found for $200 less for a reference model that would match those benchmarks. A reference MSI 1080 ends up $195.5 less than a reference (blower fan) Vega 64. Good job on that pricing AMD.
In that case Neo Cyrus you be part right when come down to price of device but we are better off spending a few extra buck on 1080 Ti model vs the Vega 64 in long run but as you said it really make no sense to even buy Vega 56 as the reg 1080 is still faster at about the same price going by today pricing even know Vega 56 just drop in price a bit but it still $100 😡 to much in my book.
warlord:

1060 GTX is by far a superior choice and more futureproof (6GB version) or used 980ti/1070(non ti). Who the hell is gonna buy 4gb card and call himself a gamer in 2018? ROFL come on AMD you are digging your grave this year.
I know a few that only have 3GB or 4GB model there nothing wrong it, do keep in mind that not every body is a hard core game as type game they may play are do not need major fps like run mill RTS game or old for 10 years ago and so on warlord.
https://forums.guru3d.com/data/avatars/m/243/243702.jpg
Card meant for 1080p gaming? 4GB VRAM? No prolem. There has been only one GPU which did run out of VRAM before horsepower, that's GK104 paired w/ 2GB of VRAM. Yes, one can use 4k texture pack on 1080p resolution. Placebo works. But it is not question of HW configuration or calling VRAM amount future proof. It is about sanity. On 1080p, 4k textures improve clarity maybe by 5%. If it chokes memory controller or cause other type of stutter, then it is matter of choice. Secondly those 4k textures have to be processed, so even if there is enough VRAM and memory bandwidth, there is considerable performance impact on average gaming GPU.