AMD APU Display Architecture to remain VEGA, phase out in 2022, mention of Rembrandt

Published by

Click here to post a comment for AMD APU Display Architecture to remain VEGA, phase out in 2022, mention of Rembrandt on our message forum
https://forums.guru3d.com/data/avatars/m/248/248994.jpg
Inconceivable if true. I actually hope Intel's Xe iGPUs will leave AMD's ancient Vega based offerings to eat the dust. Sometimes bitter medicine is what people need to move forward. It's all the weirder when you keep in mind that AMD developed the APUs for the next gen consoles a while ago. They most certainly won't be based on any GCN anymore. So, AMD has already basically done the work, but still refuses to let go of GCN in their own products.
data/avatar/default/avatar39.webp
The reason behind this is probably because the GPU part of an apu, especially on laptop are used mainly for compute task, with the majority of gaming worload (even on laptop) being done on a discreet GPU (except obviously for Ultrabook like devices). Probably when RDNA (2?, 3?) will become clearly better than GCN (perf by Watt, and without pushing it out of efficient widow) for compute task the switch will be made...
https://forums.guru3d.com/data/avatars/m/250/250418.jpg
Kaarme:

Inconceivable if true. I actually hope Intel's Xe iGPUs will leave AMD's ancient Vega based offerings to eat the dust. Sometimes bitter medicine is what people need to move forward. It's all the weirder when you keep in mind that AMD developed the APUs for the next gen consoles a while ago. They most certainly won't be based on any GCN anymore. So, AMD has already basically done the work, but still refuses to let go of GCN in their own products.
I highly doubt Xe will do anything to AMD.
https://forums.guru3d.com/data/avatars/m/259/259654.jpg
Kaarme:

Inconceivable if true. I actually hope Intel's Xe iGPUs will leave AMD's ancient Vega based offerings to eat the dust. Sometimes bitter medicine is what people need to move forward. It's all the weirder when you keep in mind that AMD developed the APUs for the next gen consoles a while ago. They most certainly won't be based on any GCN anymore. So, AMD has already basically done the work, but still refuses to let go of GCN in their own products.
Not at all. Vega sucks when a lot of voltage needs to be pumped to it, but in small packages and at lower voltages and frequencies, it's actually quite efficient. It was also designed with specific package height limitations and other laptop-specific things in mind, so I don't see any issue with it. Just the fact that it has the driver from a real GPU manufacturer makes it better than anything Intel might pump out. And according to rumors, Xe seems like a really slow Vega 2 (ie hot and slow), so I wouldn't hold my breath about it.
https://forums.guru3d.com/data/avatars/m/271/271684.jpg
Kaarme:

Inconceivable if true. I actually hope Intel's Xe iGPUs will leave AMD's ancient Vega based offerings to eat the dust. Sometimes bitter medicine is what people need to move forward. It's all the weirder when you keep in mind that AMD developed the APUs for the next gen consoles a while ago. They most certainly won't be based on any GCN anymore. So, AMD has already basically done the work, but still refuses to let go of GCN in their own products.
I don't see a reason why they should move from Vega in their APUs. They do what they were designed for and their performance leaves everything Intel has to offer in the dust. Also, I think that AMDs Vega-based APUs are quite possibly at the top limit of performance you can get out of APUs, since they are always going to get heavily bottlenecked by RAM. Another thing is, that moving to a newer GPU arch would most probably need some sort of redesign = work = money. AMD would be crazy to invest anything in a low-cost, low-margin product, which already has no competition and works fine. EDIT: grammar and typos.
data/avatar/default/avatar27.webp
Kaarme:

Inconceivable if true. I actually hope Intel's Xe iGPUs will leave AMD's ancient Vega based offerings to eat the dust. Sometimes bitter medicine is what people need to move forward. It's all the weirder when you keep in mind that AMD developed the APUs for the next gen consoles a while ago. They most certainly won't be based on any GCN anymore. So, AMD has already basically done the work, but still refuses to let go of GCN in their own products.
We do do not know how much effort needs to have navi in the APU. This is not `ancient vega` i think is sort of revised vega and also they could probably just increase the gpu unit count a bit. Don't forget those are 100-150$ products.
https://forums.guru3d.com/data/avatars/m/186/186805.jpg
Could this be a license deal with Sony and/or Microsoft? That the RDNA APU design is exclusive to the consoles and can't be used within desktop APU's? Or do I have my tin foil hat on? :P Regardless, it costs a lot of time, money and resources to implement a new architecture such as RDNA into an APU design and these chips need to be around the sub £150 mark. Like others have said, majority of these will be inside a laptop, which for 90%+ users is basic tasks like video encoding, hw acceleration in browsers, and maybe casual gaming. Why change something that works, and is currently dominating in its market. AMD obviously has a roadmap like every other company, and they obviously see more potential within the VEGA design when used in these parts. Not only that it should help keep prices low and profits high.
https://forums.guru3d.com/data/avatars/m/175/175902.jpg
Kaarme:

It's all the weirder when you keep in mind that AMD developed the APUs for the next gen consoles a while ago. They most certainly won't be based on any GCN anymore.
All the actual are still GCN and work quite good... (production cost...) APU with vega work really fine for their price, there is no equivalent in Intel line (even if they are way better than before APU's launch). For 99 Euro you have a modern quad core and a GPU part that permit a lot of things on the actual AMD line. I don't get the point of having RDNA in an APU if it raise the price over 200 Euro even more when AMD APU are limited to X8 on the PCIe x16 of the motherboard because of having GPU inside. If you need more then get a discrete GPU with a CPU.
https://forums.guru3d.com/data/avatars/m/175/175902.jpg
CPC_RedDawn:

Could this be a license deal with Sony and/or Microsoft? That the RDNA APU design is exclusive to the consoles and can't be used within desktop APU's? Or do I have my tin foil hat on? 😛
Nope, you have to consider those APU as part of whole system, if core and graphic part are more or less the same as PC, everything else is different (exemple: memory management)
https://forums.guru3d.com/data/avatars/m/243/243702.jpg
rl66:

Nope, you have to consider those APU as part of whole system, if core and graphic part are more or less the same as PC, everything else is different (exemple: memory management)
On top of that I am pretty sure that if anyone approached AMD with deal to make them semi-custom APU, they would do it. Like they did for that Subor Z. It is likely because they see no extra market share gain from integrating it. Most of people can't even begin to care about generational differences in intel's iGPUs. If they did, maybe AMD would see reason to show that they deliver newest and greatest they can. Ryzen 4700/4800 U/H are particularly nice things to get into mobile workstation with 8C/16T + 7 Vega CUs. If they paired 6C/12T with let's say 20 Vega CUs or RDNA/2 equivalent, it would be really nice gaming APU. (HBM2 on interposer with APU.) Question is: Is there good market for ultra-portable, Reasonably powerful gaming laptop? Bold part is where the issue comes from. iGPU needs Bandwidth to its memory, same way as dGPU. With DDR4 that's being put into most laptops (slow even while faster are available at similar price), really powerful iGPUs are not doable. AMD's wait for DDR5 mass production is likely reason here. On other hand, Every time I see 2033MHz DDR4 instead of 2666/2933MHz or when I see just single stick present in new product, I am really sad. This trend continues for very long time. And it sometimes happens even to intel's laptops. I honestly think that AMD and intel should give boot time warning when CPU operates with single stick. Then even uneducated customer may start to feel that he has been robbed of something.
https://forums.guru3d.com/data/avatars/m/248/248627.jpg
6C/12T with let's say 20 Vega CUs or RDNA/2 equivalent, it would be really nice gaming APU. (HBM2 on interposer with APU.) If amd made such a thing i could slot into my media pc I would buy that right away. I just hope with them keeping vega for so long they can at least beef up the memory controller and maybe loosen up some of the power restrictions on the APU's at least for the desktop side. currently sing the 2400g with the cpu underclocked and igpu/ram overclocked it can actually game decently.
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
It's very much worth pointing out that at least for AM4 it might not be possible to use a new architecture. Remember, much of the socket was purpose-built to have an on-die GPU. The electrical infrastructure behind that likely revolved around Vega, and doesn't have much wiggle room. Overall, the Vega GPU is actually pretty good for APUs. I wouldn't call it great, but it's a solid processor. The part where it really crumbles is memory bandwidth. There just isn't anywhere near enough. As you increase memory frequency or tighten timings, the performance is almost perfectly proportionate no matter how far you go. That shows how much the GPU is severely starved for bandwidth. The only reason Vega should be replaced is because of how it really isn't optimized at all for DDR4. AMD desperately needs to have a toggleable option for more aggressive memory compression. Granted, anything gamers can do to lower VRAM would help performance. For example in modern AAA titles, you might as well lower texture detail to medium, because high texture detail in today's games are really meant for 4K and let's face it, you're not going to get a good 4K experience on an APU.
data/avatar/default/avatar29.webp
Since when was Vega RDNA and Navi RDNA2? Vega is GCN and Navi RDNA...somebody should correct that in the report. And concerning Vega and APUs...Vega may not be that great for gaming and its a powerhungry beast, but for business apllications, office and multimedia it is good enough. And since the likes of Dell and HP really need quite a lot of time till they bring out notebooks, laptops and desktops for professionals, Vega is the better choice to rake in some cash for AMD. Almost feels as if Navi was a bit rushed...maybe they wanted that RDNA2 but saw they could not deliver the raytracing part in time and used Navi as a stop-gap? From personal experience Navis is not that good in compute...my Vega64 eats a 5700 for supper in FaH, while on medium and underclocked and undervolted.
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
WoenK:

Since when was Vega RDNA and Navi RDNA2? Vega is GCN and Navi RDNA...somebody should correct that in the report.
Agreed - I noticed that too.
And concerning Vega and APUs...Vega may not be that great for gaming and its a powerhungry beast
As pointed out by others, only high-end Vega chips are power-hungry. The performance-per-watt is perfectly reasonable when you don't push the clocks so high.
Almost feels as if Navi was a bit rushed...maybe they wanted that RDNA2 but saw they could not deliver the raytracing part in time and used Navi as a stop-gap?
I was thinking the same thing. Navi feels very experimental to me. I'm not so sure if RT had much to do with it, but AMD was already taking too long to get RDNA as "soon" as it was out which likely worried investors, so, I'm sure they were like "well, what we've done so far works and is better [on paper], so let's just release that to buy us more time".
From personal experience Navis is not that good in compute...my Vega64 eats a 5700 for supper in FaH, while on medium and underclocked and undervolted.
As is always the case, different architectures are better at different things. I have an old FirePro V7900 (TeraScale2) and in some BOINC workloads, that has performance to match GCN models that use double the power. The GCN architecture was purpose-built for compute, whereas RDNA seems to be a little more "diverse".
https://forums.guru3d.com/data/avatars/m/266/266726.jpg
Kaarme:

Inconceivable if true. I actually hope Intel's Xe iGPUs will leave AMD's ancient Vega based offerings to eat the dust. Sometimes bitter medicine is what people need to move forward. It's all the weirder when you keep in mind that AMD developed the APUs for the next gen consoles a while ago. They most certainly won't be based on any GCN anymore. So, AMD has already basically done the work, but still refuses to let go of GCN in their own products.
Die space is a premium and memory bandwidth is a more limiting factor than the raw gflops on apus for laptops/desktop, which dont have high frequency memories like gddr and are limited to a 128bit bus width effectively. If a new uarch uses more diespace, but doesn't appreciably improve memory performance , potentially it wouldnt make sense to switch.
https://forums.guru3d.com/data/avatars/m/248/248994.jpg
user1:

Die space is a premium and memory bandwidth is a more limiting factor than the raw gflops on apus for laptops/desktop, which dont have high frequency memories like gddr and are limited to a 128bit bus width effectively. If a new uarch uses more diespace, but doesn't appreciably improve memory performance , potentially it wouldnt make sense to switch.
I'm sure AMD has some reason not to do it, but I doubt efficiency is one of them, in any form. If anything, wouldn't Vega be more bandwidth dependent because it was designed for HBM? In dedicated video cards, Vega only exists with HBM, after all. Navi wasn't, it works with regular GDDR. In my opinion the difference between Vega and Navi is very glaring in the performance charts, and it's not in Vega's favour. There might be some computation work where Vega could fare better, but that's hardly relevant with APUs.
https://forums.guru3d.com/data/avatars/m/243/243702.jpg
WoenK:

From personal experience Navis is not that good in compute...my Vega64 eats a 5700 for supper in FaH, while on medium and underclocked and undervolted.
RDNA/CDNA
https://forums.guru3d.com/data/avatars/m/266/266726.jpg
Kaarme:

I'm sure AMD has some reason not to do it, but I doubt efficiency is one of them, in any form. If anything, wouldn't Vega be more bandwidth dependent because it was designed for HBM? In dedicated video cards, Vega only exists with HBM, after all. Navi wasn't, it works with regular GDDR. In my opinion the difference between Vega and Navi is very glaring in the performance charts, and it's not in Vega's favour. There might be some computation work where Vega could fare better, but that's hardly relevant with APUs.
its not really about efficiency, its more ,if i have 50gb/s of bandwidth and a gpu running at 1000mhz saturates that, running the gpu at 2000mhz wont double performance , if it scales at all. Unless navi has much better memory compression than vega(unlikely), it probably wont make much difference. Dual channel Ddr4 at 3200mt/s nets about 50gb/s peak, that isnt really enough for anything over 720p, its much less than an 8800gtx (90gb/s) and that was 13 years ago, any gpu is going to struggle with such a handycap. Vega is quite good with bandwidth utilization ,vega on apus is different that the pcie cards(vega 10/vega20), the extra hbm bandwidth doesnt help vega all that much in avg graphics workloads, but it helps alot for compute, which is what those cards are built for.