AMD to launch Zen 4 and RDNA 3 both in Q4 2022

Published by

Click here to post a comment for AMD to launch Zen 4 and RDNA 3 both in Q4 2022 on our message forum
data/avatar/default/avatar04.webp
I guess my current HW will last longer 🙂 Which is good 🙂
https://forums.guru3d.com/data/avatars/m/259/259654.jpg
31/12/2022.
https://forums.guru3d.com/data/avatars/m/180/180081.jpg
That's OK. With my current activity level, and the back catalogue of games I have yet to play, there's no reason to replace anything until that time anyway 🙂 Hopefully Apple will have moved on to something like 3nm or something before then 😛
data/avatar/default/avatar17.webp
PrMinisterGR:

31/12/2022.
Wich means we can buy one march/april 2023 🙂
https://forums.guru3d.com/data/avatars/m/198/198862.jpg
Ryzen 7000 and Radeon 7000. I see what they did there. Anyway thats still far away. We cant even get a radeon 6000 at the normal price.
data/avatar/default/avatar36.webp
Is AMD going to have a CPU and MB refresh before then? Is there going to be a Zen 3+ and/or new X570 (X570 MBs are a bit old now) motherboards? I've still been trying to get a video card the past 7 months or so. I had an RX 5700 XT, but sold it just before November and downgraded to a GTX 1070 which I only intended to use as a stopgap card 🙁. I should have kept the 5700 XT as I would have been content to wait longer if I still had it. Oh well, didn't know these shortages were going to happen.
data/avatar/default/avatar27.webp
Great my new upgrade then, Ryzen 7 + Radeon 7 sounds nice.
https://forums.guru3d.com/data/avatars/m/201/201426.jpg
Undying:

Ryzen 7000 and Radeon 7000. I see what they did there. Anyway thats still far away. We cant even get a radeon 6000 at the normal price.
Gives me a chance to save up for it. Ryzen 7 gonna be the new socket? A 7800x, DDR5, X670* sounds pretty damn tasty.
IceZero09:

Is AMD going to have a CPU and MB refresh before then? Is there going to be a Zen 3+ and/or new X570 (X570 MBs are a bit old now) motherboards? I've still been trying to get a video card the past 7 months or so. I had an RX 5700 XT, but sold it just before November and downgraded to a GTX 1070 which I only intended to use as a stopgap card 🙁. I should have kept the 5700 XT as I would have been content to wait longer if I still had it. Oh well, didn't know these shortages were going to happen.
Exactly why I stayed with my 5700xt until i landed a 6800xt.
https://forums.guru3d.com/data/avatars/m/198/198862.jpg
IceZero09:

Is AMD going to have a CPU and MB refresh before then? Is there going to be a Zen 3+ and/or new X570 (X570 MBs are a bit old now) motherboards? I've still been trying to get a video card the past 7 months or so. I had an RX 5700 XT, but sold it just before November and downgraded to a GTX 1070 which I only intended to use as a stopgap card 🙁. I should have kept the 5700 XT as I would have been content to wait longer if I still had it. Oh well, didn't know these shortages were going to happen.
My guess there will be apu's 6000 but for desktop we will get just an updated 5000 series with 3d cache.
https://forums.guru3d.com/data/avatars/m/198/198862.jpg
Agonist:

Gives me a chance to save up for it. Ryzen 7 gonna be the new socket? A 7800x, DDR5, X670* sounds pretty damn tasty. Exactly why I stayed with my 5700xt until i landed a 6800xt.
I didnt upgrade my 2700x for that reason. Zen4 will be blazing fast.
https://forums.guru3d.com/data/avatars/m/165/165326.jpg
Good to hear as maybe by that time the hardware market would stabilize as current market prices in all kinds of hardware is out of whack at the moment. As it seems the new rx6900xtx-h stepping gpu arrived just in time to battle the rtx3080Ti and rtx3090 and has close the performance gap trading blows with rtx3090 in games. Either one will perform more or less equal in most games. As per the new cpus and socket it will give us more time , support and longevity with am4 as most of us are not ready to change socket and go for expensive ddr5 memory yet. In my opinion this is a good move from amd.
https://forums.guru3d.com/data/avatars/m/271/271684.jpg
Undying:

My guess there will be apu's 6000 but for desktop we will get just an updated 5000 series with 3d cache.
It would be interesting, but I don't think they are going to release 5000 series with 3d cache. I'd say they just slapped it on 5000 series for now for testing, development and demonstration purposes. It would make a lot more sense to keep it for brand new products.
https://forums.guru3d.com/data/avatars/m/115/115462.jpg
Even though it's very far away, it is likely a wise decision. It would be pointless to release such new architectures in a market like this. By that time even if the crypto/scalper craze will not go away, at least production should be adapted to the new "standards" demand wise. Also, it wouldn't surprise me either if they're going to launch even later than that.
https://forums.guru3d.com/data/avatars/m/216/216349.jpg
Well this gives me some time in case i decide to bite the bullet and buy a 5600/5800X, the question is if i should...
https://forums.guru3d.com/data/avatars/m/248/248994.jpg
As a person who has never owned a single Apple device, all I can say is that "Apple monopolizing almost all of the 5nm wafers produced by TSMC" is disgusting. Understandable, of course, considering Apple's bottomless pockets and a customer base that doesn't have the word "expensive" in their vocabulary.
https://forums.guru3d.com/data/avatars/m/216/216349.jpg
Kaarme:

As a person who has never owned a single Apple device, all I can say is that "Apple monopolizing almost all of the 5nm wafers produced by TSMC" is disgusting. Understandable, of course, considering Apple's bottomless pockets and a customer base that doesn't have the word "expensive" in their vocabulary.
I´m not a fan of Apple but you really can´t blame them on this one, it´s not their fault that there´s insufficient foundry capacity worldwide. Ironically we should thank them for the current capacity and wafer advancement because they are the one of the few companies rich enough to help finance all this in advance, that´s way they have priority on the best nodes and on wafer capacity.
https://forums.guru3d.com/data/avatars/m/250/250418.jpg
Kaarme:

As a person who has never owned a single Apple device, all I can say is that "Apple monopolizing almost all of the 5nm wafers produced by TSMC" is disgusting. Understandable, of course, considering Apple's bottomless pockets and a customer base that doesn't have the word "expensive" in their vocabulary.
People are willing to pay, Apple is just securing the means paying whatever TSMC wants. Remember that to buy wafer allocation it is a bid process, the highest bider wins. So, the only ones to blame are the sheep who buys a new Apple device every year just because of the social status. Those who keep their devices for at least 3 years are the only normal users that I respect.
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
AlmondMan:

That's OK. With my current activity level, and the back catalogue of games I have yet to play, there's no reason to replace anything until that time anyway 🙂 Hopefully Apple will have moved on to something like 3nm or something before then 😛
I wish I could say the same but my R9 290 doesn't exactly keep up with 4K these days. I've mostly just been revisiting games made from the mid 2000s. Honestly, kinda refreshing - back then games weren't loaded with microtransactions and were more focused on being fun than immersive.
Kaarme:

As a person who has never owned a single Apple device, all I can say is that "Apple monopolizing almost all of the 5nm wafers produced by TSMC" is disgusting. Understandable, of course, considering Apple's bottomless pockets and a customer base that doesn't have the word "expensive" in their vocabulary.
I can't help but wonder if Apple is actually selling all those chips. Sure, the M1 is decently popular but is it popular enough to take up all of TSMC's load?
https://forums.guru3d.com/data/avatars/m/259/259654.jpg
Kaarme:

As a person who has never owned a single Apple device, all I can say is that "Apple monopolizing almost all of the 5nm wafers produced by TSMC" is disgusting. Understandable, of course, considering Apple's bottomless pockets and a customer base that doesn't have the word "expensive" in their vocabulary.
As a person using several, but owning none, I would say that they deserve it, and there are good reasons they can do it.
https://forums.guru3d.com/data/avatars/m/250/250667.jpg
Content with my 5950X and Asus Dark Hero, plus my 1080Ti is more than enough