AMD Mentions It is Already Working on Zen 5

Published by

Click here to post a comment for AMD Mentions It is Already Working on Zen 5 on our message forum
https://forums.guru3d.com/data/avatars/m/232/232130.jpg
Also, AMD might just skip the number 4 for Zen, the number 4 is not loved in Asia, an unlucky number.
It's Zen after all 🙂
data/avatar/default/avatar20.webp
4 is not loved in Asia...? So the Intel Pentium 4 never was a success in Asia?
https://forums.guru3d.com/data/avatars/m/258/258664.jpg
Well, not surprised. It takes some time to design an architectural upgrade, they're doing their homework.
https://forums.guru3d.com/data/avatars/m/268/268355.jpg
x86 .. oh my god,,for how long will they stay on that dino arch. ? no wonder we dont get any improvments in speed. this is insulting. i will stop buying into that arch. until they change it. its ridiculous to bother us with this bs. they press money out of this arch forever it seems or are they afraid of giving us some real stuff? maybe they need the real arch. for their cia&co and dont want us to use it. i cant be bothered anymore keeping 'up to date' with 1970-1980'tech ^^
https://forums.guru3d.com/data/avatars/m/248/248627.jpg
undorich:

x86 .. oh my god,,for how long will they stay on that dino arch. ? no wonder we dont get any improvments in speed. this is insulting. i will stop buying into that arch. until they change it. its ridiculous to bother us with this bs. they press money out of this arch forever it seems or are they afraid of giving us some real stuff? maybe they need the real arch. for their cia&co and dont want us to use it. i cant be bothered anymore keeping 'up to date' with 1970-1980'tech ^^
You could just go to apple in a few years apparently they have plans to ditch x86 and fab something of their own.
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
undorich:

x86 .. oh my god,,for how long will they stay on that dino arch. ? no wonder we dont get any improvments in speed. this is insulting. i will stop buying into that arch. until they change it. its ridiculous to bother us with this bs. they press money out of this arch forever it seems or are they afraid of giving us some real stuff? maybe they need the real arch. for their cia&co and dont want us to use it. i cant be bothered anymore keeping 'up to date' with 1970-1980'tech ^^
Calling a modern processor a "x86 architecture" is a big misconception. They have an x86 decoder on the front that converts x86 instructions into an optimized format, the internal architecture resembles nothing like a traditional x86 processor. The decoder isn't even close to a bottleneck, the only thing it impacts is power, slightly, at sub 10w level - hence ARM.
data/avatar/default/avatar08.webp
Considering 5 year development cycles has been the norm in this industry for a good 20 years now this is kind of expected. If they said they WEREN'T working on zen 5 at this point that would be news.
data/avatar/default/avatar30.webp
10nm, 7nm, 4nm and then what? I read somewhere that they cant shrink anymore then that
data/avatar/default/avatar16.webp
NaturalViolence:

Considering 5 year development cycles has been the norm in this industry for a good 20 years now this is kind of expected. If they said they WEREN'T working on zen 5 at this point that would be news.
It's nice to be reminded since we live in a short attention span world these days. 😉
https://forums.guru3d.com/data/avatars/m/248/248994.jpg
RamaARG:

10nm, 7nm, 4nm and then what? I read somewhere that they cant shrink anymore then that
Positronic brains, what else could it be?
https://forums.guru3d.com/data/avatars/m/268/268248.jpg
undorich:

x86 .. oh my god,,for how long will they stay on that dino arch. ? no wonder we dont get any improvments in speed. this is insulting. i will stop buying into that arch. until they change it. its ridiculous to bother us with this bs. they press money out of this arch forever it seems or are they afraid of giving us some real stuff? maybe they need the real arch. for their cia&co and dont want us to use it. i cant be bothered anymore keeping 'up to date' with 1970-1980'tech ^^
you are not going to buy x86 cpus ? Denial's answer pretty much covered the technical part, other than that i gues have fun with phones and tablets!
https://forums.guru3d.com/data/avatars/m/270/270233.jpg
I hope AMD continues to push the envelope. I wouldn't mind having a 12-core mainstream processor, or a 32-core HEDT chip. I was able to increase my computing power nearly sevenfold last year, and I would love to do it again. 🙂
undorich:

x86 .. oh my god,,for how long will they stay on that dino arch. ? no wonder we dont get any improvments in speed. this is insulting. i will stop buying into that arch. until they change it. its ridiculous to bother us with this bs. they press money out of this arch forever it seems or are they afraid of giving us some real stuff? maybe they need the real arch. for their cia&co and dont want us to use it. i cant be bothered anymore keeping 'up to date' with 1970-1980'tech ^^
Intel already tried that with their EPIC architecture - FYI, it did not end well. Despite its flaws, the improvements made to x86 over the years have made it a viable processor for both consumers and servers (e.g., SIMD instructions, SMT/hyperthreading, speculative execution, etc). Current x86 chips are very different from the 8086.
https://forums.guru3d.com/data/avatars/m/258/258688.jpg
undorich:

x86 .. oh my god,,for how long will they stay on that dino arch. ? no wonder we dont get any improvments in speed. this is insulting. i will stop buying into that arch. until they change it. its ridiculous to bother us with this bs. they press money out of this arch forever it seems or are they afraid of giving us some real stuff? maybe they need the real arch. for their cia&co and dont want us to use it. i cant be bothered anymore keeping 'up to date' with 1970-1980'tech ^^
"x86" refers to the backwards software compatibility, immeasurably important--today's x86 cpu designs are modern SoA--they are sort of a cross between exotic RISC/CISC designs (As I like to think about it) and they bear no resemblance to the old actual *physical* x86 cpus of yore...;) We haven't been on "x86" cpus since 1999, if not earlier. Split hairs between the '486 and the early Pentiums being the last "x86" cpus--probably the '486. AMD's successful K7 (the grandfather of the Intel Core 2--AMD64 was it's "daddy"..;)) in 1999 marks the year when things started to really diverge from classical "x86."
data/avatar/default/avatar40.webp
x86 is a core part of AMD CPU, but then again via AMD x64 especially for modern processors is very much a part of what makes them as good as they can be now and going forward. without Intel we would not have x86 without AMD we would not have x64 support ^.^
data/avatar/default/avatar24.webp
as for the lad above said what is beyond 4nm..who knows, but I would imagine pure optical interconnects instead of the current electronic via silicon being used, there is an upper limit of how far they can "shrink" transistors as well as the frequency/power/temperature etc however optical, quantum etc computing where the same current limits frequency/power/temperature even density are far far far better it takes time to happen, Intel has already claimed they will be ditching beyond 7nm for silicon based, who knows seeing as they were "stuck" using many lithography shrinks for far longer than they had wanted to (could be cost, could be unforeseen problems, could be lack of competition) anyways, time will tell that is very much certain, IMO they will use silicon with optical at least initially once they start to hit say 7nm (and the various kinks are worked out to make it "worth: doing) but likely the cost/speed/power advantage optical will offer (possibly a much less complex design) they will just fully switch to optical or otherwise when this time comes, hell they could even start the cycle all over again that is say 180nm as pure optical does not at all have the same "limitations" that BULK, SOI, FDSOI or whatever currently have (a very lower upper limit for power consumed at something ~5.5-6.2Ghz without requiring insane power or exotic cooling) for optical what I have read the "starting point" would be ~10Ghz at a fraction of power consumed. likely they are making sure that the software/hardware is fully ready for this time because it would be very much a paradigm shift (look how long it took dual core, multicore and such to be "viable" and still is not in many ways and these run at a fraction of the speed optical is capable of, let alone how much more GB worth of memory they can "stuff" in the design when the limit of electricity is no longer a direct consideration/consequence) likely if they do have it, they will "milk" profits as long as they can before they bother to switch to an even better way to do things ^.^
https://forums.guru3d.com/data/avatars/m/196/196284.jpg
Dragonstongue:

x86 is a core part of AMD CPU, but then again via AMD x64 especially for modern processors is very much a part of what makes them as good as they can be now and going forward. without Intel we would not have x86 without AMD we would not have x64 support ^.^
Actually, Intel did develop a 64bit instruction set. It was marketed as "IA64". It's the instruction set the Itanium processors use(d). The only drawback to IA64 at the time of launch was the lack of backward compatibility with 32bit (x86) software.
data/avatar/default/avatar28.webp
Go AMD! Thanks for giving us a real alternative to Intel and forcing them to stop dragging their feet. You've got my business for sure.
https://forums.guru3d.com/data/avatars/m/269/269912.jpg
RamaARG:

10nm, 7nm, 4nm and then what? I read somewhere that they cant shrink anymore then that
Gallium Nitride is an option it is already used in radar circuitry.