New Leak reveals AMD Zen2 and Zen3 codenames and adds a Timeline

Published by

Click here to post a comment for New Leak reveals AMD Zen2 and Zen3 codenames and adds a Timeline on our message forum
https://forums.guru3d.com/data/avatars/m/55/55855.jpg
[youtube=UDxLk-QBHx8]
https://forums.guru3d.com/data/avatars/m/242/242471.jpg
Lol dat voice accent..
https://forums.guru3d.com/data/avatars/m/197/197287.jpg
Rich_Guy:

This guy just called a GTX 1080 a mid-range graphics card..... Also, the guys is complaining about the 10 watt increase in TDP for the 2700x, and seemingly throughout the entire video not understanding why that may be, even though he also mentioned the fact that XFR2 and precision overdrive have much higher frequencies for many more cores rather then dropping off after 2....Am i crazy or does that not explain it right there? How exactly would higher boost speeds beyond 2 cores not create a higher TDP, after only one year? https://i.imgur.com/HhtuEFB.jpg
https://forums.guru3d.com/data/avatars/m/270/270233.jpg
Aura89:

This guy just called a GTX 1080 a mid-range graphics card.....https://i.imgur.com/HhtuEFB.jpg Also, the guys is complaining about the 10 watt increase in TDP for the 2700x, and seemingly throughout the entire video not understanding why that may be, even though he also mentioned the fact that XFR2 and precision overdrive have much higher frequencies for many more cores rather then dropping off after 2....Am i crazy or does that not explain it right there? How exactly would higher boost speeds beyond 2 cores not create a higher TDP, after only one year?
I'm inclined to agree with him on the 1080 - it's a cut-down, mid-range graphics card that Nvidia sold like a high-end graphics card (the Titan X, and the 1080 Ti later on, was the true high-end). As for power consumption, Zen+ is supposed to use a smaller node, which should deliver more power-efficiency - having a higher TDP seems to go against that idea.
data/avatar/default/avatar33.webp
Hilbert spotted a typo should be be not bee OUCH ( between the first and second slide. Free free to delete this post.
https://forums.guru3d.com/data/avatars/m/197/197287.jpg
D3M1G0D:

I'm inclined to agree with him on the 1080 - it's a cut-down, mid-range graphics card that Nvidia sold like a high-end graphics card (the Titan X, and the 1080 Ti later on, was the true high-end). As for power consumption, Zen+ is supposed to use a smaller node, which should deliver more power-efficiency - having a higher TDP seems to go against that idea.
I'm sorry but a $500 card no matter what people believe that $500 should be will never be a "mid-range" card. As well to the power-efficiency portion; again your statement disregards to the fact that XF2 and Precision Boost 2/Overdrive exist. For example, you can't very well compare Zen+ to Zen TDP wise directly, since we have no idea what the 1700-1800x CPUs TDP would have been if XFR/precision boost didn't only boost 2 cores. For all we know if they had boosted all of the cores to different degrees, like XF2/Precision boost2, they would have been 125-140 watt CPUs. If it was JUST a small increase in frequency with a more energy efficient node, then sure, the 10 extra watts would be very confusing. But that's not what is being discussed here, it's not just a small increase in base frequency and a more energy efficient mode, it's also a fairly major increase in boost frequencies across all cores, not just 2.
https://forums.guru3d.com/data/avatars/m/270/270233.jpg
Aura89:

I'm sorry but a $500 card no matter what people believe that $500 should be will never be a "mid-range" card.
Like I said, they sold it like a high-end card. The 1080, like the 1070, is a 7 bn transistor chip, while the 1080 Ti and Titans are 12 bn transistor chips. Nvidia basically sold the 1080 as the high-end chip and made bucket loads of money by doing so (which went to investors like me 😛) and then later on released the 1080 Ti as the new high-end.
Aura89:

As well to the power-efficiency portion; again your statement disregards to the fact that XF2 and Precision Boost 2/Overdrive exist. For example, you can't very well compare Zen+ to Zen TDP wise directly, since we have no idea what the 1700-1800x CPUs TDP would have been if XFR/precision boost didn't only boost 2 cores. For all we know if they had boosted all of the cores to different degrees, like XF2/Precision boost2, they would have been 125-140 watt CPUs. If it was JUST a small increase in frequency with a more energy efficient node, then sure, the 10 extra watts would be very confusing. But that's not what is being discussed here, it's not just a small increase in base frequency and a more energy efficient mode, it's also a fairly major increase in boost frequencies across all cores, not just 2.
Perhaps, but Zen+ is supposed to be a refinement of Zen, and I wouldn't expect them to blow up the TDP like that. I mean, Intel added two more physical cores to the 8700K (compared to the 7700K) but the TDP was only increased by 4 watts. A 10 watt increase, while using a smaller node, is definitely suspicious. All in all, I treat such rumors with a huge grain of salt.
https://forums.guru3d.com/data/avatars/m/236/236670.jpg
Aura89:

This guy just called a GTX 1080 a mid-range graphics card.....
I caught that too....I guess GTX 1070/1070ti would be a budget card...Lol
https://forums.guru3d.com/data/avatars/m/197/197287.jpg
D3M1G0D:

I mean, Intel added two more physical cores to the 8700K (compared to the 7700K) but the TDP was only increased by 4 watts.
I mean, if that was actually TRUE, then sure, but it's not. https://techreport.com/r.x/2017_10_09_Revisiting_power_consumption_and_efficiency_for_Intel_s_Core_i7_8700K/loadpower.png https://eteknix-eteknixltd.netdna-ssl.com/wp-content/uploads/2017/10/17-800x600.jpg http://www.guru3d.com/index.php?ct=articles&action=file&id=34766&admin=0a8fcaad6b03da6a6895d1ada2e171002a287bc1 (^ two above (not the first) do not show the 7700k, but does show the 7740k, a more power hungry 7700k) Ofcourse these companies can say whatever they want with their TDP (within reason from a legal standpoint) my whole point is we can't say a processor that has so much higher boost across all cores with a minor energy efficiency upgrade shouldn't have a higher TDP without knowing what the original processors TDP would be with the same boost. The fact of the matter is these boosts could be very, very beneficial to multi-threaded programs (probably why we saw a decent boost in multi-threaded scores in leaked benchmarks) and a small increase for that much better performance is literally nothing to worry about or complain about.
D3M1G0D:

I mean, Intel added two more physical cores to the 8700K (compared to the 7700K) but the TDP was only increased by 4 watts.
I know i'm quoting the same thing i just did above but i just looked at the "TDP" of the 7700k, 7740k, and 8700k. Wtf is Intels math on these? The 8700k most definitely takes more power then either the 7700k or 7740k, by far, yet wtf at the TDP of the 7740k? 7700k - 91W TDP 7740k - 112W TDP 8700k - 95W TDP ???????? That seems so screwed up and lies it's not even funny lol, it'd make more sense if it were: 7700k - 91W TDP 7740k - 95W TDP 8700k - 112W TDP
https://forums.guru3d.com/data/avatars/m/232/232130.jpg
Aura89:

I'm sorry but a $500 card no matter what people believe that $500 should be will never be a "mid-range" card.
Price doesn't determine GPU tier. Especially with mining craze going on. It has to be compared against other models that built on the same architecture. Including performance and specs.
https://forums.guru3d.com/data/avatars/m/197/197287.jpg
sverek:

Price doesn't determine GPU tier. Especially with mining craze going on. It has to be compared against other models that built on the same architecture. Including performance and specs.
I never talked about the mining craze, i talked about MSRP. And MSRP does determine GPU tier, just as naming determines GPU tiers. Never has the x800 or x80 series been mid-range. It is a fact, and not an opinion, that the GTX 1080 is by no means a mid-range card. Arguing against it is nonsense. It may not be the "highest" end, but that does not mean it's not high end. GTX 1080 and ti are high end, titan are ultra-high end, 1060-1070 ti are mid-range, and 1050 ti and lower are low end. Could even say below 1050 would be very low end. Lastly, we, consumers, are not the ones who get to determine which code-name for a GPU is what tier, that's up to the company that produces them to determine. This idea that because the GTX 1080 is GP104 making it mid-range is pure nonsense, as it being GP104 or GP102 is up to Nvidia and does not dictate if it's high end or mid-range, the performance and cost (MSRP, which is what nvidia sets it to, and what nvidia sets something to, price wise, is them stating what tier it is) and ultimately what they name it as dictate that. It is fully, and 100% up to the manufacturer to determine what is high end, xx80 is not a mid-range naming, so obviously, it is not mid-range. It's that simple.
https://forums.guru3d.com/data/avatars/m/243/243702.jpg
@Aura89 : Intel often plays with their TDP/SDP values. I learned it hard way. intel's Power efficiency is kind of myth, once OCed, CPU temperature even with good cooler and delid hints a lot. That's why I like AMD's XFR approach... Better cooling, better clock. It is silent way to admit that CPU eats (& heats) lot more at higher clock. I mean, does someone believe that their 95W stock CPU, which can be easily kept under 50C eats less than 150W upon OC and heating to 75C? 95W => 30 degrees above ambient, ???W => 55 degrees above ambient. And I like that 2700X shows 105W TDP, because X470/B450 boards will be counting with this chip and OC. And will have bit improved VRMs.
data/avatar/default/avatar23.webp
Fox2232:

@Aura89 : Intel often plays with their TDP/SDP values. I learned it hard way. intel's Power efficiency is kind of myth, once OCed, CPU temperature even with good cooler and delid hints a lot. That's why I like AMD's XFR approach... Better cooling, better clock. It is silent way to admit that CPU eats (& heats) lot more at higher clock. I mean, does someone believe that their 95W stock CPU, which can be easily kept under 50C eats less than 150W upon OC and heating to 75C? 95W => 30 degrees above ambient, ???W => 55 degrees above ambient. And I like that 2700X shows 105W TDP, because X470/B450 boards will be counting with this chip and OC. And will have bit improved VRMs.
Yeah 95W cpu can easily eat 150W once OC-ed. Problem? 1700X can eat that much, without OC (Blender) Intel's power explodes, because they can be oc-ed to ~5GHz. AMD's does not, because it's OC clocks are rather conservative. I don't see any advantage for AMD in this department tbh. Both AMD's and Intel's TDP given figures are somewhat misleading, and on a similar level. So they are both "playing", if you will. All in all AMD/INTEL are on par right now when it comes to power consumption. I don;t see why we need to slam Intel because AMD finally caught up in perf/Watt department.
https://forums.guru3d.com/data/avatars/m/273/273580.jpg
Aura89:

This guy just called a GTX 1080 a mid-range graphics card.....
if you would go to his channel AdoredTV and look at his History of Nvidia GeForce, Part 2 - The Way You Were Meant To Be Played(see also part 1) you would know why he is saying that.
airbud7:

I caught that too....I guess GTX 1070/1070ti would be a budget card...Lol
https://forums.guru3d.com/data/avatars/m/243/243702.jpg
Noisiv:

Yeah 95W cpu can easily eat 150W once OC-ed. Problem? 1700X can eat that much, without OC (Blender) Intel's power explodes, because they can be oc-ed to ~5GHz. AMD's does not, because it's OC clocks are rather conservative. I don't see any advantage for AMD in this department tbh. Both AMD's and Intel's TDP given figures are somewhat misleading, and on a similar level. So they are both "playing", if you will. All in all AMD/INTEL are on par right now when it comes to power consumption. I don;t see why we need to slam Intel because AMD finally caught up in perf/Watt department.
Neither Aura89, nor me are slamming intel for TDP. It is reaction to misconception of others who see intel's TDP value as some kind of holy word. And you are right that AMD is in quite similar position in power efficiency as intel is. For 160W on 1700X, I would like to know what board and bios + bios settings. Because there are boards where bios has disabled some power efficiency option (improving stock performance). Then there is another part of that Power consumption, and that is Work Done. That's important part there. people running workloads want to do them as fast as possible. So 200W chips which does 1,1 times more work per Watt than 100W chip will finish 2,2 times faster and 200W will make owner very happy.
https://forums.guru3d.com/data/avatars/m/242/242471.jpg
1080 sure ain't a high end card, wakeup.. Its what nv did with midget chips since Kepler days and sold it as high end. started with kepler GK104,.. just like GM204 or upcoming GV104 aren't either.
https://forums.guru3d.com/data/avatars/m/105/105757.jpg
When Intel (and probably AMD) quote TDP figures they're for the base frequencies of the CPU. Hence a 7700K is 95W @ 4.2GHz and the 8700K is 95W @ 3.7GHz (both all core TDPs). I'm sure if Coffee Lake hadn't been refined/reworked it would have used a bit more power. Intels working out of TDPs quoted from their site: TDP Thermal Design Power (TDP) represents the average power, in watts, the processor dissipates when operating at Base Frequency with all cores active under an Intel-defined, high-complexity workload. Refer to Datasheet for thermal solution requirements.
https://forums.guru3d.com/data/avatars/m/250/250418.jpg
Stop linking AdoredTV videos before HH goes mad. Even I don't like the dude.
https://forums.guru3d.com/data/avatars/m/273/273580.jpg
Silva:

Stop linking AdoredTV videos before HH goes mad. Even I don't like the dude.
Where in the rules is stated, that the users can not post a video if we think it proves the point? That you and HH don't like the person is not relevant. Even if HH (Hilbert Hagedoorn) is a staff member, We expect him to respect the rules and users a like as a moderator If You don't like what some people post, put them on your ignore list of your own account, then you don't have to see what they post.
https://forums.guru3d.com/data/avatars/m/197/197287.jpg
Fox2232:

@Aura89 : Intel often plays with their TDP/SDP values. I learned it hard way. intel's Power efficiency is kind of myth, once OCed, CPU temperature even with good cooler and delid hints a lot.
But shouldn't they at least be the same grading among intel processors? It's crazy to me that a 7700k is not a huge difference in power to the 7740k, though there is a difference ofcourse. Yet the TDP looks so much higher. Whereas the 8700k is more power hungry then them both, by far, and has a TDP just slightly above the 7400k. Makes no sense to me lol.
Fox2232:

And I like that 2700X shows 105W TDP, because X470/B450 boards will be counting with this chip and OC. And will have bit improved VRMs.
There is one question i do have about the TDP of 105 watts though on the 2700x. To my understanding, all 2000 series ryzens should work (with a bios update) on the 300 series chipsets. But all the 300 series chipsets, at max, support 95 watt TDP. Does that mean the 2700x won't work on 300 series chipsets? Or, is the "max TDP 95 watts" on the 300 series not really true, and only stated because that was what the max TDP of the 1000 series ryzens were.
Dragonetti:

if you would go to his channel AdoredTV and look at his History of Nvidia GeForce, Part 2 - The Way You Were Meant To Be Played(see also part 1) you would know why he is saying that.
I don't really need to do research into why someone is doing and saying something that is both objectively and factually wrong.
-Tj-:

1080 sure ain't a high end card, wakeup..
Mmmkay Mr. Consumer who is always right. Fact of the matter is we are consumers, we don't determine what is high end or not, that is up to companies to determine. Competition keeps these high ends from decreasing in "value", but that doesn't mean if there is no competition, that high end disappears from what it is. We are consumers, again, we don't determine this stuff. If you want to be so blind as to say you do, then i guess go for it, nothing stopping you. Doesn't change what are facts. Nowhere else in life do we go to a markets high end and then decide as consumers it's not high end and magically whatever market we are in has no high end. The market, the manufactures, the producers, etc. They choose what is high end or not, not consumers. If consumers do not like what is high end, then they need to create their own company, create a competing product, and force the hand of other companies to change what is high end. Until that happens, high end will stay at whatever the current companies state is high end, NOT at where make-believe consumers decide high end is.
Jagman:

When Intel (and probably AMD) quote TDP figures they're for the base frequencies of the CPU. Hence a 7700K is 95W @ 4.2GHz and the 8700K is 95W @ 3.7GHz (both all core TDPs). I'm sure if Coffee Lake hadn't been refined/reworked it would have used a bit more power.
That in no way explains the 7740k. That's basically saying a 100MHz increase from the 7700k, which is something like 2.7% increase in speed, creates almost a 24% increase in TDP? Whereas an 8700k at 3.7Ghz, with two additional cores, only increases the TDP by about 4%, even though the total processing speed (no boost) is about 32% more? I can guarantee you that if people were to do a specific test on the 7700k, 7740k, and 8700k, lock their speeds at their base speeds, and test them in multiple environments, the 8700k, with a much smaller TDP then the 7740k and an almost identical TDP to the 7700k, will always use more wattage them both of them, by a decent chunk. And i'm not saying specifically that AMD is any better in this regards, but my whole point was that someone tried to compare AMDs 10 watt increase in TDP on the 2700x to the 4 watt TDP increase from the 7700k to 8700k as a reason why the 10 watt TDP on the 2700x "looks bad" doesn't mean anything. There's nothing to compare there, you can't even compare Intels own processors TDPs together and get anything logical out of it, how exactly can people compare Intel vs AMD at that point?