AMD Ryzen 3000 Matisse Spotted - 12 Cores 24 Threads

Published by

Click here to post a comment for AMD Ryzen 3000 Matisse Spotted - 12 Cores 24 Threads on our message forum
https://forums.guru3d.com/data/avatars/m/232/232130.jpg
@man_daddio:

I can't imagine this being an actual mainstream processor. AMD is just wasting money. These chips are only good for people who know how to use them otherwise it's essentially useless cores. Recently someone I know said that they bought a Threadripper. I assume just because the name sounded cool. And I asked the questions, Why did you buy a Threadripper and do you know how they work? The answer to both was I don't know. Can I suspect that a lot of people out there have no clue?Just AMD hype. Let's just make things for the sake of making things. Causing all kinds of unnecessary waste. There's more integrity charging high prices for a product than doing that. Humanity is going backwards.
There thing called "progress". Were you saying the same as Quad cores replaced Dual cores? Or Octa cores replacing Quad cores? If dual cores were still mainstream we would still playing CS1.6 or starcraft and code in Windows XP. Giving more powerful CPUs to general consumers, meaning developers can stretch further and create more demanding content. There game called "They are billions" and I really want to actually see more than thousands of zombies on the screen. It was impossible before, cause no engine would be able to render so many entities. With stronger CPUs being available for everyone, we get what was impossible before. I still happy with dual core CPU at work, good enough to copy paste code from Stack Overflow. Humanity is going backwards since we stopped being monkeys btw. I'd be more happy just eating bananas and banging.
data/avatar/default/avatar07.webp
With the cores split between two dies on this CPU, I wonder if you will need to enable "game mode" to deactivate one die in order to avoid any latency issues. This looks like thread ripper with cores split between dies and without game mode those game really bad. I'm concerned about the 12 and 16 core Zen2 chips having issues of this nature.
https://forums.guru3d.com/data/avatars/m/232/232130.jpg
Moonbogg:

With the cores split between two dies on this CPU, I wonder if you will need to enable "game mode" to deactivate one die in order to avoid any latency issues. This looks like thread ripper with cores split between dies and without game mode those game really bad. I'm concerned about the 12 and 16 core Zen2 chips having issues of this nature.
From what I understand they will reduce latency with I/O per die and further improve IPC. But again, we can argue non-stop until actually seeing 3rd party benchmarks.
https://forums.guru3d.com/data/avatars/m/232/232504.jpg
LM2014:

Thanks for the news 🙂 But I waiting for ZEN3 and DDR5 😀:D
DDR5 you're going to wait a lot of time for very little advantage. Now I don't know about Ryzen 3, but I'm not overly optimistic. I just got Z390 with i5 8500 and my mistake and goal concurrently was the i7 9700 which I'll get.
schmidtbag:

The 9900K is also very expensive, hard to find, and generally a pretty stupid CPU to get if gaming is your top priority. An 8700K is a much smarter choice, and even then, an 8600K or 9600K is more than enough for the vast majority of gamers out there (but, so is a 2700X).
"For the vast majority" you can go a little down. 9600K is not the vast majority.
https://forums.guru3d.com/data/avatars/m/232/232130.jpg
NiColaoS:

DDR5 you're going to wait a lot of time for very little advantage. Now I don't know about Ryzen 3, but I'm not overly optimistic. I just got Z390 with i5 8500 and my mistake and goal concurrently was the i7 9700 which I'll get.
Don't mix naming schemes, Ryzen 3 is entry level of Ryzen brand lineup. [Zen1] Ryzen 3 1200/1300X, Ryzen 3 2200U, etc.... [Zen+] Ryzen 3 2300X, Ryzen 3 3300U, etc... https://en.wikipedia.org/wiki/Ryzen We don't know if AMD will use Ryzen brand for Zen2 microarchitecture. I'd rather refer to it as mainstream Zen2 processors or "Ryzen gen3" if you must.
https://forums.guru3d.com/data/avatars/m/242/242134.jpg
@@man_daddio someone not knowing the how and what about a TR cpu has zero to do with hype. - promote or publicize (a product or idea) intensively, often exaggerating its importance or benefits. TR is far from that.
https://forums.guru3d.com/data/avatars/m/243/243702.jpg
Geryboy:

well right now the 9900k is alot better in gaming than 2700x, even with the 30% more performance, the Ice Lake 10nm line up will be faster again for gaming. So not everybody will jump to AMD, but they get very attractive.
So, you need to bring 50% more expensive intel's chip into comparison to actually beat AMD? And in what scenario, how big percentage of games are CPU bound to make it worth? Is it 2080Ti and 1080p resolution? On top of that, intel's offerings in price range of 2700X are all 4C/8T or 6C/6T (even locked chips cost that much). Those locked chips are definitely not winning over 2700X in gaming consistently. And unlocked chips... who would pair those low core count chips with so powerful GPU? Then you can get higher fps, but side effects in term of stutter, not being able tu run background applications which may eat CPU cycles, being victim of windows defender and other internal processes of windows itself. At price 2700X has, intel offers only good feeling of delusion.
https://forums.guru3d.com/data/avatars/m/243/243702.jpg
Geryboy:

if you want to go 240 Hz with 1080p, or even max out 165Hz with 1440p the intel chip gets you there, the amd one not. It is a high end chip with unpaired performance, I'm not talking about the money.
2080Ti benchmarks say otherwise. Wanna go 1080p +240fps on RTX 2080Ti? Sacrifice details. And then you'll find out that even 9900K will not get you there anyway.
https://forums.guru3d.com/data/avatars/m/232/232130.jpg
Fox2232:

2080Ti benchmarks say otherwise. Wanna go 1080p +240fps on RTX 2080Ti? Sacrifice details. And then you'll find out that even 9900K will not get you there anyway.
Cut him some slack, with lowered video settings GPU won't bottleneck (similar scenario with 720p). Then it's up to CPU to carry frames. Even on 1080p with max settings 9900K will have 40 more frames over 2700X. See Guru3D benchmarks for yourself. Some games can finally reach 144fps. Yet the difference is less between 9900k and other Intel flag CPUs. However, 9900K remains the ultimate gaming and streaming CPU. Whether it's worth the money or not is different topic.
data/avatar/default/avatar10.webp
Seeing as I can push upwards of 100 Frames at max settings in either Overwatch or ARK @ 2k I think you meant 4k? A 1050Ti can push 100+ frames in 1080p.... Not sure you meant 1080p...
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
NiColaoS:

"For the vast majority" you can go a little down. 9600K is not the vast majority.
Note how I said "more than enough".
Moonbogg:

With the cores split between two dies on this CPU, I wonder if you will need to enable "game mode" to deactivate one die in order to avoid any latency issues. This looks like thread ripper with cores split between dies and without game mode those game really bad. I'm concerned about the 12 and 16 core Zen2 chips having issues of this nature.
This uses a different design (chiplets) so I'm not sure it'll suffer the same issues.
sverek:

There thing called "progress". Were you saying the same as Quad cores replaced Dual cores? Or Octa cores replacing Quad cores? If dual cores were still mainstream we would still playing CS1.6 or starcraft and code in Windows XP.
I see your point, but I don't think it's that simple. Everyday workloads aren't really adding more demand on CPUs. There's a point of diminishing returns, and as far as I'm concerned, an 8c/16t CPU is exactly that for just about everyone who doesn't do stuff like rendering, encoding, compiling, or an absurd about of multitasking. Diminishing returns can be anything from cost to efficiency to compromises because of efficiency (such as physical size, PSU upgrades, or noise). In other words, adding more cores doesn't improve the performance of tasks that can't take advantage of it, and it won't speed up your workflow or improve efficiency if you have entire cores left unused. The vast majority of people don't need more than 4 cores, and I'm not saying that's because they're using old software that can't utilize more threads, it's because the software they are running doesn't even max out the cores they've got. One could argue "yeah but with more cores becoming more common, software will become more multi-threaded" and that's not going to happen. Most calculations in most applications will get better performance if you just stick with 1 thread. It's actually not really all that different from the automobile industry. Unless you're a race car driver or a hauler, nobody needs more than 4 cylinders. More power has a minimal impact on your everyday commute. My gaming PC has a 4c/8t CPU in it. When I built that PC, I had the money to get a 7980XE with a nice liquid cooling system, but I opted not to because I knew for a fact I was not going to really take advantage of the specs. There have been times I've done encoding or compiling, but none of that takes long enough for me to complain or wish I bought something better. TL;DR More cores in and of itself doesn't make progress for desktop users. It definitely encourages progress, but probably not in the way that many of us think/hope (more applications becoming more multi-threaded).
https://forums.guru3d.com/data/avatars/m/270/270233.jpg
schmidtbag:

Yeah, but the 32-core TRs are a very niche product due to the memory and scheduler issues. The 16 core TRs are still the more sensible product for a wider audience.
I don't think memory is an issue on 32 core TR, and there is a software fix for the scheduler issues, as discovered by Wendell from Level1Techs (a variant of which will probably be incorporated into Windows at some time in the future). It's a niche product but so is TR in general, and for those who want a lot of cores it's a very attractive option (frankly, I would have gotten a 2990WX if I could have afforded it).
schmidtbag:

But like I said, it's more of an issue compared to previous generation stuff. If I owned either a 1st or 2nd gen TR with 12-16 cores, I think I'd be a little bothered that there will be AM4 CPUs that will most likely be much cheaper while potentially having better performance (when you account for the smaller transistors and probably higher clock speeds). Of course, TR builds will still be better in terms of stuff like PCIe lanes and memory bandwidth, but I wouldn't be surprised if most of these systems only use 1 GPU and 1 M.2 drive. Of course, the natural evolution of things suggest that these parts will inevitably be obsoleted, I guess what I'm saying is AMD is forcing obsolescence a little bit too fast IMO.
AMD already doubled the core count on Threadripper and will soon do so with EPYC. It's only natural that consumer Ryzen chips would follow suit. As a first-gen Ryzen and Threadripper customer I an eagerly looking forward to upgrading - socket compatibility was important for me since I always intended to upgrade to more powerful chips down the line. Plus, we all knew that Zen 2 and 7nm would bring about some big changes.
https://forums.guru3d.com/data/avatars/m/196/196284.jpg
@man_daddio:

I can't imagine this being an actual mainstream processor. AMD is just wasting money. These chips are only good for people who know how to use them otherwise it's essentially useless cores. Recently someone I know said that they bought a Threadripper. I assume just because the name sounded cool. And I asked the questions, Why did you buy a Threadripper and do you know how they work? The answer to both was I don't know. Can I suspect that a lot of people out there have no clue?Just AMD hype. Let's just make things for the sake of making things. Causing all kinds of unnecessary waste. There's more integrity charging high prices for a product than doing that. Humanity is going backwards.
If the cores are being used, they're not useless..... Whether a specific piece of software can effectively use the additional cores or not, they can still impact performance. I moved from an i5 6600K to my R5 1600. The R5 1600 has lower IPC and lower clock frequency, but considerably better overall performance. If you have a problem with progress, why are you on an enthusiast forum? Nearly every company on the planet makes a new product, simply because they can. People buy those products, because they exist. If a company stops making new products, eventually people stop buying their products entirely. Then the company ceases to exist. It's funny that you only appear to apply your statement to AMD though, but not Intel....who conveniently does the same thing....
https://forums.guru3d.com/data/avatars/m/197/197287.jpg
Borys:

When the AMD Ryzen 3XXX series reach the market... who will buy Intel processors?!?! Only people that do know nothing of hardware.... this 3XXX series will smash Intel in a way that they never imagined....
There will always be those that will never believe. Just the other day a guy was spouting off about how a 2700x takes more wattage and is hotter then an 8600, none of which is true, and after showing him proof, he said they were all lies, etc. People who want to believe one thing, regardless of facts, will believe whatever they want. So there will be people buying intel CPUs, even if intel doesn't react like they should to zen 2, even if zen 2 is what is rumored.
schmidtbag:

Of course, the natural evolution of things suggest that these parts will inevitably be obsoleted, I guess what I'm saying is AMD is forcing obsolescence a little bit too fast IMO.
It'd only be forcing obsolescence if AMD was somehow talking to developers and making software work worse on older, or less core CPUs, or releasing "driver" updates that did that. In reality, what this is "forcing", is to get developers to realize multi-core programming to the fullest potential is the future, and take use of it.
sverek:

There thing called "progress". Were you saying the same as Quad cores replaced Dual cores? Or Octa cores replacing Quad cores? If dual cores were still mainstream we would still playing CS1.6 or starcraft and code in Windows XP.
I still remember when my dad first heard about dual-cores being a thing, saying "it'll never catch on, it's only good for server environments"
Geryboy:

if you want to go 240 Hz with 1080p, or even max out 165Hz with 1440p the intel chip gets you there, the amd one not. It is a high end chip with unpaired performance, I'm not talking about the money.
In any game that actually matters with such a high FPS to match high hz, my 2080, none ti, at 1440p, with a 1700x, managers 165fps+, quite often 200fps+ Note: I said games that matter, "twitch" shooters and the like. Majorly graphical games, obviously not, they are in, generally, the 120fps+ range. But if you're playing a game like that, it's not a "twitch" shooter, and you do not get the benefit of high refreshrate anyways. Heck, in CS:GO, 4K i can manager 200FPS+, which is obviously one of the biggest "twitch" shooters I can't imagine what 1080p would be, such a small and easy resolution to get high FPS. Plus the fact that 240hz vs 144hz is minuscule in difference. Sure, 60hz to 120/144, that was a very big jump in fluidity, but the same can not be said about 144hz to 240hz I'll quote another place from the internet: "Display refresh rate 60hz can display new image every 16 ms 144hz can display new image every 7 ms 240hz can display new image every 4 ms Input lag 60hz with 70 FPS = 28 ms 144hz with 154 FPS = 18 ms 240hz with 250 FPS = 14 ms" Not a big difference. That's not to say i, personally, am not interested in 240hz, i'm looking forward to 1440p 240hz monitors, but it's just icing on the cake, anyone who "needs" to have 240hz monitor with 240fps or more is missing the big picture and needs to get their priorities straight. But again, most important part: 1700x, let alone 2700x, is able to achieve this with a high end GPU, no problem. If you don't believe this, go out and actually look for information regarding this, people have posted here countless time with non-cherry picked situations.
https://forums.guru3d.com/data/avatars/m/243/243702.jpg
Geryboy:

will still get you 30%+ closer to it than anything else AMD has to offer. So what now? and the 2080 ti wouldn't run 100% gpu load, why sacrifice details?
Draw calls.
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
Aura89:

It'd only be forcing obsolescence if AMD was somehow talking to developers and making software work worse on older, or less core CPUs, or releasing "driver" updates that did that. In reality, what this is "forcing", is to get developers to realize multi-core programming to the fullest potential is the future, and take use of it.
Sorry, I wasn't being very clear: I meant more of a metaphorical obsolescence. Literally, the old hardware will run just fine and is still plenty good enough. But my point was more the idea that someone spent extra money on a TR which, at the time, was a high-end product, and within a short amount of time this consumer-grade socket AM4 product comes along for much cheaper and probably performs better. The TR is still a good product, but all of that value/prestige it carried is suddenly diminished. Again - this progression is inevitable, I'm just saying it's happening a bit too fast. If I had a TR, I would be annoyed that I could've waited a year and got something more fitting for my needs for a much lower price. Of course, for anyone who doesn't own a TR, this is fantastic news.
Geryboy:

nice theoretical argument without any practical backup as to where the actual middle ground might be, or how severe the impact could be in real scenarios.
I think you're making the performance differences out to be far more extreme than they really are. Due to AMD getting scheduler improvements and Intel suffering performance losses from various security vulnerabilities, the performance gap is much tighter than our first impressions left us with.
https://forums.guru3d.com/data/avatars/m/56/56686.jpg
Like said my next pc will be AMD ryzen screw INTEL if they keep pushing there prices when amd right on heals now for half price and similar performance in mtp now, I might not care for TDP these fancy mulit core CPU are now using in general, but it was bound to happen. Now that mulitcore war is on it should only be matter time before all programs start using mulitcore cpu correctly, and only mater time before intel stop pushing the crazy prices when there mutli core performance in most cases are barely better then AMD or INT actual realease cpu worth there crazy prices Cause if AMD kept on the track they are on and keeps the price were they are and INTEL keep trying to push there crazy prices little extra performance AMD will have the last laugh it might take while but they will in my eye the peformance gap between INTEL and AMD is gone now and AMD is doing at half the price. And no I only play my games at 60FPS at best it would 120hz IF I ever choose to buy one those monitors which probably wont
https://forums.guru3d.com/data/avatars/m/197/197287.jpg
schmidtbag:

Sorry, I wasn't being very clear: I meant more of a metaphorical obsolescence. Literally, the old hardware will run just fine and is still plenty good enough. But my point was more the idea that someone spent extra money on a TR which, at the time, was a high-end product, and within a short amount of time this consumer-grade socket AM4 product comes along for much cheaper and probably performs better. The TR is still a good product, but all of that value/prestige it carried is suddenly diminished. Again - this progression is inevitable, I'm just saying it's happening a bit too fast. If I had a TR, I would be annoyed that I could've waited a year and got something more fitting for my needs for a much lower price. Of course, for anyone who doesn't own a TR, this is fantastic news.
Makes sense, it can be hard as a consumer to feel like you should ever buy something if the next year it's drastically "out-done", makes you want to always wait, which then you'll be in a waiting game forever. That being said i'm not sure if i, personally, would want it any other way. For me, progress can never be too quick lol
Geryboy:

benchmarks on youtube
Found your issue.
https://forums.guru3d.com/data/avatars/m/197/197287.jpg
Geryboy:

I'M referring to various benchmarks on youtube which show 30+% fps in 1080p, don't have 1st hand experience. But several people get the same results. Conspiracy?
You know, i know i'm quoting you again in different post, even though i quoted you above as well, but you sound very similar to a guy i quoted in a different thread, with your 30% even, so i'll just quote him and my quote to him, as a reply. Take note: I did the math, all information from what is quoted, is directly from guru3d, not some random youtube channel that "claims" they know how to test.
Glottiz:

Nonsense, usually there's 30% difference in framerate at high refresh rate gaming between 2700X and 9700K. That's a massive difference just from the CPU. While yes, 9700K is more expensive, that's a 1 time cost to always have better framerates.
And my original, unaltered reply: Curious to know where you got your "30" from. https://www.guru3d.com/articles_pages/amd_ryzen_7_2700x_review,20.html At 720p, where the difference between a 2700x and 8700k would be most notable, and is also a resolution almost no gamer games at so really isn't even relevant..., but anyways, there's a 12% on average performance difference. At 1080p, it's not as critical, but hey at least more people game at this resolution. At this resolution, there's a 3% average performance difference. Lets not even go to 1440p/4k, since it'd just go down from there. Now, you may be going "But i didn't say 8700k! i said 9700k!", yes, that's true. If you want to go and find the REAL averages on some other benchmarks website, then go ahead and do it. But actually do the work, as i have done, rather then making up some random number and calling it good. Reality is, in most games the 8700k performs better at, the 9700k would likely perform similarly, since the 8700k has a slightly higher minimum frequency, and the 9700k has a slightly higher maximum frequency, and the addition of 2 cores, in the majority of games that again would have the 2700x performing less in, would not matter. Plus, the 8700k has hyperthreading whereas the 9700k doesn't. From what i've seen, there's an average of 1-5fps difference, sometimes in the 8700k's favor, which would not move the percent average differences much at all. If you want you can call it 15% at 720p and 4-5% at 1080p. Again, i didn't just make these numbers up, i took the data provided in the above guru3d review, percentaged out the difference, and then averaged the percent differences. You guys gotta stop making stuff up.
https://forums.guru3d.com/data/avatars/m/197/197287.jpg
Geryboy:

cute, valid argument.
Can't help that you have that issue? Not really sure what you're getting at, read second post. If you wish to go out to a VALID review website, if you don't believe guru3d that is, and get VALID results, average them out, to see the average performance differences, at various resolutions, and post your results as i have, go right ahead, no one is stopping you. Edit: Also, no one is saying that there isn't specific situations where a 30% increase isn't valid, i'm sure there's a game that there is, but that's not the average. One off games don't make the average. It's up to each individual person to decide if the one-off games are worth it to them or not, or go with the average. Since most people play more then just one game, the average is what is most important, to most people. Point is, stop posting that there's a "30%" mythical difference between the processors when you're literally quoting one sample. If other people did the same thing, they could claim the 2080 ti has 80% faster then the 1080 ti, due to one sample, and that's obviously not the truth.