AMD Ryzen 3 3300X Has a fully enabled CCX, unlike the Ryzen 3 3100

Published by

Click here to post a comment for AMD Ryzen 3 3300X Has a fully enabled CCX, unlike the Ryzen 3 3100 on our message forum
https://forums.guru3d.com/data/avatars/m/90/90667.jpg
Astyanax:

ugh, starcraft 2 should should be on dx11 and multithreaded, but blizzard just hasn't cared enough to do it.
At least that, but it's stuck on WC3 engine with some "modern" improvements.
https://forums.guru3d.com/data/avatars/m/72/72485.jpg
DLD:

If the goal of buying a new CPU is building/upgrading a serious machine, then anything less than 6 cores/12 threads is a bad investment, nowadays. One buys a new hardware not for yesterday or for a day before, but for today and for tomorrow and for a few years to come. This AMD's move is just a sign of their greed, or, rather, an attempt on selling as much existing silicon as possible. In terms of market strategy it is known as "clearing the shelves"...
If it was about greed then AMD's entire lineup would be twice as expensive.
https://forums.guru3d.com/data/avatars/m/274/274425.jpg
DLD:

If the goal of buying a new CPU is building/upgrading a serious machine, then anything less than 6 cores/12 threads is a bad investment, nowadays. One buys a new hardware not for yesterday or for a day before, but for today and for tomorrow and for a few years to come. This AMD's move is just a sign of their greed, or, rather, an attempt on selling as much existing silicon as possible. In terms of market strategy it is known as "clearing the shelves"...
How is AMD being greedy in offering a 4-core, 8-thread CPU for as little as $99? A certain company based in Santa Clara, California was selling 4C/8T CPUs for years, and they sure as hell didn't cost $99. And, that same company would likely *still* be selling 4C/8T chips, (again, *not* for $99), if AMD hadn't come along and, you know, cleared their shelves.
https://forums.guru3d.com/data/avatars/m/268/268248.jpg
DLD:

If the goal of buying a new CPU is building/upgrading a serious machine, then anything less than 6 cores/12 threads is a bad investment, nowadays. One buys a new hardware not for yesterday or for a day before, but for today and for tomorrow and for a few years to come. This AMD's move is just a sign of their greed, or, rather, an attempt on selling as much existing silicon as possible. In terms of market strategy it is known as "clearing the shelves"...
How dare they to cover the whole spectrum of cpu price category on the market ? Everyone point at em and yell SHAME ! Now jokes aside i do not get your logic both companies sell from 2 cores and up and they try to have CPU's on every price category what's wrong there ? Should i put on my nas a 3950x to future proof it ? Should i make sure my gf that only using fb and youtube gets at least a threadripper to future proof? You fall in the trap assuming that what ever you want on a pc is what other people want /need.
https://forums.guru3d.com/data/avatars/m/45/45709.jpg
OK, that's a logic of a capital - sell anything you can sell. Yeah, price-wise, these CPUs ARE affordable. But, be sure that, when they sell it for 99.99$, they cost THEM 0.99$, each...THAT is a greed I'm referring to... And I am perfectly sure that "A certain company based in Santa Clara, California" is no "santa" (saint) either...
https://forums.guru3d.com/data/avatars/m/220/220626.jpg
... 99 cents?
https://forums.guru3d.com/data/avatars/m/266/266726.jpg
DLD:

OK, that's a logic of a capital - sell anything you can sell. Yeah, price-wise, these CPUs ARE affordable. But, be sure that, when they sell it for 99.99$, they cost THEM 0.99$, each...THAT is a greed I'm referring to... And I am perfectly sure that "A certain company based in Santa Clara, California" is no "santa" (saint) either...
they cost exactly the same to make as the 3800/3900x(depending on the die config), the difference is that they are partially defective, so they are marked down, think of it like a cost recovery.
https://forums.guru3d.com/data/avatars/m/220/220626.jpg
@user1 he seems perfectly aware of that. He also seems to think that a low tier binned part is worth 99 cents. For reasons beyond our understanding I'm sure.
https://forums.guru3d.com/data/avatars/m/270/270017.jpg
moab600:

RTS games should use all the possible cores and threads efficiently, one wrong way to do it is what blizzard done with SC2. Even on intel system, Starcraft II is one of the worst optimized games ever made specially on big battles the game tends to use LESS cpu rather than more and no more than 2 cores, even that barely... I upgraded from 4970K to 3700X and it was a huge upgrade, many games 0.1% fps improved a lot and the gameplay became smoother(like the awful optimized Borderlands 3), but the biggest upgrade was done with image processing and developing using Lightroom.
Heck yeah it is an upgrade! I also had the same intel Haswell chip, used that PC for almost 5 years. The 3700x system I built (all new) last July is WAY faster - sometimes 3~5x as fast on stuff that uses new instructions or memory bandwidth (which is much much better now). It's amazingly smooth + it NEVER bogs down. 8 thread for a file zip op on my 4790k = mouse cursor at 1~3fps... still took 10 minutes to back up my project that's multiple GB on ultra compression with 7-zip 16 thread for a file zip op on my 3700x = don't really notice it's even doing anything in the background - desktop is just as responsive so I can continue working...only took 2~3 minutes to do what it needed VS 10+ minutes on the 4790k. The difference is amazing, even RIMWORLD went way faster (well more than 2x increase in max speed), and now instead of only running 8~11 vehicles on my 4790k in BeamNG Drive, I can run 16~21 vehicles before it gets too slow. To top it off all of the above is done with the cheapest x570 board available - Asrock Phantom Gaming 4 - and 32gb (2x16gb) of 3000mhz Micron RAM I snagged out of the bargain bin. Only buyer's remorse I have whatsoever is not getting a 12 or 16 core 3900x/3950x chip when they were available (I still might, the BeamNG Drive simulator can use all the cores you throw at it and then some to run traffic). EDIT: It's so nice just being able to leave the stock cooler as-is, have a CPU that doesn't crack 80C, doesn't make a ton of noise or heat, and doesn't bake me out of the room on long work sessions. The 4790k by comparison, I had to delid (bought a delidder kit), put liquid metal under and ontop of the IHS, and have a 100$ air-cooling unit on top (designer colored, cost a little extra), just to let it stay at the 4.4ghz turbo speed. Never could get even a 100mhz overclock out of it without hitting 85C. Remounted many times but it refused, had a real pig of a chip. Was very displeased with the whole thing after buying into a fancy motherboard, fast ram, expensive cooling, K-series CPU, etc. Might as well had the non-k and a basic motherboard. So this time I bought a basic motherboard on the newest chipset and a CPU that was more than what was needed, and I am extremely happy with my purchases. Thank-you AMD. I will have a 3300X sale in my name when they're out when we finally phase out the Athlon II x2 Regor-core PC after 11.5 years of email/word-processing service. Hard pass on the 3100 though, no thanks. (edit, fix for double-post)
DLD:

If the goal of buying a new CPU is building/upgrading a serious machine, then anything less than 6 cores/12 threads is a bad investment, nowadays. One buys a new hardware not for yesterday or for a day before, but for today and for tomorrow and for a few years to come. This AMD's move is just a sign of their greed, or, rather, an attempt on selling as much existing silicon as possible. In terms of market strategy it is known as "clearing the shelves"...
There's certainly a market for both chips. Not everyone needs something super-fancy these days. The machine I am building for an internet terminal in the garage. The email computer in the den where my retired mother does church stuff on it. The 11-year old niece or daughter that likes Minecraft but is too good to be given an old pre-Bulldozer series Athlon processor, etc. The guy at the parts counter at your local automotive parts supply store. Not everyone needs a 6-core 12-thread chip - and would do just dandy with 2 cores / 4 threads or 4 cores / 8 threads; but conversely us gamers and content creators can always use high-core-count chips. Look at how often you see a basic Dell / HP / Lenovo / etc, when you're out next time. Even things like ATM's and POS (point of sale!) computers use something from 'Intel ATOM' to chips like this. These PC's are ones where you'd never notice the difference between heavily salvaged chips like the 3100 and the better ones like the 3300x / 3600 / 3600x / 3700x / 3800x chips which - while binned and hence salvaged - are all single (enabled) chiplet + IO chip designs. AMD is in business to make money. This is just how it works. It behooves them to bring things like this out as their obligation to Global Foundry comes to a close (a minimum chip requirement, without fulfilling this, they have to pay penalty equal to royalties they would have been paid for good chips). AMD needs newer stuff to appease OEM's, and that's the single biggest driver of this. When's the last time you've heard of AMD trying to penetrate the OEM market? Real often, if you read tech news, and this is it. So in the end, I don't care who is being cheap, greedy, or whatever, so long as both Intel and AMD fight for dominance, we win and save big while winning. So no complaints from this man. If you like gaming in immersive open worlds, be doubly thankful for this progress on ALL levels of PC power. Then finally the games will catch up and USE that processing power, and we can all have a great time with the money we've sunk into PC hardware. For me, I think I'll just remain thankful I finally got off quad-cores last summer for the first time since 2008 or 2009 (whenever the D-stepping i7 920 / Nehelam came out). THAT was long overdue!
https://forums.guru3d.com/data/avatars/m/272/272905.jpg
The real question is : Could Ryzen 3 can be unlocked via Bios or some Softmode or is it laser cut job.I still remember 3 cores Athlon II with Daneb core unlocking to Phenom II 4core and 6mb l3 cache via Bios unlocker.Good old cheap times.
https://forums.guru3d.com/data/avatars/m/272/272905.jpg
wavetrex:

^ (Almost) Completely false. The large majority of e-sports games barely use 2 cores - basically 1 primary for logic and rendering and various low-load threads (network, sound, etc.) on the rest of the cores 4 cores are totally fine for the large majority of gamers in the world. Also the very large majority of Indie games are (still) single-threaded. BUT... If you're talking strictly about AAA titles on the newest engines, then yes, 6 cores or more are better.
Have you ever while gaming exit to window open browser and go back in game with 1 or 2 even 4 cores???
https://forums.guru3d.com/data/avatars/m/197/197287.jpg
DLD:

This AMD's move is just a sign of their greed, or, rather, an attempt on selling as much existing silicon as possible.
Are you really trying to advocate that companies should be wasteful? How does that help...anyone?
DLD:

they cost THEM 0.99$, each...
....No. Not even just the materials themselves would be that cheap. Heck, the box and materials likely cost more then that to make, let alone the heatsink. And as stated earlier in the thread, the cost is the same as others, they lower the price to move the inventory and hope to make something. And and and, the material costs don't tell the whole story. For instance, if i make a product for $3 and i sell it for $20, i may not make any money until i've sold 10,000 units due to my development cost. Then, ofcourse, i will want to try and make some money beyond my investments and beyond the money i wish to make to help invest into the next product. It is never, ever as simple as "here's the costs, this is why you're charging too much" and to speak as though you know otherwise is downright foolish.
https://forums.guru3d.com/data/avatars/m/268/268248.jpg
@DLD in cpus and GPUs lower binned parts exist because not all chips comming from the waffer can work fully still the waffer that will give you 250-300 chiplets cost 4-6k usd ( i am to lazy to go check the exact price but i am very confident i am really close) so on that alone is impossible to just cost 99 cents , amd would have loved to have 100% yields and intel and Nvidia , anyway the alternative would be to throw those chips in the garbage bin. And again is not that simple often they will include perfectly working parts and sell em as a lower part just to cover demand because at the end .... A sale is better than no sale.
https://forums.guru3d.com/data/avatars/m/90/90667.jpg
bobblunderton:

Heck yeah it is an upgrade! I also had the same intel Haswell chip, used that PC for almost 5 years. The 3700x system I built (all new) last July is WAY faster - sometimes 3~5x as fast on stuff that uses new instructions or memory bandwidth (which is much much better now). It's amazingly smooth + it NEVER bogs down. 8 thread for a file zip op on my 4790k = mouse cursor at 1~3fps... still took 10 minutes to back up my project that's multiple GB on ultra compression with 7-zip 16 thread for a file zip op on my 3700x = don't really notice it's even doing anything in the background - desktop is just as responsive so I can continue working...only took 2~3 minutes to do what it needed VS 10+ minutes on the 4790k. The difference is amazing, even RIMWORLD went way faster (well more than 2x increase in max speed), and now instead of only running 8~11 vehicles on my 4790k in BeamNG Drive, I can run 16~21 vehicles before it gets too slow. To top it off all of the above is done with the cheapest x570 board available - Asrock Phantom Gaming 4 - and 32gb (2x16gb) of 3000mhz Micron RAM I snagged out of the bargain bin. Only buyer's remorse I have whatsoever is not getting a 12 or 16 core 3900x/3950x chip when they were available (I still might, the BeamNG Drive simulator can use all the cores you throw at it and then some to run traffic). EDIT: It's so nice just being able to leave the stock cooler as-is, have a CPU that doesn't crack 80C, doesn't make a ton of noise or heat, and doesn't bake me out of the room on long work sessions. The 4790k by comparison, I had to delid (bought a delidder kit), put liquid metal under and ontop of the IHS, and have a 100$ air-cooling unit on top (designer colored, cost a little extra), just to let it stay at the 4.4ghz turbo speed. Never could get even a 100mhz overclock out of it without hitting 85C. Remounted many times but it refused, had a real pig of a chip. Was very displeased with the whole thing after buying into a fancy motherboard, fast ram, expensive cooling, K-series CPU, etc. Might as well had the non-k and a basic motherboard. So this time I bought a basic motherboard on the newest chipset and a CPU that was more than what was needed, and I am extremely happy with my purchases. Thank-you AMD. I will have a 3300X sale in my name when they're out when we finally phase out the Athlon II x2 Regor-core PC after 11.5 years of email/word-processing service. Hard pass on the 3100 though, no thanks. (edit, fix for double-post) There's certainly a market for both chips. Not everyone needs something super-fancy these days. The machine I am building for an internet terminal in the garage. The email computer in the den where my retired mother does church stuff on it. The 11-year old niece or daughter that likes Minecraft but is too good to be given an old pre-Bulldozer series Athlon processor, etc. The guy at the parts counter at your local automotive parts supply store. Not everyone needs a 6-core 12-thread chip - and would do just dandy with 2 cores / 4 threads or 4 cores / 8 threads; but conversely us gamers and content creators can always use high-core-count chips. Look at how often you see a basic Dell / HP / Lenovo / etc, when you're out next time. Even things like ATM's and POS (point of sale!) computers use something from 'Intel ATOM' to chips like this. These PC's are ones where you'd never notice the difference between heavily salvaged chips like the 3100 and the better ones like the 3300x / 3600 / 3600x / 3700x / 3800x chips which - while binned and hence salvaged - are all single (enabled) chiplet + IO chip designs. AMD is in business to make money. This is just how it works. It behooves them to bring things like this out as their obligation to Global Foundry comes to a close (a minimum chip requirement, without fulfilling this, they have to pay penalty equal to royalties they would have been paid for good chips). AMD needs newer stuff to appease OEM's, and that's the single biggest driver of this. When's the last time you've heard of AMD trying to penetrate the OEM market? Real often, if you read tech news, and this is it. So in the end, I don't care who is being cheap, greedy, or whatever, so long as both Intel and AMD fight for dominance, we win and save big while winning. So no complaints from this man. If you like gaming in immersive open worlds, be doubly thankful for this progress on ALL levels of PC power. Then finally the games will catch up and USE that processing power, and we can all have a great time with the money we've sunk into PC hardware. For me, I think I'll just remain thankful I finally got off quad-cores last summer for the first time since 2008 or 2009 (whenever the D-stepping i7 920 / Nehelam came out). THAT was long overdue!
It's really hard to tell someone the difference between those two chips, you actually need to experience it. Before the "Human Malware" virus i travelled every two months abroad and done many photos via mine D850(huge raw files), and for the 4790K to process it all it would take 1hr and 30 mins sometimes less sometimes more, the 3700X does it in 20-30 minutes with ease it's just amazing. I think the Ryzen 3100 exist to pull people towards the 3300X, as i believe it provides better value.
https://forums.guru3d.com/data/avatars/m/220/220626.jpg
Feeling the need to hide reality by calling it human malware is unfortunate. You're not a YouTuber trying to keep monetization. @moab600
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
A bit odd how they jumped from 3100 to 3300X. I can't imagine the performance difference is that significant, and, I figure having a 2+2 configuration would be able to clock higher, due to the heat being more spread out. Seems to me they should've been called 3300 and 3300X. I guess we'll have to wait and see how big of a performance difference the core configuration makes.
JethroTu11:

I didn't really expect to see a configuration like the 3100. I thought the yields were good enough that we wouldn't see such a salvage operation with Zen2, but AMD might as well use all the silicon they can.
Well remember, the whole point of this chiplet design is to profit off of failed chips. It's much more expensive to have one giant die and crop off a large chunk of it because of 1 inconveniently placed defect. However, although I agree AMD likely doesn't have much chips that are only 2+2, the 3100 probably isn't going to sell as well as the 3300X.
HeavyHemi:

Extra cores have been a factor for over a half decade. IMO, nobody buying a new processor for gaming should go anywhere near a 4 core, H/T or not. This is entirely processor agnostic.
Nobody should buy a new 4c/8t processor for modern AAA titles. 4c/8t still holds up for everything else. For those of you saying games can use more than 8 threads, that still doesn't mean you need 8 threads. Most people only have a 60Hz display. As long as the game can play at 60FPS and isn't stuttering, any performance loss from an inferior CPU is irrelevant to such people.
https://forums.guru3d.com/data/avatars/m/132/132389.jpg
D1stRU3T0R:

^ If you talk about League of legends, than EVEN THAT PIECE OF GARBAGE with its 10yo engine uses 4 threads AT LEAST!
No it doesn't. It dumps nearly the entire load on 1 thread while the others jack off. What you see in task manager isn't what you think it is. It's not smoothly using X amount of threads, it's choking at the max of 1 core and showing the load distributed onto different threads.
data/avatar/default/avatar34.webp
Neo Cyrus:

No it doesn't. It dumps nearly the entire load on 1 thread while the others jack off. What you see in task manager isn't what you think it is. It's not smoothly using X amount of threads, it's choking at the max of 1 core and showing the load distributed onto different threads.
I tested on FX 9830P and the load was evenly distributed across the cores, hmm. But yea, on the FX 6300 I guess it was distributed too. So it might be 1 thread which is MAYBE overloaded and doing the main work, but it's "threaded" ... well not nicely, because the game is trashly made, but still somehow...ok
https://forums.guru3d.com/data/avatars/m/132/132389.jpg
D1stRU3T0R:

I tested on FX 9830P and the load was evenly distributed across the cores, hmm. But yea, on the FX 6300 I guess it was distributed too. So it might be 1 thread which is MAYBE overloaded and doing the main work, but it's "threaded" ... well not nicely, because the game is trashly made, but still somehow...ok
I wish I had a clear answer of what the hell is up in those scenarios but to this day I don't know. What I can tell you is that the game is heavily limited by the maximum speed of 2 cores. I've ran into many things which are effectively limited almost entirely by the power of 1 or 2 cores, but show an even load across all threads. Maybe league has changed though depending on how recently you checked it, I forgot to keep that in mind, I haven't been playing all year.
data/avatar/default/avatar05.webp
Neo Cyrus:

I wish I had a clear answer of what the hell is up in those scenarios but to this day I don't know. What I can tell you is that the game is heavily limited by the maximum speed of 2 cores. I've ran into many things which are effectively limited almost entirely by the power of 1 or 2 cores, but show an even load across all threads. Maybe league has changed though depending on how recently you checked it, I forgot to keep that in mind, I haven't been playing all year.
I don't play it, but tested pretty lately. So this means, a AMD Athlon X2 can still run very fine the game? That would be good tho