AMD FX-6330 Black Edition Six-core Processor Surfaces

Published by

Click here to post a comment for AMD FX-6330 Black Edition Six-core Processor Surfaces on our message forum
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
You can hit 60fps now, but that does not mean much if it does not stay near 60fps, or even has frame sync issues or stuttering.
As many reviewers have pointed out (I think even Hilbert), you can dip to 45FPS and will still have an enjoyable experience. In some cases, you might not even notice the frame drop.
Plus look at it like this... ... It does future proof having the cpu that can hit a higher FPS even though you may not intend to play at that FPS.
Right, but keep in mind the processors in question were released in late 2012 and 2015 respectively. One of these processors is the 2nd (sort of 3rd) iteration of the architecture, the other is the 6th and highly refined. When you consider that, whichever CPU is future-proofed isn't all that comparable.
https://forums.guru3d.com/data/avatars/m/254/254132.jpg
As many reviewers have pointed out (I think even Hilbert), you can dip to 45FPS and will still have an enjoyable experience. In some cases, you might not even notice the frame drop. Right, but keep in mind the processors in question were released in late 2012 and 2015 respectively. One of these processors is the 2nd (sort of 3rd) iteration of the architecture, the other is the 6th and highly refined. When you consider that, there isn't much of a comparison anymore.
I notice 50 to 45 drops so it matters to me. I'm not saying buy an i3 instead of a FX-6330... It depends entirely on usage. Compared to the i3, the FX6330 will excel in multi-threaded applications.
https://forums.guru3d.com/data/avatars/m/156/156133.jpg
Moderator
As many reviewers have pointed out (I think even Hilbert), you can dip to 45FPS and will still have an enjoyable experience. In some cases, you might not even notice the frame drop. Right, but keep in mind the processors in question were released in late 2012 and 2015 respectively. One of these processors is the 2nd (sort of 3rd) iteration of the architecture, the other is the 6th and highly refined. When you consider that, whichever CPU is future-proofed isn't all that comparable.
As far as the experience goes, I would say that depends on what the consumer thinks is enjoyable. Personally for me, it's enjoyable as low as say 20 fps or so lol. People think I'm crazy for that but that's my lowest I go! 😀 As for your second point, architecture differences I would say is not relevant due to the fact that that both CPU1 and CPU2 are both being marketed and sold in say 2018 Q1 market. Having an updated architecture as we have seen obviously does help, but this is not always the case. Also this is just me, but I don't want a cpu that will be just enough for performance. I want one that I can get great performance without having to upgrade for years down the road. My system now is the first high end I have ever built for myself, and I probably could have cut costs down some and keep the same performance but I decided to splurge. 😀
https://forums.guru3d.com/data/avatars/m/196/196284.jpg
I notice 50 to 45 drops so it matters to me.
As long as there are no anomalies, a drop from 50fps to 45fps would be completely unnoticeable by anyone. The only way that drop in framerate would be noticeable to anyone, is if it results in stuttering or other anomalies, unless you have super human vision.... Right now, I have my i5 6600K, a Celeron J1800, Athlon 5350 and A8-6410. When running Ubuntu, the J1800 is the fastest of them. When running Windows, the i5 6600K is faster. More cores is not always better. It heavily depends on the instructions being executed. For some uses, fewer cores at a higher clock rate is best....for other uses, more cores is best. For gaming, it depends on the exact game. Some games benefit from more cores while others benefit from faster cores. There is no real, "one size fits all" solution. Different processors exist to fit different usage cases.
https://forums.guru3d.com/data/avatars/m/254/254132.jpg
As long as there are no anomalies, a drop from 50fps to 45fps would be completely unnoticeable by anyone. The only way that drop in framerate would be noticeable to anyone, is if it results in stuttering or other anomalies, unless you have super human vision.... Right now, I have my i5 6600K, a Celeron J1800, Athlon 5350 and A8-6410. When running Ubuntu, the J1800 is the fastest of them. When running Windows, the i5 6600K is faster. More cores is not always better. It heavily depends on the instructions being executed. For some uses, fewer cores at a higher clock rate is best....for other uses, more cores is best. For gaming, it depends on the exact game. Some games benefit from more cores while others benefit from faster cores. There is no real, "one size fits all" solution. Different processors exist to fit different usage cases.
Something causes the stutter, which then drops the FPS. That's what I'm talking about, not the pacing difference between 50 and 45. I should have said that clearer 🙂
https://forums.guru3d.com/data/avatars/m/260/260826.jpg
Yeah, and the sky is blue... As I said before, in most cases, the GPU is the bottleneck. So what's your point? Just seems like petty nit-picking to me.
Which most cases? The sky is blue and Nvidia DX11 API driver overhead is performing at least 50% (to 100%) better than AMD one and this is showed in almost any game around who has distance view, shadow and/or grass settings, GameBrokes "features" could also be added to this list. In the last years DX11 AAA titles i don't use to see the 100% usage on GPU if i don't lower settings like distance view, grass and shadow quality. 4K or VSR at 1800p also helps a lot to get a better or max GPU usage putting more load on GPU. You can cherry pick some titles who are GPU maxed at 1080p but the norm is the bottleneck it's at DX11 API driver level, at this level AMD drivers are doing a very bad job.
https://forums.guru3d.com/data/avatars/m/196/196284.jpg
This thread also has nothing to do with GPUs or graphics drivers. This thread is concerning the FX-6330. GPU's and related drivers are irrelevant in this thread.
https://forums.guru3d.com/data/avatars/m/238/238382.jpg
Nice cpu, I wonder how well it overclocks.
https://forums.guru3d.com/data/avatars/m/248/248627.jpg
my guess is it wont overclock anymore or less the the 6300 since most will operate at these frequencies at stock volts anyways. AMD always overvolts the crap out of their cpus at stock.
data/avatar/default/avatar30.webp
I got all happy until I read it was just another spin-off lol.The box it comes in looks WICKED I must say. Is this for oversea markets or will we get em in the states as well>?:)
https://forums.guru3d.com/data/avatars/m/163/163068.jpg
Why do people still insist the FX line aren't "real" 8, 6, and 4 core designs. 2 cores only share a FPU.
https://forums.guru3d.com/data/avatars/m/156/156133.jpg
Moderator
Why do people still insist the FX line aren't "real" 8, 6, and 4 core designs. 2 cores only share a FPU.
I don't think anyone did...
https://forums.guru3d.com/data/avatars/m/163/163068.jpg
I don't think anyone did...
...B. Get a triple core with poor power efficiency...
Most games use Gump physics anyways, so I wouldn't imagine games using FPUs as much as the integer units. The only exception would by the Crysis 3 grass levels.
https://forums.guru3d.com/data/avatars/m/56/56686.jpg
STP is still king in alot programs, so question is how does it STP compare to i3 ? AMD has always been better at multi thread then Intel in the same price range
https://forums.guru3d.com/data/avatars/m/254/254132.jpg
*delete
data/avatar/default/avatar10.webp
Seems like a decent value. I've been gaming on an FX-6300 @ 4.6GHz 1.40v for a while. I've zero issues with it and I've enjoyed owning it.
https://forums.guru3d.com/data/avatars/m/239/239459.jpg
Looks like a decent chip to me, I am on the blue team now but still have fond memories of my FX8350 and in some scenarios it outperformed my 4790k, if I was building a budget system I wouldn't worry about throwing this chip in there, it's easily capable of running any game out today and now with a lower TDP whats not to like?
https://forums.guru3d.com/data/avatars/m/93/93080.jpg
Guys I have a weaker FX 4300 here at my place I built to mess around with. That chip can handle 1080p with a single GTX 780 and pull 60 FPS in Dark Souls II SOTFS. It also pulled 60 FPS in The Lord of the Rings: SoM. Of course minimal AA in both titles to achieve that. 2-4x MSAA. Never saw dips below the mid 40's in SoM. Now, when it came to the pc loading and desktop functionality, my Intel runs circles around it. Especially when it came to cranking max settings on my 3440x1440 Ultra Wide monitor. When it comes to higher resolution gaming and multi-GPU, Intel is still the better choice, even for AMD cards.
data/avatar/default/avatar29.webp
i dont know anything about that because i have a 8350 stock clocks at 4ghz and i have no problems whats so ever, i3 can beat out a low end fx cpu but at 2 cores when most new games need more then 2 cores,I think its a bandwaggon thing people get on,it will be a intel vs amd thing which it should not because in the end we game and we use what we can buy.
data/avatar/default/avatar27.webp
I said "generally speaking", meaning, not always. Also as pointed out before, if you only intend to get to 60FPS, then getting higher than that doesn't matter. If you intend to get beyond 60FPS, then what's the point of wasting time and money considering a mediocre CPU? If you're an average gamer, a mildly overclocked FX-6330 will suffice.
A stock FX-6300 will suffice for average gaming.