AMD FX-4350 and FX-6350 Piledriver CPUs

Published by

Click here to post a comment for AMD FX-4350 and FX-6350 Piledriver CPUs on our message forum
data/avatar/default/avatar12.webp
Just a heads up, Hilbert. Looks like the links don't work. Both on the site and the forum. "Error. The story does not exist."
https://forums.guru3d.com/data/avatars/m/16/16662.jpg
Administrator
thanks man, fixed 😉
data/avatar/default/avatar21.webp
12 & 14 Mb l2 and l3 cache... Don't think that's right, or am I wrong?
https://forums.guru3d.com/data/avatars/m/229/229509.jpg
12 & 14 Mb l2 and l3 cache... Don't think that's right, or am I wrong?
That's total cache...
data/avatar/default/avatar12.webp
That's total cache...
Yeah, I thought that already. Just 4+8 and 6+8.
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
What are their stock speeds? I was considering getting a 6300, which I intend to overclock. Whichever overclocks better on air is likely the one I'll be getting.
https://forums.guru3d.com/data/avatars/m/132/132389.jpg
I was considering getting a 6300
Protip: Don't. Haswell, June 4th.
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
Protip: Don't. Haswell, June 4th.
The only thing that interests me about Haswell is the power consumption. Otherwise, I'm not paying that much for something that I have little need for. I'm actually choosing the 6300 over the 8350 because I don't even need the extra performance of the 8350, and the 6300 seems to be the best valued AMD CPU today in terms of processing performance. And I'm choosing the 6300 because it'll make a nice last-upgrade to my AM3 system (I have a beta bios that supports AM3+ CPUs). Also, considering that PS4 and likely the new Xbox are both going to be Piledriver based, it won't surprise me games will perform better in a 6300 than any Haswell i5, maybe even i7. I doubt the first year of games will do that though since devs likely won't figure out the kinks in micro-optimizing. Anyways, I currently own an Athlon II x3 at 3.7GHz and so far it's difficult to justify upgrading that. It doesn't max out in any live tasks such as gaming. It's relatively slow when it comes to other things like compression or encoding, but I can deal with the wait since I don't do those very often. I have no need for 8 cores but I'm likely going to need more than 4 pretty soon. Also, being mostly a Linux user, BD/PD processors tend to perform a little better in linux than they do in Windows.
data/avatar/default/avatar12.webp
Also, considering that PS4 and likely the new Xbox are both going to be Piledriver based, it won't surprise me games will perform better in a 6300 than any Haswell i5, maybe even i7. I doubt the first year of games will do that though since devs likely won't figure out the kinks in micro-optimizing.
PS4 and Xbox are likely to have this technique implemented. Your system and Piledriver-based desktop CPUs, however, won't have it. The games are likely unable to perform better on a 6300 than a similarly spec'd i5 system, much less an i7.
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
PS4 and Xbox are likely to have this technique implemented. Your system and Piledriver-based desktop CPUs, however, won't have it. The games are likely unable to perform better on a 6300 than a similarly spec'd i5 system, much less an i7.
I don't see how huma would have that black and white of a difference. it's still based on the same general CPU architecture, so the only performance I would lose due to not having huma intel would lose as well. So I don't see how that's a valid point. Also note that huma is not ideal for discrete GPUs - you would lose performance if you attempted to make a discrete GPU use your system memory.
https://forums.guru3d.com/data/avatars/m/202/202509.jpg
In games like bf3 crysis 2 using an I5 compared to a 6300 is barley noticeable. Some games the AMD will be faster by a few frames and some games the i5 will have afew frames more, not like having an intel will give you 50fps more lol.. So unless your playing old games that only use 1 core i wouldnt spend the extra $100+ bucks on the intel. Plus old games that only use 1 core will prolly run 200+fps on any newer system.. 😉
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
In games like bf3 crysis 2 using an I5 compared to a 6300 is barley noticeable. Some games the AMD will be faster by a few frames and some games the i5 will have afew frames more, not like having an intel will give you 50fps more lol.. So unless your playing old games that only use 1 core i wouldnt spend the extra $100+ bucks on the intel. Plus old games that only use 1 core will prolly run 200+fps on any newer system.. 😉
Very valid point, which is one of the reasons I'm doing a CPU upgrade rather than a system upgrade.
data/avatar/default/avatar13.webp
I don't see how huma would have that black and white of a difference. it's still based on the same general CPU architecture, so the only performance I would lose due to not having huma, intel would lose as well. So I don't see how that's a valid point. Also note that huma is not ideal for discrete GPUs - you would lose performance if you attempted to make a discrete GPU use your system memory.
My point was the games are designed to be run on systems likely to be based on hUMA, and not on systems (with discrete GPU) like yours. You just said it yourself that you'd only lose whatever advantage(s) hUMA would provide just like any other Intel systems, so why would a 6300 has the advantage over a similarly spec'd Haswell i5, or even i7, when the games aren't likely to be optimized for non-hUMA systems? Besides, your first argument was that games would run better on (vanilla) AMD systems because PS4 and Xbox are based on AMD hardware, but that was before hUMA entered the picture. Care to elaborate? And it seems you kinda miss the point with hUMA. It's not about the GPU sharing the same memory being used by CPU, it's more about the CPU being able to read what the GPU has already processed (and vice versa) without having to wait for the result being copied back and forth between VRAM and system's RAM. On a traditional shared memory system, which you were thinking about, CPU and GPU still can't see what's in each other's share of the memory space. So there's still the overhead to copy over what's in the GPU's memory share to the CPU's, whenever there are CPU-GPU processes that need to be worked on, actually resulting in duplicate contents whenever there are such processes. With hUMA, it's both reducing the memory usage and removing the overhead of copying values. Also, hUMA would still be relevant on systems with discrete GPU. The gfx card can still keep their VRAM for graphics processing, but it'll be faster for CPU-GPU processes if the GPU can just read off and write on the system's RAM, which means no more wasted cycles to copy values back and forth between system RAM and VRAM.
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
You just said it yourself that you'd only lose whatever advantage(s) hUMA would provide just like any other Intel systems, so why would a 6300 has the advantage over a similarly spec'd Haswell i5, or even i7, when the games aren't likely to be optimized for non-hUMA systems?
Because I said it's the same CPU architecture. When you optimize a program to work on Ivy Bridge, whether you have an i3 or an i7, you'll see a noticeable performance gain in both platforms, but an AMD CPU with the same instruction sets likely won't get that performance bonus because it's architecturally different.
Besides, your first argument was that games would run better on (vanilla) AMD systems because PS4 and Xbox are based on AMD hardware, but that was before hUMA entered the picture. Care to elaborate?
Again, huma is not going to make that immense of an impact. On an APU, it would give a noticeable performance improvement, but you're acting like games for PS4 or Xbox will lose all performance gains on an AMD system solely because of huma. That's like saying a sprinter gets nearly all his speed through his shoes. Shoes make a difference but it's the legs that do all the work.
And it seems you kinda miss the point with hUMA.
I know the point of huma, but I don't think it's going to have as dramatic of an impact as you think it will on higher-end systems. At best, it fixes latency problems - which are a big deal, but there's probably a point where huma can't improve performance.
Also, hUMA would still be relevant on systems with discrete GPU. The gfx card can still keep their VRAM for graphics processing, but it'll be faster for CPU-GPU processes if the GPU can just read off and write on the system's RAM, which means no more wasted cycles to copy values back and forth between system RAM and VRAM.
Agreed. But I don't see that happening any time soon since such a scenario would be way too restricting or variable. huma works nicely on an APU because the CPU and GPU are fused together and you don't get another option. Suppose there was a time when there was a CPU, north bridge, and discrete GPU that were huma compatible. It wouldn't surprise me if the next generation of any one of those parts would break compatibility.
https://forums.guru3d.com/data/avatars/m/132/132389.jpg
It doesn't max out in any live tasks such as gaming.
I dunno what games you're playing, but even a first gen i7 (my CPU) at 4GHz gets maxed out in some games. You probably think your CPU is not a bottleneck because you see it at 12 or 25% usage spread across all the cores. That's 1-2 threads maxed out then the times slices distributed among all the cores, you're still limited by the maximum 1 or 2 cores can pull off. Do whatever works for you, but the single threaded performance of AMD CPUs is too much of a handicap for my uses. And it's just too weak in anything that requires high FPU usage, take a look at the mighty FX 8350 in this FPU benchmark compared to my CPU at the same frequency with screwed up RAM timings (no idea why, don't care, upgrading to Haswell): [spoiler]http://i.imgur.com/qkDH5t7.png[/spoiler] Wondering how it did in all the other ones? Abysmal.
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
I dunno what games you're playing, but even a first gen i7 (my CPU) at 4GHz gets maxed out in some games. You probably think your CPU is not a bottleneck because you see it at 12 or 25% usage spread across all the cores. That's 1-2 threads maxed out then the times slices distributed among all the cores, you're still limited by the maximum 1 or 2 cores can pull off.
That's a possibility, but also I have a HD5750 which could be the real bottleneck. As long as I get 45-60FPS at near full detail, I don't really care if something is pushed to its limits. Any more than that for my single 1080p 60Hz screen is pointless to achieve. I'm sure if I got a 2nd 5750 or something to replace it with, my current CPU might be the real bottleneck. As of right now my computer is almost head to head with gaming performance compared to a PS3, with equal or better visual detail. The FX 6300 may have 2 less cores than the APUs in PS4 and Xbox, but each core is considerably more powerful. Since they're based on the same architecture, even if I were to benefit from 75% of the optimizations, an overclocked 6300 ought to perform very similarly, albeit considerably more power consuming.
https://forums.guru3d.com/data/avatars/m/132/132389.jpg
Oh yes the 5750 would be the bottleneck before the CPU in any new/intensive game. My entire point was that in some things your CPU is going to be a bottleneck and if you upgrade to a Bullpile CPU you're going to have the same problem. You really shouldn't compare to consoles, for various reasons. As the person before said you're not going to gain any advantage over an Intel CPU, at all, and it's not just because of the fUMA that he pointed out. I can say with certainty Intel CPUs will still have the obvious lead in any gaming until it becomes a thread race. So unless the new games are 8 threaded when ported to PC, Intel CPUs will still win by a landslide, and then it'll only be a matter of time before Intel just makes octo cores the standard to match AMD. Throw in the HT and Intel wins harder. Anyway, you got my advice already (to avoid Bullpile like it's a disease), I don't need any convincing or justification of why you're choosing a Bullpile CPU. Just remember that we told you it won't hold any advantage in future games due to its architecture.
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
You really shouldn't compare to consoles, for various reasons. As the person before said you're not going to gain any advantage over an Intel CPU, at all, and it's not just because of the fUMA that he pointed out.
I compare to consoles in terms of what's considered acceptable gameplay. Aside from huma, what makes you think modern AMD processors won't gain an advantage? Tests have proven that BD/PD are potentially as good as i7, IF the software is optimized for it. Most software isn't, so the architecture is considered to be generally crappy in the PC world. In intel's terminology, AMD has more of a tick-tick-tick-tock strategy so I highly doubt that the APUs are going to vary too drastically (architecturally) compared to their regular x86-64 products, therefore, optimizations for consoles should carry over nicely. Not perfectly, but enough to make a noticeable difference for AMD users. I'm not saying going for intel will be a bad choice, what I'm saying is due to the console optimizations, AMD users could probably pay considerably less than an intel user and get a near identical experience. Obviously you need to consider more than just gaming on a PC, but since Linux is very BD/PD friendly, I don't have much to worry about.
https://forums.guru3d.com/data/avatars/m/132/132389.jpg
I compare to consoles in terms of what's considered acceptable gameplay. Aside from huma, what makes you think modern AMD processors won't gain an advantage? Tests have proven that BD/PD are potentially as good as i7
I stopped reading there. I tried to help and you don't want to listen, that's fine. If you actually need some answers you can Google from here, I don't have the patience for this.
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
I stopped reading there. I tried to help and you don't want to listen, that's fine. If you actually need some answers you can Google from here, I don't have the patience for this.
You tried to help me by simply telling me I'm wrong and to go for a CPU that's unnecessary. You have yet to prove why or how optimizations won't work when they make all the difference. IIRC, you had this same problem in another topic on these forums - a topic I wasn't as involved in. Look at Resident Evil 4 for gamecube for example. Abysmal hardware but can play a game that looks decent even for today's standards. A game like that on a console that crappy can only be achieved through really system-specific development. PC games demand significantly higher specs because they need to be generalized between all hardware, but, if the new gen consoles are roughly x86 based, devs won't have to work so hard on generalizing their code since it's already nearly half way there. Since the new consoles have similar pipelines to AMD's current models, AMD users therefore get many of the optimizations the consoles get, on the CPU side anyway. Not sure how that's such a hard concept to grasp. It's obviously a lot more complicated than that, but I didn't say it'd be a seamless transition.