AMD FX-4350 and FX-6350 Piledriver CPUs

Published by

Click here to post a comment for AMD FX-4350 and FX-6350 Piledriver CPUs on our message forum
https://forums.guru3d.com/data/avatars/m/230/230258.jpg
Are these only overlocked version of 4300fx and 6300fx? Nothing else?
https://forums.guru3d.com/data/avatars/m/207/207465.jpg
schmidtbag has an 890FX motherboard, it doesn't support any AMD FX series CPU... deltatux
https://forums.guru3d.com/data/avatars/m/132/132389.jpg
You tried to help me by simply telling me I'm wrong and to go for a CPU that's unnecessary. You have yet to prove why or how optimizations won't work when they make all the difference. IIRC, you had this same problem in another topic on these forums - a topic I wasn't as involved in.
I don't have to prove anything, you're defensive and don't want to listen, that's your problem. Prove optimizations won't work? LOL, if you don't realize what's wrong with that sentence right there you really should stick with consoles. Not to mention you're slinging dirt at the wrong person. Enjoy your Piledriver CPU, you deserve it.
schmidtbag has an 890FX motherboard, it doesn't support any AMD FX series CPU... deltatux
I know.
https://forums.guru3d.com/data/avatars/m/207/207465.jpg
I know.
I wasn't noting it for you, I was referring to schmidtbag directly. Should have phrased my sentence better lol. I think it'll be cheaper if schmidtbag upgrades to a Thuban. Even if he buys it used, it'll still be a worthwhile upgrade (depending on how he uses his rig). Else, just overclock the darn thing and wait until he can rebuild it. EDIT: Something like this: http://www.ebay.com/itm/USED-AMD-Phenom-II-X6-1090T-3-2-GHz-HDT90ZFBK6DGR-WITH-SCYTHE-KATANA-3-COOLER-/330916582148?pt=CPUs&hash=item4d0c2a5304 deltatux
data/avatar/default/avatar28.webp
Because I said it's the same CPU architecture. When you optimize a program to work on Ivy Bridge, whether you have an i3 or an i7, you'll see a noticeable performance gain in both platforms, but an AMD CPU with the same instruction sets likely won't get that performance bonus because it's architecturally different.
And yet, unless you're running an application that can truly utilize a multi-core CPU with parallelism, an AMD CPU doesn't really stand out much compared to an Intel. Moreover, you strictly compared the 6300 to the upcoming i5 & i7 Haswell. If you have a source that has performance comparisons on games between 6300 and current i5 & i7, I'd like to read that.
Again, huma is not going to make that immense of an impact. On an APU, it would give a noticeable performance improvement, but you're acting like games for PS4 or Xbox will lose all performance gains on an AMD system solely because of huma. That's like saying a sprinter gets nearly all his speed through his shoes. Shoes make a difference but it's the legs that do all the work.
And hUMA isn't exactly the shoes. It's an optimization technique/standard, i.e how to work out the legs so you can run efficiently. And again, I'm not 'acting like games for PS4 or Xbox will lose all performance gains on an AMD system solely because of huma', but more about 'if ported games can't/doesn't utilize hUMA, why would it run faster on an AMD CPU than on an Intel CPU'. The point of contention here is your statement regarding the performance advantage of AMD CPUs against Intel's on ported games from next-gen consoles. Feel free to keep nitpicking about 'unnoticeable performance difference with/without hUMA', but that's not something I wanted to discuss about in the first place.
I know the point of huma, but I don't think it's going to have as dramatic of an impact as you think it will on higher-end systems. At best, it fixes latency problems - which are a big deal, but there's probably a point where huma can't improve performance.
AMD's advantage in core count and parallelism against current Intel CPUs have been proven in many articles that they don't really bring improvements outside specific applications that do parallelism extensively. My point is, unless the game devs utilize hUMA extensively on consoles, the ported games won't have much of a difference in terms of performance compared to the console version. And since the cat hasn't even been let out of the box yet, both of us can't know how much impact will hUMA make (if it's really going to be implemented on said consoles, mind you). Exactly why I don't discuss it, and rather focus on your statement about AMD CPU's advantage on ported games.
Agreed. But I don't see that happening any time soon since such a scenario would be way too restricting or variable. huma works nicely on an APU because the CPU and GPU are fused together and you don't get another option.
APUs can be paired with a discrete GPU; most laptops with A8 or A6 APUs are usually configured this way, like Samsung's 5 series IIRC. hUMA will work nicely on that setup as well if it's being implemented properly.
Suppose there was a time when there was a CPU, north bridge, and discrete GPU that were huma compatible. It wouldn't surprise me if the next generation of any one of those parts would break compatibility.
Break compatibility as in what? Hardware-wise, Intel has changed sockets more often than Taylor Swift breaking up, and AMD has also changed sockets for their CPUs though (way) less often; not many people seem to mind that. We also had ISA to make way for PCI, AGP for PCIe, and so on. Software-wise, apps are often hardware agnostic, unless you compile it for a specific optimizations and/or extensions for a CPU generation, in which usually it's either run unoptimized or fail to run. All I'm saying is, hUMA will 'break compatibility' as much as AMD or Intel adding specific instructions to their CPUs. It's also nothing different than using AHCI for your HDD/SSD or not. Apps being run on a hUMA'd platform would simply not use hUMA features if not being coded specifically for it, not rendering it to stop working altogether.
https://forums.guru3d.com/data/avatars/m/113/113386.jpg
The only thing that interests me about Haswell is the power consumption. Otherwise, I'm not paying that much for something that I have little need for. I'm actually choosing the 6300 over the 8350 because I don't even need the extra performance of the 8350, and the 6300 seems to be the best valued AMD CPU today in terms of processing performance. And I'm choosing the 6300 because it'll make a nice last-upgrade to my AM3 system (I have a beta bios that supports AM3+ CPUs).
FX-6300 is really great, it also doesn't get that hot. I'm using Scythe Grand Kama Cross Rev.B and my temps are really low. 🙂 It only idles at 26c, full load 52c.
https://forums.guru3d.com/data/avatars/m/242/242573.jpg
Hardware-wise, Intel has changed sockets more often than Taylor Swift breaking up, and AMD has also changed sockets for their CPUs though (way) less often; not many people seem to mind that.
This claim is 100% false. Since 1999 when AMD released the "Slot A" cartridge processor, they have had a total of 12 different sockets for the desktop market. Intel has only had 7. I don't fault you for the inaccurate statement because I'm sure you're just repeating what you've heard a hundred times over on various forums from people who just make sh*t up. :bang: btw... I counted from 1999 when AMD entered the market for obvious reasons. http://www.hardwaresecrets.com/article/A-Complete-List-of-CPU-Sockets/373/2
https://forums.guru3d.com/data/avatars/m/179/179579.jpg
This claim is 100% false. Since 1999 when AMD released the "Slot A" cartridge processor, they have had a total of 12 different sockets for the desktop market. Intel has only had 7. I don't fault you for the inaccurate statement because I'm sure you're just repeating what you've heard a hundred times over on various forums from people who just make sh*t up. :bang: btw... I counted from 1999 when AMD entered the market for obvious reasons. http://www.hardwaresecrets.com/article/A-Complete-List-of-CPU-Sockets/373/2
The difference is that AMD CPU's have retained backward compatibility since AM2. EDIT- FM1/FM2 are designed exclusively for APU's while Skt 940/F are for servers.... so can't really be counted in that list.
https://forums.guru3d.com/data/avatars/m/202/202509.jpg
I dunno what games you're playing, but even a first gen i7 (my CPU) at 4GHz gets maxed out in some games. You probably think your CPU is not a bottleneck because you see it at 12 or 25% usage spread across all the cores. That's 1-2 threads maxed out then the times slices distributed among all the cores, you're still limited by the maximum 1 or 2 cores can pull off. Do whatever works for you, but the single threaded performance of AMD CPUs is too much of a handicap for my uses. And it's just too weak in anything that requires high FPU usage, take a look at the mighty FX 8350 in this FPU benchmark compared to my CPU at the same frequency with screwed up RAM timings (no idea why, don't care, upgrading to Haswell): [spoiler]http://i.imgur.com/qkDH5t7.png[/spoiler] Wondering how it did in all the other ones? Abysmal.
LOL you really hate AMD dont you. makes me want to puke sometimes that you cannot even see that using a amd system to a intel system for just "gaming" is maybe a 5% difference in almost all games out there.. Well maybe 4 year old games that are single threaded intel would be better at.. pff. We are talking about gaming here not using the pc to make arts and crafts and edit porn. If I wanted to have best single threaded performance Id diffidently sidegrade to a intel. But since I dont do anything regarding editing and what have you I'll stick to something that i can have fun with and end up blowing it up for half the cost of a new top end intel.. I do like amd's Iv never had a problem with one and iv done nasty things to every one of them.. Sometime im sure ill upgrade to an intel system... But thats only if there is a true reason to.. So far i see no reason to spend 100$'s more. schmidtbag Do whatever you see is fit for you. If you go AMD you will have 100's left over for upgrading to a better video card..
https://forums.guru3d.com/data/avatars/m/132/132389.jpg
Remember the last time we were having a discussion and you said that... and posted a biased GPU limited benchmark (where the FX chips were still causing frame loss)... then you got about a thousand benchmarks in response proving otherwise and you had nothing to say? Yeah, I remember. Edit: Here you go -
Awww... get ready to feel the pain: http://images.hardwarecanucks.com/image//skymtl/CPU/FX-6300-FX-4300/FX-6300-FX-4300-63.jpg http://images.hardwarecanucks.com/image//skymtl/CPU/FX-6300-FX-4300/FX-6300-FX-4300-67.jpg http://images.hardwarecanucks.com/image//skymtl/CPU/FX-6300-FX-4300/FX-6300-FX-4300-69.jpg http://images.hardwarecanucks.com/image//skymtl/CPU/FX-6300-FX-4300/FX-6300-FX-4300-71.jpg http://media.bestofmicro.com/O/M/375430/original/Crysis3-CPU.png http://media.bestofmicro.com/F/E/371210/original/Skyrim.png http://media.bestofmicro.com/F/H/371213/original/StarCraft2.png
5% difference and HUNDREDS of dollars saved? Thanks for the lulz man, I needed it.
https://forums.guru3d.com/data/avatars/m/179/179579.jpg
Remember the last time we were having a discussion and you said that... and posted a biased GPU limited benchmark (where the FX chips were still causing frame loss)... then you got about a thousand benchmarks in response proving otherwise and you had nothing to say? Yeah, I remember. Edit: Here you go - 5% difference and HUNDREDS of dollars saved? Thanks for the lulz man, I needed it.
Don't be a dick, I've said this already, but do you have any idea how old those games are? Even Skyrim is based on a 5yr old engine. And as for Crysis3 - since when was Piledriver competing with the i7??
https://forums.guru3d.com/data/avatars/m/179/179579.jpg
Btw I should add I'm actually playing Skyrim as we speak, everything maxed with all HD mods etc and my fps haven't dropped below 65. And I'm capped at 65fps using afterburner..... OK they did drop to 50fps, but my cpu cores are only hitting 60% max on 2 cores, the others are not being used.
https://forums.guru3d.com/data/avatars/m/132/132389.jpg
He throws out accusations and makes up numbers, I gave him some hard facts. Those games are relatively new, and ones which are regularly played. Personally I never got around to finishing Skyrim, waiting for a new card, my current one fails too hard. But post some more recent benchmarks if you want, I just pulled those ones up since they were already posted. Those games are far from the worst case scenario. What will happen when something is demanding on the FPU? In those cases he'd wish the results were equal to those benchmarks posted, the FX chips only have 1 weak FPU per module and that includes the upcoming Steamroller or whatever it's called. That's why they're destroyed in those FPU benchmarks. Best we can hope for is that future PC ports are 8 threaded. We'll see by early next year hopefully since the consoles should be out by the end of the year.
OK they did drop to 50fps, but my cpu cores are only hitting 60% max on 2 cores, the others are not being used.
Are you sure that's actually what's happening and it's not distributed time slices? How many threads does Skyrim use anyway? Something seems wrong since even being GPU limited never resulted in dips to 50 for me. I just want it to always be at 60 (or higher) since it's such a long game, if I'm going to spend so much time on one thing I want it to be the best experience possible (on a reasonable budget).
https://forums.guru3d.com/data/avatars/m/72/72830.jpg
You tried to help me by simply telling me I'm wrong and to go for a CPU that's unnecessary. You have yet to prove why or how optimizations won't work when they make all the difference. IIRC, you had this same problem in another topic on these forums - a topic I wasn't as involved in. Look at Resident Evil 4 for gamecube for example. Abysmal hardware but can play a game that looks decent even for today's standards. A game like that on a console that crappy can only be achieved through really system-specific development. PC games demand significantly higher specs because they need to be generalized between all hardware, but, if the new gen consoles are roughly x86 based, devs won't have to work so hard on generalizing their code since it's already nearly half way there. Since the new consoles have similar pipelines to AMD's current models, AMD users therefore get many of the optimizations the consoles get, on the CPU side anyway. Not sure how that's such a hard concept to grasp. It's obviously a lot more complicated than that, but I didn't say it'd be a seamless transition.
Gamecube doesn't even run in HD.
data/avatar/default/avatar29.webp
This claim is 100% false. Since 1999 when AMD released the "Slot A" cartridge processor, they have had a total of 12 different sockets for the desktop market. Intel has only had 7.
Hahah, yeah, "100% false". I also don't fault you for the inaccurate statement because I'm sure you're just repeating what you've heard a hundred times over on various forums from people who just make sh*t up. :bang:
I don't fault you for the inaccurate statement because I'm sure you're just repeating what you've heard a hundred times over on various forums from people who just make sh*t up. :bang:
It's good you didn't fault me because you'd be wrong. That statement, which you regard as from copypasting across several beeellion forums, was actually from my experience on tinkering with PCs from before 1999. So yeah, you could actually blame me for not being specifically start from 1999, but sorry you missed that opportunity.
btw... I counted from 1999 when AMD entered the market for obvious reasons.
For obvious reasons? There's more than just the one reason that AMD actually started their own socket design in that year? Tell me, tell me; I wanna know! Btw, you probably missed the fact that AMD has entered the market from before 1980. Or the fact that K6 was released around...1996, I think? Or was it 1997? Of course they didn't use their own socket, but I just found that claim about '1999 when AMD entered the market' 100% false.
The difference is that AMD CPU's have retained backward compatibility since AM2. EDIT- FM1/FM2 are designed exclusively for APU's while Skt 940/F are for servers.... so can't really be counted in that list.
Shush, give the guy a break, eh? =b To be honest, LGA 1156 and 1366 can also be counted as one because they're targeting different performance segments within the same generation. But yeah, I concur with your statement that AMD sockets (in the last few years) do have more backward compatibility than Intel's. @schmidtbag Personally, I'm not going to insist you're better off going Intel as I never intended to. I'm just disagreeing your claim about 6300's performance compared to Haswell i5's (and i7's) performance, when you have in no way known the hard numbers to compare it with. If you're still going to get AMD for your upgrade then by all means; get what's more sensible to your current needs and available funds. Just don't carelessly spew out questionable statements and being so defensive about it the next time around.
https://forums.guru3d.com/data/avatars/m/179/179579.jpg
t year hopefully since the consoles should be out by the end of the year. Are you sure that's actually what's happening and it's not distributed time slices? How many threads does Skyrim use anyway? Something seems wrong since even being GPU limited never resulted in dips to 50 for me. I just want it to always be at 60 (or higher) since it's such a long game, if I'm going to spend so much time on one thing I want it to be the best experience possible (on a reasonable budget).
I don't know, probably. I'd have to open up Process Explorer to see. Anyway Skyrim has fps drops on the best PC's at certain places on the map, it's just the game. That's what happens to me, 95% of the time I'm running at my 65fps cap then if I look at a particular building or something it'll drop to 50. Still not really noticeable though...it doesn't lag or stutter. I'm over the game anyway, mainly due to the poor voice acting. Not only that but I think Bethesda used about 4 actors to play 100 different characters. lol Gimme Mass Effect anyday....
https://forums.guru3d.com/data/avatars/m/217/217316.jpg
This claim is 100% false. Since 1999 when AMD released the "Slot A" cartridge processor, they have had a total of 12 different sockets for the desktop market. Intel has only had 7. I don't fault you for the inaccurate statement because I'm sure you're just repeating what you've heard a hundred times over on various forums from people who just make sh*t up. :bang: btw... I counted from 1999 when AMD entered the market for obvious reasons. http://www.hardwaresecrets.com/article/A-Complete-List-of-CPU-Sockets/373/2
Lol..what. You're oversimplifying it, you know you're oversimplifying it, and you're misrepresenting it in an effort to "be right" Counting sockets, yes, AMD has more sockets. That said, AMD's upgrade path for each segment of the market has also been much cleaner. The sockets also tend to be separated by computer types. APUs work in the APU socket, and desktop CPUs work in their own socket. Rarely would you need an upgrade path between them. While each Intel iteration requires a socket upgrade, AMD's has not. AM2 could socket AM2+ and AM2+ could socket AM2. Once you had your AM2+ system, AM3 processors would work on an AM2+ system. So now you can upgrade to AM3 motherboard at your leisure. And finally, there is also some interoperability between AM3 and AM3+. AM3+ is expected to support Bulldozer, Piledriver, and Steamroller by the time all is said and done. Flexibility. You knew this. Come on.
https://forums.guru3d.com/data/avatars/m/123/123974.jpg
remember the last time we were having a discussion and you said that... And posted a biased gpu limited benchmark (where the fx chips were still causing frame loss)... Then you got about a thousand benchmarks in response proving otherwise and you had nothing to say? Yeah, i remember. Edit: Here you go - 5% difference and hundreds of dollars saved? Thanks for the lulz man, i needed it.
zing!