AMD A10 6800K benchmarked, tested

Published by

Click here to post a comment for AMD A10 6800K benchmarked, tested on our message forum
https://forums.guru3d.com/data/avatars/m/238/238354.jpg
I think Kaveri (coming later this year) will be the real deal. Steamroller cpu with GCN cores.
https://forums.guru3d.com/data/avatars/m/128/128096.jpg
Here is an excellent graph to show AMD's problem: http://techreport.com/r.x/core-i7-4770k/power-peak.png http://techreport.com/r.x/core-i7-4770k/power-task-energy.png That shows the amount of energy used to complete a workload.
The 4770K requires the least power to complete the encoding task even though its peak power draw is slightly higher than the 3770K's. Credit for that win should go to the AVX2 and FMA extensions in the Haswell core, which are supported in the version of the x264 encoder we're using. They help the 4770K finish the encoding task sooner. If you want a single set of numbers to summarize AMD's struggles of late, look no further than the chart above. Even though the FX-8350 also supports FMA, it requires more than twice the energy to complete the same task as the 4770K. The FX processor's absolute performance is lower and its peak power draw is substantially higher. Not a recipe for success. http://techreport.com/review/24879/intel-core-i7-4770k-and-4950hq-haswell-processors-reviewed/7
This means that not only does the Intel CPU run faster and use less peak power, but since it finishes the workload faster it uses up even less energy than is usually noted in graphs. Not to mention that the new Intel Iris Pro (5200) runs circles around AMD's top APU, including the GPU portion, while using far less energy. http://techreport.com/r.x/core-i7-4770k/igp-g2-fps.png http://techreport.com/r.x/core-i7-4770k/igp-metro-fps.png
https://forums.guru3d.com/data/avatars/m/132/132389.jpg
Saying it in general, it's flexible that it can be changed easily since it was designed to be modular which follows AMD's design principle of M-SPACE. The PS3 had one PPE and 8 SPE (1 reserved for the OS). The PPE was really just a really smart controller was all. The SPE did all the heavy lifting, they were not a "core" per se, but a specialized processing unit, just like the "cores" in your graphics card, they aren't really cores, yet they're sometimes marketed as such.
I don't know anything about M-SPACE (unless it's the shared memory space, what are the odds that's what M-SPACE stands for :wanker:), all I've learned about Jaguar however points to it being significantly weaker than even ancient CPUs. As for the PS3 part... That's a load of crap though and that's my problem, using that terminology then an ALU or pretty much any other core component is a specialized processing unit that can be "sometimes marketed" as a core. The SPEs were nothing more than off-core mini components. And the PS3 might as well have had 5 in best case scenarios and 0 in 99% of cases because that's what the reality was. 1 disabled for selling defective chips, 1 reserved for "security", 1 for outright spying on you, sending logs of EVERYTHING it can log to Sony's server every time you're connected online whether or not you log in. That's why PS3s with custom firmware would become banned even without logging in. Sony abused the ignorance and stupidity of the general public and perpetuated the myth that the PS3 has 8 cores while the 5 available vector units were almost never used. I can count on one hand the amount of games I know used the available 5 for a fact, and they're all exclusives. It would sound awfully bad if the general public knew the 360 used the same core type but had 3 of them and 6 threads versus 1 core and 2 threads. The PS3 had a third of the cores of the 360 and significantly weaker GPU, yet almost every console owner believes the PS3 has more powerful hardware that was never fully utilized. 😛uke2: I know you know this stuff, I just felt like writing it out. The killer part about this is, those horribly weak consoles seem relatively reasonable now (for the time) compared to what's coming out. I have a toaster with more processing power than the upcoming consoles.
data/avatar/default/avatar12.webp
I don't know anything about M-SPACE (unless it's the shared memory space, what are the odds that's what M-SPACE stands for :wanker:), all I've learned about Jaguar however points to it being significantly weaker than even ancient CPUs. As for the PS3 part... That's a load of crap though and that's my problem, using that terminology then an ALU or pretty much any other core component is a specialized processing unit that can be "sometimes marketed" as a core. The SPEs were nothing more than off-core mini components. And the PS3 might as well have had 5 in best case scenarios and 0 in 99% of cases because that's what the reality was. 1 disabled for selling defective chips, 1 reserved for "security", 1 for outright spying on you, sending logs of EVERYTHING it can log to Sony's server every time you're connected online whether or not you log in. That's why PS3s with custom firmware would become banned even without logging in. Sony abused the ignorance and stupidity of the general public and perpetuated the myth that the PS3 has 8 cores while the 5 available vector units were almost never used. I can count on one hand the amount of games I know used the available 5 for a fact, and they're all exclusives. It would sound awfully bad if the general public knew the 360 used the same core type but had 3 of them and 6 threads versus 1 core and 2 threads. The PS3 had a third of the cores of the 360 and significantly weaker GPU, yet almost every console owner believes the PS3 has more powerful hardware that was never fully utilized. 😛uke2: I know you know this stuff, I just felt like writing it out. The killer part about this is, those horribly weak consoles seem relatively reasonable now (for the time) compared to what's coming out. I have a toaster with more processing power than the upcoming consoles.
You are the one writing nonsense false info on the PS3 Cell. The Cell SPE units are not only fully functional but full DSP units for very fast vectorial processing. The Cell architecture was derived from IBM high-end expensive Power CPUs but used the PowerPC core because it was cheaper to include. Sony and IBM spent billions of dollars and quite some years on R&D for Cell. When Cell got released Intel didn't have anything able to compete with it, it was far behind. And AMD was already going to lose big time with Intel Centrino derived CPUs on the rise, after the Tejas/Prescott mess and billions lost on fake projects in India with some managers inside Intel that stole a lot of money and almost caused Intel to go bankrupt on that, but the Centrino R&D Intel team in Israel not only saved the Corporation but gave it such a boost that AMD still has no change to even remotely achieve the some results that Intel now can. Still the Cell was far too advanced when it hit the market. Sony and IBM managers were idiots not being able to market it properly and still inside the PS3 the Cell is used at less than 70% of its true actual performance. Anyone can see the upcoming "The Last Of Us" PS3 game that thanks to Cell and proper coding can achieve high quality graphics better than anythng seen on expensive PCs with multiple GPUs in SLI mode. And that on old hardware with an old GPU.. Cell allows programmers to offload a lot of complex maths on the SPE and get massive speed gains, much more than even the latest Intel AVX2 vectorial units can do. With Cell IBM once again just proved that if they only ever wanted they could have smashed Intel and caused AMD to go bankrupt in a blink of an eye. IBM expensive Power CPUs are extremely advanced. IBM as the R&D resources to release better products than Intel. Fact is that IBM managers are narrow minded and they lost a huge opportunity with the Cell to invade the desktop and server markets for real.
https://forums.guru3d.com/data/avatars/m/132/132389.jpg
What did I write that was false?
https://forums.guru3d.com/data/avatars/m/186/186805.jpg
You are the one writing nonsense false info on the PS3 Cell. The Cell SPE units are not only fully functional but full DSP units for very fast vectorial processing. The Cell architecture was derived from IBM high-end expensive Power CPUs but used the PowerPC core because it was cheaper to include. Sony and IBM spent billions of dollars and quite some years on R&D for Cell. When Cell got released Intel didn't have anything able to compete with it, it was far behind. And AMD was already going to lose big time with Intel Centrino derived CPUs on the rise, after the Tejas/Prescott mess and billions lost on fake projects in India with some managers inside Intel that stole a lot of money and almost caused Intel to go bankrupt on that, but the Centrino R&D Intel team in Israel not only saved the Corporation but gave it such a boost that AMD still has no change to even remotely achieve the some results that Intel now can. Still the Cell was far too advanced when it hit the market. Sony and IBM managers were idiots not being able to market it properly and still inside the PS3 the Cell is used at less than 70% of its true actual performance. Anyone can see the upcoming "The Last Of Us" PS3 game that thanks to Cell and proper coding can achieve high quality graphics better than anythng seen on expensive PCs with multiple GPUs in SLI mode. And that on old hardware with an old GPU.. Cell allows programmers to offload a lot of complex maths on the SPE and get massive speed gains, much more than even the latest Intel AVX2 vectorial units can do. With Cell IBM once again just proved that if they only ever wanted they could have smashed Intel and caused AMD to go bankrupt in a blink of an eye. IBM expensive Power CPUs are extremely advanced. IBM as the R&D resources to release better products than Intel. Fact is that IBM managers are narrow minded and they lost a huge opportunity with the Cell to invade the desktop and server markets for real.
http://www.escapistmagazine.com/news/view/121896-The-Last-of-Us-Squeezes-Every-Last-Drop-of-Power-From-PS3 Neo Cyrus is right in what he said. Cell was very powerful, but it could never run on the desktop as its not an x86/64 CPU its an IBM PPC. IBM would have to pay for the x86 license. It was very powerful, but to say that games done on the PS3 could not be done better, and more efficiently on a PC and its hardware is just ludicrous. Even mid range PC's from two years ago were leaps and bounds above the Cell in PS3. Cell was a bombshell for Sony and the rest involved, it cost them far too much money and its taken seven years for a first party company to get 100% out of it. I don't know about you but to me that just shows it was built wrong in the first place. It was far too complex for most devs. Sony also shot themselves in the foot by sending out instructions to devs in order to better understand the Cell and they sent them out in Japanese to EVERY dev even those in the UK and USA. Also, I have played The Last Of Us Demo on PS3 on a VERY good 50" Plasma TV and to be honest it looks pretty dated. Low textures, basically zero antialiasing, and a pretty poor draw distance. Also the A.I gets pretty confused from time to time. Its a stellar game in terms of acting, script, gameplay, but technically it looks pretty washed out. Still pretty decent for seven year old tech.
What did I write that was false?
The part about the toaster was a little white lie 😉
data/avatar/default/avatar23.webp
http://www.escapistmagazine.com/news/view/121896-The-Last-of-Us-Squeezes-Every-Last-Drop-of-Power-From-PS3 Neo Cyrus is right in what he said. Cell was very powerful, but it could never run on the desktop as its not an x86/64 CPU its an IBM PPC. IBM would have to pay for the x86 license. It was very powerful, but to say that games done on the PS3 could not be done better, and more efficiently on a PC and its hardware is just ludicrous. Even mid range PC's from two years ago were leaps and bounds above the Cell in PS3. Cell was a bombshell for Sony and the rest involved, it cost them far too much money and its taken seven years for a first party company to get 100% out of it. I don't know about you but to me that just shows it was built wrong in the first place. It was far too complex for most devs. Sony also shot themselves in the foot by sending out instructions to devs in order to better understand the Cell and they sent them out in Japanese to EVERY dev even those in the UK and USA. Also, I have played The Last Of Us Demo on PS3 on a VERY good 50" Plasma TV and to be honest it looks pretty dated. Low textures, basically zero antialiasing, and a pretty poor draw distance. Also the A.I gets pretty confused from time to time. Its a stellar game in terms of acting, script, gameplay, but technically it looks pretty washed out. Still pretty decent for seven year old tech. The part about the toaster was a little white lie 😉
The Last Of Us looks dated to you ? Low textures ? Expensive power hungry PCs with 3 200Watts GPUs in SLI can't run better 3D engines that the one of The Last Of Us on the Cell. That is reality. Then one can be blind and go around claiming that due to antialiasing their expensive multi-SLI PCs are more powerful because they are expensive and the games look better because it's expensive and new... BUT that is not the case. It's just not true. Games running on those expensive configurations are far from optimized in any possible way, the 3D engines are outdated.. Using huge textures with less or even no compression doesn't automagically mean that everything looks more photorealistic, new and not dated. If you seriously believe that Sony didn't get a profit for 7 years on the PS3 due to Cell R&D costs then you surely are naive or you are writing nonsense following an agenda. So do you even believe the claims that Warner Bros told the press about the Harry Potter franchise ? http://www.cinemablend.com/new/Leaked-Report-Claims-Warner-Bros-Lost-Money-On-2007-Harry-Potter-Film-19433.html Leaked Report Claims Warner Bros. Lost Money On 2007 Harry Potter Film Author: Eric Eisenberg | published: 2010-07-06 22:17:15 Seriously? Yeah, sure... WB can claim to have lost money BUT anyone trusting such an obvious lie must be really naive, blind and living under a rock... indeed. Sony lost money on the PS3 as much as WB lost money on Harry Potter. Then what is with all the babbling about Sony or IBM needing an x86/x64 license for the desktop market ? If ever they tried to do what they just didn't, trying to compete against Intel and AMD with Cell invading desktop, server and notebook markets... why should have they needed an x86/x64 license ? They could have just built their own UNIX OS just like Apple OS X for their own hardware. Also IBM has plenty of UNIX experts so delivering a new OS on par with OS X to attack Microsoft could have been done with not so much effort, they just needed to put the money needed to pay designers,coders and sign agreements with hardware manufacturers. They had and still would have the money to do it but when the managers are just stupid cowards and managers not willing to take the risk..they really can't make a group expanding or getting back to lost market segments.
data/avatar/default/avatar04.webp
Or we need new technology for batteries, something better than li-ion, or li-polymer. Higher capacity in small area with high current. Somehow i don't see batteries advancement since first li-polymer hit the market. I don't talk exactly about this processor.
https://forums.guru3d.com/data/avatars/m/156/156133.jpg
Moderator
The Last Of Us looks dated to you ? Low textures ? Expensive power hungry PCs with 3 200Watts GPUs in SLI can't run better 3D engines that the one of The Last Of Us on the Cell. That is reality. Then one can be blind and go around claiming that due to antialiasing their expensive multi-SLI PCs are more powerful because they are expensive and the games look better because it's expensive and new... BUT that is not the case. It's just not true. Games running on those expensive configurations are far from optimized in any possible way, the 3D engines are outdated.. Using huge textures with less or even no compression doesn't automagically mean that everything looks more photorealistic, new and not dated. If you seriously believe that Sony didn't get a profit for 7 years on the PS3 due to Cell R&D costs then you surely are naive or you are writing nonsense following an agenda. So do you even believe the claims that Warner Bros told the press about the Harry Potter franchise ? http://www.cinemablend.com/new/Leaked-Report-Claims-Warner-Bros-Lost-Money-On-2007-Harry-Potter-Film-19433.html Leaked Report Claims Warner Bros. Lost Money On 2007 Harry Potter Film Author: Eric Eisenberg | published: 2010-07-06 22:17:15 Seriously? Yeah, sure... WB can claim to have lost money BUT anyone trusting such an obvious lie must be really naive, blind and living under a rock... indeed. Sony lost money on the PS3 as much as WB lost money on Harry Potter. Then what is with all the babbling about Sony or IBM needing an x86/x64 license for the desktop market ? If ever they tried to do what they just didn't, trying to compete against Intel and AMD with Cell invading desktop, server and notebook markets... why should have they needed an x86/x64 license ? They could have just built their own UNIX OS just like Apple OS X for their own hardware. Also IBM has plenty of UNIX experts so delivering a new OS on par with OS X to attack Microsoft could have been done with not so much effort, they just needed to put the money needed to pay designers,coders and sign agreements with hardware manufacturers. They had and still would have the money to do it but when the managers are just stupid cowards and managers not willing to take the risk..they really can't make a group expanding or getting back to lost market segments.
You pretty much just answered your own statement. Newer hardware will easily out do the cell and xenon no issue. The cell has what it's good at, vector computing. But general computing that the general public uses, it's behind. For visuals, look at the difference between bf3 on the ps3 versus the PC. That's a newer optimized engine, even my low/mid 5800k CPU and gpu wise runs miles around the ps3. Also there's a reason why apple switched from ppc to x86/64. Ppc is harder to code for, uses more power, high cost for stable ppc core yields, and is less efficient for general computing versus x86/64. It has its niche in super computers, but with amds server designs I wouldn't be surprised to see more apu based servers. If IBM were to ever want to make an x86/64 platform, they would have to get licenses from intel since they own rights to it. You make it out to be one giant conspiracy, but it's not. It's simple aging and hardware limitations. That's just basic computing evolution.
data/avatar/default/avatar05.webp
I was about to make a post about everything wrong with your post. Then I looked at your name. Seeing as you're likely the same chojin that post on extremetech, I'll choose not to feed you.
The Last Of Us looks dated to you ? Low textures ? Expensive power hungry PCs with 3 200Watts GPUs in SLI can't run better 3D engines that the one of The Last Of Us on the Cell. That is reality. Then one can be blind and go around claiming that due to antialiasing their expensive multi-SLI PCs are more powerful because they are expensive and the games look better because it's expensive and new... BUT that is not the case. It's just not true. Games running on those expensive configurations are far from optimized in any possible way, the 3D engines are outdated.. Using huge textures with less or even no compression doesn't automagically mean that everything looks more photorealistic, new and not dated. If you seriously believe that Sony didn't get a profit for 7 years on the PS3 due to Cell R&D costs then you surely are naive or you are writing nonsense following an agenda. So do you even believe the claims that Warner Bros told the press about the Harry Potter franchise ? http://www.cinemablend.com/new/Leaked-Report-Claims-Warner-Bros-Lost-Money-On-2007-Harry-Potter-Film-19433.html Leaked Report Claims Warner Bros. Lost Money On 2007 Harry Potter Film Author: Eric Eisenberg | published: 2010-07-06 22:17:15 Seriously? Yeah, sure... WB can claim to have lost money BUT anyone trusting such an obvious lie must be really naive, blind and living under a rock... indeed. Sony lost money on the PS3 as much as WB lost money on Harry Potter. Then what is with all the babbling about Sony or IBM needing an x86/x64 license for the desktop market ? If ever they tried to do what they just didn't, trying to compete against Intel and AMD with Cell invading desktop, server and notebook markets... why should have they needed an x86/x64 license ? They could have just built their own UNIX OS just like Apple OS X for their own hardware. Also IBM has plenty of UNIX experts so delivering a new OS on par with OS X to attack Microsoft could have been done with not so much effort, they just needed to put the money needed to pay designers,coders and sign agreements with hardware manufacturers. They had and still would have the money to do it but when the managers are just stupid cowards and managers not willing to take the risk..they really can't make a group expanding or getting back to lost market segments.
data/avatar/default/avatar10.webp
Hell with it. Because X86 is the entire desktop market. It would take way more than a theoretically powerful chip, and UNIX kernel to unseat the guys at the top of this space. Apple is barely successful there when compared to the entirety of PC OEMs and Microsoft, and abandoned Xserve. The lack of third party software and hardware support, and a likely slow adoption would leave them DOA. Their area is will probably forever be specialized systems.