AMD A10 6800K benchmarked, tested
Click here to post a comment for AMD A10 6800K benchmarked, tested on our message forum
Titan29
I think Kaveri (coming later this year) will be the real deal. Steamroller cpu with GCN cores.
Chillin
Here is an excellent graph to show AMD's problem:
http://techreport.com/r.x/core-i7-4770k/power-peak.png
http://techreport.com/r.x/core-i7-4770k/power-task-energy.png
That shows the amount of energy used to complete a workload.
This means that not only does the Intel CPU run faster and use less peak power, but since it finishes the workload faster it uses up even less energy than is usually noted in graphs.
Not to mention that the new Intel Iris Pro (5200) runs circles around AMD's top APU, including the GPU portion, while using far less energy.
http://techreport.com/r.x/core-i7-4770k/igp-g2-fps.png
http://techreport.com/r.x/core-i7-4770k/igp-metro-fps.png
Neo Cyrus
chojin996
Neo Cyrus
What did I write that was false?
CPC_RedDawn
http://www.escapistmagazine.com/news/view/121896-The-Last-of-Us-Squeezes-Every-Last-Drop-of-Power-From-PS3
Neo Cyrus is right in what he said. Cell was very powerful, but it could never run on the desktop as its not an x86/64 CPU its an IBM PPC. IBM would have to pay for the x86 license. It was very powerful, but to say that games done on the PS3 could not be done better, and more efficiently on a PC and its hardware is just ludicrous. Even mid range PC's from two years ago were leaps and bounds above the Cell in PS3. Cell was a bombshell for Sony and the rest involved, it cost them far too much money and its taken seven years for a first party company to get 100% out of it. I don't know about you but to me that just shows it was built wrong in the first place. It was far too complex for most devs. Sony also shot themselves in the foot by sending out instructions to devs in order to better understand the Cell and they sent them out in Japanese to EVERY dev even those in the UK and USA.
Also, I have played The Last Of Us Demo on PS3 on a VERY good 50" Plasma TV and to be honest it looks pretty dated. Low textures, basically zero antialiasing, and a pretty poor draw distance. Also the A.I gets pretty confused from time to time. Its a stellar game in terms of acting, script, gameplay, but technically it looks pretty washed out. Still pretty decent for seven year old tech.
The part about the toaster was a little white lie 😉
chojin996
http://www.cinemablend.com/new/Leaked-Report-Claims-Warner-Bros-Lost-Money-On-2007-Harry-Potter-Film-19433.html
Leaked Report Claims Warner Bros. Lost Money On 2007 Harry Potter Film
Author: Eric Eisenberg | published: 2010-07-06 22:17:15
Seriously? Yeah, sure... WB can claim to have lost money BUT anyone trusting such an obvious lie must be really naive, blind and living under a rock... indeed.
Sony lost money on the PS3 as much as WB lost money on Harry Potter.
Then what is with all the babbling about Sony or IBM needing an x86/x64 license for the desktop market ?
If ever they tried to do what they just didn't, trying to compete against Intel and AMD with Cell invading desktop, server and notebook markets... why should have they needed an x86/x64 license ?
They could have just built their own UNIX OS just like Apple OS X for their own hardware. Also IBM has plenty of UNIX experts so delivering a new OS on par with OS X to attack Microsoft could have been done with not so much effort, they just needed to put the money needed to pay designers,coders and sign agreements with hardware manufacturers.
They had and still would have the money to do it but when the managers are just stupid cowards and managers not willing to take the risk..they really can't make a group expanding or getting back to lost market segments.
The Last Of Us looks dated to you ? Low textures ?
Expensive power hungry PCs with 3 200Watts GPUs in SLI can't run better 3D engines that the one of The Last Of Us on the Cell.
That is reality.
Then one can be blind and go around claiming that due to antialiasing their expensive multi-SLI PCs are more powerful because they are expensive and the games look better because it's expensive and new... BUT that is not the case. It's just not true.
Games running on those expensive configurations are far from optimized in any possible way, the 3D engines are outdated..
Using huge textures with less or even no compression doesn't automagically mean that everything looks more photorealistic, new and not dated.
If you seriously believe that Sony didn't get a profit for 7 years on the PS3 due to Cell R&D costs then you surely are naive or you are writing nonsense following an agenda.
So do you even believe the claims that Warner Bros told the press about the Harry Potter franchise ?
NAMEk
Or we need new technology for batteries, something better than li-ion, or li-polymer. Higher capacity in small area with high current. Somehow i don't see batteries advancement since first li-polymer hit the market. I don't talk exactly about this processor.
vbetts
Moderator
blkspade
I was about to make a post about everything wrong with your post. Then I looked at your name. Seeing as you're likely the same chojin that post on extremetech, I'll choose not to feed you.
blkspade
Hell with it. Because X86 is the entire desktop market. It would take way more than a theoretically powerful chip, and UNIX kernel to unseat the guys at the top of this space. Apple is barely successful there when compared to the entirety of PC OEMs and Microsoft, and abandoned Xserve. The lack of third party software and hardware support, and a likely slow adoption would leave them DOA. Their area is will probably forever be specialized systems.