AMD Ryzen 5 5600 review
PowerColor RX 6650 XT Hellhound White review
FSP Hydro PTM Pro (1200W PSU) review
ASUS ROG Radeon RX 6750 XT STRIX review
AMD FidelityFX Super Resolution 2.0 - preview
Sapphire Radeon RX 6650 XT Nitro+ review
Sapphire Radeon RX 6950 XT Sapphire Nitro+ Pure review
Sapphire Radeon RX 6750 XT Nitro+ review
MSI Radeon RX 6950 XT Gaming X TRIO review
MSI Radeon RX 6750 XT Gaming X TRIO review
AMD A10 6800K review




We review the 145 EUR AMD A10 6800K APU processor that you guys know under codename Richland. Based on Piledriver architecture this processor slash graphics hybrid symbiosis called APU remains hard to beat in terms of features performance and well all the goodness you can expect from a great APU. Value and fun is what the platform offers. Our conclusion stands, a A10 6800K with an A75 or A85X based motherboard for normal daily usage is just fine, it's great for HTPCs and even a game or two, albeit in lower resolutions and quality levels.
Read article
Advertisement
« ASUS Sabertooth Z87 motherboard review · AMD A10 6800K review
· AMD A10 6700 review »
pages « < 4 5 6 7
blkspade
Senior Member
Posts: 613
Senior Member
Posts: 613
Posted on: 06/07/2013 09:02 PM
I was about to make a post about everything wrong with your post. Then I looked at your name. Seeing as you're likely the same chojin that post on extremetech, I'll choose not to feed you.
The Last Of Us looks dated to you ? Low textures ?
Expensive power hungry PCs with 3 200Watts GPUs in SLI can't run better 3D engines that the one of The Last Of Us on the Cell.
That is reality.
Then one can be blind and go around claiming that due to antialiasing their expensive multi-SLI PCs are more powerful because they are expensive and the games look better because it's expensive and new... BUT that is not the case. It's just not true.
Games running on those expensive configurations are far from optimized in any possible way, the 3D engines are outdated..
Using huge textures with less or even no compression doesn't automagically mean that everything looks more photorealistic, new and not dated.
If you seriously believe that Sony didn't get a profit for 7 years on the PS3 due to Cell R&D costs then you surely are naive or you are writing nonsense following an agenda.
So do you even believe the claims that Warner Bros told the press about the Harry Potter franchise ?
http://www.cinemablend.com/new/Leaked-Report-Claims-Warner-Bros-Lost-Money-On-2007-Harry-Potter-Film-19433.html
Leaked Report Claims Warner Bros. Lost Money On 2007 Harry Potter Film
Author: Eric Eisenberg | published: 2010-07-06 22:17:15
Seriously? Yeah, sure... WB can claim to have lost money BUT anyone trusting such an obvious lie must be really naive, blind and living under a rock... indeed.
Sony lost money on the PS3 as much as WB lost money on Harry Potter.
Then what is with all the babbling about Sony or IBM needing an x86/x64 license for the desktop market ?
If ever they tried to do what they just didn't, trying to compete against Intel and AMD with Cell invading desktop, server and notebook markets... why should have they needed an x86/x64 license ?
They could have just built their own UNIX OS just like Apple OS X for their own hardware. Also IBM has plenty of UNIX experts so delivering a new OS on par with OS X to attack Microsoft could have been done with not so much effort, they just needed to put the money needed to pay designers,coders and sign agreements with hardware manufacturers.
They had and still would have the money to do it but when the managers are just stupid cowards and managers not willing to take the risk..they really can't make a group expanding or getting back to lost market segments.
I was about to make a post about everything wrong with your post. Then I looked at your name. Seeing as you're likely the same chojin that post on extremetech, I'll choose not to feed you.
The Last Of Us looks dated to you ? Low textures ?
Expensive power hungry PCs with 3 200Watts GPUs in SLI can't run better 3D engines that the one of The Last Of Us on the Cell.
That is reality.
Then one can be blind and go around claiming that due to antialiasing their expensive multi-SLI PCs are more powerful because they are expensive and the games look better because it's expensive and new... BUT that is not the case. It's just not true.
Games running on those expensive configurations are far from optimized in any possible way, the 3D engines are outdated..
Using huge textures with less or even no compression doesn't automagically mean that everything looks more photorealistic, new and not dated.
If you seriously believe that Sony didn't get a profit for 7 years on the PS3 due to Cell R&D costs then you surely are naive or you are writing nonsense following an agenda.
So do you even believe the claims that Warner Bros told the press about the Harry Potter franchise ?
http://www.cinemablend.com/new/Leaked-Report-Claims-Warner-Bros-Lost-Money-On-2007-Harry-Potter-Film-19433.html
Leaked Report Claims Warner Bros. Lost Money On 2007 Harry Potter Film
Author: Eric Eisenberg | published: 2010-07-06 22:17:15
Seriously? Yeah, sure... WB can claim to have lost money BUT anyone trusting such an obvious lie must be really naive, blind and living under a rock... indeed.
Sony lost money on the PS3 as much as WB lost money on Harry Potter.
Then what is with all the babbling about Sony or IBM needing an x86/x64 license for the desktop market ?
If ever they tried to do what they just didn't, trying to compete against Intel and AMD with Cell invading desktop, server and notebook markets... why should have they needed an x86/x64 license ?
They could have just built their own UNIX OS just like Apple OS X for their own hardware. Also IBM has plenty of UNIX experts so delivering a new OS on par with OS X to attack Microsoft could have been done with not so much effort, they just needed to put the money needed to pay designers,coders and sign agreements with hardware manufacturers.
They had and still would have the money to do it but when the managers are just stupid cowards and managers not willing to take the risk..they really can't make a group expanding or getting back to lost market segments.
blkspade
Senior Member
Posts: 613
Senior Member
Posts: 613
Posted on: 06/07/2013 09:17 PM
Hell with it. Because X86 is the entire desktop market. It would take way more than a theoretically powerful chip, and UNIX kernel to unseat the guys at the top of this space. Apple is barely successful there when compared to the entirety of PC OEMs and Microsoft, and abandoned Xserve. The lack of third party software and hardware support, and a likely slow adoption would leave them DOA. Their area is will probably forever be specialized systems.
Hell with it. Because X86 is the entire desktop market. It would take way more than a theoretically powerful chip, and UNIX kernel to unseat the guys at the top of this space. Apple is barely successful there when compared to the entirety of PC OEMs and Microsoft, and abandoned Xserve. The lack of third party software and hardware support, and a likely slow adoption would leave them DOA. Their area is will probably forever be specialized systems.
pages « < 4 5 6 7
Click here to post a comment for this article on the message forum.
Posts: 15139
The Last Of Us looks dated to you ? Low textures ?
Expensive power hungry PCs with 3 200Watts GPUs in SLI can't run better 3D engines that the one of The Last Of Us on the Cell.
That is reality.
Then one can be blind and go around claiming that due to antialiasing their expensive multi-SLI PCs are more powerful because they are expensive and the games look better because it's expensive and new... BUT that is not the case. It's just not true.
Games running on those expensive configurations are far from optimized in any possible way, the 3D engines are outdated..
Using huge textures with less or even no compression doesn't automagically mean that everything looks more photorealistic, new and not dated.
If you seriously believe that Sony didn't get a profit for 7 years on the PS3 due to Cell R&D costs then you surely are naive or you are writing nonsense following an agenda.
So do you even believe the claims that Warner Bros told the press about the Harry Potter franchise ?
http://www.cinemablend.com/new/Leaked-Report-Claims-Warner-Bros-Lost-Money-On-2007-Harry-Potter-Film-19433.html
Leaked Report Claims Warner Bros. Lost Money On 2007 Harry Potter Film
Author: Eric Eisenberg | published: 2010-07-06 22:17:15
Seriously? Yeah, sure... WB can claim to have lost money BUT anyone trusting such an obvious lie must be really naive, blind and living under a rock... indeed.
Sony lost money on the PS3 as much as WB lost money on Harry Potter.
Then what is with all the babbling about Sony or IBM needing an x86/x64 license for the desktop market ?
If ever they tried to do what they just didn't, trying to compete against Intel and AMD with Cell invading desktop, server and notebook markets... why should have they needed an x86/x64 license ?
They could have just built their own UNIX OS just like Apple OS X for their own hardware. Also IBM has plenty of UNIX experts so delivering a new OS on par with OS X to attack Microsoft could have been done with not so much effort, they just needed to put the money needed to pay designers,coders and sign agreements with hardware manufacturers.
They had and still would have the money to do it but when the managers are just stupid cowards and managers not willing to take the risk..they really can't make a group expanding or getting back to lost market segments.
You pretty much just answered your own statement. Newer hardware will easily out do the cell and xenon no issue. The cell has what it's good at, vector computing. But general computing that the general public uses, it's behind. For visuals, look at the difference between bf3 on the ps3 versus the PC. That's a newer optimized engine, even my low/mid 5800k CPU and gpu wise runs miles around the ps3.
Also there's a reason why apple switched from ppc to x86/64. Ppc is harder to code for, uses more power, high cost for stable ppc core yields, and is less efficient for general computing versus x86/64. It has its niche in super computers, but with amds server designs I wouldn't be surprised to see more apu based servers. If IBM were to ever want to make an x86/64 platform, they would have to get licenses from intel since they own rights to it.
You make it out to be one giant conspiracy, but it's not. It's simple aging and hardware limitations. That's just basic computing evolution.