Guru3D.com
  • HOME
  • NEWS
    • Channels
    • Archive
  • DOWNLOADS
    • New Downloads
    • Categories
    • Archive
  • GAME REVIEWS
  • ARTICLES
    • Rig of the Month
    • Join ROTM
    • PC Buyers Guide
    • Guru3D VGA Charts
    • Editorials
    • Dated content
  • HARDWARE REVIEWS
    • Videocards
    • Processors
    • Audio
    • Motherboards
    • Memory and Flash
    • SSD Storage
    • Chassis
    • Media Players
    • Power Supply
    • Laptop and Mobile
    • Smartphone
    • Networking
    • Keyboard Mouse
    • Cooling
    • Search articles
    • Knowledgebase
    • More Categories
  • FORUMS
  • NEWSLETTER
  • CONTACT

New Reviews
AMD Ryzen 5 5600 review
PowerColor RX 6650 XT Hellhound White review
FSP Hydro PTM Pro (1200W PSU) review
ASUS ROG Radeon RX 6750 XT STRIX review
AMD FidelityFX Super Resolution 2.0 - preview
Sapphire Radeon RX 6650 XT Nitro+ review
Sapphire Radeon RX 6950 XT Sapphire Nitro+ Pure review
Sapphire Radeon RX 6750 XT Nitro+ review
MSI Radeon RX 6950 XT Gaming X TRIO review
MSI Radeon RX 6750 XT Gaming X TRIO review

New Downloads
AIDA64 Download Version 6.70
FurMark Download v1.30
Display Driver Uninstaller Download version 18.0.5.1
Download Samsung Magician v7.1.1.820
Intel ARC graphics Driver Download Version: 30.0.101.1732
HWiNFO Download v7.24
GeForce 512.77 WHQL driver download
Intel HD graphics Driver Download Version: 30.0.101.1960
AMD Radeon Software Adrenalin 22.5.1 WHQL driver download
3DMark Download v2.22.7359 + Time Spy


New Forum Topics
Gigabyte confirms AMD X670 chipset based motherboards Computex 2022 Press Release Rumor: AMD to announce X670 Extreme, X670 and B650 Chipsets Are we ever going to get a new NVIDIA CONTROL PANEL ??? 5900x or 5800x3D? [3rd-Party Driver] Amernime Zone Radeon Insight 22.5.1 WHQL Driver Pack (Released) The AMD Ryzen All In One Thread /Overclocking/Memory Speeds & Timings/Tweaking/Cooling Part 2 When can we expect to see Microsoft DirectStorage in real games? CD Projekt RED promises next-gen versions of The Witcher 3 by the end of 2022 Rumor: NVIDIA could unveil a GeForce GTX 1630 this month Review: AMD Ryzen 5 5600 processor




Guru3D.com » Review » AMD A10 6800K review 4

AMD A10 6800K review 4

Posted by: Hilbert Hagedoorn on: 06/05/2013 07:09 AM [ 33 comment(s) ]

We review the 145 EUR AMD A10 6800K APU processor that you guys know under codename Richland. Based on Piledriver architecture this processor slash graphics hybrid symbiosis called APU remains hard to beat in terms of features performance and well all the goodness you can expect from a great APU. Value and fun is what the platform offers. Our conclusion stands, a A10 6800K with an A75 or A85X based motherboard for normal daily usage is just fine, it's great  for HTPCs and even a game or two, albeit in lower resolutions and quality levels. 

Read article


Advertisement



Tagged as: review, amd, apu, a10

« ASUS Sabertooth Z87 motherboard review · AMD A10 6800K review · AMD A10 6700 review »

pages « < 4 5 6 7

vbetts



Posts: 15139
Posted on: 06/07/2013 07:48 PM
The Last Of Us looks dated to you ? Low textures ?
Expensive power hungry PCs with 3 200Watts GPUs in SLI can't run better 3D engines that the one of The Last Of Us on the Cell.
That is reality.
Then one can be blind and go around claiming that due to antialiasing their expensive multi-SLI PCs are more powerful because they are expensive and the games look better because it's expensive and new... BUT that is not the case. It's just not true.
Games running on those expensive configurations are far from optimized in any possible way, the 3D engines are outdated..
Using huge textures with less or even no compression doesn't automagically mean that everything looks more photorealistic, new and not dated.

If you seriously believe that Sony didn't get a profit for 7 years on the PS3 due to Cell R&D costs then you surely are naive or you are writing nonsense following an agenda.
So do you even believe the claims that Warner Bros told the press about the Harry Potter franchise ?

http://www.cinemablend.com/new/Leaked-Report-Claims-Warner-Bros-Lost-Money-On-2007-Harry-Potter-Film-19433.html
Leaked Report Claims Warner Bros. Lost Money On 2007 Harry Potter Film
Author: Eric Eisenberg
| published: 2010-07-06 22:17:15

Seriously? Yeah, sure... WB can claim to have lost money BUT anyone trusting such an obvious lie must be really naive, blind and living under a rock... indeed.

Sony lost money on the PS3 as much as WB lost money on Harry Potter.

Then what is with all the babbling about Sony or IBM needing an x86/x64 license for the desktop market ?
If ever they tried to do what they just didn't, trying to compete against Intel and AMD with Cell invading desktop, server and notebook markets... why should have they needed an x86/x64 license ?
They could have just built their own UNIX OS just like Apple OS X for their own hardware. Also IBM has plenty of UNIX experts so delivering a new OS on par with OS X to attack Microsoft could have been done with not so much effort, they just needed to put the money needed to pay designers,coders and sign agreements with hardware manufacturers.
They had and still would have the money to do it but when the managers are just stupid cowards and managers not willing to take the risk..they really can't make a group expanding or getting back to lost market segments.

You pretty much just answered your own statement. Newer hardware will easily out do the cell and xenon no issue. The cell has what it's good at, vector computing. But general computing that the general public uses, it's behind. For visuals, look at the difference between bf3 on the ps3 versus the PC. That's a newer optimized engine, even my low/mid 5800k CPU and gpu wise runs miles around the ps3.

Also there's a reason why apple switched from ppc to x86/64. Ppc is harder to code for, uses more power, high cost for stable ppc core yields, and is less efficient for general computing versus x86/64. It has its niche in super computers, but with amds server designs I wouldn't be surprised to see more apu based servers. If IBM were to ever want to make an x86/64 platform, they would have to get licenses from intel since they own rights to it.

You make it out to be one giant conspiracy, but it's not. It's simple aging and hardware limitations. That's just basic computing evolution.

blkspade
Senior Member



Posts: 613
Posted on: 06/07/2013 09:02 PM
I was about to make a post about everything wrong with your post. Then I looked at your name. Seeing as you're likely the same chojin that post on extremetech, I'll choose not to feed you.

The Last Of Us looks dated to you ? Low textures ?
Expensive power hungry PCs with 3 200Watts GPUs in SLI can't run better 3D engines that the one of The Last Of Us on the Cell.
That is reality.
Then one can be blind and go around claiming that due to antialiasing their expensive multi-SLI PCs are more powerful because they are expensive and the games look better because it's expensive and new... BUT that is not the case. It's just not true.
Games running on those expensive configurations are far from optimized in any possible way, the 3D engines are outdated..
Using huge textures with less or even no compression doesn't automagically mean that everything looks more photorealistic, new and not dated.

If you seriously believe that Sony didn't get a profit for 7 years on the PS3 due to Cell R&D costs then you surely are naive or you are writing nonsense following an agenda.
So do you even believe the claims that Warner Bros told the press about the Harry Potter franchise ?

http://www.cinemablend.com/new/Leaked-Report-Claims-Warner-Bros-Lost-Money-On-2007-Harry-Potter-Film-19433.html
Leaked Report Claims Warner Bros. Lost Money On 2007 Harry Potter Film
Author: Eric Eisenberg
| published: 2010-07-06 22:17:15

Seriously? Yeah, sure... WB can claim to have lost money BUT anyone trusting such an obvious lie must be really naive, blind and living under a rock... indeed.

Sony lost money on the PS3 as much as WB lost money on Harry Potter.

Then what is with all the babbling about Sony or IBM needing an x86/x64 license for the desktop market ?
If ever they tried to do what they just didn't, trying to compete against Intel and AMD with Cell invading desktop, server and notebook markets... why should have they needed an x86/x64 license ?
They could have just built their own UNIX OS just like Apple OS X for their own hardware. Also IBM has plenty of UNIX experts so delivering a new OS on par with OS X to attack Microsoft could have been done with not so much effort, they just needed to put the money needed to pay designers,coders and sign agreements with hardware manufacturers.
They had and still would have the money to do it but when the managers are just stupid cowards and managers not willing to take the risk..they really can't make a group expanding or getting back to lost market segments.

blkspade
Senior Member



Posts: 613
Posted on: 06/07/2013 09:17 PM
Hell with it. Because X86 is the entire desktop market. It would take way more than a theoretically powerful chip, and UNIX kernel to unseat the guys at the top of this space. Apple is barely successful there when compared to the entirety of PC OEMs and Microsoft, and abandoned Xserve. The lack of third party software and hardware support, and a likely slow adoption would leave them DOA. Their area is will probably forever be specialized systems.

pages « < 4 5 6 7

Post New Comment
Click here to post a comment for this article on the message forum.

Guru3D.com » Articles » AMD A10 6800K review 4

Guru3D.com © 2022