Guru3D.com
  • HOME
  • NEWS
    • Channels
    • Archive
  • DOWNLOADS
    • New Downloads
    • Categories
    • Archive
  • GAME REVIEWS
  • ARTICLES
    • Rig of the Month
    • Join ROTM
    • PC Buyers Guide
    • Guru3D VGA Charts
    • Editorials
    • Dated content
  • HARDWARE REVIEWS
    • Videocards
    • Processors
    • Audio
    • Motherboards
    • Memory and Flash
    • SSD Storage
    • Chassis
    • Media Players
    • Power Supply
    • Laptop and Mobile
    • Smartphone
    • Networking
    • Keyboard Mouse
    • Cooling
    • Search articles
    • Knowledgebase
    • More Categories
  • FORUMS
  • NEWSLETTER
  • CONTACT

New Reviews
Fractal Design Pop Air RGB Black TG review
Palit GeForce GTX 1630 4GB Dual review
FSP Dagger Pro (850W PSU) review
Razer Leviathan V2 gaming soundbar review
Guru3D NVMe Thermal Test - the heatsink vs. performance
EnGenius ECW220S 2x2 Cloud Access Point review
Alphacool Eisbaer Aurora HPE 360 LCS cooler review
Noctua NH-D12L CPU Cooler Review
Silicon Power XPOWER XS70 1TB NVMe SSD Review
Hyte Y60 chassis review

New Downloads
AMD Radeon Software Adrenalin 22.6.1 WHQL driver download
GeForce 516.59 WHQL driver download
Media Player Classic - Home Cinema v1.9.22 Download
AMD Chipset Drivers Download v4.06.10.651
CrystalDiskInfo 8.17 Download
AMD Radeon Software Adrenalin 22.6.1 Windows 7 driver download
ReShade download v5.2.2
HWiNFO Download v7.26
7-Zip v22.00 Download
GeForce 516.40 WHQL driver download


New Forum Topics
AMD Radeon Software - UWP 3060ti vs 6700xt a year later [3rd-Party Driver] Amernime Zone Radeon Insight 22.5.1 WHQL Driver Pack (Released) Sony Gaming Gear Brand INZONE with Upgraded Gaming Monitors and Headsets Review: Fractal Design Pop Air RGB Black TG Philips Launches the 34E1C5600HE Monitor i-3 Processor upgrade PlayStation 3 emulator increases its CPU performance by 30% with AVX-512 The AMD Ryzen All In One Thread /Overclocking/Memory Speeds & Timings/Tweaking/Cooling Part 2 AMD Radeon Software Adrenalin 22.6.1 - Driver download and discussion




Guru3D.com » Review » AMD A10 6800K review 4

AMD A10 6800K review 4

Posted by: Hilbert Hagedoorn on: 06/05/2013 07:09 AM [ 33 comment(s) ]

We review the 145 EUR AMD A10 6800K APU processor that you guys know under codename Richland. Based on Piledriver architecture this processor slash graphics hybrid symbiosis called APU remains hard to beat in terms of features performance and well all the goodness you can expect from a great APU. Value and fun is what the platform offers. Our conclusion stands, a A10 6800K with an A75 or A85X based motherboard for normal daily usage is just fine, it's great  for HTPCs and even a game or two, albeit in lower resolutions and quality levels. 

Read article


Advertisement



Tagged as: review, amd, apu, a10

« ASUS Sabertooth Z87 motherboard review · AMD A10 6800K review · AMD A10 6700 review »

pages « < 4 5 6 7

chojin996
Junior Member



Posts: 19
Posted on: 06/06/2013 06:31 PM
I don't know anything about M-SPACE (unless it's the shared memory space, what are the odds that's what M-SPACE stands for :wanker :), all I've learned about Jaguar however points to it being significantly weaker than even ancient CPUs.

As for the PS3 part...

That's a load of crap though and that's my problem, using that terminology then an ALU or pretty much any other core component is a specialized processing unit that can be "sometimes marketed" as a core. The SPEs were nothing more than off-core mini components. And the PS3 might as well have had 5 in best case scenarios and 0 in 99% of cases because that's what the reality was. 1 disabled for selling defective chips, 1 reserved for "security", 1 for outright spying on you, sending logs of EVERYTHING it can log to Sony's server every time you're connected online whether or not you log in. That's why PS3s with custom firmware would become banned even without logging in.

Sony abused the ignorance and stupidity of the general public and perpetuated the myth that the PS3 has 8 cores while the 5 available vector units were almost never used. I can count on one hand the amount of games I know used the available 5 for a fact, and they're all exclusives. It would sound awfully bad if the general public knew the 360 used the same core type but had 3 of them and 6 threads versus 1 core and 2 threads.

The PS3 had a third of the cores of the 360 and significantly weaker GPU, yet almost every console owner believes the PS3 has more powerful hardware that was never fully utilized. :puke2:

I know you know this stuff, I just felt like writing it out.

The killer part about this is, those horribly weak consoles seem relatively reasonable now (for the time) compared to what's coming out. I have a toaster with more processing power than the upcoming consoles.

You are the one writing nonsense false info on the PS3 Cell.

The Cell SPE units are not only fully functional but full DSP units for very fast vectorial processing.
The Cell architecture was derived from IBM high-end expensive Power CPUs but used the PowerPC core because it was cheaper to include.
Sony and IBM spent billions of dollars and quite some years on R&D for Cell.
When Cell got released Intel didn't have anything able to compete with it, it was far behind. And AMD was already going to lose big time with Intel Centrino derived CPUs on the rise, after the Tejas/Prescott mess and billions lost on fake projects in India with some managers inside Intel that stole a lot of money and almost caused Intel to go bankrupt on that, but the Centrino R&D Intel team in Israel not only saved the Corporation but gave it such a boost that AMD still has no change to even remotely achieve the some results that Intel now can.
Still the Cell was far too advanced when it hit the market. Sony and IBM managers were idiots not being able to market it properly and still inside the PS3 the Cell is used at less than 70% of its true actual performance.
Anyone can see the upcoming "The Last Of Us" PS3 game that thanks to Cell and proper coding can achieve high quality graphics better than anythng seen on expensive PCs with multiple GPUs in SLI mode. And that on old hardware with an old GPU.. Cell allows programmers to offload a lot of complex maths on the SPE and get massive speed gains, much more than even the latest Intel AVX2 vectorial units can do.
With Cell IBM once again just proved that if they only ever wanted they could have smashed Intel and caused AMD to go bankrupt in a blink of an eye.
IBM expensive Power CPUs are extremely advanced. IBM as the R&D resources to release better products than Intel. Fact is that IBM managers are narrow minded and they lost a huge opportunity with the Cell to invade the desktop and server markets for real.

Neo Cyrus
Senior Member



Posts: 10090
Posted on: 06/06/2013 08:43 PM
What did I write that was false?

CPC_RedDawn
Senior Member



Posts: 9351
Posted on: 06/06/2013 09:04 PM
You are the one writing nonsense false info on the PS3 Cell.

The Cell SPE units are not only fully functional but full DSP units for very fast vectorial processing.
The Cell architecture was derived from IBM high-end expensive Power CPUs but used the PowerPC core because it was cheaper to include.
Sony and IBM spent billions of dollars and quite some years on R&D for Cell.
When Cell got released Intel didn't have anything able to compete with it, it was far behind. And AMD was already going to lose big time with Intel Centrino derived CPUs on the rise, after the Tejas/Prescott mess and billions lost on fake projects in India with some managers inside Intel that stole a lot of money and almost caused Intel to go bankrupt on that, but the Centrino R&D Intel team in Israel not only saved the Corporation but gave it such a boost that AMD still has no change to even remotely achieve the some results that Intel now can.
Still the Cell was far too advanced when it hit the market. Sony and IBM managers were idiots not being able to market it properly and still inside the PS3 the Cell is used at less than 70% of its true actual performance.
Anyone can see the upcoming "The Last Of Us" PS3 game that thanks to Cell and proper coding can achieve high quality graphics better than anythng seen on expensive PCs with multiple GPUs in SLI mode. And that on old hardware with an old GPU.. Cell allows programmers to offload a lot of complex maths on the SPE and get massive speed gains, much more than even the latest Intel AVX2 vectorial units can do.
With Cell IBM once again just proved that if they only ever wanted they could have smashed Intel and caused AMD to go bankrupt in a blink of an eye.
IBM expensive Power CPUs are extremely advanced. IBM as the R&D resources to release better products than Intel. Fact is that IBM managers are narrow minded and they lost a huge opportunity with the Cell to invade the desktop and server markets for real.

http://www.escapistmagazine.com/news/view/121896-The-Last-of-Us-Squeezes-Every-Last-Drop-of-Power-From-PS3

Neo Cyrus is right in what he said. Cell was very powerful, but it could never run on the desktop as its not an x86/64 CPU its an IBM PPC. IBM would have to pay for the x86 license. It was very powerful, but to say that games done on the PS3 could not be done better, and more efficiently on a PC and its hardware is just ludicrous. Even mid range PC's from two years ago were leaps and bounds above the Cell in PS3. Cell was a bombshell for Sony and the rest involved, it cost them far too much money and its taken seven years for a first party company to get 100% out of it. I don't know about you but to me that just shows it was built wrong in the first place. It was far too complex for most devs. Sony also shot themselves in the foot by sending out instructions to devs in order to better understand the Cell and they sent them out in Japanese to EVERY dev even those in the UK and USA.

Also, I have played The Last Of Us Demo on PS3 on a VERY good 50" Plasma TV and to be honest it looks pretty dated. Low textures, basically zero antialiasing, and a pretty poor draw distance. Also the A.I gets pretty confused from time to time. Its a stellar game in terms of acting, script, gameplay, but technically it looks pretty washed out. Still pretty decent for seven year old tech.
What did I write that was false?

The part about the toaster was a little white lie ;)

chojin996
Junior Member



Posts: 19
Posted on: 06/06/2013 11:31 PM
http://www.escapistmagazine.com/news/view/121896-The-Last-of-Us-Squeezes-Every-Last-Drop-of-Power-From-PS3

Neo Cyrus is right in what he said. Cell was very powerful, but it could never run on the desktop as its not an x86/64 CPU its an IBM PPC. IBM would have to pay for the x86 license. It was very powerful, but to say that games done on the PS3 could not be done better, and more efficiently on a PC and its hardware is just ludicrous. Even mid range PC's from two years ago were leaps and bounds above the Cell in PS3. Cell was a bombshell for Sony and the rest involved, it cost them far too much money and its taken seven years for a first party company to get 100% out of it. I don't know about you but to me that just shows it was built wrong in the first place. It was far too complex for most devs. Sony also shot themselves in the foot by sending out instructions to devs in order to better understand the Cell and they sent them out in Japanese to EVERY dev even those in the UK and USA.

Also, I have played The Last Of Us Demo on PS3 on a VERY good 50" Plasma TV and to be honest it looks pretty dated. Low textures, basically zero antialiasing, and a pretty poor draw distance. Also the A.I gets pretty confused from time to time. Its a stellar game in terms of acting, script, gameplay, but technically it looks pretty washed out. Still pretty decent for seven year old tech.


The part about the toaster was a little white lie ;)

The Last Of Us looks dated to you ? Low textures ?
Expensive power hungry PCs with 3 200Watts GPUs in SLI can't run better 3D engines that the one of The Last Of Us on the Cell.
That is reality.
Then one can be blind and go around claiming that due to antialiasing their expensive multi-SLI PCs are more powerful because they are expensive and the games look better because it's expensive and new... BUT that is not the case. It's just not true.
Games running on those expensive configurations are far from optimized in any possible way, the 3D engines are outdated..
Using huge textures with less or even no compression doesn't automagically mean that everything looks more photorealistic, new and not dated.

If you seriously believe that Sony didn't get a profit for 7 years on the PS3 due to Cell R&D costs then you surely are naive or you are writing nonsense following an agenda.
So do you even believe the claims that Warner Bros told the press about the Harry Potter franchise ?

http://www.cinemablend.com/new/Leaked-Report-Claims-Warner-Bros-Lost-Money-On-2007-Harry-Potter-Film-19433.html
Leaked Report Claims Warner Bros. Lost Money On 2007 Harry Potter Film
Author: Eric Eisenberg
| published: 2010-07-06 22:17:15

Seriously? Yeah, sure... WB can claim to have lost money BUT anyone trusting such an obvious lie must be really naive, blind and living under a rock... indeed.

Sony lost money on the PS3 as much as WB lost money on Harry Potter.

Then what is with all the babbling about Sony or IBM needing an x86/x64 license for the desktop market ?
If ever they tried to do what they just didn't, trying to compete against Intel and AMD with Cell invading desktop, server and notebook markets... why should have they needed an x86/x64 license ?
They could have just built their own UNIX OS just like Apple OS X for their own hardware. Also IBM has plenty of UNIX experts so delivering a new OS on par with OS X to attack Microsoft could have been done with not so much effort, they just needed to put the money needed to pay designers,coders and sign agreements with hardware manufacturers.
They had and still would have the money to do it but when the managers are just stupid cowards and managers not willing to take the risk..they really can't make a group expanding or getting back to lost market segments.

NAMEk
Senior Member



Posts: 658
Posted on: 06/07/2013 12:47 PM
Or we need new technology for batteries, something better than li-ion, or li-polymer. Higher capacity in small area with high current. Somehow i don't see batteries advancement since first li-polymer hit the market. I don't talk exactly about this processor.

pages « < 4 5 6 7

Post New Comment
Click here to post a comment for this article on the message forum.

Guru3D.com » Articles » AMD A10 6800K review 4

Guru3D.com © 2022