AMD Hawaii is 30% smaller than Kepler GK110
Click here to post a comment for AMD Hawaii is 30% smaller than Kepler GK110 on our message forum
mohiuddin
Die size huh ? Is that really important ?
If they match titan being smaller then it is something.
Redemption80
The fact they keep saying they aren't going for the Ultra Enthusiast market means it's not likely to match Titan.
780 performance for a bit less money would be great though and with a nice game bundle i would definitely be considering one.
HeavyHemi
“They’re coming in Q4. I can’t reveal a pricepoint but we’re looking at more traditional enthusiast GPU pricepoints. We’re not targeting a $999 single GPU solution like our competition because we believe not a lot of people have that $999. We normally address what we call the ultra-enthusiast segment with a dual-GPU offering like the 7990. So this next-generation line is targeting more of the enthusiast market versus the ultra-enthusiast one.
“It’s also extremely efficient. [Nvidia's Kepler] GK110 is nearly 30% bigger from a die size point of view. We believe we have the best performance for the die size for the enthusiast GPU.
Translation, our next top of the line single GPU card will fall somewhere between the GTX 770 and the GTX 780 in performance at a competitive price.
Loophole35
No they should match the 780 in performance.
---TK---
my guess is around 780 performance for cheaper price plus 200 free games
Darkest
Around 780 performance for a good price with a decent bundle sounds like a deal to me.
IcE
We'll see what they actually come up with. The die size argument doesn't really impress me given that GF104 has a drastically smaller die size and is probably the most efficient architecture I've seen to date.
warlord
Well then these are mid-high range cards not even enthusiast if their flagship is up to just gtx780..how are they're going compete with next year's maxwell like that? Disaster omg AMD! :'(
Chillin
Because 99% of people don't care about the top GPU's? Nearly all the discrete GPU's sold are low-mid range.
Here is a Steam Hardware (read: average-enthusiast gamer systems) breakdown of GPU's used. I marked with red arrows the top of the line GPU's and with blue a single step down. Notice the % they have of the entire market:
http://s17.postimg.org/5x2s3yqzj/gsdg.jpg
I'm with AMD on this choice, even if they are not the highest performing. As long as they bring down power levels and are able to produce their chips for cheaper, they win. It's the mid-range that matters not the top end.
HeavyHemi
Chillin
Fine, take Best Buy's best selling list then:
http://www.bestbuy.com/site/Computer-Cards-Components/Video-Graphics-Cards/abcat0507002.c?id=abcat0507002&gf=y
You have to reach spot #30 before you even come across a high-end card, the next high end card is at spot #53.
kosh_neranek
What a FAIL.. Is he really proud of making a chip that is slower in performance but smaller in die a year after the competition? Next, who really cares about TressFX? (other than those few who still J**K OFF on Lara) And they are not attacking ultra enthusiasts? Who does? Does he think Nvidia makes much money of TITANs? Surely not.They just have it cause they created it a year ago and their only competition still has no means of fighting it.So why not sell it for insane price. Look at 3960X/4960X..same story.
What is worst, it seems that now that they have put their APU into poor performing "next-gen"consoles they believe they can stop pushing forward as they believe that consoles will hold 3D graphics for years.And I am afraid they will.. C'mon Nvidia, we need you more then ever to come up with something that's gonna kick-ass
Chillin
There are plenty of times I might come across as hard on AMD, this is definitely not one of them. I am very happy with what I hear in this decision, it sounds like they are definitely starting to act more responsibly and focus on what makes their core money rather than branching out all over the place.
So what if the next generation doesn't even beat the 780? As long as it is priced right, uses less power than the 780 and is cheaper to produce (smaller die size) than the competition; then AMD made a successful product.
What AMD needs to do is come up with a working (not the half assed "Enduro") competitor to Nvidia's Optimus switchable graphics, then we might start seeing them in more notebooks/ultrabooks.
The only place this hurts AMD is in the HPC market as they have no viable competitor to the top Tesla systems; however, the HPC market can be targeted with a different architecture if played right.
HeavyHemi
Chillin
Texter
30% larger than Hawaii means Hawaii is 23% smaller than Kepler.
It appears my now ancient 430mm^2 estimate for a decently sized 28nm AMD GPU was pretty close if not spot on.
When you think about how nVidia designed GK110 with all the completely redundant dual precision units scattered all over the die next to the CUDA cores, then it's no surprise that anyone with the right mind to combine logic instead of separating it can achieve a significantly smaller die size with theoretical peak performance being more or less the same. Tessellation takes up a relative lot of logic in a GCN SP so that probably explains why the difference is so much less than 40%. In the end performance should be about equal, and sadly for nVidia, also Dual Precision performance.
vazup
51cent
moab600
AMD better beat he mighy GK110, though i think it might end up faster than 780 or a bit faster than Titan. or not... anyway i waiting for maxwell...
schmidtbag
What seems to amaze me is people here are comparing apples to oranges and saying oranges are simply better because they're bigger. We don't know the actual performance of these new GPUs; core per core, clock per clock, and or dollar per dollar, they could still end up being better than nvidia. Chillin is right - AMD doesn't have a compelling reason to compete with something like the titan because nobody, including most of the whiners here, will actually pay for it. It seems all people want to see is something literally "off the charts" and those results become meaningless when they offer performance you can't take advantage of until 2 years later, at which point there's a GPU you could get for at least $200 cheaper and much more power efficient.