Radeon R9 380X with HBA high-bandwidth memory
Click here to post a comment for Radeon R9 380X with HBA high-bandwidth memory on our message forum
PrMinisterGR
It's 300W because everybody is stuck on 28nm and AMD apparently won't compromise on compute performance *cough* Maxwell *cough*
fantaskarsef
Not really impressed with the 300W... could run 2 980s with that kind of power, couldn't I? I'm pretty curious what that HBM can do though.
Undying
While in nvidia headquarters 128bit cards are ready to be released. 😀
AMD is serious about 4k gaming.
Lane
Battlefieldprin
well why companies are making 1200 W and 1500 W for ? then if the technology and performance are worthy then why not even if it consumes 300 W. However , there is the issue of heat that I myself can not tolerate
Goldie
no answer to Maxwell's light weight power consumption.
hope it's out soon and that it sells well though.
shymi
Another day - another rumor...
-Tj-
Who gives a shizle about power consumption, this 3d memory will lower heat output a lot.
Im just curious if it will deliver like in that leak..
980gtx is also "power friendly" but in end its what 15-30W less then 780GTX, wow.. :P
[spoiler]
http://tpucdn.com/reviews/MSI/GTX_980_Gaming/images/power_peak.gif
[/spoiler]
Denial
-Tj-
I personally never cared about lower power consumption.. Its just a marketing thing to make people go wow, but its more efficient bla bla. Same with 580GTX vs 680GTX..
Or same thing now with this AMD chip, but still new type of memory and up to 50% will lower heat for sure.
Just like 980gtx runs cooler now - less power.. But obviously not so extreme like 3dmemory, Imo 50% difference gDDR5 vs 3Dmemory is something else.
http://s2.postimg.org/5opekm8cp/image.jpg
http://s2.postimg.org/nssf59615/image.jpg
VultureX
Denial
GhostXL
All rumors, I agree. I'm not really paying much attention to current cards being released as much as I usually do. Main reason is I've yet to find any game at 1440p maxed out that one 980 OC'd can't handle.
I know AMD is trying to push 4K.
I've seen 4K, upscaled 4K etc. It does not look that much better than 1440p. Very little difference in clarity. Sure it's there, but it just does not justify spending the money.
-Tj-
GhostXL
dean469
I don't really understand all the complaints about power usage lately. Sure if you're running a server farm or something. But private home users?
The average price of a kilowatt hour in the U.S. is .12 cents. If you game 4 hours a day, 5 days a week average, at 100 watts extra an hour, that's 2 Kwh extra a week. It's like 13 extra dollars a YEAR.
For home users, I just don't see the big deal.
Denial
pbvider
What comes with a high TDP? Oh, that's right. Heat!
Extraordinary
Redemption80
I also don't care about power consumption from a financial point of view and i agree it does get thrown around as bragging rights by Nvidia users in the same way AMD fans did back in the Fermi days.
It's heat and noise that concern me.
They could put a really good cooler on it, but that will bump up the prices to something that might already be quite expensive.
Hoping it is something great though, tech wise things have been a touch boring over the last few years.