Radeon R9 380X with HBA high-bandwidth memory

Published by

Click here to post a comment for Radeon R9 380X with HBA high-bandwidth memory on our message forum
https://forums.guru3d.com/data/avatars/m/259/259654.jpg
It's 300W because everybody is stuck on 28nm and AMD apparently won't compromise on compute performance *cough* Maxwell *cough*
https://forums.guru3d.com/data/avatars/m/258/258664.jpg
Not really impressed with the 300W... could run 2 980s with that kind of power, couldn't I? I'm pretty curious what that HBM can do though.
https://forums.guru3d.com/data/avatars/m/198/198862.jpg
While in nvidia headquarters 128bit cards are ready to be released. 😀 AMD is serious about 4k gaming.
data/avatar/default/avatar01.webp
Not really impressed with the 300W... could run 2 980s with that kind of power, couldn't I? I'm pretty curious what that HBM can do though.
No you cant.. stock 980 have a TDP of 181W ( bios max power limit ), but most oc models have a Boost TDP limit at 230W... ( the magic number of 163W is just an average calculated by Nvidia ). In fact, the 980 have basically the same TDP of the GTX680. ( who is allready incredible when you see the performance gain on same process ). This said, most gpu are designed to run at 300W ( who is the limit set ), its concern the electric board power and cooling capabilities warranty. not the power used ... based on this simple source, its a bit hard to know if the gpu is drawing so much power or if it based on something else. ( anyway if still on 28nm with a full DP setups / scalar units etc, no miracle i believe ) ...
data/avatar/default/avatar22.webp
well why companies are making 1200 W and 1500 W for ? then if the technology and performance are worthy then why not even if it consumes 300 W. However , there is the issue of heat that I myself can not tolerate
https://forums.guru3d.com/data/avatars/m/256/256367.jpg
no answer to Maxwell's light weight power consumption. hope it's out soon and that it sells well though.
data/avatar/default/avatar37.webp
Another day - another rumor...
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
Who gives a shizle about power consumption, this 3d memory will lower heat output a lot. Im just curious if it will deliver like in that leak.. 980gtx is also "power friendly" but in end its what 15-30W less then 780GTX, wow.. :P [spoiler] http://tpucdn.com/reviews/MSI/GTX_980_Gaming/images/power_peak.gif [/spoiler]
I agree for the most part that as long as it's 300w or lower I don't really care but 3D memory isn't going to drastically reduce heat/power. Also 30w less then 780GTX but it's as fast as a 780Ti and that's without a die shrink. It's impressive. Especially when thats loaded -- it gets even better efficiency when doing driving miscellaneous tasks, which is an obvious byproduct of their work in mobile computing.
While in nvidia headquarters 128bit cards are ready to be released. 😀 AMD is serious about 4k gaming.
I think Nvidia should have put a larger bus on the 980, but a 960 isn't going to run 4K, even if it was 10billionbit.
https://forums.guru3d.com/data/avatars/m/242/242471.jpg
I personally never cared about lower power consumption.. Its just a marketing thing to make people go wow, but its more efficient bla bla. Same with 580GTX vs 680GTX.. Or same thing now with this AMD chip, but still new type of memory and up to 50% will lower heat for sure. Just like 980gtx runs cooler now - less power.. But obviously not so extreme like 3dmemory, Imo 50% difference gDDR5 vs 3Dmemory is something else. http://s2.postimg.org/5opekm8cp/image.jpg http://s2.postimg.org/nssf59615/image.jpg
data/avatar/default/avatar34.webp
While in nvidia headquarters 128bit cards are ready to be released. 😀 AMD is serious about 4k gaming.
Yes, memory bandwidth really is a downside of nVidia products with the 9 series. I'd like to see a 512-bit 8GB version of their cards.
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
It will power consumption up to 50%? and that also means less heat.
It lowers the power consumption of memory. Which isn't a significant source of heat/power on a video card in the first place. Like if you take a card that hits 88c with GDDR5 and toss some HBM memory in, that card isn't going to magically go down to 60c. Or use like 100w less of power. I mean it's a nice gain but the way you worded your statement is like it's going to magically decrease heat across the entire chip or something.
https://forums.guru3d.com/data/avatars/m/93/93080.jpg
All rumors, I agree. I'm not really paying much attention to current cards being released as much as I usually do. Main reason is I've yet to find any game at 1440p maxed out that one 980 OC'd can't handle. I know AMD is trying to push 4K. I've seen 4K, upscaled 4K etc. It does not look that much better than 1440p. Very little difference in clarity. Sure it's there, but it just does not justify spending the money.
https://forums.guru3d.com/data/avatars/m/242/242471.jpg
It lowers the power consumption of memory. Which isn't a significant source of heat/power on a video card in the first place. Like if you take a card that hits 88c with GDDR5 and toss some HBM memory in, that card isn't going to magically go down to 60c. Or use like 100w less of power. I mean it's a nice gain but the way you worded your statement is like it's going to magically decrease heat across the entire chip or something.
Yes it is, why does 290X run so hot then, 512bit bus is taking its toll,.. They even had to downclock it to 5ghz or it would heat & consume too much power and it still was hot as hell unless you put on a custom cooler.
https://forums.guru3d.com/data/avatars/m/93/93080.jpg
Yes it is, why does 290X run so hot, 512bit bus is taking its toll, they even had to downclock it to 5ghz or it would heat & consume too much power.
Yeah and that 512bit bus does not shine as it should due to architecture. It's why I'm very happy with the GTX 980.
data/avatar/default/avatar27.webp
I don't really understand all the complaints about power usage lately. Sure if you're running a server farm or something. But private home users? The average price of a kilowatt hour in the U.S. is .12 cents. If you game 4 hours a day, 5 days a week average, at 100 watts extra an hour, that's 2 Kwh extra a week. It's like 13 extra dollars a YEAR. For home users, I just don't see the big deal.
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
Yes it is, why does 290X run so hot then, 512bit bus is taking its toll,.. They even had to downclock it to 5ghz or it would heat & consume too much power and it still was hot as hell unless you put on a custom cooler.
Because you design your architecture put a bus on it and memory. If the architecture uses 200w then you only have 100w for everything else to hit your 300 target. (Obviously heavily simplified) 300w is 300w -- it doesn't matter what is generating it the heat will be the same. I'd actually argue that 300w card with HBM would actually show higher core temps. Since a larger % of that 300w is coming off the core and not off the outer extremities where the memory resides. But idk, if AMD enhances their cooler temps will drop. Regardless if the card is a 300w card, doesn't matter how much they lower memory power consumption, it's still 300w of heat.
I don't really understand all the complaints about power usage lately. Sure if you're running a server farm or something. But private home users? The average price of a kilowatt hour in the U.S. is .12 cents. If you game 4 hours a day, 5 days a week average, at 100 watts extra an hour, that's 2 Kwh extra a week. It's like 13 extra dollars a YEAR. For home users, I just don't see the big deal.
It's not a big deal honestly. But it does give you some insight to how performance could possibly scale. Nvidia could easily create a 300w Maxwell variant that would be roughly ~30% faster than current without making a single optimization. For the most part, I agree though, as long as it's 300w or under I couldn't careless.
https://forums.guru3d.com/data/avatars/m/244/244064.jpg
What comes with a high TDP? Oh, that's right. Heat!
https://forums.guru3d.com/data/avatars/m/224/224067.jpg
I don't really understand all the complaints about power usage lately. Sure if you're running a server farm or something. But private home users? The average price of a kilowatt hour in the U.S. is .12 cents. If you game 4 hours a day, 5 days a week average, at 100 watts extra an hour, that's 2 Kwh extra a week. It's like 13 extra dollars a YEAR. For home users, I just don't see the big deal.
Not everyone lives in the US, and some of us have to pay over £0.14p per KW/h
https://forums.guru3d.com/data/avatars/m/206/206288.jpg
I also don't care about power consumption from a financial point of view and i agree it does get thrown around as bragging rights by Nvidia users in the same way AMD fans did back in the Fermi days. It's heat and noise that concern me. They could put a really good cooler on it, but that will bump up the prices to something that might already be quite expensive. Hoping it is something great though, tech wise things have been a touch boring over the last few years.