Radeon R9 380X with HBA high-bandwidth memory

Published by

Click here to post a comment for Radeon R9 380X with HBA high-bandwidth memory on our message forum
https://forums.guru3d.com/data/avatars/m/258/258664.jpg
I'm about heat too. I don't think about a single cent on my electricity bill, I don't want to heat up my summer living room as if I'd have the stove on all day...
data/avatar/default/avatar36.webp
Same bullsh#$ with heat. Like video cards haven't been putting out 60-80 C, or even more, FOREVER. I game with headphones on, so noise isn't an issue. Edit: Some heat numbers for a 580gtx. This card ran 42 degrees C in IDLE which is very normal. When the GPU is stressed out 100% for several minutes the card reaches roughly 85 to 87 degrees C. For a GeForce GTX 580 these are rather normal numbers. Also, we measure at a room temperature of 21 degrees Celsius.
https://forums.guru3d.com/data/avatars/m/244/244064.jpg
Same bullsh#$ with heat. Like video cards haven't been putting out 60-80 C, or even more, FOREVER. I game with headphones on, so noise isn't an issue.
Noise and heat is a issue,unless you live in a cave and a single 300W gpu will keep you warm.
https://forums.guru3d.com/data/avatars/m/258/258664.jpg
Same bullsh#$ with heat. Like video cards haven't been putting out 60-80 C, or even more, FOREVER. I game with headphones on, so noise isn't an issue.
Uhm well, just to tell you a secret, not everybody plays with headphones 😉
data/avatar/default/avatar16.webp
Lucky for me, I have air conditioning. So still not an issue. I realize not everyone uses headphones, but if the noise bothers you that much, it's an option to think about.
https://forums.guru3d.com/data/avatars/m/206/206288.jpg
Yeah, I hate headphone audio so only use them if I'm not at home. If you do like headphones then it's a non issue, but a noisy card would have to be surprising cheap or fast for me to consider it.
https://forums.guru3d.com/data/avatars/m/34/34585.jpg
Well the HBA should reduce the power consumption some what but not massively what it does mean is 512bit bus with fewer memory chips which = less complex PCB = lower over all costs making it cheaper to produce.
data/avatar/default/avatar12.webp
300W it's fine to me..... people who already put hands on the samples could swear things like even a possible 980Ti will lag far behind the reference 380X.....
https://forums.guru3d.com/data/avatars/m/55/55855.jpg
Not really impressed with the 300W... could run 2 980s with that kind of power, couldn't I? I'm pretty curious what that HBM can do though.
If its faster than 2x 980s though 😀 Won't be like :P
https://forums.guru3d.com/data/avatars/m/179/179579.jpg
Clarified below.
https://forums.guru3d.com/data/avatars/m/242/242471.jpg
Because you design your architecture put a bus on it and memory. If the architecture uses 200w then you only have 100w for everything else to hit your 300 target. (Obviously heavily simplified) 300w is 300w -- it doesn't matter what is generating it the heat will be the same. I'd actually argue that 300w card with HBM would actually show higher core temps. Since a larger % of that 300w is coming off the core and not off the outer extremities where the memory resides. But idk, if AMD enhances their cooler temps will drop. Regardless if the card is a 300w card, doesn't matter how much they lower memory power consumption, it's still 300w of heat.
Ok let me put it this way, 290x vs 780ti vs 980gtx @ maximum peak 290x 512bit can use up to 324W 780ti 384bit can use up to 266w 980gtx 256bit now uses up to 190W (stock), and can still reaches 80C with stock cooler.. This 300w is 300w is kinda far fetched then, I mean 780Ti heats the same with stock cooler, ok 82C, so if 266W is 266W then it should heat up to 85-90C then, no? Its extra +80W compared 980GTX after all, and both use the same cooler with similar noise ratio (~4db louder by 780ti) [spoiler]http://tpucdn.com/reviews/MSI/GTX_980_Gaming/images/power_maximum.gif http://img.hexus.net/v2/graphics_cards/nvidia/Gigabyte/GTX980G1/13.png[/spoiler] They didnt save total ~80W just by improving core design, they obviously had to sacrifice bus speed and come up with new compression algorithm to compensate lower bandwidth also to lower power output & heat. Anyway I still believe this new 380X will heat less, even if it has 300W tdp spec., at least 10C lower with same stock cooler like by 290X (94C)..
https://forums.guru3d.com/data/avatars/m/258/258688.jpg
I personally never cared about lower power consumption.. Its just a marketing thing to make people go wow, but its more efficient bla bla. Same with 580GTX vs 680GTX..
Yes, and it's really very weird when you consider we're only talking the power consumption of a few light bulbs and we're *not* talking about a dire need to conserve power to get the most out of our batteries...:D (Thank goodness! I get really tired of battery-powered, have-to-recharge devices which seem more trouble than they're worth, lately. Ugh.) Of course, no real info yet...but it's certainly interesting. ~4k monitor prices are preparing to drop through the floor and the real GPU performance race will be to see who can drive them best, first...;) There's a real market there (~4k monitors), no mistake about it, and it's going to mature pretty quickly because initially there will be price wars of the kind we all like to see....:D
data/avatar/default/avatar28.webp
Here u go buddy,u will need one for all those warm gaming sessions.
I have an FX-9590 and two 7970's, well over 300W, yet I never feel any heat emitting from my computer, or notice the few hours of gaming where the PC works the most, as a huge strain on the electrical bill.. I mean if you want a less than 200W card, go buy a 980 or 970 then, they are great cards, some of us though, just like going with maximum performance, in my case my Watercooling will handle several 300W cards while staying quiet, and as long as the Idle is good too, it likely won't show on your electrical bill unless you're some economical green serial PC gamer who can't function with less than 8 hours of gaming every day. I did once experience my PC could heat up my apartment, but it was a insanely insulated shoebox of an apartment and the PC at the time consumed maybe the same as an entire 980. I'm sure there will be some other 300 series card that won't consume 300W's though.
data/avatar/default/avatar10.webp
Ok let me put it this way, 290x vs 780ti vs 980gtx @ maximum peak 290x 512bit can use up to 324W 780ti 384bit can use up to 266w 980gtx 256bit now uses up to 190W (stock), and can still reaches 80C with stock cooler.. [spoiler]http://tpucdn.com/reviews/MSI/GTX_980_Gaming/images/power_maximum.gif http://img.hexus.net/v2/graphics_cards/nvidia/Gigabyte/GTX980G1/13.png[/spoiler] Anyway I still believe this new 380X will heat less, even if it has 300W tdp spec., at least 10C lower with same stock cooler like by 290X (94C)..
Note, the numbers you are shown is using Furmark at 1280x1024 .... I dont even know how they do for dont make kick the driver profile who detect furmark and shoot down the power limit.
https://forums.guru3d.com/data/avatars/m/258/258664.jpg
I did once experience my PC could heat up my apartment, but it was a insanely insulated shoebox of an apartment and the PC at the time consumed maybe the same as an entire 980. I'm sure there will be some other 300 series card that won't consume 300W's though.
Well, the 300W might be the total power target or something, but my guess is that there won't be much missing, especially if your going to OC that GPU. The cooler, a hybrid one as far as I know, will be able to handle more than 300W, so I don't expect that 300 number is linked to it's cooler. And well, I'm living in a rather nice apartment, some 30m² living room, and there's two factors actually heating it up in the summer: the TV, and my gaming rig. With both on, I see a rise of 2 to 4°C after a few hours in summer, when the temperature is already high (above 28°C). With the TV off it's only a 1 to 2°C rise, but still I notice it. Adding to it, I have a fairly cornered place for my rig (which I will change), so maybe the situation lets me see it overly sensitive. But the difference between 28 and 32°C can be felt, so I won't go for the heat production factories.
If its faster than 2x 980s though 😀 Won't be like :P
I think it'll be faster than a 980, I have little doubt about it, but more like 1.5 times as fast, not twice, at least not with 1440p and below. This might behave different with 4K and all the eyecandy though.
https://forums.guru3d.com/data/avatars/m/179/179579.jpg
I'll clarify my earlier post. Doesn't this seem a bit suspicious to anyone else? Today's leak:
More and more rumors start to surface on the Radeon R9 380X using HBA memory. And yeah, we know the name now as well, R9 380X as in LinkedIN Ilana Shternshain, an AMD ASIC physical design engineer named this product. Iiana "responsible for full-chip timing methodology and closure for AMD's most exciting chips." apparently has been working on the PlayStation 4's SoC, the Radeon R9 290X discrete GPU, .... and the R9 380X, which is described as the "largest in 'King of the hill' line of products."
Ilana Shternshain https://www.linkedin.com/in/ilanashternshain http://www.guru3d.com/index.php?ct=news&action=file&id=9266 http://i1227.photobucket.com/albums/ee426/pillmonsta/Capture-83.png%7Eoriginal Identical news item from 2012, identical linkedIN profile with a different first name and gender.
The recent firings from Advanced Micro Devices apparently reduced not only marketing and PR personnel, but also engineers. Fired in November, 2011, a hardware designer from AMD has disclosed the code-name of the company’s next-generation of graphics processors – Sea Islands – over at his profile in LinkedIn social network. Alexander Shternshain, a former MTS design engineer at AMD, who is looking for a new job now, has unveiled the name of AMD’s future graphics processors: Sea Islands. The new family will likely emerge in 2012 or in 2013 and will be most probably be made using 28nm fabrication process at Taiwan Semiconductor Manufacturing Company. According to VR-Zone web-site, Sea Islands name “directly refers to the chain of islands on the USA's Atlantic coast”. Based on his LinkedIn profile, Mr. Shternshain was involved in design of the latest AMD families of graphic processors starting with Evergreen and Northern Islands (Radeon HD) GPU families, Fusion APUs (Ontario, Llano, Krishna), and the Southern Islands and Sea Islands GPU families.
http://www.xbitlabs.com/news/graphics/display/20120102144434_AMD_Readies_Sea_Islands_Family_of_Graphics_Processors.html http://i1227.photobucket.com/albums/ee426/pillmonsta/add.png%7Eoriginal Alexander Shternshain https://www.linkedin.com/in/alexandershternshain http://i1227.photobucket.com/albums/ee426/pillmonsta/Capture-81.png%7Eoriginal http://i1227.photobucket.com/albums/ee426/pillmonsta/Capture-84.png%7Eoriginal Maybe transgender surgery took place somewhere along the line? Make of it what you will.
https://forums.guru3d.com/data/avatars/m/243/243702.jpg
Yeah and that 512bit bus does not shine as it should due to architecture. It's why I'm very happy with the GTX 980.
290x had 6.2B transistors, 780Ti 7.1B transistors, that approximately is difference in performance between them. Yes, Maxwell is better, It is back down to 5.2B transistors with 980, that is sole reason for lower power consumption. But good feat on nV side none the less, sad is price they have set in comparison to 780Ti. Huang marketed it as twice the performance per transistor and per watt, while that was big over exaggeration it told that nV saved a lot per GPU manufacturing and I consider 980 as slap in the face of customers. And they gladly took that slap and encouraged nV to do even bigger slaps in future. I for once am glad for full 300W PCIe graphics card as it is to standard. HBM saves few more watts which means they are eaten by GPU = more power. I Do Not think AMD had in store same performance per transistor boost as Maxwell but lets see real world benchmarks 1st.
https://forums.guru3d.com/data/avatars/m/179/179579.jpg
No you cant.. stock 980 have a TDP of 181W ( bios max power limit ), but most oc models have a Boost TDP limit at 230W... ( the magic number of 163W is just an average calculated by Nvidia ). In fact, the 980 have basically the same TDP of the GTX680. ( who is allready incredible when you see the performance gain on same process ). This said, most gpu are designed to run at 300W ( who is the limit set ), its concern the electric board power and cooling capabilities warranty. not the power used ... based on this simple source, its a bit hard to know if the gpu is drawing so much power or if it based on something else. ( anyway if still on 28nm with a full DP setups / scalar units etc, no miracle i believe ) ...
TDP isn't the power limit, TDP is what's allowed at stock before overclocking. Total power limit is set as a percentage above TDP, usually up to 300W since 300W is a 6+8pin plug (inc. board). TDP can vary even between identical cards though due to thermal dissipation and leakage, so there can't really be a hard and fast rule for all vendors. It's is up to the vendor to program the bios, but since AMD/Nvidia have to account for worse case scenario, actual draw may be less than the official specs. Sometimes it's a marketing trick as well, such as gimping cards where disparity between leakage on the dies is substantial. Instead of having all GPU's with good clocks but with a high draw and some that could do the same with much less power, the clocks are crippled across the board to keep all cards of a generation within a similar TDP range. The low leakage dies can then be sold at a premium to vendors who up the stock clocks and market them accordingly. So it would seem power, not speed, is more important when it comes to sales.....I don't really understand that part tbh.
https://forums.guru3d.com/data/avatars/m/206/206905.jpg
Like I said months ago "will run hotter, use more power and will not perform as well as the nvidia equivalent". Same old, same old, amd need to change the record. 20nm amd or forget it.