Meet the XFX Radeon R9 390 Double Dissipation ?

Published by

Click here to post a comment for Meet the XFX Radeon R9 390 Double Dissipation ? on our message forum
data/avatar/default/avatar33.webp
The memory controller is not a significant portion of power. If it uses say 15% (I think it actually uses more like 10%) of total power of the GPU, reducing it by half only gives you 7.5% less power. That's 22w on a 300w card. Also I'd argue that the 2.5D memory configuration decreases the area of heat dissipation, as the memory modules are significantly closer to the chip.
My R9 290 in Metro Last Light shows 25W for "GPU Memory" and 175W for "GPU VRM" in AIDA. Making it 14% for VRAM out of total GPU consumption. This is proly not very accurate, if at all, but I wonder if this is the the ballpark percentage value. Its funny because I was sure that something like 1/3 of Hawaii power goes to VRAM operations.
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
My R9 290 in Metro Last Light shows 25W for "GPU Memory" and 175W for "GPU VRM" in AIDA. Making it 14% for VRAM out of total GPU consumption. This is proly not very accurate, if at all, but I wonder if this is the the ballpark percentage value. Its funny because I was sure that something like 1/3 of Hawaii power goes to VRAM operations.
I mean it probably is in that range. Let's say it's accurate. HBM shows a 42% reduction over GDDR5. So 10.5w reduction on a 290. Which is really good considering it's also significantly increasing the performance. But it's not like we're going to suddenly get fanless designs or dramatic changes to the heatspreader/fans or something. And like I said, with a 2.5D config you're now putting all the memory modules a lot closer to the main source of heat. So you're essentially going to be dissipating heat from a smaller section of the card, which is going to make it more difficult to produce smaller, thinner cooling solutions.
data/avatar/default/avatar07.webp
If it's really in that range (~15%) that's bad, but it would explain the hell out of R9 300 late release. I get what you mean with smaller dissipation area, but data manipulation paths are shorter, so there is less heat to get rid of, and HBM, 2.5/3D is clearly the way forward. It's just that AMD is paying the price of early quirks and illnesses that come with leading the technology way.
https://forums.guru3d.com/data/avatars/m/237/237957.jpg
Although I use to like XFX video cards, I haven't bought one since they stopped making NVidia GPU based cards. I now buy EVGA cards for my NVidia fix. XFX bring back NVidia cards to your lineups!
https://forums.guru3d.com/data/avatars/m/232/232079.jpg
Real card? Maybe maybe not lol I always get entertained when a possible card is being released in this forum. It's like a car company produced a car with no engine, and you guys are standing around looking at the empty engine compartment. And you are all yelling at each other saying how fast the car is lol.:speed: I'll wait till it's released, wait till its goes down to half price, then buy it. Then wait for another card release, or maybe not, and watch for more entertainment here.:banana:
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
If it's really in that range (~15%) that's bad, but it would explain the hell out of R9 300 late release. I get what you mean with smaller dissipation area, but data manipulation paths are shorter, so there is less heat to get rid of, and HBM, 2.5/3D is clearly the way forward. It's just that AMD is paying the price of early quirks and illnesses that come with leading the technology way.
I definitely agree HBM 2.5/3D is the way forward. There is no doubt about it. TJ just makes it sound like it's going to allow for more lax cooling solutions. I don't think it will. Especially when AMD will probably take that 10.5w savings and just clock it 5mhz faster or whatever to make up the difference. The card is definitely going to be good at 4K and stuff though. Now all I want is a 120/144hz 4K monitor with a DP1.3.
https://forums.guru3d.com/data/avatars/m/258/258688.jpg
Color me skeptical...;) Q: Why bother to take pictures of the front of the card but not the back? (Either side breaks NDA.) A: Because the back of the card would quickly reveal the fake.
https://forums.guru3d.com/data/avatars/m/243/243702.jpg
Color me skeptical...;) Q: Why bother to take pictures of the front of the card but not the back? (Either side breaks NDA.) A: Because the back of the card would quickly reveal the fake.
Backside may not reveal anything at all. Because it may not need to have SMDs there since HBM sits next to GPU and there is plenty of space around where 8~16 memory chips were before.
https://forums.guru3d.com/data/avatars/m/242/242471.jpg
I definitely agree HBM 2.5/3D is the way forward. There is no doubt about it. TJ just makes it sound like it's going to allow for more lax cooling solutions. I don't think it will. Especially when AMD will probably take that 10.5w savings and just clock it 5mhz faster or whatever to make up the difference. The card is definitely going to be good at 4K and stuff though. Now all I want is a 120/144hz 4K monitor with a DP1.3.
Yes I make it sound... Don't be surprised if Im right with what I said there. It will be cooler then 290X with same custom cooler, no doubt.
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
Yes I make it sound and don't be surprised if Im right with what I said there.. It will be cooler then 290X with same custom cooler, no doubt.
So you're saying a card at 28nm with 40% additional shaders, same architecture and same clock speeds will have the same or less power consumption as a 290x because of HBM? If the 390x somehow manages the same power consumption as a 290x, it will be primarily due to the architecture changes and definitely not HBM. As I've shown, with support, HBM at most saves about 20w, if the 25w number is accurate, it's even less -- 10w. There is no way 40% additional shaders is less then 20w, let alone 10w. That difference will definitely come from enhanced architecture and not HBM, which is what you originally claimed. Edit: http://www.cs.utah.edu/thememoryforum/mike.pdf GDDR5 is 20 pJ/bit, a 336 GB/s interface takes about 54 Watts (Titan Black) HBM is 7 pJ/bit. So a 640GB/s interface is about 36 watts. A 18w reduction if AMD is targeting ~640GB/s interface. So basically AMD would have to find a way to add 40% more shaders and design the architecture so that increase would be less or equal to 18w in order to achieve what you're claiming.
data/avatar/default/avatar16.webp
I've had a few XFX (ATI/AMD) products down the years, nothing special imho - the cards were hot, noisy and eventually died within 2/3 years. Ever since I've tended to stick to MSI & Gigabyte - but also had great luck with current Inno3d GTX 770 4GB but really wouldn't be interested in XFX again
data/avatar/default/avatar38.webp
So you're saying a card at 28nm with 40% additional shaders, same architecture and same clock speeds will have the same or less power consumption as a 290x because of HBM? If the 390x somehow manages the same power consumption as a 290x, it will be primarily due to the architecture changes and definitely not HBM.
He always drools over next AMD release, then ends up buying GF 😀 Same like Nursie in The Black Adder pondering on what costume she should wear to a masked party, although everyone knows the answer... [spoiler] [youtube]Wa2JhlyU-8M[/youtube][/spoiler]
https://forums.guru3d.com/data/avatars/m/243/243702.jpg
So you're saying a card at 28nm with 40% additional shaders, same architecture and same clock speeds will have the same or less power consumption as a 290x because of HBM? If the 390x somehow manages the same power consumption as a 290x, it will be primarily due to the architecture changes and definitely not HBM. As I've shown, with support, HBM at most saves about 20w, if the 25w number is accurate, it's even less -- 10w. There is no way 40% additional shaders is less then 20w, let alone 10w. That difference will definitely come from enhanced architecture and not HBM, which is what you originally claimed.
I think that HBM will save up to 15W on memory bricks, and there will be additional up to 10W power saving on GPU from new memory controller. Rest will have to come from advanced power control/gating of each "shader cluster" so as they have no load in between frames or batches, they'll not run at all. Like maxwell does. And that other stuff from AMD's power efficiency road map which applies to GPUs. They had to come with something special to push into GPU 4096 shaders and keep it under 300W as 290x was already like 285W. And we do not know if that something was just enough to get under 300W or it gave much higher improvement. It is quite same as maxwell architectural changes, nVidia knew they had good direction, but at time of architecture design, they could not predict that 5.2B maxwell chip will perform better than 7.1B kepler.
https://forums.guru3d.com/data/avatars/m/251/251199.jpg
The cooler is too small. If they want to go serious on a 300w card then they should even consider using 2 PCI slots, and make the cooler be wider than the actual card. A monster Arctic Cooling S1 type of a cooler. That thing they use now will not be silent.
data/avatar/default/avatar02.webp
The Fiji XT will be 14nm from GF, and will probably have max. 250W TDP. They only added the extra power connectors for possible overclocking. Besides: The DD cooling, with basically the same design, works well on my XFX R9 290X.
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
I think that HBM will save up to 15W on memory bricks, and there will be additional up to 10W power saving on GPU from new memory controller. Rest will have to come from advanced power control/gating of each "shader cluster" so as they have no load in between frames or batches, they'll not run at all. Like maxwell does. And that other stuff from AMD's power efficiency road map which applies to GPUs. They had to come with something special to push into GPU 4096 shaders and keep it under 300W as 290x was already like 285W. And we do not know if that something was just enough to get under 300W or it gave much higher improvement. It is quite same as maxwell architectural changes, nVidia knew they had good direction, but at time of architecture design, they could not predict that 5.2B maxwell chip will perform better than 7.1B kepler.
The pJ/bit figure factors in leakage/refresh on memory module. So basically the entire memory module/interface/controller takes up about 36W on a 640gbit/s interface. It's an 18w reduction in power over a 336gbit/s one. Which is god damn impressive considering it's doubling the performance. And yeah, again I understand that architecture improvements will definitely lead to better power consumption. But the original argument was that HBM will be making the difference in terms of power consumption. It definitely helps, but it's not offsetting a 40% increase in shaders, architecture evolution will. Regardless, hopefully AMD will keep the cost low. A 2.5D interposer is expensive to manufacture and if you lose a single channel on a HBM module during production you essentially lost the entire interposer. The yields are going to be interesting to say the least. 390x should be a beast performance wise.
https://forums.guru3d.com/data/avatars/m/258/258801.jpg
The cooler is too small. If they want to go serious on a 300w card then they should even consider using 2 PCI slots, and make the cooler be wider than the actual card. A monster Arctic Cooling S1 type of a cooler. That thing they use now will not be silent.
Where are you getting that 300W figure from? Actual power consumption is 197, lower than 290x which means it'll run @ 280x temp levels. And in that case XFX's DD cooler isn't gonna help because goddamn it sucks ass even on an 280. Believe me I researched.
https://forums.guru3d.com/data/avatars/m/262/262085.jpg
if you look close on the card it says 380 not 390 lol or maybe its my crap screen 🙂
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
Where are you getting that 300W figure from? Actual power consumption is 197, lower than 290x which means it'll run @ 280x temp levels. And in that case XFX's DD cooler isn't gonna help because goddamn it sucks ass even on an 280. Believe me I researched.
The article that you linked that references 197w is from back in November and considering the same site later leaked the performance figures from the Titan X (which were all wrong, as were their specs) I'd take anything from that site with a gain of salt. Especially when the rumor is so old. What we know and what has been consistent is the 40% increase in shaders and most likely 28nm node. Even if somehow AMD significantly improves leakage, HBM, etc, they would most likely end up with same TDP as a 290x. There is no way they would hit 100w less. Not without a die shrink. 20nm is garbage and 16nm FF+ is 6+ months away from full production. Samsung/GF 14nm isn't ready for GPU production and neither is regular 16nm FF.
https://forums.guru3d.com/data/avatars/m/243/243702.jpg
The pJ/bit figure factors in leakage/refresh on memory module. So basically the entire memory module/interface/controller takes up about 36W on a 640gbit/s interface. It's an 18w reduction in power over a 336gbit/s one. Which is god damn impressive considering it's doubling the performance. And yeah, again I understand that architecture improvements will definitely lead to better power consumption. But the original argument was that HBM will be making the difference in terms of power consumption. It definitely helps, but it's not offsetting a 40% increase in shaders, architecture evolution will. Regardless, hopefully AMD will keep the cost low. A 2.5D interposer is expensive to manufacture and if you lose a single channel on a HBM module during production you essentially lost the entire interposer. The yields are going to be interesting to say the least. 390x should be a beast performance wise.
Agree, and thanks for pointing out interposer costs/defect rate posibility. I did not take it into consideration till now since it was mostly ignored as HBM itself looked as more challenging. Will have to check details, if any are available.