AMD RX Vega HBM 2 8GB Memory Stack Reportedly Costs $160

Published by

Click here to post a comment for AMD RX Vega HBM 2 8GB Memory Stack Reportedly Costs $160 on our message forum
data/avatar/default/avatar29.webp
Hbm 2 it is the fastest memory available. What are you babbling about? Fastest (top oc'ed tier) ddr3/ddr4 memories are still more expensive. Move on "experts". Now, maybe it would be wise only Top-End vega to use hbm2, mid-tier and low-tier should use ggdr5x or simply gddr5. I can't accept cheap vega ranked at #3 place could cost 460$+. If this 160$ is for real we could see prices like: Vega #1: 860$+ Vega #2: 660$+ Vega #3: 460$+ Godawful. 😛uke2:
https://forums.guru3d.com/data/avatars/m/262/262564.jpg
Early adopters always pay a premium. The question is will Vega match or exceed the 1080ti for a lower cost. This is not a high volume segment of the GPU market so AMD can probably withstand thin margins here until production efficiencies reduce the cost. I for one will pay the luxury tax to support AMD given what they've done for consumers in both the CPU and GPU market, as long at Vega delivers with a significant price/performance edge over the ti. I won't buy any GPU for over $600 though. To me it's an absolute waste and highway robbery by shareholders.
https://forums.guru3d.com/data/avatars/m/237/237771.jpg
Early adopters always pay a premium. The question is will Vega match or exceed the 1080ti for a lower cost. This is not a high volume segment of the GPU market so AMD can probably withstand thin margins here until production efficiencies reduce the cost. I for one will pay the luxury tax to support AMD given what they've done for consumers in both the CPU and GPU market, as long at Vega delivers with a significant price/performance edge over the ti. I won't buy any GPU for over $600 though. To me it's an absolute waste and highway robbery by shareholders.
This is why I try to stay around $400 for GPU's any more really is a bad investment in the long run.
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
Does it cost $160 or does it add $160 to the cost of the chip? HBM1 cost $41, but it added ~$120 to the cost of the Fury X vs GDDR5 (For a 30% Margin, it's 3x material cost for high volume products) I would think that HBM2 would be cheaper due to improved manufacturing + it's only using 2 stacks, but with all the rumors of supply issues it's possible it's more. So if they are saying it adds $160 to the cost of the chip, it's really not that far fetched given HBM1's cost.
data/avatar/default/avatar24.webp
It seems history is repeating again. Fury X was way overpriced being slower to what Nvidia offered at that time. VEGA will be a power hungry card that wil need watercooling just to come near 1080 ti performance levels, add HBM2 costs to that, and price will not be competitive at all.
https://forums.guru3d.com/data/avatars/m/259/259654.jpg
Does it cost $160 or does it add $160 to the cost of the chip? HBM1 cost $41, but it added ~$120 to the cost of the Fury X vs GDDR5 (For a 30% Margin, it's 3x material cost for high volume products) I would think that HBM2 would be cheaper due to improved manufacturing + it's only using 2 stacks, but with all the rumors of supply issues it's possible it's more. So if they are saying it adds $160 to the cost of the chip, it's really not that far fetched given HBM1's cost.
That's the actual question. By default the two stacks will be cheaper to use on the interconnect, and this time there are more than one fabs manufacturing this. I would actually argue that if production properly scales this will be cheaper than the equivalent Fury X cost.
It seems history is repeating again. Fury X was way overpriced being slower to what Nvidia offered at that time. VEGA will be a power hungry card that wil need watercooling just to come near 1080 ti performance levels, add HBM2 costs to that, and price will not be competitive at all.
How do you know anything about performance/watt? Also for 1440p+, the Fury X is actually faster than the 980Ti.
data/avatar/default/avatar01.webp
That's the actual question. By default the two stacks will be cheaper to use on the interconnect, and this time there are more than one fabs manufacturing this. I would actually argue that if production properly scales this will be cheaper than the equivalent Fury X cost. How do you know anything about performance/watt? Also for 1440p+, the Fury X is actually faster than the 980Ti.
Considering how power hungry polaris is for the performance it delivers, I'm not seeing VEGA top card to be less than 300W.
https://forums.guru3d.com/data/avatars/m/259/259654.jpg
Considering how power hungry polaris is for the performance it delivers, I'm not seeing VEGA top card to be less than 300W.
Vega is IP9.0. It has different rendering backends and different CUs. It's like saying that Kepler should clock low because Fermi was a furnace. Also the Polaris refresh has no issues at all hitting 1.4GHz at acceptable power draws. From the Instinct reveals I would say that for Teraflop/Watt, Vega is better than Pascal. That's not exactly as surprise either. The Nano had better performance/watt than the GTX 980, on top of better general performance too.
https://forums.guru3d.com/data/avatars/m/237/237771.jpg
Vega is IP9.0. It has different rendering backends and different CUs. It's like saying that Kepler should clock low because Fermi was a furnace. Also the Polaris refresh has no issues at all hitting 1.4GHz at acceptable power draws. From the Instinct reveals I would say that for Teraflop/Watt, Vega is better than Pascal. That's not exactly as surprise either. The Nano had better performance/watt than the GTX 980, on top of better general performance too.
Polaris actually has great performance/watt ratio when under volted and under clocked. I feel AMD was just forced to pump mor voltage and MHz through Polaris than they originally wanted to.
https://forums.guru3d.com/data/avatars/m/243/243702.jpg
Polaris actually has great performance/watt ratio when under volted and under clocked. I feel AMD was just forced to pump mor voltage and MHz through Polaris than they originally wanted to.
That's unfortunately affected by manufacturing process. AMD can use certain methods to reduce power consumption. But they can't affect how much power each transistor trigger eats at certain clock. And what voltage transistor needs at given clock to ensure that its final state is 1 or 0 as desired and not something in-between.
https://forums.guru3d.com/data/avatars/m/260/260826.jpg
It looks like a Déjà vu. IF AMD GPU will deliver AGAIN (Fury X HBM 1.0 vs 980TI GDDR5) the same or less performance than the competition for the same or higher price than the competition : HMB 2.0 = Fury X fail...2.0?
https://forums.guru3d.com/data/avatars/m/259/259654.jpg
Polaris actually has great performance/watt ratio when under volted and under clocked. I feel AMD was just forced to pump mor voltage and MHz through Polaris than they originally wanted to.
True. On the other hand, the specific voltage/MHz they push through it with the 570/580 cards, isn't exactly a deal breaker, especially if you consider that this is basically Fiji's smaller "tock", internal-design wise.
That's unfortunately affected by manufacturing process. AMD can use certain methods to reduce power consumption. But they can't affect how much power each transistor trigger eats at certain clock. And what voltage transistor needs at given clock to ensure that its final state is 1 or 0 as desired and not something in-between.
There are a lot of tricks that chip designers employ. From packed libraries to dark silicone (dead parts of the chip that improve thermal conditions). It looks like Vega is the first AMD design since Tahiti to target high clocks.
It looks like a Déjà vu. IF AMD GPU will deliver AGAIN (Fury X HBM 1.0 vs 980TI GDDR5) the same or less performance than the competition for the same or higher price than the competition : HMB 2.0 = Fury X fail...2.0?
Well, the Fury X wasn't that bad. Today it's a surprisingly decent card, and it's actually faster than the 980Ti in higher resolutions. Was it a success? No. Would I buy it with 4GB of VRAM? Nope. Vega doesn't seem to have any of these issues though. It seems to clock really high, and 8GB of VRAM with that controller should be more than enough even for the Scorpio+ generation of games. I'm also really curious as to the effect of that memory controller in games where microstutter happens whenever the engine calls data from RAM.
data/avatar/default/avatar28.webp
Most likely this time AMD should make their rebranded fury X (vega edition) faster than Nvidia's new TI or 100% equal performing. OR else RIP for good. Fury X is an inferior card in many ways. It beats 980ti at best in 10% of games and resolutions to be honest not mention upper-mid-range 1070 which is near at price and tramples everything AMD has to offer 'till today. HBM is purely a gimmick. No game changer. I would prefer better GPU architecture with more transistors than investing in HBM2 for nothing. God help Vega's star to not fade away because of AMD's gaming style. :bang:
data/avatar/default/avatar11.webp
Im happy if the Card manages 4k better than a 1080ti for a ok price. I dont understand why AMD made it look like vega was going against Volta in one of the ads. It would be stupid of them if it cant even performe better than Pascal.. But Who knows. We need AMD back in the game again, even tho i understand some like to be raped by nvidias markering and pricing. Selling mid end Card as high end, overpricing , lying, close source, in general i dont like to suport them. Still i have a gtx1080 because it was on sale and Pascal is the only Card close to the performance i want. It a fine Card. But i hope i can choose vega over Volta or cheap 1080ti.. It would be positive for us all if they dont f**k up more than they have. Im affraid vega is comming to late, and performance is to bad to deffend this.. Sry for typos, using my Phone...
https://forums.guru3d.com/data/avatars/m/260/260826.jpg
Well, the Fury X wasn't that bad. Today it's a surprisingly decent card, and it's actually faster than the 980Ti in higher resolutions. Was it a success? No. Would I buy it with 4GB of VRAM? Nope.
"Decent" was not enough for Fury X and it certainly will not be enough for VEGA. When you are the second in a market of 2 competitors and your competition sells 3 times more than you, "decent" is not enough to make customers buy your product, specially with the game devs (obviously) favoring the more sold GPUs. Performance AND price (both at the same time) are the only factors customers will consider when the product is on shelves (and seriously reviewed) and the months long marketing HYPE BS faded.
Vega doesn't seem to have any of these issues though.
Issue 1: Price. "Leaks" (marketing 2.0) is making customers aware and used to a premium price due to his fancy HBM 2.0.
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
Im happy if the Card manages 4k better than a 1080ti for a ok price. I dont understand why AMD made it look like vega was going against Volta in one of the ads. It would be stupid of them if it cant even performe better than Pascal.. But Who knows.
It's going to be funny when like 3 years from now someone at AMD says the poor voltage thing was a completely unintentional addition by the marketing firm that did that ad. They probably like put it in there because they were alluding to lower power or something and everyone just assumed they were talking about Nvidia's Volta lol
https://forums.guru3d.com/data/avatars/m/248/248994.jpg
Most likely this time AMD should make their rebranded fury X (vega edition) faster than Nvidia's new TI or 100% equal performing.
What's this? Is AMD going to release a 14nm Fiji shrink? I have missed this news entirely. I reckon this chip would then be used in a video card between Vega and Polaris. Do you have any links to more info?
data/avatar/default/avatar14.webp
not at all On other hand we already know that HBM2 is really expensive (even for pro NVidia user)... not really new. But with mass production it will tend to be cheaper.
Its difficult to mass produce something nobody is buying 😉
https://forums.guru3d.com/data/avatars/m/259/259654.jpg
"Decent" was not enough for Fury X and it certainly will not be enough for VEGA. When you are the second in a market of 2 competitors and your competition sells 3 times more than you, "decent" is not enough to make customers buy your product, specially with the game devs (obviously) favoring the more sold GPUs.
This is completely true. But as you mention yourself below, the only key factor for mass adoption will be price.
Performance AND price (both at the same time) are the only factors customers will consider when the product is on shelves (and seriously reviewed) and the months long marketing HYPE BS faded. Issue 1: Price. "Leaks" (marketing 2.0) is making customers aware and used to a premium price due to his fancy HBM 2.0.
From the retail customer perspective, we have to wait and see if Vega does indeed deliver in the price/performance ratio. If it provides initial performance between the GTX 1080 and the 1080Ti, and is similarly priced in-between, I would get it and it would actually cover a current hole in the market. For the high performance computing perspective, Vega is godsent. 13Teraflop for a reasonable price and an actually decent open source software stack, for under 300W. I bet that AMD will try to exhaust all of their inventory there, that's good business and it makes a ton of sense. Even if they sell it at $2,000 a pop, that card is still many times cheaper than what NVIDIA sells in that performance segment. My personal opinion on Vega's delay for the consumer market is this: Look at AMD's GPU release history. They always release beefy, good hardware, that usually takes years to be properly utilized by their driver. That meant that AMD had to sell expensive hardware at lower prices because initial relative performance was lower (see the 7970 and the 290x). Seeing how hard they're at work even with the Linux driver, and that a lot of parts of the architecture are completely revamped, I believe that they don't want their expensive, performant hardware to be sold for peanuts due to what is essentially driver performance again. I believe that they're putting the retail sales on the back of the burner to make sure that nothing ridiculous as what happened to Tahiti or Hawaii initially, happens to Vega and that the driver is at least giving them decent performance from the start.
Its difficult to mass produce something nobody is buying 😉
NVIDIA is buying all it can, same as AMD.
https://forums.guru3d.com/data/avatars/m/156/156133.jpg
Moderator
Curious, how much did a single stack of the original HBM 4gb cost?