JEDEC Announces Publication of GDDR5X Graphics Memory Standard

Published by

Click here to post a comment for JEDEC Announces Publication of GDDR5X Graphics Memory Standard on our message forum
https://forums.guru3d.com/data/avatars/m/248/248994.jpg
This just means that Nvidia can further reduce the memory bus width in their non-high end cards (since those will be HBM2 anyway). Just like 970/980 got 256-bits and especially the 950/960 the strangling 128-bits bus. This also means that AMD can still design GPUs meant for a wider bus but crippled with a much narrower one in the final card released, haha.
https://forums.guru3d.com/data/avatars/m/248/248627.jpg
I can see this being used in the low to mid segments while hbm will probably be used in the high end daddy cards.
data/avatar/default/avatar07.webp
hbm makes cooling a lot easier since it's right next to the core n at the same level as the core too. it all comes down to the price difference tho.
https://forums.guru3d.com/data/avatars/m/163/163068.jpg
And where is the future beyond that? HBM already has a future beyond HBM 2
data/avatar/default/avatar25.webp
This is actually not really news for those of us who are are keeping tracks with development. It means lower end product will use GDDR5X but if AMD would push HBM2 all across the board it would mean serious performance gain for all their products (if architecture is really that efficient). It could set another bar for lower end like they did with 4000 series. But I understand why lower perfs models will be GDDR5C based - money. They can't mess around with their market share and stocks declining.
https://forums.guru3d.com/data/avatars/m/175/175902.jpg
And where is the future beyond that? HBM already has a future beyond HBM 2
the future is GDDR5X for low/middle end and HBM2 for high end, HBM where past already before the the lauch of Furry, due to the limit of 4g but HBM2 wasn't ready and AMD was in need of a new GPU... so... also GDDR5X seem pin to pin to GDDR5... imagine how much actuar model could be rebranded in future line of GPU :banana:
data/avatar/default/avatar02.webp
great, more milking of GDDR5. where is GDDR6/7?
https://forums.guru3d.com/data/avatars/m/260/260855.jpg
So we're going to see graphics cards with RAM that is twice as fast, and people want to pout about this? *confused*
https://forums.guru3d.com/data/avatars/m/168/168405.jpg
GDDR5X is a fail if you ask intel. Watch this> video and you will know why. And i have to agree whit them.
https://forums.guru3d.com/data/avatars/m/231/231366.jpg
download free ? i'm lost...
https://forums.guru3d.com/data/avatars/m/248/248994.jpg
GDDR5X is a fail if you ask intel. Watch this> video and you will know why.
Let's wait till we see a first product with that. Preferably one that doesn't cost as much as everything else that has "Intel" printed on it. Otherwise that video taught me absolutely nothing new.
https://forums.guru3d.com/data/avatars/m/227/227853.jpg
^ Yeah, you're acting like everything they produce is overpriced into oblivion.
So we're going to see graphics cards with RAM that is twice as fast, and people want to pout about this? *confused*
Yeah, I guess people expected HBM to be everywhere for some reason. They haven't learned it's not how it works. The amount of GDDR3 cards was quite high when GDDR5 hit the market for example.
data/avatar/default/avatar13.webp
great, more milking of GDDR5. where is GDDR6/7?
Will be no GDDR6/7, HBM is future, just like GDDR HBM will go HBM2 HBM3 etc as future generations are improved. Same on desktop DDR4 is last generation of DDR will be no DDR5. Desktop memory will be looking at 3D stacked memory chips.
https://forums.guru3d.com/data/avatars/m/248/248627.jpg
I can see this as both a good and bad technology as it will probably slow down the adoption of hbm but it will probably mean cheaper cards in the short term
https://forums.guru3d.com/data/avatars/m/258/258664.jpg
I'm probably the only one, but I'd rather see GDDR5X on future cards than HBM.
https://forums.guru3d.com/data/avatars/m/248/248994.jpg
^ Yeah, you're acting like everything they produce is overpriced into oblivion.
Because it is? But of course that also means I'd be all the happier if they actually released something reasonably priced. But looking at how the i5/i7 prices have developed, I wouldn't hold my breath. Unfortunately Intel knows far too well how to run a business.
https://forums.guru3d.com/data/avatars/m/227/227853.jpg
Because it is? But of course that also means I'd be all the happier if they actually released something reasonably priced. But looking at how the i5/i7 prices have developed, I wouldn't hold my breath. Unfortunately Intel knows far too well how to run a business.
They'd have to be stupid not to capitalize on the fact that AMD screwed up so bad. It's a business not a charity. But as it stands, not everything is overpriced. The i7s are overpriced indeed, the i5s not so much. The i3s are killing AMD's octa-cores in games so I wouldn't call those too overpriced either. Not normal price either though. My point is that it could be far worse. We can still get Intel 6-cores with $400 which stomp everything AMD has by quite a large margin.
https://forums.guru3d.com/data/avatars/m/258/258664.jpg
I guess it also depends on what you're looking for? Paralleled working on many cores? AMD vs Xeons. Gaming? The field narrows down to some AMD CPUs and almost all Intel ones. Maybe I'm wrong here, but if I want to build a workstation, AMDs stuff is by far more interesting than for a purely gaming orientated rigs imo. Or small office computers, go for AMD's APUs mainly because of the platform price. Also sorry for OT, just realised this has got nothing to do with ram.
data/avatar/default/avatar30.webp
I guess this technology was already done years ago. Sometimes I wonder why this comes right after HBM release. Why Amd announced freesync right after nvidia gsync launch? And many questions. These techologies were already available, they just refused to do it. When the right time comes, they released it. It's not like they found a technology during the night and then in the same day, the technology is found and shared with the public.
https://forums.guru3d.com/data/avatars/m/227/227853.jpg
I guess this technology was already done years ago. Sometimes I wonder why this comes right after HBM release. Why Amd announced freesync right after nvidia gsync launch? And many questions. These techologies were already available, they just refused to do it. When the right time comes, they released it. It's not like they found a technology during the night and then in the same day, the technology is found and shared with the public.
I believe I understand where you're getting at and it's certainly possible that companies withhold certain tech for various reasons. One example is the technologies as you stated. Another would be the performance difference of high-end GPUs between vendors. I find it quite convenient that with each generation of new GPUs one manufacturer edges the other slightly in performance. It's entirely possible that I'm wrong but it seems a bit convenient as I said. Take this and compare it to how it was 20 years ago when we had more than 2 GPU players on the market. Competition was something fierce with significant performance difference between the manufacturers' top of the line. I know the research was far more volatile back then but still. What you said about AMD is one of their common tactics I've noticed, it's called vaporware. For example they started hyping up Fury X when they realized how the GTX900 series was selling like hotcakes. Or how bulldozer was supposed to trash every i7 out there. This usually comes back and bites them in the ass. (and people blame Intel for AMD's downfall)