AMD Radeon (big NAVI) again rumored to get 80 Compute units (5,120 shaders)
Click here to post a comment for AMD Radeon (big NAVI) again rumored to get 80 Compute units (5,120 shaders) on our message forum
Maddness
Yes please AMD. Starting to get really excited about next gen cards from both vendors.
Martin5000
It dont make financial sense to target the small enthusiast market.
Im in no doubt that it will be slower than the competition.
FrostNixon
We keep hearing about it for two years, can't wait to see it in person.
Denial
JonasBeckman
Well AMD would start with the full chip anyway and from there go to 4 or 8 units down to 72, 64, 56 and so on whatever models and specs AMD has lined up using as much as possible as what might not meet the full 80 criteria could still work for the 72's or the 64's and whatever changes or how AMD does the let's say 6900, 6800 and so on.
Same as on the CPU side and Intel and NVIDIA and all that, might not be full 3950X material but would work as a 3900X without trying to get into some complicated discussion on silicon and wafers and all that. ๐
Cut and scale down use as much of the binning process as possible for what defects and yields are available which I think are getting more complicated as these die shrinks happen but seem to be doing OK at least.
Well that last part depends on what the launch situation will look like and initial availability plus costs and potential increase in costs from the retailers as that initial supply meets demand and disappears. ๐
Goes for NVIDIA too with Ampere I suppose now on a new process and then die shrink and how the situation looks for this compared to Navi here.
Plus additional partners, distribution of consumer grade chips versus workstation and for AMD there's also a part of it going to Apple and their hardware.
And the whole bit about the actual die and core GPU itself which for NVIDIA that's the A something (A100 and up was it?) and Navi20 to I think 23 or thereabout at least for AMD.
EDIT: Cost wise though no clue as to what AMD or NVIDIA have planned, GDDR6 as a more expensive chip though not HBM2 or variants thereof expensive but then paired with a 256 or 384 bit controller and how that is handled and additional costs and pairing of memory modules.
128 or something for the lower end cards resulting memory and total bandwidth speeds plus whatever the price will end up at although I am expecting the enthusiast tier high-end models to be on the costlier side even with the general increase in hardware or perhaps more so because of it.
After that I guess anything could happen for the high-end, mid and low-end models following that price level entry.
Hoping for 600 - 800 but 1000+ Euro / US Dollar and retailer price increases on top of it doesn't sound too implausible either and from the other news if NVIDIA is sticking 16 - 24 GB of RAM onto their cards that'd probably all but confirm it.
Although for their cards the 3080Ti is I believe taking the position the Titan cards had before so that also shifts things around a bit assuming that info was correct that is basically moving a big part of the GPU lineup up one tier in pricing.
EDIT: And how AMD will respond and position their own GPU hardware or what they can without reducing profits from costs and hardware, HBM would probably mean it's going to cost but that's still unconfirmed and I think recent info leaned more on going with GDDR6 although more than that seem to be rumor and guesswork at least for now.
Embra
Having a top tier competitive card can help have a psychological effect on making the whole line seem more "premium", hence selling more mid-level gpus.
Nvidia has benefited by this from this for quite some time now. Bragging rights.
AMD would benefit across their whole line of gpus if they can go head to head against NV top tier.
DeskStar
Twice the size of the 5700 XT. Cool, so since they're already knocking seven NM out of the park with superior returns one can assume they're not going to be that far off from the silicon returns of that of the 5700.
Half is what I'm reading. And I would go as far as saying it's going to be better than that as fabrication has to of gotten better/more efficient.
Man if this is truly going to be twenty-thirty percent faster than the 2080TI and Nvidia is going to offer up something potentially fifty percent faster to future of graphics cards is looking great!
Sure hope these things come with sixteen plus gigabytes of RAM!!
That is if these things come in at a decent price point!!
Buying $1,000 graphics cards any more are pretty much of this guy's want.... Done with that nonsense.
H83
Hope itยดs true, we really need the competition.
schmidtbag
cucaulay malkin
I do believe this will be released as a cut die that AMD will only release in its full capacity only when 7nm gets more mature.
I just hope big rdna2 doesn't just end up being an urban legend like big rdna1,and brings amd on feature parity with rt acceleration and image reconstruction techniques better than turing's RTX and DLSS 2.0
really ? is this the whole reason ?
not the performance,stability,feature set,good AIB versions,nvenc and cuda integration,decent OC,power efficiency,resale value ...
JamesSneed
schmidtbag
cucaulay malkin
JamesSneed
rm082e
As always, I wish AMD the best. I want good competition in the market to push all players to put out their best products, at the best prices. However, I've been hearing "just wait, its going to be an Nvidia killer!" since I got back into PC gaming in 2013. The 290x was immediately trumped by the 780 ti and a price drop on the 780. The Fury X had 4GB of RAM to the 980 ti's 6GB, and performed more like a 980. The Radeon VII was little more than vaporware and didn't compete on price or performance. The 5700 XT was rumored to be more powerful than it was until it launched.
Time and time again, the rumor mill describes cards that perform much higher than what we actually see at market, and they struggle to compete on price with what they do deliver. AMD has had some great cards down in the $200-250 range, and the XT line provides an alternative to the 2070 series for people who really don't want to buy Nvidia. But at that $350 to $400 price point, they just haven't been competitive.
I really hope this changes. I hope they can do with Radeon what they were able to do with Ryzen. But until I get slapped in the face with amazing performance at competitive prices, I'm just not believing any of these rumors.
cucaulay malkin
sbacchetta
Having the performance crown is important, because most journalists, youtubers, streamers, use the most powerful card available usually. All this is basically free advertising and help a ton for branding.
JamesSneed
JamesSneed
cucaulay malkin