AMD Radeon 5800 XT (big NAVI) to get 80 Compute units?

Published by

Click here to post a comment for AMD Radeon 5800 XT (big NAVI) to get 80 Compute units? on our message forum
https://forums.guru3d.com/data/avatars/m/250/250418.jpg
I don't know if the rumour is true or not, but I doubt it will come made in 7nm. AMD must be waiting for the refined 7nm+ for that extra juice, wile refining RDNA to version 2.0 adding Ray Tracing: that one will be interesting to see.
https://forums.guru3d.com/data/avatars/m/260/260828.jpg
Pinstripe:

If it's RDNA2, shouldn't it be RX 6x00 series?
AMD already had different architectures on the same series with GCN, they could do the same with RDNA
data/avatar/default/avatar25.webp
Picolete:

AMD already had different architectures on the same series with GCN, they could do the same with RDNA
Yeah, and it caused confusion. If there's really RTRT-hardware, they should call it RX 6000 series.
https://forums.guru3d.com/data/avatars/m/230/230258.jpg
They should reduce the price of 5700/xt cards also. They are overpriced
data/avatar/default/avatar15.webp
No point unless it has HDMI 2.1
data/avatar/default/avatar31.webp
warlord:

I have a bad feeling though. If Nvidia was afraid of Big Navi, 3080ti would come before 3070 and 3080. What do you think people here?
Nvidia is not afraid of anything here. Data centers are its field, their cards are the top performers, drivers works better. Many people still want to buy a fast card and want competition, exactly how it happened with CPUs.
data/avatar/default/avatar32.webp
h9dlb:

No point unless it has HDMI 2.1
How this is a determining factor? bandwidth for what kind of resolution/fps?
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
asturur:

How this is a determining factor? bandwidth for what kind of resolution/fps?
4k 144 with no sampling
data/avatar/default/avatar22.webp
mohiuddin:

They should reduce the price of 5700/xt cards also. They are overpriced
I see everywhere 5700XT priced as 2060 super, it does not look overpriced with the current trend.
https://forums.guru3d.com/data/avatars/m/232/232349.jpg
JonasBeckman:

One of the things AMD does tend to do is to push the voltage and resulting power draw pretty high almost to it's limits so since the AMD Fury at least if not even before that it's possible to nearly halve the power draw and lose at most maybe 5% performance. Maybe things will be different here but if that makes a change between how the card positions itself against the 2080Ti then I expect AMD to go for it although at least it allows for a very good fine tuning should users want to try and balance it a bit. EDIT: Though it could also be bit on how AMD and NVIDIA reports power draw and such, guessing at full load and boost they might not be that different. EDIT: About 270 - 290w for the Ti it seems but then it also performs in a class of it's own. 5700XT has a 8x and a 6x by default and seems to be around 225w for how it performs but it has these spikes or high short period transient draws too which can almost hit 300w I think it was.
I hit around 3-330watts with my powercolor red devil on air. That is with my OC of course....
data/avatar/default/avatar30.webp
Denial:

4k 144 with no sampling
Can't that be obtained somehow even if they cannot get the 2.1 certification? like reaching the output speed but not other required specs? Being able to output 4k/144 vs being full 2.1 spec compliant can be a different thing? I would assume the next generation to be able to output that much frames, without being able to reach that in 4k new titles or titles from 2018+ with high settings ( imho ), still would be bad if they can't do that.
https://forums.guru3d.com/data/avatars/m/269/269912.jpg
I don't care if the Big Navi power efficiency makes me buy a new psu. If I can get 2080TI performance at half the price I am in. I have refused to buy any Nvidia RTX cards because of the gluttonous pricing. AMD has such a great chance of carving out a big piece of the enthusiast gpu pie. Just get it right this time. And if you do get it right, raise the persons salary that is responsible for getting it right so he/she doesn't go to Big Blue Balls. https://media0.giphy.com/media/atvAbYGL800EM/giphy.gif
https://forums.guru3d.com/data/avatars/m/130/130124.jpg
OK this sounds interesting. I think this is the only real chance of caching up in the high end GPU segment. They need the 5800XT to match the 2080ti and they need the 5950 to at least match 3080 while the 5950XT to go head to head with 3080TI. But as we all know with AMD the hype train is always big. I'll believe it when i see it this time. Compared to all the past speculations on their GPU's, this sounds the most impossible to happen tbh 🙂)))
https://forums.guru3d.com/data/avatars/m/243/243189.jpg
5800xt sounds good, right on track with what is needed from AMD to compete in higher segments. Will be interesting to see what is even above this considering names AMD has prepared for release beyond this GPU. I would be all behind big Navi going HBM again. I don't understand this continuous mention of drivers being better/worse, recent experiences with AMD drivers have been great with less software bloat too. I do agree though that release drivers really need to be up to scratch to make a great first impression on the market though.
https://forums.guru3d.com/data/avatars/m/217/217375.jpg
fantaskarsef:

Now this is intriguing. If it's not a CFX card, and works well, this might be the time to switch to 4K gaming, with a new screen which is Freesync Premium super edition plus or whatever it's called these days. edit: probably the card hinted at with that VR benchmark? +20% compared to a 2080TI there?
I decided to go 4K a a few years back, on my second monitor as 1st 60Hz screen failed inside it's extended warranty that was bought. I am now using what was easily the best choice for someone not on the bleeding edge of tech (top end screens get Very Expensive!!!) which ended up being the Acer Nitro XV273K 4k, QDot, 144Hz capable. Not qualified for Freesync 2 though. (I got an amazing price, several hundred cheaper than other sites, from Acer's own website due to an Anniversary sale + added a 4 year extended warranty for about £30) To achieve it's 144Hz certain settings have to be dialed : you miss chroma subsampling, HDR, Freesync or Gsync.... oh and you need to use 2x DP 1.4 cables. On the other hand you can have all options On but then are limited to 98Hz. 120Hz is a nice option for One cable usage and some good quality of life stuff. When I researched upcoming panels last year there was not a whole lot on the horizon, but I have noticed a lot of new monitor releases recently so there should be some good new choices. If I was buying a screen today or the near future I would want a single cable solution that requires no compromises in quality or features. Last I checked not all features are supported over HDMI yet so you are looking for DP2.0. DP1.4 effective bandwidth = 25.92 Gbps DP2.0 effective bandwidth = 77.37 Gbps I would imagine the new HDMI 2.1 will have full support of features but you would have to check. It's certainly a good time for PC hardware these days, Really looking forward to a proper Enthusiast card from AMD again <3
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
asturur:

Can't that be obtained somehow even if they cannot get the 2.1 certification? like reaching the output speed but not other required specs? Being able to output 4k/144 vs being full 2.1 spec compliant can be a different thing? I would assume the next generation to be able to output that much frames, without being able to reach that in 4k new titles or titles from 2018+ with high settings ( imho ), still would be bad if they can't do that.
Yeah it probably can but I think people like the stamp of approval. For example we have these monitor companies shipping 4K 144hz monitors that on paper look fine but when you get it and actually run it in 144hz mode, text looks like blurry garbage because it drops to 4:2:2 to hit that. And now they are doing weird resolutions like that 49" samsung one, where the issue is there but it happens at 92hz or something. It's like unless you research it, you have no idea what the actual max ceiling is before the image quality degrades. IMO it leads to a frustrating experience. I don't want to have to dig around the internet all day to buy a monitor. I want to see that it's 2.1 supported, my GPU is 2.1 supported and I know it's going to do 4K 144 with no sampling, without having to rely on reddit posts and whatnot. As for the frames, no idea - I'm sure with consoles getting a bump in horsepower that games will push for better graphics.. especially with RT coming out. So more than likely newer titles on the consoles will still target 1080P/60 but with higher fidelity. That being said it is nice to have the choice to run games in 4K and turn down the graphics settings. I do play various games like league of legends, CS, siege, etc where even my 1080Ti can do 4K @ 90-140 depending on the scene. So I imagine midrange from next gen will easily be able to do that across a ton of games.
https://forums.guru3d.com/data/avatars/m/237/237771.jpg
Vananovion:

HBM2 might be necessary due to power budget reasons - same as with Vega. Also, performance didn't scale linearly with more CUs on Vega - some CUs were doing literally nothing on Vega64. Hopefully AMD worked that out and the scaling will be better. Or maybe the added CUs are meant for dedicated RT processing. Still just rumors though. As much as I like to discuss the possibilities, nothing's certain until we get hard numbers.
Maybe not. If AMD doubles the CU count they will have to lower clocks. I'd expect base ~1200Mhz, game ~1350Mhz and boost ~1500Mhz. That way they can definitely keep the TDP around 275W. Those clocks should not need more than 900mV.
https://forums.guru3d.com/data/avatars/m/271/271560.jpg
i sincerely doubt Big Navi will have HBM2 (or 3). the cost is too high and the demand for Instinct cards is high and will remain so (mainly because of pcie 4 leveraged in system). too rich for consumer blood, i expect some variant of GDDR on a wide bus. the pro/sumer market will be fine with that, and the efficiencies and yields of gpu and memory will allow them to sell large numbers (by AMD standards) of Big Navi. as far as comparison to Nvidia, imho, AMD has only power efficiency to conquer as their designs are finally fully modern.
data/avatar/default/avatar35.webp
I am excited about big Navi. But then there is the elephant in the room called Ampere. 2080 Ti performance for $699 would be a good day though. Except I already went 2080 Ti for $999 last year. Would like to have AMD GPU to pair with my AMD CPU though.
data/avatar/default/avatar08.webp
warlord:

It can easily be 20% faster than 2080ti.
It can easily be 20% slower than 2080ti too 😛 "Fury X the Titan X killer" -was ~20% slower than most 980ti's Ps: I had Fury X as one of the few in Europe 😛