AMD Big Navi Spotted? - Updated, the twitter photo is a confirmed fake

Published by

Click here to post a comment for AMD Big Navi Spotted? - Updated, the twitter photo is a confirmed fake on our message forum
https://forums.guru3d.com/data/avatars/m/273/273678.jpg
alanm:

Agree with Hilbert, looks fishy. Hynix has no business knowing the non-memory related specs of the GPU. AIBs would know, not Hynix. If the final card does indeed have HBM2e, then its likely an enterprise card.
Hynix would definitely be in the business of knowing what GPU's they are working with to slap hbm with.
https://forums.guru3d.com/data/avatars/m/79/79740.jpg
Astyanax:

Hynix would definitely be in the business of knowing what GPU's they are working with to slap hbm with.
You're probably right, it may not be as simple as AMD specifying their exact HBM2e needs and Hynix just providing it for them. Still unusual to think it would be tailor made for them which would certainly up the costs.
data/avatar/default/avatar02.webp
This is the upcoming MI100. Is Arcturus (Vega) based compute card. Is not Navi based card.
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
Fediuld:

This is the upcoming MI100. Is Arcturus (Vega) based compute card. Is not Navi based card.
Yeah, this is what I assumed. AMD stated multiple times they are going to keep Vega around for HPC as the architecture is simply better than RDNA in that regard.
data/avatar/default/avatar08.webp
BReal85:

You are just a simple liar. Check Navi's efficiency: the 5700/XT's power consumption, and check 5600XT's. They are on par with their NV counterparts'. The 5600XT is even more effective than an RTX 2060: a bit faster and consuming a bit less power. Bad drivers? You write in when NV released a WHQL CERTIFIED driver about 2 years ago, with which Watch Dogs 2 wasn't starting? Or do you remember the Chrome video playback problems? Yes, there are bad drivers, as well with NV cards. You just forgot to mention you can fry egg on the AMD cards, not to forget the mantras green eyed people tell to each other.
WHQL driver with game that could not start is really your worst nvidia driver? Damn, you missed so much from nvidia. Have you at least enjoyed all buggy WHQL drivers from AMD in the last couple of years? Maybe you should read how WHQL is "certified" these days...
https://forums.guru3d.com/data/avatars/m/280/280077.jpg
Will it be priced like Nvidia's Titan ?
https://forums.guru3d.com/data/avatars/m/72/72485.jpg
andries:

5120 cores, very powerful indeed. I wonder if a certain number of those cores are dedicated for ray tracing.
If it doesn't have HW ray tracing I think that would be a huge mistake.
https://forums.guru3d.com/data/avatars/m/260/260103.jpg
jbscotchman:

If it doesn't have HW ray tracing I think that would be a huge mistake.
I'd say with both the PS5 and Xbox both having hardware Ray Tracing that this is a given. I know for me to consider buying one, it is a must have feature.
https://forums.guru3d.com/data/avatars/m/196/196284.jpg
Crazy Serb:

WHQL driver with game that could not start is really your worst nvidia driver? Damn, you missed so much from nvidia. Have you at least enjoyed all buggy WHQL drivers from AMD in the last couple of years? Maybe you should read how WHQL is "certified" these days...
AMD and NVidia both release buggy drivers. If they didn't, there wouldn't be a regular stream of new driver releases.....
data/avatar/default/avatar40.webp
It's unfortunate that with the internet - and tech in general (photoshop, deepfake, autotune, etc), misinformation is so damn, damn easy to push out. Just as worrying are the number of folks who get so emotionally attached into believing, primarily due to playing on their hopes. These folks need to pay more attention to the intros of these pieces, where the author (Guru/Adored/etc etc) VERY CLEARLY let you know that this isn't stated or confirmed facts. Hopefully, in the future, net source validation is taken more seriously and its actually implemented.
https://forums.guru3d.com/data/avatars/m/248/248994.jpg
Twitter is 99% fake, so it's no big surprise SK Hynix says this photo is fake as well.
data/avatar/default/avatar13.webp
It might be fake, but surely they are working on their 4k gpu with ray tracing right? They need to hurry so I can get ampere 😛 March might be an interesting month, let's wait and see.
https://forums.guru3d.com/data/avatars/m/164/164033.jpg
Now 5120 core rdna card would be nice. But if it uses hbm I would think it's not for consumers tbh. I've had really good experience so far with 5700 xt better than with Vega 64 I had. But I would like a high end gpu even tho 5700 xt is plenty enough for 1440p.
https://forums.guru3d.com/data/avatars/m/181/181063.jpg
So...fake - the marketing machine for Big Navi is starting at full speed - prepare for another fakes, rumors - marketing strategies employed by unscrupulous firms. I'm sure that Nvidia will do the same with Ampere - AMD just started earlier.
https://forums.guru3d.com/data/avatars/m/243/243702.jpg
andries:

5120 cores, very powerful indeed. I wonder if a certain number of those cores are dedicated for ray tracing.
No. AMD's design is way different from nVidia's dedicated blocks. (Unless they scratched what they had in patents and started over again.) Multiple parts of Dual-CU work on DX-R math. Because transistors used are mostly shared with "traditional" rendering, DX-R addition is not that much costly on transistors. This means that changes that enable HW level optimizations for raytracing may be applied to all CUs.
data/avatar/default/avatar39.webp
Fox2232:

No. AMD's design is way different from nVidia's dedicated blocks. (Unless they scratched what they had in patents and started over again.) Multiple parts of Dual-CU work on DX-R math. Because transistors used are mostly shared with "traditional" rendering, DX-R addition is not that much costly on transistors. This means that changes that enable HW level optimizations for raytracing may be applied to all CUs.
Ah ok thanks, so if I understand correctly, rasterisation and ray tracing algorithms will be shared on all available cores?
https://forums.guru3d.com/data/avatars/m/250/250418.jpg
What a surprise it is a leak about a card with 24Gb HBM2e to be fake, not. I don't even use 4Gb on my RX580 at 1080p, modern games would be fine with 6Gb top. Playing at 4K is another thing, you'd need 8Gb minimum, but why 24Gb? It's not like we have the power to run 8K displays, yet.