Sapphire Technology Product Manager Spills Some Beans on AMD NAVI

Published by

Click here to post a comment for Sapphire Technology Product Manager Spills Some Beans on AMD NAVI on our message forum
https://forums.guru3d.com/data/avatars/m/156/156133.jpg
Moderator
Astyanax:

Guess which company's not getting initial stock now.
Sapphire will still, they're probably AMD's best board partner lol.
https://forums.guru3d.com/data/avatars/m/156/156348.jpg
I really should have bought a 1080 instead of a 1070 dammit!!!
https://forums.guru3d.com/data/avatars/m/217/217375.jpg
schmidtbag:

I'm not liking those price points. I have a feeling I might be waiting longer than I thought to do a 4K upgrade. But, if these are Sapphire's prices at launch day, maybe it won't be so bad; they are typically more expensive than others. That doesn't make any sense... 7nm doesn't just magically make everything better. Relative to AMD's last gen stuff, it is an improvement, but you're comparing apples to oranges here.
Yup, I would still be waiting too, but was forced into an upgrade when my Fury died... The 2080ti has reached 4k 60fps levels reliably, but is a bit Nope! for me at that price lol Got a great Black Friday deal on an almost 4k ready 2080 - Good enough for now. Still want one of AMD's Next Gen post GCN cards when they finally get here with their chiplet design - they will do proper 4k 🙂
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
Humanoid_1:

Yup, I would still be waiting too, but was forced into an upgrade when my Fury died... The 2080ti has reached 4k 60fps levels reliably, but is a bit Nope! for me at that price lol Got a great Black Friday deal on an almost 4k ready 2080 - Good enough for now. Still want one of AMD's Next Gen post GCN cards when they finally get here with their chiplet design - they will do proper 4k 🙂
I hear ya. I could never justify buying a 2080Ti. A 2080 is 4K-ready if you just turn off AA. Depending on the size of your display, you probably wouldn't notice AA being off anyway, though, I'm sure some purists here will vehemently disagree. I too would hope the post-GCN cards will be what we're looking for but knowing their development rate, that's just way too far away, and that includes mature drivers. On the note of those chiplet-based GPUs, I'd like to see if that breathes new life into mGPU setups, where you can basically just keep adding as many core clusters as you want and it will scale appropriately. So to start with, you could have AIBs where their performance is basically determined by how many core clusters they have and how much VRAM they have. Then, there could be absurd Crossfire setups where you've got PCIe x1 or M.2 cards with only a single core cluster that act as a mini performance boost. I'm not keeping my hopes up that will happen, but I think it'd be pretty cool to be able to incrementally add performance like that.
https://forums.guru3d.com/data/avatars/m/273/273678.jpg
vbetts:

Sapphire will still, they're probably AMD's best board partner lol.
Yeaaah, AMD can't afford to punish its partners.
https://forums.guru3d.com/data/avatars/m/270/270008.jpg
Astyanax:

Yeaaah, AMD can't afford to punish its partners.
They can ask Sapphire to fire the person who leaked it which is what I suspect will occur.
https://forums.guru3d.com/data/avatars/m/63/63170.jpg
schmidtbag:

I hear ya. I could never justify buying a 2080Ti. A 2080 is 4K-ready if you just turn off AA. Depending on the size of your display, you probably wouldn't notice AA being off anyway, though, I'm sure some purists here will vehemently disagree. I too would hope the post-GCN cards will be what we're looking for but knowing their development rate, that's just way too far away, and that includes mature drivers. On the note of those chiplet-based GPUs, I'd like to see if that breathes new life into mGPU setups, where you can basically just keep adding as many core clusters as you want and it will scale appropriately. So to start with, you could have AIBs where their performance is basically determined by how many core clusters they have and how much VRAM they have. Then, there could be absurd Crossfire setups where you've got PCIe x1 or M.2 cards with only a single core cluster that act as a mini performance boost. I'm not keeping my hopes up that will happen, but I think it'd be pretty cool to be able to incrementally add performance like that.
I dont think we'll be seeing multi-card setups outside of the industrial/research space anymore. On the other hand, it does mean we could see maybe 16 chiplets (or more) on a single board (with HBM). I look forward to our new chiplet overlords 🙂
https://forums.guru3d.com/data/avatars/m/270/270008.jpg
I like pulling for AMD but if they don't push the GPU market forward then Im likely getting whatever Nvidia puts out on 7nm. Still holding onto my GTX 1070 and my 2k monitor which is actually doing pretty well. I did have an ati 9700 pro and an ATI 5870 which at the time they were better than anything Nvidia offered albeit the 5870's dominance was very short lived. Hopefully next week we can get some details on big Navi or whatever they will call it.
data/avatar/default/avatar35.webp
If it comes out for 500 and performs similarly to a 2070 it's dead in the water day one. I have been holding onto my rx 580 for a while, just to get a NAVI and if it's so slow and pricey and as we know HOT AS HELL, thank you, but no thank you.
https://forums.guru3d.com/data/avatars/m/156/156348.jpg
JamesSneed:

I like pulling for AMD but if they don't push the GPU market forward then Im likely getting whatever Nvidia puts out on 7nm. Still holding onto my GTX 1070 and my 2k monitor which is actually doing pretty well. I did have an ati 9700 pro and an ATI 5870 which at the time they were better than anything Nvidia offered albeit the 5870's dominance was very short lived. Hopefully next week we can get some details on big Navi or whatever they will call it.
Haha i'm pretty much like you the last two ATI/AMD GPU i owned is a 9800 pro (flashed to XT with a custom cooler) and a 5850. The 9800 pro flashed to XT remains one of the best card i ever owned. Like you said the 5850/5870 price to performance "dominance" was short lived but it was still a very good card and from that i can remember it was not running hot at all. I own a 1070 too and while it's okay at 2k144Hz most of the time it's a little bit on the not powerful enough side. I wish i could upgrade. The 2070 has come down in price and at the current price i could justify it (the launch price was totally ridiculous and i still have not digested it) but just to make a statement with my wallet if Navi is not good enough i'll wait for the next nVidia lineup i wont touch the RTX cards.
https://forums.guru3d.com/data/avatars/m/217/217375.jpg
schmidtbag:

I hear ya. I could never justify buying a 2080Ti. A 2080 is 4K-ready if you just turn off AA. Depending on the size of your display, you probably wouldn't notice AA being off anyway, though, I'm sure some purists here will vehemently disagree. I too would hope the post-GCN cards will be what we're looking for but knowing their development rate, that's just way too far away, and that includes mature drivers. On the note of those chiplet-based GPUs, I'd like to see if that breathes new life into mGPU setups, where you can basically just keep adding as many core clusters as you want and it will scale appropriately. So to start with, you could have AIBs where their performance is basically determined by how many core clusters they have and how much VRAM they have. Then, there could be absurd Crossfire setups where you've got PCIe x1 or M.2 cards with only a single core cluster that act as a mini performance boost. I'm not keeping my hopes up that will happen, but I think it'd be pretty cool to be able to incrementally add performance like that.
Certainly brings some interesting possibilities, really looking forward to see what comes. Certainly more cores and more speed just like the new Ryzen chips we have coming all so Soon ^^ @Evildead666 I don't know, perhaps the heavy demands of ray tracing will see a resurgence of multicard setups? Though the chiplet design and reduced cost of no longer needing monolithic GPU design may make multi card setups defunct for enough performance from a gaming perspective with decent quality..
https://forums.guru3d.com/data/avatars/m/243/243702.jpg
Michal Turlik 21:

We are going offtopic here...btw I will happily give my scores here...or better I can make a video on how rage 2 is doing...please tell me which metrics we want to use, I mean...which fps tool you want me to use and maybe I will shut up the few of you still believing in nvidia sh*t. Will give a FCAT try this evening and put the results.
What you do wrong is comparing apples to oranges. Stock, to Stock 1080 is around 2060. OC both and not much changes. Except maybe 2060 eating less power.
https://forums.guru3d.com/data/avatars/m/79/79740.jpg
tunejunky:

5) given the efficiencies and power draw @ 7nm, wc Navi may not actually need wc....my Radeon VII doesn't (even tho' i do have it on a custom loop). i wc the Radeon VII more for noise than the few extra fps i get.
When someone WC's a GPU to keep noise down rather than for performance sake, you know theres an interesting story somewhere 😀. So how do you like the R7 vs the 1080ti?
https://forums.guru3d.com/data/avatars/m/197/197287.jpg
Michal Turlik 21:

Considering that my oc gtx 1080 is on par with the rtx 2070 I am wondering which the sense should be by releasing something that will be on par with a 5 years old technology...wait...maybe I got the point...cash.
Are you in the future? Specifically May 27th, 2021? That's my only assumption i can come up with due to your "5 years old technology" statement. You must be in the future, May 27th 2021 or later, for a GTX 1080 to be 5 years old.
data/avatar/default/avatar20.webp
I expect better than this. AMD will either quickly lower pricing post launch or these prices are for premium Sapphire cards. You should think logically: Why release products which are dead on arrival? I'm personally expecting at least hundred euros less for both cards from typical partners like Gigabyte. Some analyses don't consider the market space at all, just thinking that maximum profit is what every company is after, when AMD cannot gain that by overpricing.
https://forums.guru3d.com/data/avatars/m/55/55855.jpg
Lets just take a moment, to think how they are going to do RT via software, after seeing how well Nvidia can do it via hardware 😱 😛
data/avatar/default/avatar09.webp
Rich_Guy:

Lets just take a moment, to think how they are going to do RT via software, after seeing how well Nvidia can do it via hardware 😱 😛
Technically very true. As we see from Pascal, software RT can't compare. But I find that RT has lost most of its interest to me. Shopping right now I'm looking at RTX because compared to a used GTX it's lower power and can be had new with warranty*. Compared to anything Radeon it can be had with higher performance and will hold such a lead for a long time. RT? I'll run some demos but otherwise that almost doesn't matter. * A used 1080 TI is still an excellent choice, and I'd go for that in a different use case.
https://forums.guru3d.com/data/avatars/m/243/243702.jpg
Arbie:

Technically very true. As we see from Pascal, software RT can't compare. But I find that RT has lost most of its interest to me. Shopping right now I'm looking at RTX because compared to a used GTX it's lower power and can be had new with warranty. Compared to anything Radeon it can be had with higher performance and will hold such lead for a long time. RT? I'll run some demos but otherwise that almost doesn't matter.
Pascal has magnitude worse FP16 performance than FP32. Therefore all those operations run on FP32. On other hand Vega has 2:1 FP16 to FP32, this makes it's FP16 twice as good as FP32 of GTX 1080Ti. So, if AMD does it through brute force, you can expect Vega 56 to do about twice as well as GTX 1080Ti. (Still not the best, but that Vega 56 is quite cheaper than lowest RTX enabled 2060.) Check this and potentially double fps of GTX 1080Ti. [youtube=TkY-20kdXl0]
https://forums.guru3d.com/data/avatars/m/260/260114.jpg
Guys i think it's $499 Singapur Dollar 😉 (w/7% Tax) Navi 10 XT = ~330 USD Navi 10 ("Pro") = ~280 USD IMhO Hi-End Navi starting price will be at ~299€ (As Polaris was, or maby lower? who knows) Those are small chips, easy to manufacture with good Yield... So pricing at >400€ do not do any good to ATI/AMD.
https://forums.guru3d.com/data/avatars/m/271/271573.jpg
Undying:

2070 is faster than 1080 in every new game, especially ones using Vulkan. Same goes for 2080 vs 1080ti.
Here we go. I am getting tired of repeating that Nvidia is trolling the real performance of Pascal gpus. The Gtx 1080 is a little monster, especially when oc'ed. It gets so close to the gtx 1080ti that the difference is something between 10-15%. And not less it kicks the rtx 2070 ass. My Rig: Asus X470-I ROG Strix 16 gb ddr4 3200 mhz Ryzen 2700x (stock clock) Watercooled and oc gtx 1080 fe with 399.24 drivers (the last good ones for Pascal gpus) Rage 2 has been run in 4k on ultra settings with soft vsync, txaa+fxaa As for the fps capturing I did use the awesome OCAT piece of software: https://github.com/GPUOpen-Tools/OCAT/releases Here you can grab my video showing the magic - this is a Rage 2 gameplay session that I have recorded for the misbelievings 🙂 https://drive.google.com/open?id=17eg_Np8bBrgSs2VR8xjgzuuIUX3btr2r I know, the word is: unbelievable 🙂 Undying, I can not say anything more cause I may offend you and the last thing that I want is to be banned - cheers!