AMD Removed RDNA 3 Efficiency Comparison to RTX 4090 from slide deck

Published by

Click here to post a comment for AMD Removed RDNA 3 Efficiency Comparison to RTX 4090 from slide deck on our message forum
https://forums.guru3d.com/data/avatars/m/108/108389.jpg
xIcarus:

To be fair these two points are at least somewhat valid. RT performance is slightly worse, but the bigger problem is that far more titles implement DLSS compared to FSR. This hurts AMD, and is starting to feel a bit like the old Gameworks debacle.
Not to mention that Nvidia grabbed more market shares these past 2 years, lots of people have seen for themselves what RT/DLSS could do and they certainly think those features worth a premium that AMD needs to charge at least 20% less Price to Performance to compete evenly. It's only AMD fanboys who refuse to accept that fact.
https://forums.guru3d.com/data/avatars/m/164/164033.jpg
Krizby:

Not to mention that Nvidia grabbed more market shares these past 2 years, lots of people have seen for themselves what RT/DLSS could do and they certainly think those features worth a premium that AMD needs to charge at least 20% less Price to Performance to compete evenly. It's only AMD fanboys who refuse to accept that fact.
Tbh I don't think AMD would sell that well even at 600€ for 7900xtx compared to 4090 for example 😀
https://forums.guru3d.com/data/avatars/m/108/108389.jpg
Ryu5uzaku:

Tbh I don't think AMD would sell that well even at 600€ for 7900xtx compared to 4090 for example 😀
Maybe, maybe not. If it were 10 years ago and I didn't have a job, I would certainly picked 7900XTX at 600usd 😀. I did picked the R9 290 in 2013 though, I thought the 780Ti sucked and I was right
https://forums.guru3d.com/data/avatars/m/164/164033.jpg
Krizby:

Maybe, maybe not. If it were 10 years ago and I didn't have a job, I would certainly picked 7900XTX at 600usd 😀. I did picked the R9 290 in 2013 though, I thought the 780Ti sucked and I was right
I think AMD will catch up in few years to RT but meh.
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
Krizby:

Maybe, maybe not. If it were 10 years ago and I didn't have a job, I would certainly picked 7900XTX at 600usd 😀. I did picked the R9 290 in 2013 though, I thought the 780Ti sucked and I was right
I'm still using a 290 and it's honestly amazing how much it postponed its obsolescence in driver updates, especially on Linux. The lack of VRAM is really what hurts it the most for me. With textures, shadows, and AA disabled or turned down a notch, I can get playable framerates in 1440p, or 4K for some older or indie games. That's why for me, if this nearly decade old GPU can still mostly keep up with what I like to do, a modern mainstream GPU ought to run laps around it. I just want steady 4K @ 60FPS raster performance for a price below $300. I'm not yet interested in RT. That being said:
Ryu5uzaku:

I think AMD will catch up in few years to RT but meh.
It will take AMD a few years to catch up with Nvidia but Nvidia's results aren't good enough for me either. You basically have to spend at least $1000 for a no-sacrifices RT experience, and I do consider DLSS to be a sacrifice (just a very small one). That's a hefty pricetag for something that is otherwise rather underwhelming that in most cases doesn't yield much better of an experience. I'm a strong believer in RT but like tessellation, it's going to take some time until the average person should have any reason to care about it. So - I'm probably going to be going with the RX 7000 series, since AMD seems to understand their RT performance is sub-par and the chiplet design ought to finally get good raster performance at a lower price. Worst-case scenario, the 6000 series will lower a lot in price, where a 6700XT ought to hold me over until someone can achieve good RT performance at mainstream prices.
https://forums.guru3d.com/data/avatars/m/164/164033.jpg
schmidtbag:

I'm still using a 290 and it's honestly amazing how much it postponed its obsolescence in driver updates, especially on Linux. The lack of VRAM is really what hurts it the most for me. With textures, shadows, and AA disabled or turned down a notch, I can get playable framerates in 1440p, or 4K for some older or indie games. That's why for me, if this nearly decade old GPU can still mostly keep up with what I like to do, a modern mainstream GPU ought to run laps around it. I just want steady 4K @ 60FPS raster performance for a price below $300. I'm not yet interested in RT. That being said: It will take AMD a few years to catch up with Nvidia but Nvidia's results aren't good enough for me either. You basically have to spend at least $1000 for a no-sacrifices RT experience, and I do consider DLSS to be a sacrifice (just a very small one). That's a hefty pricetag for something that is otherwise rather underwhelming that in most cases doesn't yield much better of an experience. I'm a strong believer in RT but like tessellation, it's going to take some time until the average person should have any reason to care about it. So - I'm probably going to be going with the RX 7000 series, since AMD seems to understand their RT performance is sub-par and the chiplet design ought to finally get good raster performance at a lower price. Worst-case scenario, the 6000 series will lower a lot in price, where a 6700XT ought to hold me over until someone can achieve good RT performance at mainstream prices.
I would buy AMD anyway even if they have subpar performance in RT just to keep some competition and I like their driver suite more lol. I would maybe get myself 7900xtx next year or wait till 8000. I love my 6800xt it's been amazing buy better then most GPU buys I have done. I got 3060 ti for a second and regretted it hard 😀 sold it forward with 50€ win and got myself a 6800 XT with some 300 more back then.
https://forums.guru3d.com/data/avatars/m/108/108389.jpg
schmidtbag:

It will take AMD a few years to catch up with Nvidia but Nvidia's results aren't good enough for me either. You basically have to spend at least $1000 for a no-sacrifices RT experience, and I do consider DLSS to be a sacrifice (just a very small one). That's a hefty pricetag for something that is otherwise rather underwhelming that in most cases doesn't yield much better of an experience. I'm a strong believer in RT but like tessellation, it's going to take some time until the average person should have any reason to care about it. So - I'm probably going to be going with the RX 7000 series, since AMD seems to understand their RT performance is sub-par and the chiplet design ought to finally get good raster performance at a lower price. Worst-case scenario, the 6000 series will lower a lot in price, where a 6700XT ought to hold me over until someone can achieve good RT performance at mainstream prices.
Lol you can ask @Undying how much of a sacrifice DLSS is at 1440p. You can play with rasterization only and still use DLSS, Turing owners have no problem playing the latest AAA games at Ultra Settings + DLSS. At 1440p DLSS is noticeably better than FSR2.0 Anyways if you prefer AMD, no amount of feature nor bang-for-buck would persuade you to look the other way. Though it a shame to upgrade PC, only to look at rasterization, which look almost the same for the past 8 years (Witcher 3 still looks comparable to today games). It's better to hold out a few more months, FTX collapse have forced miners to fully offload their used GPUs
Ryu5uzaku:

I would buy AMD anyway even if they have subpar performance in RT just to keep some competition and I like their driver suite more lol. I would maybe get myself 7900xtx next year or wait till 8000. I love my 6800xt it's been amazing buy better then most GPU buys I have done. I got 3060 ti for a second and regretted it hard 😀 sold it forward with 50€ win and got myself a 6800 XT with some 300 more back then.
Lol 3060Ti has better bang for buck than 6800XT, yet it didn't matter to you, then you complain about 4090/4080 price to perf, sound kinda hypocrite to me
https://forums.guru3d.com/data/avatars/m/164/164033.jpg
Krizby:

Lol you can ask @Undying how much of a sacrifice DLSS is at 1440p. You can play with rasterization only and still use DLSS, Turing owners have no problem playing the latest AAA games at Ultra Settings + DLSS. At 1440p DLSS is noticeably better than FSR2.0 Anyways if you prefer AMD, no amount of feature nor bang-for-buck would persuade you to look the other way. Though it a shame to upgrade PC, only to look at rasterization, which look almost the same for the past 8 years (Witcher 3 still looks comparable to today games). It's better to hold out a few more months, FTX collapse have forced miners to fully offload their used GPUs Lol 3060Ti has better bang for buck than 6800XT, yet it didn't matter to you, then you complain about 4090/4080 price to perf, sound kinda hypocrite to me
It was way slower on my resolution costing 600€ vs 800€ ish. So yeah I had to go with 6800 XT. To me the prices were horrendous and they still are. Just no go at 4K with something like 3060 ti. I wasn't happy paying that much. But alas it's what we got and I ain't happy about most likely paying more in the future either. But it has served me well and it was actually available at that time. It was either suffer low fps at my res or get 6800 XT or 6900 XT. Since that is what was available and well my old GPU wasn't exactly cutting it anymore after the display upgrade. And yeah the performance per dollar wasn't better but interestingly because prices were what they were, it was roughly the same considering 6800 XT on avg around 40-45% faster at 4K. Good value it was not but still longest lasting GPU buy so far without horrible hitches suddenly.
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
Krizby:

Lol you can ask @Undying how much of a sacrifice DLSS is at 1440p. You can play with rasterization only and still use DLSS, Turing owners have no problem playing the latest AAA games at Ultra Settings + DLSS. At 1440p DLSS is noticeably better than FSR2.0
I don't need to ask. I've seen the results. In some cases, the fidelity loss of DLSS is negligible and well worth the performance gain. In other cases, I'd rather just lower the detail settings of something else. It's a crapshoot of which titles offer amazing DLSS results, but since I'm not interested in the vast majority of games that support DLSS, the feature is virtually irrelevant to me. I've bought into a niche exclusive Nvidia tech before (PhysX) and regretted it. DLSS is not at all worth it to me. I'm aware you can do raster only with DLSS. Doesn't really change my point. Also, I have little to no interest in FSR since that's even worse with fidelity loss, but, at least I don't have to depend on the developers to implement it if I want to use it (which I probably won't).
Anyways if you prefer AMD, no amount of feature nor bang-for-buck would persuade you to look the other way. Though it a shame to upgrade PC, only to look at rasterization, which look almost the same for the past 8 years (Witcher 3 still looks comparable to today games).
First of all, get bent with your shame. Don't pity people because you feel the need to justify the amount of money you burnt on a frivolous machine that will be obsoleted in 2 or 3 years. Secondly, that's not even true. I was pretty hopeful about switching to Arc, only to find even the flagship on paper isn't good enough for my needs, let alone what it can actually do. As I've made clear before, I'm a believer in RT. I was there from the beginning routing for it, and lauded Nvidia for what they've tried to do. Part of the reason for that is because Nvidia didn't make it an exclusive technology. Lastly, like I said, you need to spend 4 figures if you want a playable RT experience without sacrifices. Unlike you, I don't find that a worthwhile investment, at all. I have the money to buy such a GPU but why would I spend ~$650 extra just so I can subtly improve the detail level for practically no improvement on how fun the game is? There are so many more ways $650 could improve my gaming experience more than RT will. So - I opt against Nvidia because they have nothing to offer me. Their affordable GPUs don't have RT performance worth anyone's attention so if I'm going to surrender RT for now, I might as well spend less money and get better raster performance with AMD. Add to the fact I game on Linux, where AMD's drivers are much better. Unlike you, my decision isn't based on vendor preferences, it's logical. AMD gives me the best experience for the amount of money I'm willing to spend, which isn't saying much because even with today's sales, I find AMD to be overpriced.
https://forums.guru3d.com/data/avatars/m/108/108389.jpg
schmidtbag:

I don't need to ask. I've seen the results. In some cases, the fidelity loss of DLSS is negligible and well worth the performance gain. In other cases, I'd rather just lower the detail settings of something else. It's a crapshoot of which titles offer amazing DLSS results, but since I'm not interested in the vast majority of games that support DLSS, the feature is virtually irrelevant to me. I've bought into a niche exclusive Nvidia tech before (PhysX) and regretted it. DLSS is not at all worth it to me. I'm aware you can do raster only with DLSS. Doesn't really change my point. Also, I have little to no interest in FSR since that's even worse with fidelity loss, but, at least I don't have to depend on the developers to implement it if I want to use it (which I probably won't). First of all, get bent with your shame. Don't pity people because you feel the need to justify the amount of money you burnt on a frivolous machine that will be obsoleted in 2 or 3 years. Secondly, that's not even true. I was pretty hopeful about switching to Arc, only to find even the flagship on paper isn't good enough for my needs, let alone what it can actually do. As I've made clear before, I'm a believer in RT. I was there from the beginning routing for it, and lauded Nvidia for what they've tried to do. Part of the reason for that is because Nvidia didn't make it an exclusive technology. Lastly, like I said, you need to spend 4 figures if you want a playable RT experience without sacrifices. Unlike you, I don't find that a worthwhile investment, at all. I have the money to buy such a GPU but why would I spend ~$650 extra just so I can subtly improve the detail level for practically no improvement on how fun the game is? There are so many more ways $650 could improve my gaming experience more than RT will. So - I opt against Nvidia because they have nothing to offer me. Their affordable GPUs don't have RT performance worth anyone's attention so if I'm going to surrender RT for now, I might as well spend less money and get better raster performance with AMD. Add to the fact I game on Linux, where AMD's drivers are much better. Unlike you, my decision isn't based on preferences, it's logical. AMD gives me the best experience for the amount of money I'm willing to spend, which isn't saying much because even with today's sales, I find AMD to be overpriced.
Lol, you are just deluding yourself. You can enjoy games even with Intel Arc, just lower the settings to get playable framerate. Reviewers benchmark at ultra settings but that doesn't mean you need ultra settings to enjoy games, High or Medium are often good enough. DLSS/XeSS/FSR are the way of the future, if you don't use them then you will need to spend a lot more money for playable FPS, even with raster only. The more prerequisites you add to your choice of GPU, it just show your preference more clearly.
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
Krizby:

Lol, you are just deluding yourself. You can enjoy games even with Intel Arc, just lower the settings to get playable framerate. Reviewers benchmark at ultra settings but that doesn't mean you need ultra settings to enjoy games, High or Medium are often good enough.
Uh huh, says the one who is constantly trying to justify his purchase and has to convince himself that despite the expense, DLSS isn't a sacrifice. Why would I wait 8 years to upgrade my GPU just so I can buy something where I have to again lower detail settings? I agree ultra settings aren't necessary and not what I expect to do anyway, but I don't want something that can barely keep up today if it's going to be another few years until acceptable RT performance makes it way down to mainstream GPUs. Also, seems rather hypocritical of you to say "high or medium are often good enough" when you also say dismissing RT is a shame.
The more prerequisites you add to your choice of GPU, it just show your preference more clearly.
Except I don't have that many: I just want something that can reliably do 4K@60FPS raster for $300 and under 300W (preferably under 250W). Maybe AV1 decode too but that's not a dealbreaker. That's it. Everything else is nice-to-have. Nvidia doesn't offer what I want.
https://forums.guru3d.com/data/avatars/m/108/108389.jpg
schmidtbag:

Uh huh, says the one who is constantly trying to justify his purchase and has to convince himself that despite the expense, DLSS isn't a sacrifice. Why would I wait 8 years to upgrade my GPU just so I can buy something where I have to again lower detail settings? I agree ultra settings aren't necessary and not what I expect to do anyway, but I don't want something that can barely keep up today if it's going to be another few years until acceptable RT performance makes it way down to mainstream GPUs. Also, seems rather hypocritical of you to say "high or medium are often good enough" when you also say dismissing RT is a shame. Except I don't have that many: I just want something that can reliably do 4K@60FPS raster for $300 and under 300W (preferably under 250W). Maybe AV1 decode too but that's not a dealbreaker. That's it. Everything else is nice-to-have. Nvidia doesn't offer what I want.
Lol I'm trying to look at the perspective of someone who say they care about price to perf, yet refuse to lower some useless raster settings, or enabling DLSS/XeSS/FSR for a 4k60hz gaming, sound very hypocritical to me. BTW I had Titan X Maxwell, 1080Ti, 2080Ti, 3090 before and still had to lower some raster settings in some games, kinda entitled or wishful of you to demand 4k60 Ultra for 300eur Why do I need justification when whatever setting I use, I still have the highest FPS money can buy anyways LOL. Btw I use 4K DLAA+RT in games that support them now, visual is off the chart haha. Anyways I don't have to tell my wife how much I spend on PC, so I definite don't need to justify my purchase to some strangers 😉
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
Krizby:

Lol I'm trying to look at the perspective of someone who say they care about price to perf, yet refuse to lower some useless raster settings, or enabling DLSS/XeSS/FSR for a 4k60hz gaming, sound very hypocritical to me.
I've waited so many years because I was waiting for the price to performance to make sense for 4K, and the time has finally come. I don't mind at all lowering settings when my GPU is just simply getting old, but I would care if I'm buying something new and it's struggling to keep up with games from a few years ago. That's what Arc is. I'm not necessarily against DLSS/XeSS/FSR but they're not selling points to me.
BTW I had Titan X Maxwell, 1080Ti, 2080Ti, 3090 before and still had to lower some raster settings in some games, kinda entitled or wishful of you to demand 4k60 at 300eur
Yeah and other than the 1080Ti, people rightfully ridiculed those GPUs for being poor values. What I want isn't wishful thinking; it's almost already happening. A 6700XT can offer 4k@60 and in some stores is getting pretty close to $350. You can get it for $300 used. That being said, I'm measuring in USD, not euros. Not only am I not paying the same VAT as you but in my state, I don't have to pay any sales tax at all. Add to the fact that many stores offer free shipping and whatever the price tag is, that's what I'm paying. So, it's a lot more realistic for me than you think.
Why do I need justification when whatever setting I use, I still have the highest FPS money can buy anyways LOL. Btw I use 4K DLAA+RT in games that support them now, visual is off the chart haha.
Because up until the 4090, the best was still never good enough. It doesn't make sense to spend double or triple when you're not getting an experience that is 2x or 3x better. If you've got money to burn, fine, you do you. But ultimately my point is don't start challenging other people's choices when there's good reasons for them.
Anyways I don't have to tell my wife how much I spend on PC, so I definite don't need to justify my purchase to some strangers 😉
Yet you do anyway...
https://forums.guru3d.com/data/avatars/m/164/164033.jpg
I've only twice best of the best from one manufacturer 5950 ultra and 290x... Out of these 290x lasted basically forever. After I've just gotten 980ti, 5700 XT, 6800 XT. Before that I've had some 5870, 4870, 4200 ti, 7800 GTX, 1950xtx, GeForce 256 and riva tnt... I've made few poor decisions there but that happens.
https://forums.guru3d.com/data/avatars/m/270/270008.jpg
schmidtbag:

Here's my theory: Behind the scenes, AMD is targeting the 4080 in overall performance. This has backfired twice: First was when the 4080 12GB got downgraded, so, that means AMD only has the 16GB to compare to, which means they might have to bump clocks to be more competitive. Meanwhile, the 4080 16GB was a bit strange in that it overall consumed less power than advertised. This was probably a curveball to AMD, where the 4080's better-than-advertised performance-per-watt meant they could no longer reliably tout efficiency of RDNA3.
Possibly. I am seriously considering getting a 4080 and power limiting it to 225w because the perf drop off is around 10%. Would make one heck of a near silent card. That is only possible due to how efficient the GPU is in the 4080. Both the 4080 and 4090 if pushed less are super-efficient. Heck the 4090 only loses 10% perf once you reduce power 40%. I'm sure AMD has done a great job but competition here is very tough.
https://forums.guru3d.com/data/avatars/m/108/108389.jpg
schmidtbag:

I've waited so many years because I was waiting for the price to performance to make sense for 4K, and the time has finally come. I don't mind at all lowering settings when my GPU is just simply getting old, but I would care if I'm buying something new and it's struggling to keep up with games from a few years ago. That's what Arc is. I'm not necessarily against DLSS/XeSS/FSR but they're not selling points to me. Yeah and other than the 1080Ti, people rightfully ridiculed those GPUs for being poor values. What I want isn't wishful thinking; it's almost already happening. A 6700XT can offer 4k@60 and in some stores is getting pretty close to $350. You can get it for $300 used. That being said, I'm measuring in USD, not euros. Not only am I not paying the same VAT as you but in my state, I don't have to pay any sales tax at all. Add to the fact that many stores offer free shipping and whatever the price tag is, that's what I'm paying. So, it's a lot more realistic for me than you think. Because up until the 4090, the best was still never good enough. It doesn't make sense to spend double or triple when you're not getting an experience that is 2x or 3x better. If you've got money to burn, fine, you do you. But ultimately my point is don't start challenging other people's choices when there's good reasons for them. Yet you do anyway...
Looks like you are dead set against any GPU other than the 6700XT already, though I agree it's a nice GPU at 350usd 😉. Would probably better if you get it new and get 2 bundled games. But the part where 6700XT can offer 4K60 Ultra, nah, pretty far from it even in pre-2020 games, going forward you would definitely need FSR for 4K60
average-fps-3840-2160.png
https://forums.guru3d.com/data/avatars/m/198/198862.jpg
Krizby:

Lol you can ask @Undying how much of a sacrifice DLSS is at 1440p.
Honestly i've been pleasantly surprised with dlss. 1440p dlssq is very usable maybe even more than i want to admit. I will be missing it when i go back to amd.
https://forums.guru3d.com/data/avatars/m/275/275921.jpg
Undying:

I will be missing it when i go back to amd.
Did AMD let you go and explore the world for a little while and now you have to return back to the collective?
https://forums.guru3d.com/data/avatars/m/294/294824.jpg
Krizby:

Lol, you are just deluding yourself. You can enjoy games even with Intel Arc, just lower the settings to get playable framerate. Reviewers benchmark at ultra settings but that doesn't mean you need ultra settings to enjoy games, High or Medium are often good enough. DLSS/XeSS/FSR are the way of the future, if you don't use them then you will need to spend a lot more money for playable FPS, even with raster only. The more prerequisites you add to your choice of GPU, it just show your preference more clearly.
That's just it, my 3090ti does amazing at 4k high to ultra settings, I also disable the bullshit settings that really make no difference, then enable dlss or amd tech I can get 140-160 fps in most of my games. Hell, you can hardly see the difference between LOW to HIGH in WZ2. Games these days are just terribly optimized, add settings that don't look much different without but take a big performance hit. ""shrugs""
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
Krizby:

Looks like you are dead set against any GPU other than the 6700XT already, though I agree it's a nice GPU at 350usd 😉. Would probably better if you get it new and get 2 bundled games. But the part where 6700XT can offer 4K60 Ultra, nah, pretty far from it even in pre-2020 games, going forward you would definitely need FSR for 4K60
I'm not actually, I'd rather get a 7700[XT] since it will probably be faster and cheaper. The 6700XT is just my "control group". Whatever I get will be compared to that. I have no intention to buy it used, not for $300 anyway. Of the benchmarks you showed where the 6700XT falls behind in 4K, at least 3 of them are known to be poorly optimized no matter what you use, so I have to draw the line somewhere - I'm not waiting several more years to play a game that's barely playable on a $700 GPU. As stated before, I'm using Linux, where AMD does much better. For example, Hitman 3 with ultra settings in 4K yields a whopping 81FPS for the 6700 XT. That's nearly double what it gets in Windows. Bear in mind, there is no native binary for that game. That means no driver optimizations for it, and yet, it runs much better. There's a reason I rarely ever recommend AMD to Windows users, and rarely ever recommend Nvidia to Linux users.