AMD Radeon RX 6700 XT could be slower than GeForce RTX 3060 Ti

Published by

Click here to post a comment for AMD Radeon RX 6700 XT could be slower than GeForce RTX 3060 Ti on our message forum
data/avatar/default/avatar26.webp
cucaulay malkin:

im saying the way they do things is so broad they always leave themselves so much wiggle space for whatever opinions they wanna present look at this one they took two games for a whole review.does that mean the problem is 2 games from the possible hundreds of dx12 titles ? Or it is just the two they wanna base their whole opinion on ? paired three $100 budget cpus with a $1000/1400 cards.there is no in-between cpu like 3600,10400 or 3700x. then arrived at their conclusion. fine,but how do you judge the extent from this.
They didn't tested just two games , you need to watch whole video and read description . All games show similar behavior . Watch dogs is good example because it is nvidia game 🙂
data/avatar/default/avatar38.webp
cucaulay malkin:

im seeing horizon and wd legion.
Yes this is what they showed but tested more games . "Some additional information: If you want to learn more about this topic I highly recommended watching this video by NerdTech: [youtube=nIoZB-cnjc0] You can also follow him on Twitter for some great insights into how PC hardware works: https://twitter.com/nerdtechgasm As for the video, please note I also tried testing with the Windows 10 Hardware-Accelerated GPU Scheduling feature enabled and it didn't change the results beyond the margin of error. This GeForce overhead issue wasn’t just seen in Watch Dogs Legion and Horizon Zero Dawn, as far as we can tell this issue will be seen in all DX12 and Vulkan games when CPU limited, likely all DX11 games as well. We’ve tested many more titles such as Rainbow Six Siege, Assassin’s Creed Valhalla, Cyberpunk 2077, Shadow of the Tomb Raider, and more."
https://forums.guru3d.com/data/avatars/m/282/282473.jpg
kapu:

Yes this is what they showed but tested more games . "Some additional information: If you want to learn more about this topic I highly recommended watching this video by NerdTech: [youtube=nIoZB-cnjc0] You can also follow him on Twitter for some great insights into how PC hardware works: https://twitter.com/nerdtechgasm As for the video, please note I also tried testing with the Windows 10 Hardware-Accelerated GPU Scheduling feature enabled and it didn't change the results beyond the margin of error. This GeForce overhead issue wasn’t just seen in Watch Dogs Legion and Horizon Zero Dawn, as far as we can tell this issue will be seen in all DX12 and Vulkan games when CPU limited, likely all DX11 games as well. We’ve tested many more titles such as Rainbow Six Siege, Assassin’s Creed Valhalla, Cyberpunk 2077, Shadow of the Tomb Raider, and more."
where are the results then ? why is he saying that ? I don't have a problem with their conclusion.like I said,I've seen this long ago already. I'm saying with this review it's absolutely impossible to judge anything else than how a high end card will perform on a $100 cpu in,literally,a couple of dx12 games.no vulkan,no dx11.no cpu in between worst and best single thread you can get.
https://forums.guru3d.com/data/avatars/m/172/172989.jpg
I see people again have to cherry pick reviews in order to support their point. And other will pick and choose their counter arguments. Fine. Can we have a list of mutual agreed on, mutually respected review authorities? This discrediting and gaslighting has to stop. If not, we'll stop being a community and we're essentially sailing blind with having to do all our reviews ourselves. We neither have the time, energy, money and resources nevermind the moral status to do so. Learn to agree to disagree. Ps. Yes, you can dig into my post history and see I'm being hypocritical. Doesn't mean I won't report you.
https://forums.guru3d.com/data/avatars/m/250/250418.jpg
I didn't want to stir the discussion around the YT channel HU, but my opinion is that they are not biased. They have both recommended or not Intel/Nvidia/AMD where applicable. Currently they trash AMD for not having a DLSS option and their RT performance being subpar. Their test system changed recently to AMD because it is the fastest processor and 16 cores will never bottleneck a GPU. They trash both companies, AMD specially, for no availability of GPUs. Only Nvidia has tried to censure them for no reason at all, as they cover RT and DLSS extensively on their channel. You just can't hide the fact that 3090 is absolute shit at that price and a money grab, and 3080 has too little memory although a good card. AMD only failed on availability, the 6000 series performance wise is spot on. But we need 10x more cards of what they are producing right now...
https://forums.guru3d.com/data/avatars/m/282/282473.jpg
Silva:

Currently they trash AMD for not having a DLSS option and their RT performance being subpar.
didn't they have a spat with nvidia about doing the exact opposite ? I like them,but more like a publicity stunt example to follow if you wanna make a profitable YT tech channel. this a 6700xt thread tho.
data/avatar/default/avatar17.webp
cucaulay malkin:

where are the results then ? why is he saying that ? I don't have a problem with their conclusion.like I said,I've seen this long ago already. I'm saying with this review it's absolutely impossible to judge anything else than how a high end card will perform on a $100 cpu in,literally,a couple of dx12 games.no vulkan,no dx11.no cpu in between worst and best single thread you can get.
I take his word for it , never had any doubts in their testing . It shows that nVidia GPU will have much lower performance on 100$ cpu than AMD GPU in CPU bottleneck , and it is true . Maybe nvidia can fix it and users can benefit . They did good job.
https://forums.guru3d.com/data/avatars/m/273/273678.jpg
kapu:

And HU is trusted with methodology so
No they aren't. HU will ignore the logical factors involved in why something is different and report it as a defect or flaw. 2 Things at play with his video. 1> Nvidia is not forcing the CPU to prepare as many frames in advance (Input Latency Win) 2> Nvidia uses advanced power gating to shutdown chip regions under lower resolution and load. (Performance Down, Power savings up). if the cpu is weaker, in a static flip queue less frames get sent to the gpu and this combined with SM gating returns lower fps. The Radeon is using more CPU to achieve its output 1> Has the input lag prior to gpu render time suffered? (not tested) 2> Do Radeons use power gating in the way nvidia does? (No they don't)
https://forums.guru3d.com/data/avatars/m/282/282473.jpg
kapu:

I take his word for it , never had any doubts in their testing . It shows that nVidia GPU will have much lower performance on 100$ cpu than AMD GPU in CPU bottleneck , and it is true . Maybe nvidia can fix it and users can benefit . They did good job.
3090/6900xt and 1600x/10100 specifically fair point,but the testing subject to form a thesis like that is too narrow.how often will I see that ? 2/2 games is 100% according to hub I always had a feeling their tests are made intentionally vague.
https://forums.guru3d.com/data/avatars/m/198/198862.jpg
Review in a week. It will beat even the 3070. People will be surprised.
https://forums.guru3d.com/data/avatars/m/56/56004.jpg
itpro:

Leave HU alone. They are doing great job. They do not lick Nvidia/Intel like most of tech people.
Yep, many lick the boots of nVidia when told to do so, HU did not, gotta give 'em props for that!
https://forums.guru3d.com/data/avatars/m/243/243702.jpg
cucaulay malkin:

except that only happens when 5700xt performs badly due to amd card's well known flaws like dx11 performance and such.when ported to dx12/vulkan they get a performance increase.So NO it's not nvidia performing great,it's nvidia performing more solid where radeon is lacking. i have everything set correctly,don't worry.it's you who's trying to spin everything. I wonder if I asked you the same about 980Ti in early dx12 titles. well,it's doing fine,you just don't have the baseline set correctly for maxwell.it certainly doesn't lack proper dx12 support 😛 this must be the baseline performance for 5700 then [SPOILER] https://i.imgur.com/jdloM8U.jpeg[/SPOILER] taking extremities as the baseline is wrong conceptually. baseline for a GPU is average of all engines and api's,and resolutions.that's why I think pcgh's method is the most objective way to answer the question "how fast is X compared to Y".it might not be true for an individual,but it's true as a concept. 20 games,all api's,most prominent game engines,four resolutions averaged https://www.pcgameshardware.de/Grafikkarten-Grafikkarte-97980/Specials/Rangliste-GPU-Grafikchip-Benchmark-1174201/ take baseline for navi and turing,then you can compare two.
You are right. Wrong software picking is just wrong. Valheim popped out into existence last month and it came as Early access. It is like hyperbole for wrong pickings. 🙂
https://forums.guru3d.com/data/avatars/m/282/282473.jpg
Fox2232:

You are right. Wrong software picking is just wrong. Valheim popped out into existence last month and it came as Early access. It is like hyperbole for wrong pickings. 🙂
dude,there are games that are 5 years old and are broken.meanwhile valheim in early access runs flawlessly for me. I guess it matters for whatever number of people bought it too.
data/avatar/default/avatar22.webp
Undying:

Review in a week. It will beat even the 3070. People will be surprised.
In raster 1080/1440p i think it will trade blows 🙂 in RT there isnt any debate.
https://forums.guru3d.com/data/avatars/m/282/282473.jpg
kapu:

In raster 1080/1440p i think it will trade blows 🙂 in RT there isnt any debate.
amd's slides are not gonna be 100% representative,but this geekbench is just completely bogus in comparison.
https://forums.guru3d.com/data/avatars/m/243/243702.jpg
cucaulay malkin:

dude,there are games that are 5 years old and are broken.meanwhile valheim in early access runs flawlessly for me. I guess it matters for whatever number of people bought it too.
And are results so surprising since Valheim is DX11 title and is on GeForce NOW? Not rly. And it does not change fact that saying one brand is baseline and other is all over place is just wrong. Because moment you put AMD's cards as baseline, nVidia's performance will be all around the place too. That's just basic mathematical fact. Same way as using OpenCL for gaming comparison estimation is wrong. Like RX 5700 XT is in general very close to Radeon VII in gaming, but their compute performance is quite different. Each architecture has quite different ratio between compute and rendering performance. I am just pointing out wrong approaches without disputing actual numerical results. Wrong baseline is like saying that people 190cm tall are baseline. 210cm are tall and 170cm are short. Even if global average was used for baseline, it would be wrong. Because there is very smart saying: Right tool for right job.
data/avatar/default/avatar36.webp
cucaulay malkin:

amd's slides are not gonna be 100% representative,but this geekbench is just completely bogus in comparison.
For last months i never found Geekbench to be even near the truth.
https://forums.guru3d.com/data/avatars/m/282/282473.jpg
kapu:

For last months i never found Geekbench to even near the truth.
i only know this name from headlines
Fox2232:

And are results so surprising since Valheim is DX11 title and is on GeForce NOW? Not rly. And it does not change fact that saying one brand is baseline and other is all over place is just wrong. Because moment you put AMD's cards as baseline, nVidia's performance will be all around the place too. That's just basic mathematical fact. Same way as using OpenCL for gaming comparison estimation is wrong. Like RX 5700 XT is in general very close to Radeon VII in gaming, but their compute performance is quite different. Each architecture has quite different ratio between compute and rendering performance. I am just pointing out wrong approaches without disputing actual numerical results. Wrong baseline is like saying that people 190cm tall are baseline. 210cm are tall and 170cm are short. Even if global average was used for baseline, it would be wrong. Because there is very smart saying: Right tool for right job.
it's vulkan not what I said,I said nvidia and amd have their own baselines,not one is baseline for the other
https://forums.guru3d.com/data/avatars/m/273/273678.jpg
cucaulay malkin:

dude,there are games that are 5 years old and are broken.meanwhile valheim in early access runs flawlessly for me.
You clearly haven't done any terraforming then.