Too bad they are bad at RT as once you turn that on it chokes them out. I must say though the improvement from the previous cards to the new one is pretty impressive for an incremental if this is true.
Either these results are wrong, or there is more to this refresh, other than a small increase in core clock and memory.
An increase of 2Gbps in memory and a few extra Mhz on the core, can't account for an improvement of 20%.
Too bad they are bad at RT as once you turn that on it chokes them out. I must say though the improvement from the previous cards to the new one is pretty impressive for an incremental if this is true.
6800 with MPT,i'm sure if a 6900xt fell into my hands its numbers would probably be between 3080 / 3070ti.
Either these results are wrong, or there is more to this refresh, other than a small increase in core clock and memory.
An increase of 2Gbps in memory and a few extra Mhz on the core, can't account for an improvement of 20%.
When the card bios in Techpowerup will be uploaded,we will see if the "hidden" frequencies have also increased,e.g. The Fclk frequency from 1940 can reach 2.3GHz, such an increase can bring 10% more performance.
6800 with MPT,i'm sure if a 6900xt fell into my hands its numbers would probably be between 3080 / 3070ti.
different versions of the game
here's the same,more like between 3060ti and 3070ti for 6900xt and 2070 Super for 6800
it's playable,I ran rt 1440p on 3060ti,but 6900 is a 1000msrp/1400 retail card
it's a massive gap even to 3080 non-ti
Either these results are wrong, or there is more to this refresh, other than a small increase in core clock and memory.
An increase of 2Gbps in memory and a few extra Mhz on the core, can't account for an improvement of 20%.
Agreed. This performance increase is huge. Though, it also depends on how old the results for the non-50 cards are. Drivers alone could account for a 10% increase.
Agreed. This performance increase is huge. Though, it also depends on how old the results for the non-50 cards are. Drivers alone could account for a 10% increase.
wouldn't the drivers affect the older models too if true? I mean I'm not one to doubt AMD driver optimizations could help but they should reflect a similar increase percentage wise.
Agreed. This performance increase is huge. Though, it also depends on how old the results for the non-50 cards are. Drivers alone could account for a 10% increase.
But a new driver would improve both the older cards and the refresh.
Unless there are new units or instructions in this refresh. A bit like what happened with the 9700 to 9800.
wouldn't the drivers affect the older models too if true? I mean I'm not one to doubt AMD driver optimizations could help but they should reflect a similar increase percentage wise.
Horus-Anhur:
But a new driver would improve both the older cards and the refresh.
Unless there are new units or instructions in this refresh. A bit like what happened with the 9700 to 9800.
That's why I said "depends on how old the results for the non-50 cards". If they weren't re-tested with updated drivers, then the driver updates would make a big difference.
Too bad they are bad at RT as once you turn that on it chokes them out. I must say though the improvement from the previous cards to the new one is pretty impressive for an incremental if this is true.
On other hand even RTX 3090 Ti is still bad at RT...
Real good RT is still not for this gen.
No way the 6700XT scores only 12000 in time spy. I have this GPU and it scores ~13000 on default settings, overclocked it can do around 14200 points (at least my card)
So if the leaked scores are legit the 6750XT is only around 7% faster compared to the 6700XT.
No way the 6700XT scores only 12000 in time spy. I have this GPU and it scores ~13000 on default settings, overclocked it can do around 14200 points (at least my card)
So if the leaked scores are legit the 6750XT is only around 7% faster compared to the 6700XT.
Same, my 6700xt also puts up better number than those in the chart. Maybe the benchmark system is weak or badly configured? Wccftech is kind of trash in general so that could also explain the discrepancy 😛.
No way the 6700XT scores only 12000 in time spy. I have this GPU and it scores ~13000 on default settings, overclocked it can do around 14200 points (at least my card)
So if the leaked scores are legit the 6750XT is only around 7% faster compared to the 6700XT.
SamuelL421:
Same, my 6700xt also puts up better number than those in the chart. Maybe the benchmark system is weak or badly configured? Wccftech is kind of trash in general so that could also explain the discrepancy 😛.
Ok 7% makes much more sense by just increasing the mem speed!
Cyberpunk 2077 is still a simple version of RT, I do not see any shadows from car lights or other non static lights on my 6900xt, should be the same on 3090ti.
I would boost the settings before activating RT, even if I had a 3090ti.
100fps DLSS and 73 native 4k is still with lowered settings, top settings is below 50FPS.
Look at something like minecraft, that is a more advanced RT and it runs like crap compared to how simple the game is.
100fps DLSS and 73 native 4k is still with lowered settings
no it isn't
TLD LARS:
Look at something like minecraft, that is a more advanced RT
it's full path tracing
does it run like crap ? runs at 90fps
the only portion that drops below 50 is because gpu is at 50% load
[SPOILER][youtube=qH72R-Dqf9A][/SPOILER]
it's fine when 6900xt owners get a +1000usd gpu because of rasterization and 1440p performance.
it gets tiring when they post incorrect information.
3090ti bad at ray tracing,yeah,right.
how simple compared to rt off ?
no it isn't
it's full path tracing
does it run like crap ? runs at 90fps
the only portion that drops below 50 is because gpu is at 50% load
[SPOILER][youtube=qH72R-Dqf9A][/SPOILER]
it's fine when 6900xt owners get a +1000usd gpu because of rasterization and 1440p performance.
it gets tiring when they post incorrect information.
3090ti bad at ray tracing,yeah,right.
it's a massive gap even to 3080 non-ti