AMD Radeon RX 6950 XT, RX 6750 XT, and RX 6650 XT 3DMark Scores leak

Published by

Click here to post a comment for AMD Radeon RX 6950 XT, RX 6750 XT, and RX 6650 XT 3DMark Scores leak on our message forum
https://forums.guru3d.com/data/avatars/m/175/175902.jpg
cucaulay malkin:

how simple compared to rt off ? no it isn't it's full path tracing does it run like crap ? runs at 90fps the only portion that drops below 50 is because gpu is at 50% load [SPOILER][youtube=qH72R-Dqf9A][/SPOILER] it's fine when 6900xt owners get a +1000usd gpu because of rasterization and 1440p performance. it gets tiring when they post incorrect information. 3090ti bad at ray tracing,yeah,right.
Don't mistake "run at 100 fps" with the quality of RT... It's 2 things. And DLSS... the buitoni pizza compared to ***** restaurant when you look closer to texture... Right now none of the green or red GPU card with RT skill make me think: "wow i need this GPU so bad" And most of the time RT in game look bad (a bit like 10 year ago ray tracing),too reflective, too over shiny, or too dark. It's just the 1st stepS of RT in game and there is still some more steps before to be good enough. The only sure things is that one day, we will all have real good RT card... Just a question of time.
https://forums.guru3d.com/data/avatars/m/165/165018.jpg
The interesting thing about Cyberpunk though is that DLSS is done very well and adds a sizable amount of FPS with RT on. So even just DLSS is a factor that makes AMD cards less useful. Though they are attempting to do this with Fidelity FX I suppose.
https://forums.guru3d.com/data/avatars/m/282/282473.jpg
rl66:

Don't mistake "run at 100 fps" with the quality of RT... It's 2 things. And DLSS... the buitoni pizza compared to ***** restaurant when you look closer to texture... Right now none of the green or red GPU card with RT skill make me think: "wow i need this GPU so bad" And most of the time RT in game look bad (a bit like 10 year ago ray tracing),too reflective, too over shiny, or too dark. It's just the 1st stepS of RT in game and there is still some more steps before to be good enough. The only sure things is that one day, we will all have real good RT card... Just a question of time.
nah it looks fantastic,once you get used to rtgi you'll see the difference when you get back to traditional lighting I find it more of a difference when you get accustomed to ray traced GI and AO and then switch it off,rather than just comparing odd screenshots. keep telling yourself dlss at this point is the same as two-three years ago,fine by me,don't care.
data/avatar/default/avatar29.webp
cucaulay malkin:

it's full path tracing does it run like crap ? runs at 90fps the only portion that drops below 50 is because gpu is at 50% load [SPOILER][/SPOILER] it's fine when 6900xt owners get a +1000usd gpu because of rasterization and 1440p performance. it gets tiring when they post incorrect information. 3090ti bad at ray tracing,yeah,right.
Look at the lows in a game made out of 1meter x 1meter blocks, it is between 10-50 FPS. For a card costing more then 2000€ the performance is still pretty bad in my opinion.
cucaulay malkin:

keep telling yourself dlss at this point is the same as two-three years ago,fine by me,don't care.
Nowhere did he write that. EDIT : I can see that Cyberpunk 1.5 DLSS still has problems with grids and fences being rendered a bit funny and sometimes flickering.
https://forums.guru3d.com/data/avatars/m/277/277158.jpg
cucaulay malkin:

3090ti bad at ray tracing,yeah,right.
Cool let's get everyone buying 3090ti cards so they can enjoy this 'reasonable' RT experience.
https://forums.guru3d.com/data/avatars/m/282/282473.jpg
beedoo:

Cool let's get everyone buying 3090ti cards so they can enjoy this 'reasonable' RT experience.
there are 10 other options between 3090ti and 6900xt for rt 6900xt owners nitpicking on 3090 rt or dlss is hilarious when their +$1000 doesn't offer anything comparable.
https://forums.guru3d.com/data/avatars/m/230/230258.jpg
The performance hit in the expense of some perceived improvement in reflections and such is just atrocious for me when u can get way image quality uplift from so many other things like texture quality resolution scaling etc etc... Of course you bring logic and perception on the table for this. But guys defending RT's necessity; could u ask yourself and give an honest answer? Aren't all these just some justification of your owning a graphics processing device having some green logo sticking to it?
https://forums.guru3d.com/data/avatars/m/284/284177.jpg
rl66:

On other hand even RTX 3090 Ti is still bad at RT... Real good RT is still not for this gen.
you do realize that you're talking about the fastest gaming graphics card on planet earth? just saying
data/avatar/default/avatar40.webp
Airbud:

you do realize that you're talking about the fastest gaming graphics card on planet earth? just saying
For me the point is not that it is the fastest gaming card on the planet, the point is when choosing raytracing on, something else needs to be turned off and DLSS mostly needs to be used to make up for the high performance hit.
https://forums.guru3d.com/data/avatars/m/282/282473.jpg
TLD LARS:

For me the point is not that it is the fastest gaming card on the planet, the point is when choosing raytracing on, something else needs to be turned off and DLSS mostly needs to be used to make up for the high performance hit.
no rt is fine,as long as the card that can't run rt costs a lot less
https://forums.guru3d.com/data/avatars/m/272/272433.jpg
Ill believe it when the reviews comes out and even then it not 100%. I get 22634 graphics, 20384 CPU and 22265 Score in time spy and a little over 15200 in Port Royal with my little 12900KS/3090TI DDR5 setup and that isn't my best run its just the only 1 I had handy to look at to compare. So not really impressed with the XT or the X3D.
data/avatar/default/avatar23.webp
Horus-Anhur:

Either these results are wrong, or there is more to this refresh, other than a small increase in core clock and memory. An increase of 2Gbps in memory and a few extra Mhz on the core, can't account for an improvement of 20%.
Sure it can, if the card is limited by memory bandwidth at 4k (which it is with only a 256 bit bus), then a 20% increase to memory bandwidth would give exactly that.
https://forums.guru3d.com/data/avatars/m/248/248291.jpg
Dragam1337:

Sure it can, if the card is limited by memory bandwidth at 4k (which it is with only a 256 bit bus), then a 20% increase to memory bandwidth would give exactly that.
People have done a bit of memory OC in the 6000 cards, and they never got a linear performance increase with memory clock.
data/avatar/default/avatar38.webp
Horus-Anhur:

People have done a bit of memory OC in the 6000 cards, and they never got a linear performance increase with memory clock.
I dont get nearly as big an improvement in performance by oc'ing my memory as the 3090 ti gets from it's stock higher clocked memory either. Error correction will prevent you from getting a liniar increase from more than a very small increase to frequency... but higher quality chips, that stock runs alot faster, will not have the same issue. It will likely be a combo of different things giving the 20% performance increase, but a substaintial increase to memory bandwidth will surely be a big part of it.
https://forums.guru3d.com/data/avatars/m/248/248291.jpg
Dragam1337:

I dont get nearly as big an improvement in performance by oc'ing my memory as the 3090 ti gets from it's stock higher clocked memory either. Error correction will prevent you from getting a liniar increase from more than a very small increase to frequency... but higher quality chips, that stock runs alot faster, will not have the same issue. It will likely be a combo of different things giving the 20% performance increase, but a substaintial increase to memory bandwidth will surely be a big part of it.
I doubt that memory is the bulk of that 20% performance increase. It's just an increase of 12.5% in memory bandwidth. I have never seen a GPU performance scale linearly with memory clocks. We'll be lucky to get 6% performance increase by going from 16 to 18 Gbps. If those numbers are correct, and there is a 20% performance increase, there has to be something more to these 6x50 cards.
https://forums.guru3d.com/data/avatars/m/156/156348.jpg
cucaulay malkin:

I did very much,even on 3060ti,wouldn't call 55fps that cinematic ,I guess it'll have to do on a mid-range card.
55 fps on average is quite low in my humble opinion even for a single player offline game.
https://forums.guru3d.com/data/avatars/m/198/198862.jpg
MonstroMart:

55 fps on average is quite low in my humble opinion even for a single player offline game.
55fps feels the same as 60 with a gsync monitor. For a single player game more than fine i would say.
https://forums.guru3d.com/data/avatars/m/156/156348.jpg
Undying:

55fps feels the same as 60 with a gsync monitor. For a single player game more than fine i would say.
Depends on the lows. With 55 fps on average there's a good chance the most depending part of the game doesn't feel smooth.
https://forums.guru3d.com/data/avatars/m/282/282473.jpg
Undying:

55fps feels the same as 60 with a gsync monitor. For a single player game more than fine i would say.
Its better than 35