Benchmark Results Radeon RX 6800 XT Show Good RT scores and Excellent Time Spy GPU score

Published by

Click here to post a comment for Benchmark Results Radeon RX 6800 XT Show Good RT scores and Excellent Time Spy GPU score on our message forum
https://forums.guru3d.com/data/avatars/m/243/243189.jpg
If AMD RT performance matches Nvidia + DLSS performance, then I would say that is already pushing towards a win in their first gen RT implementation.
data/avatar/default/avatar27.webp
Now we just need to know when and how will the "super resolution" tech work which is supposedly offer similar results as DLSS, hopefully.
https://forums.guru3d.com/data/avatars/m/281/281256.jpg
Am sold already 6800xt it is just hope the water blocks are out nice and early as well
https://forums.guru3d.com/data/avatars/m/271/271684.jpg
The second benchmark screenshot shows GPU utilization at 73% - CPU bottleneck at 1440p? As said in the article, 3500 is not much of an enthusiast CPU, but still a bottleneck at 1440p...
https://forums.guru3d.com/data/avatars/m/273/273822.jpg
I really hope these are great cards. That would be the best scenario for all of us. Imagine AMD pushing Nvidia into a price war. Bring it on.
https://forums.guru3d.com/data/avatars/m/273/273838.jpg
If the performance is near the 3080 without DLSS, then it's quite respectable I must say. We'll have to wait and see how good it looks though. AMD seems to have exceeded my expectations. Of course, I'm not sold yet, I'll wait for the independent reviews.
data/avatar/default/avatar33.webp
moo100times:

If AMD RT performance matches Nvidia + DLSS performance, then I would say that is already pushing towards a win in their first gen RT implementation.
Who actually cares about facking raytracing though? The vast majority of popular games doesn't support raytracing, and good riddance...
https://forums.guru3d.com/data/avatars/m/282/282600.jpg
Will have to also wait for real prices, the 3000 series aren't close to MRSP at the places I buy things, paper pre orders obviously lol but still. As I said weeks ago it's still 22nd November for main stock incomings so AMD will need to have a decent launch if Nvidia has stock then which seems to be the case. 2080ti level of RT is fine for the 6800xt, it's that DLSS which will make it worth it or not to most buyers on the fence. I'm already convinced that RT will be dictated by the consoles and because of that AMD should be fine, you'll get a couple of games which go full RT crazy but it's not like you're forced to switch it on and have no FPS
https://forums.guru3d.com/data/avatars/m/283/283103.jpg
Hmm not bad, not bad at all...
https://forums.guru3d.com/data/avatars/m/279/279721.jpg
Ray tracing isn't high on my priority list when it comes to a GPU but it's good to see that all current offerings can do it in some way, shape, or form without looking like a slideshow.
data/avatar/default/avatar39.webp
for once I didn't upgrade my gpu to the next gen and like many other (remember even Jensen talked to us directly) the jump from 1080(ti) to 3000 or 6000 seems very exciting 😀 my 1080ti is currently at 10391 - 66.17fps - 60.84fps that 6800XT +77% on graphics test 1 +55% on graphics test 2 totally don't care about RT I checked watchdogs legion RT and well it's typically the kind of details you only see if you stop, stand still and watch the world..which I often do but not a good enough reason to lose massive fps
data/avatar/default/avatar37.webp
Vananovion:

The second benchmark screenshot shows GPU utilization at 73% - CPU bottleneck at 1440p? As said in the article, 3500 is not much of an enthusiast CPU, but still a bottleneck at 1440p...
it's what I want to see change with the 5000 series my 3960x TR : 76% GPU bound in 1080p my 9900k : 99% GPU bound in 1080p I don't know about the more gaming AMD cpus but are they at 99% in 1080p ? (quickly checked and they seem worse than my threadripper with percentages a 3900x in the 30-40% paired with a 2080ti meaning the cpu is dragging the gpu down)
data/avatar/default/avatar37.webp
Vananovion:

The second benchmark screenshot shows GPU utilization at 73% - CPU bottleneck at 1440p? As said in the article, 3500 is not much of an enthusiast CPU, but still a bottleneck at 1440p...
Another possibility is that the fixed-function Ray Accelerator engine is the limiting factor. Possibly we won't see full GPU utilisation in ray traced games but we will see soon i guess.
https://forums.guru3d.com/data/avatars/m/211/211933.jpg
Leaning more and more towards team red,can't wait for HHs review.
data/avatar/default/avatar38.webp
Dragam1337:

Who actually cares about facking raytracing though? The vast majority of popular games doesn't support raytracing, and good riddance...
That same vast majority don't need a brand new $500+ gpu to run either and run fine on all sorts of old cards so why buy some expensive new gpu for them? Really it's ray tracing that's pushing graphics and hence is the reason you're gonna need to spend money on a new gpu.
data/avatar/default/avatar18.webp
Early benchmarks shows that AMD is in deed faster in RT than nVidia without DLSS.
data/avatar/default/avatar11.webp
Dribble:

That same vast majority don't need a brand new $500+ gpu to run either and run fine on all sorts of old cards so why buy some expensive new gpu for them? Really it's ray tracing that's pushing graphics and hence is the reason you're gonna need to spend money on a new gpu.
No, no and no. Resolution and hz is pushing the need for more gpu power, not raytracing. I dont know anyone who actually uses raytracing, as it's just a waste of gpu power. But i know alot of people who needs faster gpu's to push 240 hz or 4k.
https://forums.guru3d.com/data/avatars/m/145/145154.jpg
Just to chime in, I also couldn't care less about raytracing abilities. (or DLSS, for that matter) All things being equal, sure; I'll take it. When buying, RT is so far down the list of things I might care about on a GPU that it's rendered completely irrelevant to my GPU purchase decision. It keeps getting brought up and I keep not caring. It would affect maybe 5% of my playtime, if even that much. Nothing I've seen (so far) makes it's functionality a deal breaker to me. Nvidia wants me to care but I just don't. Maybe I'll care in 2025 but not now.