Radeon RX 6600 XT 12GB and RX 6700 6GB and XT 12GB Graphics cards Are Listed

Published by

Click here to post a comment for Radeon RX 6600 XT 12GB and RX 6700 6GB and XT 12GB Graphics cards Are Listed on our message forum
https://forums.guru3d.com/data/avatars/m/198/198862.jpg
cucaulay malkin:

excuse me,wtf why is 5700's succesor a 6gb card ?
6/12gb for the 6700. Isnt that the same what nvidia is doing with 3060?
https://forums.guru3d.com/data/avatars/m/282/282473.jpg
Undying:

6/12gb for the 6700. Isnt that the same what nvidia is doing with 3060?
well it's not what the title says
The non-X variant, the RX 6700. (RX 6700 CLD 6GO (Challenger DOC) is listed at 6 GB of video memory, half that of the XT variant.
Radeon RX 6700 XT is based on a NAVI 22 XT GPU, likely 40 CUs/2560 SP. The memory subsystem will be 12 GB at 192 bits. As you can see in the listing, The RX 6700 then has 6GB fitted with NAVI 22 XL. The lower-spec RX 6600 XT then again is accompanied by 12 GB of GDDR6 memory. A bit confusing, yes.
6700xt is 12g,6700 is 6g nvidia is doing 12g for a lower tier card than 6700,and so is amd for 6600xt
https://forums.guru3d.com/data/avatars/m/172/172989.jpg
TimmyP:

Yes. You are missing the denial.
The denial of what exactly, friend? ^^
https://forums.guru3d.com/data/avatars/m/282/282473.jpg
Chess:

Controversial: but I don't understand people wanting RTX when not using HDR and 10bit, colour correct capable monitors
as if you needed 10-bit and hdr to see dynamic relfections,more natural light and shadows/ao your denial is strong tell me what part of ray traced image you can't see on an 8-bit sdr that you can on 10-bit hdr ? if anything most hdr monitors are gimmicks,hdr is hdr10,not some 400/600 brightness lie with a hdr sticker on it.
data/avatar/default/avatar11.webp
Having both a 3080 RTX and a 6900 XT I can say that RT and DLSS are still shit this generation. Big improved over the last generation, but still shit unless you're playing in 1080p and lower your settings there's no way you'll get decent FPS. Once you go 144hz+ you'll never want to play anything lower and most games with RT + DLSS still give you garbage frame rates. I'll tell you something about DLSS, even though big YouTube reviewers have done a frame by frame analysis of DLSS and have said there's very little quality loss there's still a noticeable difference when DLSS is enabled and in motion. When I enable DLSS, my frames start to come close or beat my 6900 XT but there is a significant quality loss. I can sit down, start playing and I know it's enabled because the picture while in motion just doesn't seem right. The visual fidelity on the 6900 XT absolutely demolishes the 3080, even with DLSS on "Quality". I can achieve the same thing by lowering the in game quality on my 6900 XT. In my opinion I don't see the point of purchasing a 3080, or even 3070 and playing with DLSS enabled as you're paying top dollar for a premium product just to play at an inferior quality. Let's see what DLSS 3.0 does.
https://forums.guru3d.com/data/avatars/m/282/282473.jpg
Daytona675:

Having both a 3080 RTX and a 6900 XT I can say that RT and DLSS are still crap this generation. Big improved over the last generation, but still crap
this and last generation use the same dlss and why exactly do you need a 6900xt to compare that ?
data/avatar/default/avatar01.webp
cucaulay malkin:

this and last generation use the same dlss and why exactly do you need a 6900xt to compare that ?
I was referring to DLSS 1.0 and 2.0 -- they're both flagship cards that I own so I'm able to give an unbiased, non-fanboy opinion of what I experienced.
https://forums.guru3d.com/data/avatars/m/172/172989.jpg
cucaulay malkin:

as if you needed 10-bit and hdr to see dynamic reflections,more natural light and shadows/ao your denial is strong.
Perhaps I wasn't very clear: people have the option for superior image quality for years by indeed HDR10 and full range colours and barely did, yet RTX is all the hype? ( yes, hype ). You DO want RTX, but don't care for deep blacks, true colors and clear whites? Weird in my point of view. I'm not in denial, I'm trying to look at it from a different angle AND in this place and time. I'll fully go for RTX the moment I can get a consistent 60+ fps at 4k, whilst not selling a kidney.
cucaulay malkin:

tell me what part of ray traced image you can't see on an 8-bit sdr that you can on 10-bit hdr ?
This has nothing to do with my statement. I'm not dissing the quality of RTX, I'm saying it's not mature enough...yet.
cucaulay malkin:

if anything most hdr monitors are gimmicks,hdr is hdr10,not some 400/600 brightness lie with a hdr sticker on it.
HDR is HDR indeed. Try and go full range or don't bother. I never said anything about it. RTX has the same problem. Take a look at this: [youtube=ShHQkuU9W2A] Around 9 minutes. Now, this is a very limited case as he benches CP2077 only, but RTX bring some nice, accurate reflections, but the game is gorgeous without RTX. So éh, I'd like my 50+ FPS performance hit back. It's not worth it, at this time ( alright, for CP2077 ) I'm being painted as a AMD fanboi indirectly with this 'denial' thing. I do something true fanbois can't even fathom: I buy what suits my needs at the time. Check my GTX1080Ti. But I understand some people can't stand another opinion and will try and shut that person up. But like religious fanatics.
https://forums.guru3d.com/data/avatars/m/282/282473.jpg
Chess:

I'm being painted as a AMD fanboi indirectly with this 'denial' thing.
not to me but I don't know how one quality enhancement excludes the other. not everyone wants to/can get a hdr10 display. I wanted a 24" 1440p with ulmb.doesn't mean I should just igonre rt.I got control with a 2070S and it looked ridiculously good,though what I liked most in that game was environmental physics. I did play metro with RT GI on and if anything it was an interesting experience to see GI done correctly not faking it with local lights.It's not about whether one or the other looks nicer,for a pc gaming enthusiast to see ray traced gi,reflections,shadows,ambient occlusion is just intriguing,at least to me. of course there is hype.it's ray tracing in games.did you think it was gonna go unnoticed ? your remarks don't seem fanboyish to me,but they do seem completely out of touch.including the cp2077 rt not being worth it. https://www.purepc.pl/misc/img_compare2.php?f1=/image/artykul/2020/12/15_cyberpunk_2077_pc_test_wydajnosci_kart_graficznych_w_ray_tracing_i_dlss_czego_potrzeba_do_grania_na_ultra_nc33_b.jpg&f2=/image/artykul/2020/12/15_cyberpunk_2077_pc_test_wydajnosci_kart_graficznych_w_ray_tracing_i_dlss_czego_potrzeba_do_grania_na_ultra_nc29_b.jpg https://www.purepc.pl/misc/img_compare2.php?f1=/image/artykul/2020/12/15_cyberpunk_2077_pc_test_wydajnosci_kart_graficznych_w_ray_tracing_i_dlss_czego_potrzeba_do_grania_na_ultra_nc18_b.jpg&f2=/image/artykul/2020/12/15_cyberpunk_2077_pc_test_wydajnosci_kart_graficznych_w_ray_tracing_i_dlss_czego_potrzeba_do_grania_na_ultra_nc16_b.jpg of course non-raytraced looks good.it's 2020 and we've had ways to make games look spectacular for a decade.never have I seen ambient occlusion,contact shadows,dynamic shadows and reflections behave the way they do with RT when it's not faked one way or the other.
https://forums.guru3d.com/data/avatars/m/263/263507.jpg
Personally I don't care about RT. Unless it was a feature consuming from 1% to 5% of the total GPU usage. (Yes 5% could be a lot, but I would probably still sacrifice it for RT unless it's a game like cyberpunk where is difficult to get 80~ FPS in some heavy areas and I wouldn't trade limited FPS for that ) I understand for a lot of people RT is one of the most important things for the eye candy level. But there are also lot of people who think it's just an optional very power hungry bonus.
https://forums.guru3d.com/data/avatars/m/201/201426.jpg
itpro:

Before people continue to buy amd cards should demand an answer to dlss. I am afraid to buy red gpu if I must lower settings for viable performance.
You are a clown.