ASUS ROG Strix and TUF Gaming AMD Radeon RX 6800 Series Graphics Cards incl LCS

Published by

Click here to post a comment for ASUS ROG Strix and TUF Gaming AMD Radeon RX 6800 Series Graphics Cards incl LCS on our message forum
data/avatar/default/avatar10.webp
OK, this is exactly what I was looking for .. in the RTX3000 series! ASUS wtf. This time around I would gladly switch to AMD, but the GSYNC module in my monitor says otherwise.
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
shalafi:

OK, this is exactly what I was looking for .. in the RTX3000 series! ASUS wtf. This time around I would gladly switch to AMD, but the GSYNC module in my monitor says otherwise.
If you're not aware, many G-Sync monitors can be properly used by AMD GPUs. You might want to check the compatibility of yours.
data/avatar/default/avatar17.webp

upload_2020-10-29_17-0-12.png
LOL is this because of the POSCAP / MCM capacitors discussion?
https://forums.guru3d.com/data/avatars/m/249/249481.jpg
Oh man, the pricing on that Strix LC is going to be insanely high. Strix cards are already one of the most expensive models on the market and they added liquid cooling on top of that?
data/avatar/default/avatar13.webp
schmidtbag:

If you're not aware, many G-Sync monitors can be properly used by AMD GPUs. You might want to check the compatibility of yours.
Sorry, should've probably mentioned that in the initial post - AW3418DW, G-Sync v1 module, Dell support stated they will NOT be upgrading the firmware on this one to support FreeSync as to not encroach on sales of the newer monitors (like the AW3420DW, which does support FreeSync as well). Yes, they really said it like that, f**kers.
data/avatar/default/avatar05.webp
Huggi:

Oh man, the pricing on that Strix LC is going to be insanely high. Strix cards are already one of the most expensive models on the market and they added liquid cooling on top of that?
Probably going be fastest card in that range though.
data/avatar/default/avatar19.webp
shalafi:

OK, this is exactly what I was looking for .. in the RTX3000 series! ASUS wtf. This time around I would gladly switch to AMD, but the GSYNC module in my monitor says otherwise.
This is the thing about Nvidia, they always try to artificially force you to use their stuff only. Any Nvidia card support AMD freesync coz its opensource. Any Nvidia card can support Vulkan API which initially was AMD Mantle which went opensource. And now AMD is officially working on a cross-platform free AI scaler, while Nvidia will be trying to enforce loyalty by their proprietary stuff (RTX, DLSS, G-Sync, etc.).
https://forums.guru3d.com/data/avatars/m/248/248994.jpg
shalafi:

Sorry, should've probably mentioned that in the initial post - AW3418DW, G-Sync v1 module, Dell support stated they will NOT be upgrading the firmware on this one to support FreeSync as to not encroach on sales of the newer monitors (like the AW3420DW, which does support FreeSync as well). Yes, they really said it like that, f**kers.
Damn, that's low. But at the same time, it's surprisingly honest. They could have lied and said internal testing resulted in problems and thus they can't go forward with a new firmware.
https://forums.guru3d.com/data/avatars/m/271/271560.jpg
this is on the right track the best move w/ RDNA 2 is keeping two slot capability. all other things being equal, lower power, less heat, and fitting ITX cases is a huge win. and that's just for the air cooled models. i love the Nvidia uArch, but as i said for Turing they are putting this (Ampere) out at the wrong node. new power connectors, higher power, more heat, and will ONLY FIT m-ATX and ATX. i want an RX6800XT with a factory supplied block (a la the old Poseidon). and if they won't come out (yet) i know all of the usual suspects will be supplying them. One Slot FTW!
https://forums.guru3d.com/data/avatars/m/149/149188.jpg
Aw man, as an owner of a pre-2019 G-Sync monitor, I'd love to see ASUS push out an RTX 3080 with that smexy AiO solution. =/
https://forums.guru3d.com/data/avatars/m/242/242134.jpg
@Huggi/k3vst3r only a "crappy" (asetek based) cpu converted to be used on a gpu, so any card with similar hw and a full block will perform better (my 2080 non S with block performed better than the 2080S with aio when oc). @tunejunky connector is a non issue as it was in the past ,when we switched from molex, and i doubt you would say a word if it was on all brands/cards no matter the make. and still someone ignoring more "power" doesnt equal more heat, outside that a 3070 matches a 2080ti while only needing a 550w psu (vs ti). and +90% of the market dont care about things outside mATX/ATX, and has nothing to do with Nv if manufacturers dont make one (and small aio or full block isnt usually a 3 slot either).
https://forums.guru3d.com/data/avatars/m/220/220214.jpg
Kaarme:

Damn, that's low. But at the same time, it's surprisingly honest. They could have lied and said internal testing resulted in problems and thus they can't go forward with a new firmware.
He obviously accidentally got through to the "autisticly honest" engineering dept, and managed to bypass the filtered lying bastard marketing dept
data/avatar/default/avatar06.webp
this is not a 3090 okay but still as Linus demonstrated and I know from my own custom loop I wouldn't use less than a 360mm for a 300+watts gpu let's be serious for a cpu that is 2/3 at worst more like 1/2 the watts we need a 360mm or 280mm (running slightly hotter) but "magically" for a 300 watts gpu no 240mm is "fine" nope, having an AIO gpu to have high-ish temps has no point edit : attached a screenshot so you know who's talking, 3dmark timespy extreme watercooled 1080ti max gpu temp 41°C and that's with QL120 fans on radiators, some of the worst static pressure fans (running max speed here which is +-1700rpm but making the noise of 1300rpm on "static pressure fans") ambient temp was 25°C +-1°C so higher than most Guru3d reviews and most youtubers too (no aircon and real life living room not a giant storage hangar)
https://forums.guru3d.com/data/avatars/m/56/56004.jpg
Alexraptor:

Aw man, as an owner of a pre-2019 G-Sync monitor, I'd love to see ASUS push out an RTX 3080 with that smexy AiO solution. =/
I suspect that when Asus saw the availability of the RTX 3080, or the lack thereof, they gave up on such custom or extreme cooling endeavors for the RTX3080. I have a good feeling that when Asus saw that AMD doesn't have this issue, they'd moved forward with a LCS line of 6800XT's.