Gigabyte Aorus GeForce RTX 3080 XTREME review

Graphics cards 1049 Page 1 of 1 Published by

Click here to post a comment for Gigabyte Aorus GeForce RTX 3080 XTREME review on our message forum
https://forums.guru3d.com/data/avatars/m/282/282392.jpg
I'd consider one in March, any 3080. ( looking for smaller size though, this looks ideal)
https://forums.guru3d.com/data/avatars/m/196/196426.jpg
Oh my, 400W ! Thicc hungry boy.
data/avatar/default/avatar20.webp
It does have I\Xtreme in it's title.
https://forums.guru3d.com/data/avatars/m/282/282392.jpg
My current card pulls 380w.
data/avatar/default/avatar16.webp
oof 400 watts 😱 I mean I knew it was going to be high since my 1080ti slightly overclocked already reaches +-300watts but still reading four hundred that's heavy and only comforting me into going with a watercooling block/prebuilt (like EK or Aorus) I'm not exhausting 400watts at the center of my case and definitely not onto a TRX40 or X570 chipset they don't need the added heat >< edit : to be perfectly clear I do not expect performance to come for free, I've tuned cars I know power comes at a cost (way cheaper to tune pcs btw I was basically throwing away a high-end PC config every year in maintenance and parts)
https://forums.guru3d.com/data/avatars/m/271/271560.jpg
Look Toto, it's the size of Kansas! ;)
https://forums.guru3d.com/data/avatars/m/220/220214.jpg
Is this Gigabyte Aorus Xtreme 10GB already discontinued? According to one UK site I saw today it is. Maybe Gigabyte have already stopped making it and are now making Aorus Xtreme 3080TI version instead...
upload_2020-12-16_17-32-57.png
https://forums.guru3d.com/data/avatars/m/198/198862.jpg
geogan:

Is this Gigabyte Aorus Xtreme 10GB already discontinued? According to one UK site I saw today it is.
upload_2020-12-16_17-32-57.png
Nah its just out of stock. They sold all 13 of them. 😀
https://forums.guru3d.com/data/avatars/m/260/260103.jpg
Monitoring this in GPU-z i've seen it go as high as 450watts with the new updated bios from Gigabyte. That's boosting to 2100mhz without touching the clocks. it's a power hog for sure, but it runs nicely.
data/avatar/default/avatar21.webp
I've finally got a new gaming computer (after 7 years of EVGA 780's SLI) and I'm delighted it includes an ASUS Rog Strix 3080 OC as it's far better looking than this Gigabyte lump of dull grey. The ASUS card has a Performance or Quiet switch on the upper edge of the card and if you download the software from the ASUS website, the OC Mode gives you 1935MHz boost (Performance is 1905MHz). And thanks to the Amazon 3 week Black Friday sale I've finally updated my 10 year old Dell U2711 1440 monitor to an ASUS ROG Swift PG35VQ 35" 3440 x 1440 with 200Mz refresh, Gsynch Ultimate & HDR1000. Gaming computer: CoolerMaster H500M mesh front, i7 10700K to 5GHz, 32GB Corsair Dominator RGB 3200, ASUS 3080 OC, ASUS Thor 850w Platinum with OLED screen & RGB, Hero XII mobo, Corsair AIO 360 Pro XT Sorry, after 10 years of the Dell & 7 years with the previous gaming computer I'm a bit overwhelmed with the excitement of it all. It took me all of 10 seconds to adapt to an ultrawide monitor lol.
https://forums.guru3d.com/data/avatars/m/239/239459.jpg
I dunno if it's even worth the premium for these factory overclocked cards, I got a Palit Gaming Pro 3080 standard that clocks way over 2100, I hit 2220 running Port Royale before I started seeing crashes I am watercooling though.
https://forums.guru3d.com/data/avatars/m/220/220214.jpg
ChisChas:

updated my 10 year old Dell U2711 1440 monitor to an ASUS ROG Swift PG35VQ 35" 3440 x 1440 with 200Mz refresh, Gsynch Ultimate & HDR1000.
Nice! I just recently upgraded my Dell U2715 to a new ASUS TUF Gaming VG27AQL1A monitor which has the Gsync 144Hz and HDR and I definitely think the Windows HDR mode in games is the single biggest improvement I have seen in years. Things like Doom Eternal and Cyberpunk especially look amazing in HDR... really makes a huge difference to the games compared to flat SDR. And mine only goes up to about 400nits... can't imagine what 1000nits would be like on those ultimate displays! I thought 400 nits would be rubbish but its not. One other unrelated thing about review... I've asked this before and never got a reply… so I’ll ask it again... The Timespy results page scores make no sense... In the “Graphics” Score graph the 3090 is way higher than the Aorus: 3090: 20144 AORUS: 18127 But yet in overall graph (I presume this is the overall score) the 3090 scored LOWER than the Aorus: 3090: 16796 AORUS: 17086 The only way that the 3090 could be lower overall in this case would be if the CPU subscore (not shown) was WAY lower during this run. But the CPU/motherboard used should be identical in both cases. Now I know for a fact that 3DMark weighs the GPU result much higher than CPU result to generate the overall score, so there is no way the CPU difference could even account for this difference. So what’s going on here? Am I missing something?
https://forums.guru3d.com/data/avatars/m/180/180081.jpg
kakiharaFRS:

oof 400 watts 😱 I mean I knew it was going to be high since my 1080ti slightly overclocked already reaches +-300watts but still reading four hundred that's heavy and only comforting me into going with a watercooling block/prebuilt (like EK or Aorus) I'm not exhausting 400watts at the center of my case and definitely not onto a TRX40 or X570 chipset they don't need the added heat >< edit : to be perfectly clear I do not expect performance to come for free, I've tuned cars I know power comes at a cost (way cheaper to tune pcs btw I was basically throwing away a high-end PC config every year in maintenance and parts)
You could stick some fans in your case 😛
https://forums.guru3d.com/data/avatars/m/274/274977.jpg
Yikes. 400W is nothing to sneeze at... My current 1080 AORUS is great and quite silent - I will probably go for a 3070 (or Ti if it happens) at max, though. At least it seems to exhibit less coil whine than seems to be the "standart" these days...
https://forums.guru3d.com/data/avatars/m/115/115462.jpg
Don't think I've ever seen a GPU this thicc in recent memory. Looks great, really like the Aorus from Gigabyte and Strix from Asus this generation. It's either one of these or maybe a TUF that I'll go for when I'll finally manage to buy one...
data/avatar/default/avatar11.webp
I think we both bought ourselves an early Christmas present (or two or three in my instance). Amazon saved me £700 so thanks Jeff.....
data/avatar/default/avatar31.webp
My ASUS ROG Strix 3080 OC is billed as 2.9 slots so yes, the highest performing air-cooled 3080 cards are thick.
https://forums.guru3d.com/data/avatars/m/266/266231.jpg
DannyD:

I'd consider one in March, any 3080. ( looking for smaller size though, this looks ideal)
you should.
https://forums.guru3d.com/data/avatars/m/282/282392.jpg
Freeman:

you should.
Oh for sure, especially seeing as CP2077 uses just 8gb at 1440p.
data/avatar/default/avatar36.webp
AlmondMan:

You could stick some fans in your case 😛
ahah I guess you could think that, except I already have 10x120mm fans and a triple radiator cpu+gpu custom loop now (upgraded from bottom+front rad with air cooled gpu after seeing a block on sale) I have one thing to add tough about my previous original comment and that is relevant to this videocard review : I checked the EK/ASUS and Aorus prebuilt watercooled models and something struck me, both brands and even on the 3090s ! only have 2 pcie connectors o_O Also read somewhere that apparently the blocks used on the EK/ASUS had an older design (re-purposed RTX 2000 stock ?) Seems really weird when you see this Aorus reaching 401watts (above what 2 pcie connectors should be seeing) and we know the 3090 can reach 400+ Removed all waterblock-ed cards from my wishlists, clearly buying an "extreme" card like here and adding a water block yourself (or leaving it air cooled) is the way to go if you want no limits performance