ASUS GeForce RTX 3090 STRIX OC Review

Graphics cards 1049 Page 1 of 1 Published by

Click here to post a comment for ASUS GeForce RTX 3090 STRIX OC Review on our message forum
data/avatar/default/avatar10.webp
Just a question, do you guys ever review videocards negatively? It's been more than a decade since I've been here, and everything is "recommended" and "top pick". Seems quite unrealistic, especially for products with such a horrible performance/price ratio.
https://forums.guru3d.com/data/avatars/m/258/258688.jpg
Now we know what JHH meant by his "starting at" $1499 price quote....as this one is $1799, apparently. Nope, not interested. goodbye. ..click... dial tone.... I'll wager a nVidia AIB will hit with a $2k version at some point. Unless this thing comes with a crap load of professional CAD & 3D rendering software I can't see the value here...! But it is far better looking than the cheaper (maybe) FE version!
https://forums.guru3d.com/data/avatars/m/201/201426.jpg
LOL at these prices for Nvidia. And its ok if Nvidia has high power consumption. AMD does it, massive shit talking and they are trash.
data/avatar/default/avatar33.webp
Agonist:

LOL at these prices for Nvidia. And its ok if Nvidia has high power consumption. AMD does it, massive crap talking and they are trash.
People tend to tolerate better ovens in their PCs if those ovens actually provide the best performance, which cards like the Vega64 or the FuryX did not provide. And Nvidia was trashed really hard for their own hot cards, at least 2 times, the FX 5800 and GTX 480 come to mind.
https://forums.guru3d.com/data/avatars/m/226/226700.jpg
toyo:

Just a question, do you guys ever review videocards negatively? It's been more than a decade since I've been here, and everything is "recommended" and "top pick". Seems quite unrealistic, especially for products with such a horrible performance/price ratio.
First let me say this was an excellent review Hilbert! With that having been said, if I may I want to address what you (toyo) have asked. When Guru3D reviews a product (such as this Asus Strix graphics card), they look at what the manufacturer promotes as to its capabilities. The testing process which follows, reveals whether the product can perform at the expected performance level. If the tests reveal that the product meets or exceeds the specified performance level, then it certainly deserves the "top pick" designation. When Guru3D gives the ASUS GeForce RTX 3090 STRIX OC graphics card a "top pick" award; they are not telling everyone to go out an buy one. But they are saying that if you are interested in it, you will not be disappointed. In other words, the card (based on the testing results) can truly perform at the level Asus says that it can.
data/avatar/default/avatar21.webp
Are these GPUs with RGB lighting compatible with iCUE for example?
data/avatar/default/avatar12.webp
toyo:

Just a question, do you guys ever review videocards negatively? It's been more than a decade since I've been here, and everything is "recommended" and "top pick". Seems quite unrealistic, especially for products with such a horrible performance/price ratio.
This product is not about performance/price ratio. This is the product you buy, if you know what it is 😉 It's not for the average pleb 🙂
https://forums.guru3d.com/data/avatars/m/79/79740.jpg
Agonist:

... And its ok if Nvidia has high power consumption. AMD does it, massive crap talking and they are trash.
Problem is, when AMD came out with V64, its horrible power consumption had nothing to show for it. Performance was on par with older GTX 1080. And that was a 180w card vs V64 300w+ card. If V64 had commensurate performance, we would be cheering it on. https://www.guru3d.com/articles_pages/amd_radeon_rx_vega_64_8gb_review,8.html Also 3090 is more of a luxury or 'statement' card. If money is a concern, the 3080 fills in the spot at nearly same performance. Amazingly, despite the 3090 power draw, its the best perf/watt card ever made! [spoiler] https://tpucdn.com/review/asus-geforce-rtx-3090-strix-oc/images/performance-per-watt_3840-2160.png [/spoiler]
https://forums.guru3d.com/data/avatars/m/282/282657.jpg
Thank you for the nice review. What really impresses me with this card is the insane compute performance, 2x faster in Vray and Blender even approx. 1/3 faster in Indigo compared to a 2080 TI. That´s a big step, but no wonder if you look at the new chip design and memory. Only for gaming, I would never pay double for this little fps gain compared to a 3080. If you have the money I would consider buying 2x 3080s instead of one 3090, and if you really don´t have to care, buy even two 3090 if you can´t wait for a RTX6000 which is even more expensive than 2 or 3 3090s. But then you are in a complete other domain.... Nevertheless, whatever you want or looking forward to, you simply can´t, because there is no stock, it´s just crazy and another story.... What I am a bit missing in the review, especially for this kind of card, is the software support/ driver compatibility for developers/ creators who can´t afford or just don´t want to buy a Quadro setup or upcoming RTX6000. But maybe this would be another topic for a kind of review regarding non- or occasional gamers. From the teardown pictures, I think ASUS did a great job on the board, especially for OCs, good layout, phase design and components, easy to reach points for voltage checking or connecting an elmorlabs evc. If you have fun with OC this is a card to go for, alone the fact of having 3x8 pin power connectors is just crazy. Not to say I would do this heavy modding to a new utterly overpriced card during warranty time, but maybe in two years?!, when a RTX3090 is old and gray, hello? maybe then there is stock or you can buy one used for 1/2, just like a 2080 TI supposed to be now?! edit: my 2xR9 280x are reaching merely 13.000 in Indigo supercar edit2: okay, to be precise: 13.113 after a second bench right now
https://forums.guru3d.com/data/avatars/m/280/280231.jpg
Agonist:

LOL at these prices for Nvidia. And its ok if Nvidia has high power consumption. AMD does it, massive crap talking and they are trash.
Because they say AMD users are average plebs. That's why. They have the authority to trash talk about us and AMD.
data/avatar/default/avatar08.webp
itpro:

Because they say AMD users are average plebs. That's why. They have the authority to trash talk about us and AMD.
You don't like to be a average pleb? 😉
data/avatar/default/avatar38.webp
Is there a huge gap in performance between the 3090 strix and the 3080 strix ? I've pre-ordered a 3090 but I really doubt to get it before a long time and I was thinking about changing my order but I can't find any comparison between the two asus cards..
https://forums.guru3d.com/data/avatars/m/256/256969.jpg
Can't believe this card has this little oc headroom out of the box. This card generation is whole different paradigm for nvidia cards.
data/avatar/default/avatar06.webp
asryan:

Is there a huge gap in performance between the 3090 strix and the 3080 strix ? I've pre-ordered a 3090 but I really doubt to get it before a long time and I was thinking about changing my order but I can't find any comparison between the two asus cards..
You are more likely to get a 3090 fast (though not the 3090 strix, as that is what 90% of 3090 buyers are getting) vs a 3080... there are a huge amount of backorders on the 3080.
https://forums.guru3d.com/data/avatars/m/238/238382.jpg
I'm pretty sure these video cards don't exist.
waltc3:

Now we know what JHH meant by his "starting at" $1499 price quote....as this one is $1799, apparently. Nope, not interested. goodbye. ..click... dial tone.... I'll wager a nVidia AIB will hit with a $2k version at some point. Unless this thing comes with a crap load of professional CAD & 3D rendering software I can't see the value here...! But it is far better looking than the cheaper (maybe) FE version!
https://i.imgur.com/OdNJ6TB.jpg https://i.imgur.com/Kda7F8l.gif
https://forums.guru3d.com/data/avatars/m/59/59908.jpg
I don't care if it has four 8-pin connectors. I want it. In fact, I want two of them. And I want to play extreme solitaire with forced SLi and rainbow RGB just to up the intensity of EXTREME solitaire! **slams fist on desk**
data/avatar/default/avatar35.webp
brogadget:

Thank you for the nice review. What really impresses me with this card is the insane compute performance, 2x faster in Vray and Blender even approx. 1/3 faster in Indigo compared to a 2080 TI. That´s a big step, but no wonder if you look at the new chip design and memory. Only for gaming, I would never pay double for this little fps gain compared to a 3080. If you have the money I would consider buying 2x 3080s instead of one 3090, and if you really don´t have to care, buy even two 3090 if you can´t wait for a RTX6000 which is even more expensive than 2 or 3 3090s. But then you are in a complete other domain.... Nevertheless, whatever you want or looking forward to, you simply can´t, because there is no stock, it´s just crazy and another story.... What I am a bit missing in the review, especially for this kind of card, is the software support/ driver compatibility for developers/ creators who can´t afford or just don´t want to buy a Quadro setup or upcoming RTX6000. But maybe this would be another topic for a kind of review regarding non- or occasional gamers. From the teardown pictures, I think ASUS did a great job on the board, especially for OCs, good layout, phase design and components, easy to reach points for voltage checking or connecting an elmorlabs evc. If you have fun with OC this is a card to go for, alone the fact of having 3x8 pin power connectors is just crazy. Not to say I would do this heavy modding to a new utterly overpriced card during warranty time, but maybe in two years?!, when a RTX3090 is old and gray, hello? maybe then there is stock or you can buy one used for 1/2, just like a 2080 TI supposed to be now?! edit: my 2xR9 280x are reaching merely 13.000 in Indigo supercar edit2: okay, to be precise: 13.113 after a second bench right now
Bro, you cant runt 2x 3080s in SLI, It is not supported by Nvidia on 3080, only on 3090. Which does not make sence, as if SLI is now totally exclusive or something. But SLI is dead soon anyway. Better to stick with either a 3080 for gaming or a 3090 for computing/rendering/gaming. That is if you ever find one with the right price. Availability on either of the cards are very slim, I dont think we will get more on stock this year, I presume Q1-Q2 next year. I was just about lucky to get my 3080. So far very happy with it (gaming vise on 4K that is)
data/avatar/default/avatar36.webp
BReal85:

Wut? The Strix model doesn't work passively in idle mode? :O
worst idea ever "passive cards" more like pre-heated cards fans at the lowest rpms make zero noise and can easily maintain your gpu at <30°C while keeping everything on the card cool passive cards end up being 40-50°C base temp while drawing more power because hotter = more watts during the summer my 1080ti was around 28-30°C with fans at 40% (minimum) around 1000rpm when autumn started it was down to 26°C and now 22°C for 12watts lol but well I opened the windows that's cheating :P anyway you get the idea fans at low rpm aren't a problem, my crazy custom loop 3 rads 10 fans makes almost zero noise unless you stand still in the middle of the night you won't hear it
https://forums.guru3d.com/data/avatars/m/224/224952.jpg
kakiharaFRS:

worst idea ever "passive cards" more like pre-heated cards fans at the lowest rpms make zero noise and can easily maintain your gpu at <30°C while keeping everything on the card cool passive cards end up being 40-50°C base temp while drawing more power because hotter = more watts
Not true. They will draw "less" power because the fan isnt running. Any temp increase is due to less airflow, not because more power is being used. And these cards wont idle at 40 to 50C unless your case airflow is terrible. Advantages of no fan at idle: lower noise Less fan wear Less dust build up on the fan and heatsink.
https://forums.guru3d.com/data/avatars/m/79/79740.jpg
Mufflore:

Not true. They will draw "less" power because the fan isnt running. Any temp increase is due to less airflow, not because more power is being used. And these cards wont idle at 40 to 50C unless your case airflow is terrible. Advantages of no fan at idle: lower noise Less fan wear Less dust build up on the fan and heatsink.
I agree but some cards do idle higher than others. Even if idling at 45-50c, I would still find that preferable to fan always on due to reasons you mentioned, esp less dust build up.