GALAX GeForce RTX 3090 Hall Of Fame (HOF) (updated)

Published by

Click here to post a comment for GALAX GeForce RTX 3090 Hall Of Fame (HOF) (updated) on our message forum
https://forums.guru3d.com/data/avatars/m/281/281256.jpg
Seems a waste use all the best components and then ruin it with supposedly inferior poscaps, having said that I have had 0 issues with mine sat all.
https://forums.guru3d.com/data/avatars/m/275/275892.jpg
$2499+scalper fee confirmed 😎 26 phases...such a waste of circuitry components. Wake me up when high end cards only consumes 250Watts again and not 400W out of the box.
https://forums.guru3d.com/data/avatars/m/201/201426.jpg
Devid:

$2499+scalper fee confirmed 😎 26 phases...such a waste of circuitry components. Wake me up when high end cards only consumes 250Watts again and not 400W out of the box.
Nice peasant mentality.
https://forums.guru3d.com/data/avatars/m/275/275892.jpg
Agonist:

Nice peasant mentality.
Yes sure, since when are we ok we 400W+ TDP / TGP? Also with the 'quality' of your comment I suggest you to go to Wccftech mate.
data/avatar/default/avatar23.webp
quick review of KFA2 (which is galax) base 3090 : - annoying noise fans - 4 fat +2 clusters capacitors (same as HOF) - LQ super rough surface at the GPU plate, cooler is too cheap for the price range but "does it's job" - thermal pads seems more HQ than previous gens they are an in-between paste and pads (a pain to remove if you plan to liquid cool I strongly suggest you buy a pre-built card) - it's constantly on the 345-355 watts limit if you try 4K or max details 1080 - benchmarks are +60-100% of a 1080ti the performance gap 😱 😱 😱 went from 120fps ultra in borderlands 3 to 260fps badass it's insane I limited the fps to 120 so my PC doesn't become an oven (turns out rendering 260fps generates more heat than 120 who would have guessed lol) - the card is so powerful that it's almost running at idle frequency for older or indie games 600-800Mhz ? that's enough apparently - you have so many fps that NVENC streaming or encoding is irrelevant I wouldn't now if I had a fps loss in 1080p 1440p mine's limited to 350watts which I tought would suck...(EK-ASUS and Aorus only had 2 connectors at the time I bought mine, this has now partially changed) I don't mind it at all in the end because it's too much already, I get 49°C max with an acceptable fan/noise ratio (still lower than an air-cooled card would produce alone) but I don't like the amount of heat it puts into the loop I wouldn't want more if you buy a 400+watts air cooled card it's going to be hell inside your case I placed my fingers in the air 2-3 centimeters away from the backplate of the air-cooled 350watts KFA2 around the lower part of the memory sticks I had to remove them I would have been burned by the boiling hot air, I had ESD gloves on btw...not the best insulators but would have been worse without them, wouldn't be surprised memory sticks reach 80°C people who don't want or are afraid of the "too many watts" are right to be
https://forums.guru3d.com/data/avatars/m/281/281256.jpg
kakiharaFRS:

quick review of KFA2 (which is galax) base 3090 : - annoying noise fans - 4 fat +2 clusters capacitors (same as HOF) - LQ super rough surface at the GPU plate, cooler is too cheap for the price range but "does it's job" - thermal pads seems more HQ than previous gens they are an in-between paste and pads (a pain to remove if you plan to liquid cool I strongly suggest you buy a pre-built card) - it's constantly on the 345-355 watts limit if you try 4K or max details 1080 - benchmarks are +60-100% of a 1080ti the performance gap 😱 😱 😱 went from 120fps ultra in borderlands 3 to 260fps badass it's insane I limited the fps to 120 so my PC doesn't become an oven (turns out rendering 260fps generates more heat than 120 who would have guessed lol) - the card is so powerful that it's almost running at idle frequency for older or indie games 600-800Mhz ? that's enough apparently - you have so many fps that NVENC streaming or encoding is irrelevant I wouldn't now if I had a fps loss in 1080p 1440p mine's limited to 350watts which I tought would suck...(EK-ASUS and Aorus only had 2 connectors at the time I bought mine, this has now partially changed) I don't mind it at all in the end because it's too much already, I get 49°C max with an acceptable fan/noise ratio (still lower than an air-cooled card would produce alone) but I don't like the amount of heat it puts into the loop I wouldn't want more if you buy a 400+watts air cooled card it's going to be hell inside your case I placed my fingers in the air 2-3 centimeters away from the backplate of the air-cooled 350watts KFA2 around the lower part of the memory sticks I had to remove them I would have been burned by the boiling hot air, I had ESD gloves on btw...not the best insulators but would have been worse without them, wouldn't be surprised memory sticks reach 80°C people who don't want or are afraid of the "too many watts" are right to be
Certainly toasty on air my case is huge with lots of fans but it made it very warm in there, needs to be liquid cooled really to keep things sane unless in an open case
https://forums.guru3d.com/data/avatars/m/201/201426.jpg
Devid:

Yes sure, since when are we ok we 400W+ TDP / TGP? Also with the 'quality' of your comment I suggest you to go to Wccftech mate.
If you are crying about anything above 250w, high end GPUs are not for you. Maybe stick to mid range gpus or consoles. So by you, I should be upset my overclocked 5700xt pushes 250w for $389. The quality of my comment is higher then the quality of your reasoning. Love some new softy telling someone whos been on here for 13 years where to go.
https://forums.guru3d.com/data/avatars/m/275/275892.jpg
Agonist:

If you are crying about anything above 250w, high end GPUs are not for you. Maybe stick to mid range gpus or consoles. So by you, I should be upset my overclocked 5700xt pushes 250w for $389. The quality of my comment is higher then the quality of your reasoning. Love some new softy telling someone whos been on here for 13 years where to go.
I own an RTX 2080Ti and that has 250W TDP, the GTX 1080 Ti that I had before had 250W TDP again, and I could give a nice of list of VGA-s that I've owned over the last 22-23 years. Ampere architecture draws a loads of unnecessary power due to it's bad design and/or the Samsung 8nm. Therefore the OC class cards are fully packed with VRM and a lot more just to feed these GPU-s. Like it or not the high end cards are not supposed to draw this much of power out of the box like Ampere does. That's just one thing the TGP is 320W on them but in reality they consume a lot more with some peaks of above 400W and so. I do understand Kingpin and HOF etc cards are usually over engineered due to it's fact they all made for OC records and such. But the basic problem is here the Ampere architecture. To be honest the fact you have been here for long time should not give you any courage to talk to people like you did to me.
data/avatar/default/avatar32.webp
Devid:

I own an RTX 2080Ti and that has 250W TDP, the GTX 1080 Ti that I had before had 250W TDP again, and I could give a nice of list of VGA-s that I've owned over the last 22-23 years.
cannot agree when you say "shouldn't use that much power" you don't realize how much of a leap this generation is, probably because you're talking as a 2080ti where it's less obvious ?! for me it's very obvious 😱 - my MSI 1080Ti Gaming X 11G was stock more in the 280watts than the announced 250w and when overclocked it easily reached 320watts (peak) (only reference cards follow TDP custom ones alway went way above it) - my galax/kfa 3090 (reference design) power limited with 2 connectors reaches 345-355watts (not peak this time...) - I have +100% fps in most games for around + 24% more power is it really that bad ? you're putting too much faith in node size it's not magic in my attached screenshots max power was 1080ti/267W 3090/352W - don't only check the resulting fps but the little yellow boxes showing the GPU fps, game is limited by the CPU at this stage, the 3090 is double the benchmark scores and actual fps in most games I play unless the CPU limits them (my modded skyrim gained maybe 10-20fps but it has a billion scripts running in the background) you reminded me that I completely forgot to check the power graph when benchmarking, I sadly don't have any from the 1080ti to compare to, I think I know what I'm going to see tough an horizontal 350watts line lol (like I said earlier I'm happy the card is power limited for once and I don't even want to try to overclock it)
https://forums.guru3d.com/data/avatars/m/232/232349.jpg
Devid:

$2499+scalper fee confirmed 😎 26 phases...such a waste of circuitry components. Wake me up when high end cards only consumes 250Watts again and not 400W out of the box.
Guess you should have invested in a better PSU?? Maybe.....
https://forums.guru3d.com/data/avatars/m/227/227994.jpg
suty455:

Seems a waste use all the best components and then ruin it with supposedly inferior poscaps, having said that I have had 0 issues with mine sat all.
Good thing that was never the real problem, but a driver issue.
https://forums.guru3d.com/data/avatars/m/262/262208.jpg
kakiharaFRS:

quick review of KFA2 (which is galax) base 3090 : - annoying noise fans - 4 fat +2 clusters capacitors (same as HOF) - LQ super rough surface at the GPU plate, cooler is too cheap for the price range but "does it's job" - thermal pads seems more HQ than previous gens they are an in-between paste and pads (a pain to remove if you plan to liquid cool I strongly suggest you buy a pre-built card) - it's constantly on the 345-355 watts limit if you try 4K or max details 1080 - benchmarks are +60-100% of a 1080ti the performance gap 😱 😱 😱 went from 120fps ultra in borderlands 3 to 260fps badass it's insane I limited the fps to 120 so my PC doesn't become an oven (turns out rendering 260fps generates more heat than 120 who would have guessed lol) - the card is so powerful that it's almost running at idle frequency for older or indie games 600-800Mhz ? that's enough apparently - you have so many fps that NVENC streaming or encoding is irrelevant I wouldn't now if I had a fps loss in 1080p 1440p mine's limited to 350watts which I tought would suck...(EK-ASUS and Aorus only had 2 connectors at the time I bought mine, this has now partially changed) I don't mind it at all in the end because it's too much already, I get 49°C max with an acceptable fan/noise ratio (still lower than an air-cooled card would produce alone) but I don't like the amount of heat it puts into the loop I wouldn't want more if you buy a 400+watts air cooled card it's going to be hell inside your case I placed my fingers in the air 2-3 centimeters away from the backplate of the air-cooled 350watts KFA2 around the lower part of the memory sticks I had to remove them I would have been burned by the boiling hot air, I had ESD gloves on btw...not the best insulators but would have been worse without them, wouldn't be surprised memory sticks reach 80°C people who don't want or are afraid of the "too many watts" are right to be
I just recently built loop for friend, friend using same KFA2 RTX 3090 SG as you probably have, I never tried or used stock coolers on GPU abd I literally can't comment on noise of fans If you are hitting power limit then my recommendation get KFA2 390W BIOS which I used on my friend build, temperatures with 360mm slim radiator and 240mm radiator on bottom with 10900k OC to 5.3GHz are in 42-46°C with fans spinning just at 800RPM, we have used Bykski RTX 3090 waterblock and must say I'm very impressed by Bykski waterblock for RTX 3090, same block on my Palit RTX 3090 GamingPro will reach 36-38°C as max with other 2*RTX 2080Ti in loop, power draw in rendering is around 1200-1350W If you are think 350-375W is too much for you then I'm sure you will be scared probably XOC 1000W BIOS hahaha,people use that BIOS on their air cooled GPUs and no issues although I would be very cautious using that BIOS, I use that BIOS only for benchmarks and some rendering Regarding the KFA2 RTX 3090 SG they're not bad,if you put them under water Hope this helps Thanks, Jura
https://forums.guru3d.com/data/avatars/m/275/275892.jpg
DeskStar:

Guess you should have invested in a better PSU?? Maybe.....
Yes mate definitely, as I own an EVGA Supernova 850W P2. I will throw it in a bin just for your sake 😎 Let us have a word about enjoying gaming in the middle of the summer with these space heaters then.o_O I know I know AirCon, I've got that one too before anyone asks.
https://forums.guru3d.com/data/avatars/m/275/275892.jpg
kakiharaFRS:

cannot agree when you say "shouldn't use that much power" you don't realize how much of a leap this generation is, probably because you're talking as a 2080ti where it's less obvious ?! for me it's very obvious 😱 - my MSI 1080Ti Gaming X 11G was stock more in the 280watts than the announced 250w and when overclocked it easily reached 320watts (peak) (only reference cards follow TDP custom ones alway went way above it) - my galax/kfa 3090 (reference design) power limited with 2 connectors reaches 345-355watts (not peak this time...) - I have +100% fps in most games for around + 24% more power is it really that bad ? you're putting too much faith in node size it's not magic in my attached screenshots max power was 1080ti/267W 3090/352W - don't only check the resulting fps but the little yellow boxes showing the GPU fps, game is limited by the CPU at this stage, the 3090 is double the benchmark scores and actual fps in most games I play unless the CPU limits them (my modded skyrim gained maybe 10-20fps but it has a billion scripts running in the background) you reminded me that I completely forgot to check the power graph when benchmarking, I sadly don't have any from the 1080ti to compare to, I think I know what I'm going to see tough an horizontal 350watts line lol (like I said earlier I'm happy the card is power limited for once and I don't even want to try to overclock it)
Hold on for a sec! You are comparing your Ampere RTX 3090 (Titan class) card of yours to an old Pascal GTX 1080 Ti. Of course there is a big leap as you skipped one generation. And comparing AIB card to reference and back and forth. I was only talking about reference cards in general and theirs real life/game/peak etc power consumption and this is exactly where Ampere looses.
https://forums.guru3d.com/data/avatars/m/232/232349.jpg
Devid:

Yes mate definitely, as I own an EVGA Supernova 850W P2. I will throw it in a bin just for your sake 😎 Let us have a word about enjoying gaming in the middle of the summer with these space heaters then.o_O I know I know AirCon, I've got that one too before anyone asks.
Right on the cusp of not being good enough. 1k plus PSU's for me for life here. Shoot I have three 1.5kw PSU's now and was pissed I had to save some scratch on a 1kw recently. Need a bigger PSU...
https://forums.guru3d.com/data/avatars/m/166/166706.jpg
TheDeeGee:

Good thing that was never the real problem, but a driver issue.
that driver lowered the clocks
https://forums.guru3d.com/data/avatars/m/227/227853.jpg
Agonist:

Nice peasant mentality.
So wanting GPUs to draw less power instead of acting as secondary room heaters is peasant mentality now. What an interesting line of thought. Especially coming from someone who seems to think that having a certain age on this forum means newcomers suddenly aren't allowed to express their opinions anymore. How about toning the gatekeeping down a notch eh? Nobody benefits from it, aside from you apparently.
https://forums.guru3d.com/data/avatars/m/181/181063.jpg
For me...any hardware H.O.F means "really Hard tO Find"...
data/avatar/default/avatar21.webp
the advertised frequency seems low for an HOF "all out" card...my basic KFA2 3090 runs 1900+ okay I watercooled it but the power limitation is real, played 4hrs it was an horizontal line around 350w on aida graph
https://forums.guru3d.com/data/avatars/m/248/248902.jpg
Even with all the climate propaganda induced extra we pay for electricity, it's still cheap enough for me to ask for not 400W cards, but 1kW+. Bring it on!