MSI GeForce RTX 3080 GAMING X TRIO review

Graphics cards 1049 Page 1 of 1 Published by

Click here to post a comment for MSI GeForce RTX 3080 GAMING X TRIO review on our message forum
https://forums.guru3d.com/data/avatars/m/198/198862.jpg
CPC_RedDawn:

Would this card (or any 3080 for that matter) that uses 3x 8pin connectors still boot and work with just two connectors attached? I remember years ago a graphics card that had 3x 8pin and you didn't need the extra 8pin as it was meant for overclocking the card on like LN2 or something. Can't remember what card it was though....
It was an r9 290x lightning with 2x8 pins and 1x6 pin and yeah it booted just fine with only first two.
data/avatar/default/avatar27.webp
So whats the actual max power limit on this thing, is 340 as high as it goes?
data/avatar/default/avatar09.webp
That's pretty weak for a 3 connector card. I assumed it would be at least 375.
https://forums.guru3d.com/data/avatars/m/282/282392.jpg
ttnuagmada:

That's pretty weak for a 3 connector card. I assumed it would be at least 375.
Push the power limit up and i'm sure it'll eat more.
https://forums.guru3d.com/data/avatars/m/270/270008.jpg
schmidtbag:

Although I doubt this actually needs all 3 connectors (especially if you intend to undervolt), I doubt it'll work without all of them. You might be able to trick it though, by breaching the signal pin.
These guys show it pulling 425 watts in Furmark and there gaming power draw is very close to Guru3d's. Each 8-pin PCIe is rated to 150 watts with 75 for the PCIe slot. With two pins you are only at 375 watts max. So yeah it looks like the third 8-pin power connector is needed. I do realize Furmark is really tough but it blew past the two 8-pins by 50 watts so I'm expecting certain other workloads to get close or go over 375 watts. https://www.techpowerup.com/review/msi-geforce-rtx-3080-gaming-x-trio/29.html
https://forums.guru3d.com/data/avatars/m/79/79740.jpg
I think from Ampere and on, we can kiss OC'ing goodbye. Just not worth the measly 2-3 fps for the added power draw. I suspect same will be for AMD. The heightened competition between them has squeezed out any headroom for OC'ing.
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
JamesSneed:

These guys show it pulling 425 watts in Furmark and there gaming power draw is very close to Guru3d's. Each 8-pin PCIe is rated to 150 watts with 75 for the PCIe slot. With two pins you are only at 375 watts max. So yeah it looks like the third 8-pin power connector is needed. I do realize Furmark is really tough but it blew past the two 8-pins by 50 watts so I'm expecting certain other workloads to get close or go over 375 watts. https://www.techpowerup.com/review/msi-geforce-rtx-3080-gaming-x-trio/29.html
Yeah, I guess so. Though under most real-world tests, I don't think it'd be so high. They definitely could've got away with 2x 8-pins and 1x 6-pin. I was hesitant about getting my R9 290 because I was wary of its high max power draw (I had a different, weaker PSU at the time) but by ignoring the synthetic loads and only looking at the power consumption of games, its wattage was actually relatively low, compared to the GTX 970. I undervolted a bit just to be on the safe side too. I never had any problems, despite my less-than-ideal PSU situation at the time. So - I figure for anyone who keeps v-sync on, lowers the voltage, and doesn't run burn tests, the 3080 could probably run on 2x 8-pins. I wouldn't advise it, but I'm sure you could do it if you were in a pinch.
data/avatar/default/avatar09.webp
Think there is any big difference between X trio and Ventus? I only managed to secure a Ventus, and not sure if I should regret and wait for a better version. Silence really is everything to me.
https://forums.guru3d.com/data/avatars/m/126/126739.jpg
Not sure if I missed this somewhere in the review or this thread. But was the 359W power draw with an overclock, or stock?
https://forums.guru3d.com/data/avatars/m/282/282392.jpg
Olfert Hansen:

Think there is any big difference between X trio and Ventus? I only managed to secure a Ventus, and not sure if I should regret and wait for a better version. Silence really is everything to me.
My previous card was a ventus (currently have a x trio), a cheap looking 2070s, these 3k series ventus are totally different though. in the 2k series there indeed was a decent difference between cards, ventus had 100% power limit, couldn't push it higher, also fans were crap. But sure 3k series ventus totally different animal.
https://forums.guru3d.com/data/avatars/m/227/227994.jpg
I hope they release a 3070 with the same cooler, might be what i will go for eventually then in terms of quietness. I'm so so picky when it comes to noise.
https://forums.guru3d.com/data/avatars/m/279/279306.jpg
ttnuagmada:

That's pretty weak for a 3 connector card. I assumed it would be at least 375.
Very sorry I misread it is 340 accourding to spec untuned, max is probly higher by adjusting power limit like DannyD said^^
TheDeeGee:

I hope they release a 3070 with the same cooler, might be what i will go for eventually then in terms of quietness. I'm so so picky when it comes to noise.
Looks like the MSI 3070 Gaming X Trio have same cooling option^^ https://www.msi.com/Graphics-card/GeForce-RTX-3070-GAMING-X-TRIO/Overview
data/avatar/default/avatar08.webp
DannyD:

Push the power limit up and i'm sure it'll eat more.
Right, that was my question; what the max power limit is.
https://forums.guru3d.com/data/avatars/m/270/270008.jpg
TheDeeGee:

I hope they release a 3070 with the same cooler, might be what i will go for eventually then in terms of quietness. I'm so so picky when it comes to noise.
Same. Really want my GPU's to hang under 250w for that reason as well. The 3070 will get a look from me along with whatever the heck AMD outs. These 3080's are just pulling to many watts even under normal gaming loads for my taste. I felt the same about Vega and going back the GTX 480 as well.
data/avatar/default/avatar21.webp
alanm:

I think from Ampere and on, we can kiss OC'ing goodbye. Just not worth the measly 2-3 fps for the added power draw. I suspect same will be for AMD. The heightened competition between them has squeezed out any headroom for OC'ing.
On the other hand, I'm glad the hardware is already squeezed from the factory without needing to OC and rely on silicon luck. This is good for the vast majority of users.
https://forums.guru3d.com/data/avatars/m/232/232349.jpg
What does this mean for power then?!? This is gigabytes 3090 offering with what looks like dual twelve pin hookups???
schmidtbag:

Sounds about right. The new 12-pin connector I think is supposed to carry roughly 650W (varying sources say different things but that seems to be the "safe number" based on the per-pin specs). I don't think the 3090 is supposed to get anywhere near that high, but the power cable was developed for future needs. I wouldn't be surprised if the 3090 does actually warrant the 3x 8-pins, at least when overclocked. Personally, I don't like how the new 12-pin increases the wire gauge specs. Most 12v wires are supposed to handle 5A. This power connector is spec'd to carry 9.5A. Not only does this complicate the manufacturing process of PSUs (thereby making them more expensive) but it basically just gives GPU manufacturers the freedom to give efficiency a lower priority. I don't want triple-slot GPUs to become the norm.
Man the 24th can't come soon enough!! I'm stoked at the idea of a 3090... Had a 3080 MSI in the cart this morning, but didn't want to bite that bullet and the 3090 in a week or so. (*Edit...* "I want to blood that bad boy!!!" Not sure exactly at the time what I was on about...??!)
data/avatar/default/avatar01.webp
Hmm my PSU only have 2 8 pin left. with a 359 power draw, shouldnt it still be possible with a split setup: 1 8 pin directly attached = 150 W 1 8 pin split into 2 8pins = 150 W PCI Slot = 75W Which would give the card 375W to draw from. Or did I fail electricity class? Buying a new PSU just to get another 8pin slot seems like such a waste. Using a RM750X from Corsair.
https://forums.guru3d.com/data/avatars/m/55/55855.jpg
Been looking forward to your review of the Gaming X-Trio Hilbert, and it doesn't disappoint. 🙂 A cracker of a card as expected, can't wait to get my mitts on one 😀