ZOTAC GeForce RTX 3090 and RTX 3080 Trinity Holo Renders (Update)

Published by

Click here to post a comment for ZOTAC GeForce RTX 3090 and RTX 3080 Trinity Holo Renders (Update) on our message forum
data/avatar/default/avatar26.webp
@Supertribble I have 10 years of SLI experience.... here's a related story for you, in the car world I had someone arguing that all the problems I had on my race-ready mitsubishi lancer evo IX were only mine he sold his car with 4000miles on it after 1 year I had it for 6 years and drove it 80'000miles including most european race tracks know that I have several steam games above 1000hours played one at 8000hrs and I still get comments "I know better than you dude" on it 🙄 and while we are at big numbers https://www.speedtest.net/result/d/02366486-268d-4eac-a41b-077b116eeecc
https://forums.guru3d.com/data/avatars/m/225/225084.jpg
Jeeez with 3x 8pin that's an incredible wattage amount. 450 + 75 = 525 Watts. wow just crazy power delivery. I'm already put off these new cards before they are even released. Going from a 1400mhz GTX 980 G1 which uses 175w to a 3080 with 300+ w is a no no for me. Gutted. I really wanted a 3080 too but not if it's chewing through electric like a kettle.
https://forums.guru3d.com/data/avatars/m/256/256969.jpg
This 3x8pins, jeez, you might have heard of it but seeing it it's something else. 😱
https://forums.guru3d.com/data/avatars/m/226/226700.jpg
Okay then; Zotac is using 3 eight pin power plugs. Also they have the traditional double slot configuration; this appears true for the Gainward pics too.
https://forums.guru3d.com/data/avatars/m/273/273678.jpg
Reddoguk:

Jeeez with 3x 8pin that's an incredible wattage amount. 450 + 75 = 525 Watts. wow just crazy power delivery.
@K.S. if vendors are using 3x8, the 12pin plug might be 6 or 7a per 12v pin 😱
data/avatar/default/avatar33.webp
Reddoguk:

Jeeez with 3x 8pin that's an incredible wattage amount. 450 + 75 = 525 Watts. wow just crazy power delivery. I'm already put off these new cards before they are even released. Going from a 1400mhz GTX 980 G1 which uses 175w to a 3080 with 300+ w is a no no for me. Gutted. I really wanted a 3080 too but not if it's chewing through electric like a kettle.
Thunk_It:

Okay then; Zotac is using 3 eight pin power plugs. Also they have the traditional double slot configuration; this appears true for the Gainward pics too.
Astyanax:

@K.S. if vendors are using 3x8, the 12pin plug might be 6 or 7a per 12v pin 😱
The use of 3x 8pin is not the standard setup nor does it reflect the actual power requirement for the 12 pin its said that its a 2x8pin conversion but due to weaker psus having limitations on delivering full power of each connector that many will use a 3x8pin adapter to spread the load but we already know good quality psu can power it with 2x 8pin. Also power requirements are the same 650w as 2080ti (confirmed with msi psu announcement of a 650w unit that's "fully compatible with the next generation of GeForce gpu's") a higher tdp doesn't mean it will absolutely be using insane power and anyone with a decent 650w or higher should be just fine. I get so tired of people complaining about this stuff when you're shopping for $800-1400 gpu I would hope you spent more than $80 on a middle of the road basic psu. If the idea of a gpu using some more power (within same requirements as previous cards mind you) "guts" your interest in it then go get amd card and enjoy your most certainly lower performance for some marginal power benefits. To me if you're always delivering the best performance and beating the competition to certain performance levels by literal years then if you need more power to offer me the best then I'm obviously willing to spend a bit to have it (both money as well as efficiency). You can complain about power usage when It's like AMD has done in the past by jacking power through the roof while also not bringing the absolute peak performance.
https://forums.guru3d.com/data/avatars/m/225/225084.jpg
All we are saying is that with new tech that's had a die shrink you'd expect power requirements to be lower than previous gens. Like Cpus going from 95w down to 65w. Turing on 12nm isn't too bad but it's looking like Ampere on 7nm needs either a new 12 pin or 3x 8 pin. Maybe we are wrong and a 3080 uses 200w or less but i doubt it right now. We'll soon see when HH get's un NDA'd. BUT "MSI launches its first PSUs and hints at very thirsty Nvidia Ampere compatibility" And btw Yes we all have top of the line PSUs here because we ain't daft.
https://forums.guru3d.com/data/avatars/m/226/226150.jpg
Everyone talking about power pins and casually Nvidia have killed USB-C VirtualLink
data/avatar/default/avatar30.webp
Isn't the 12 pin plug being introduced on the FE models?
https://forums.guru3d.com/data/avatars/m/197/197287.jpg
Reddoguk:

All we are saying is that with new tech that's had a die shrink you'd expect power requirements to be lower than previous gens.
Don't understand where this method of thinking comes from. Yes, a die shrink should allow lower power consumption at the same performance But it has absolutely nothing to do with ability to make high power parts or not.... Take CPUs as you talk about Zen - 14nm Ryzen max TDP of 95W Threadripper max TDP of 180W Zen+ - 12nm Ryzen max TDP of 105W Threadripper max TDP of 250W Zen 2- 7nm Ryzen max TDP of 105W Threadripper max TDP of 280W And before you say: Well they added more cores!...Yes, that's my point, max wattage of a product range has nothing to do with a die shrink.
oli3:

Everyone talking about power pins and casually Nvidia have killed USB-C VirtualLink
You seem to be on a kick on this. So i guess i'll just say: If that's the case, deal with it. It wasn't adopted, that's not nvidias fault.
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
Supertribble:

Scalability aside I keep hearing sli support is limited these days. I wanted to try out 2070S sli, just to play around with really, but yeah, folks say it's pointless, while others swear by it. I'm confused. 😕
mGPU setups were never that terrible in my experience, at least last time I did Crossfire (about 6 years ago) it wasn't that bad. Unlike what a lot of people claim, you could force-enable it on unsupported games, and if you had major issues, all it took was some manual tweaking. I'd be shocked if it was worse on Nvidia. People just want to plug something in and have it work without doing anything, and that's not gonna happen when you're using a technological niche. Regardless... I would not recommend mGPU setups for gaming. I suppose if you have an 8K display and upgrade your system every couple years, go for it. But otherwise, even with the manual tweaking, it's not worth the hassle. Despite my overall positive experience, I won't be doing it again.