Is this the GeForce RTX 2080 Ti Founders model?

Published by

Click here to post a comment for Is this the GeForce RTX 2080 Ti Founders model? on our message forum
https://forums.guru3d.com/data/avatars/m/272/272918.jpg
if so, be good to see they sorted out the abysmal blower to an extent.
https://forums.guru3d.com/data/avatars/m/258/258664.jpg
Wait and see. 8h10m to go.
https://forums.guru3d.com/data/avatars/m/269/269560.jpg
Just magnificent. I am salivating... 8 hours to go. Hours are looooooonnnnggggggggg
data/avatar/default/avatar21.webp
B-but the GEFORCE RTX logo on the side is upside down? Fake news?
https://forums.guru3d.com/data/avatars/m/271/271576.jpg
Wow this shot is pretty.
https://forums.guru3d.com/data/avatars/m/16/16662.jpg
Administrator
strike:

B-but the GEFORCE RTX logo on the side is upside down? Fake news?
For promotion & marketing, I'm pretty sure the artwork designers know how to rotate a logo so people can actually read it.
https://forums.guru3d.com/data/avatars/m/215/215813.jpg
Looks nice but why would they go with a dual fan design when the Titan V and Quadro don't require two fans to cool those cards adequately?
data/avatar/default/avatar06.webp
For me, it's now confirmed that everything shown so far was fake. If the coolers are actually positioned this way (which I wondered my entire life why they weren't turned up as default since the 6800gt Era), all we've seen were mere speculation. This cooler positioning means the chip and all of the other stuff are on the "backside" of the PCB. Since gpus became a decorative piece on gaming PCs, I have always thought /tried to turn it in ways that I can actually show it off at first sight. Looks like Nvidia was thinking much like me.
data/avatar/default/avatar33.webp
BTW... This is what I call "reverse engineering" 😀
data/avatar/default/avatar20.webp
Wait. It got me thinking now... For AMD systems, no problem. Can't really say the same for most Intel boards. AMD's first PCIx slots are x1, the x16 coming right below. On most Intel boards you'll be forced to use the second x16 slot, which in the vast majority, is limited to X8.
https://forums.guru3d.com/data/avatars/m/132/132389.jpg
insp1re2600:

if so, be good to see they sorted out the abysmal blower to an extent.
Don't keep your hopes high, I've seen dual fan designs which look and feel like the high end type which perform actually as bad as the bottom-of-the-trashcan blower type nVidia make (looking at you Gigabyte). When it comes to reference design cards from either company I always expect the worst because that's literally always been the case. If this one time it's any better than a leaf blower glued onto an aluminum foil sheet from Walmart then it's just a pleasant surprise in my book.
https://forums.guru3d.com/data/avatars/m/123/123974.jpg
igorfiuza:

Wait. It got me thinking now... For AMD systems, no problem. Can't really say the same for most Intel boards. AMD's first PCIx slots are x1, the x16 coming right below. On most Intel boards you'll be forced to use the second x16 slot, which in the vast majority, is limited to X8.
What are you talking about? The last 3 Intel motherboards I've owned have a x1 first and then the first PCIe x16 slot.
https://forums.guru3d.com/data/avatars/m/202/202673.jpg
RavenMaster:

Looks nice but why would they go with a dual fan design when the Titan V and Quadro don't require two fans to cool those cards adequately?
It's a combination of wanting to run high enough clocks for a decent performance gain when the chip is just idiotically big, running tensor cores on the side and just plain old binning increasing the (gaming) TDP. Suddenly you find yourself in need of a bigger cooler...and apparently Intel wouldn't pick up the phone...
https://forums.guru3d.com/data/avatars/m/63/63170.jpg
igorfiuza:

Wait. It got me thinking now... For AMD systems, no problem. Can't really say the same for most Intel boards. AMD's first PCIx slots are x1, the x16 coming right below. On most Intel boards you'll be forced to use the second x16 slot, which in the vast majority, is limited to X8.
PCI Express cards are a standard. They are all built the same way for a very good reason (hint : Compatibility). Just deciding to go against EVERY other manufacturer out there, and put all of your Chips and cooling on the backside, would not only relieve you of your PCIExpress certification, You'd automatically lose one slot on your motherboard. Seriously, either look up about the PCI Express specifications, and card specifications, or Google it. Also, if you hadn't realized by now, its up to the motherboard manufacturers to put what slots where they want. Any make of motherboard could have a x1 or a x16, or anything in between, as a first slot. If all the AMD mobo's you have seen have a x1 slot first, thats a coincidence. 😉 I'm sorry to sound all negative and all, but the "negative knowledge" you just posted should not have been posted. Seriously read up a bit more about the PCI Express, and Motherboard technologies. Teach yourself 🙂 You'll learn better. https://en.wikipedia.org/wiki/PCI_Express For Motherboards, i would go through the history, starting with AT. Just see how far we have come since the Socket 5, Super Socket 7 (Yaaaaaaaay), and up to todays sockets. edit : Was being too harsh, changed crap to "Negative Knowledge".
https://forums.guru3d.com/data/avatars/m/236/236136.jpg
Evildead666:

PCI Express cards are a standard. They are all built the same way for a very good reason (hint : Compatibility). ".
I think he just refered to that render of the card (looks like it facing upwards and not the standard downward. Anyway, I think the render (if real) is made this way just for marketing. Would be weird if the turned it upside down. Let's see what they bring to the table in a few hours 🙂
https://forums.guru3d.com/data/avatars/m/63/63170.jpg
JulianBr:

I think he just refered to that render of the card (looks like it facing upwards and not the standard downward. Anyway, I think the render (if real) is made this way just for marketing. Would be weird if the turned it upside down. Let's see what they bring to the table in a few hours 🙂
I understand that the image may be misleading, but that is a far cry from saying that they could put the GPU and cooling on the backside of the card, because they just can't!! 🙂 I think the main reason for the cards being "Face down" (in a tower case setup) is so that dust doesn't naturally "Fall" into the fans, the air is sucked upwards and into the heatsink. On Topic : If this is a founders card, the dual fan setup might be because the TDP wouldn't allow a Blower style cooling, without it being too loud. The TDP's have gone up with respect to the 1080Ti, and especially the 1080 (if this preliminary info is correct.). I mean ~250W for the 2080Ti, and whats looking like 220-230W for the 1080 ? Still faaaaaaaaaaaaaaaaaaaaaaaaar away from Vega TDP's, but it will be interesting to see the reviews (hopefully this afternoon/evening ?)
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
People on Nvidia sub were saying the TDP is going to be 280w because you get +30w from the VR Link connector. Doesn't make sense to me that would affect the thermal design power of the GPU but I figured I'd toss that out there.
https://forums.guru3d.com/data/avatars/m/63/63170.jpg
Denial:

People on Nvidia sub were saying the TDP is going to be 280w because you get +30w from the VR Link connector. Doesn't make sense to me that would affect the thermal design power of the GPU but I figured I'd toss that out there.
I'd heard that the stated TDP included the 30W for the VR Link. 🙂 So ~220W for the Ti, and ~190W for the 1080. TDP should cover the entire board power, not just the dissipated heat. Seeing the size of these Chips though, it could be that Nvidia is taking a page from Intel and AMD, and only stating the TDP at Base Clocks, and not at Boost.... They might overclock, but with a chip that large, it will heat up very quickly. i also think the fan blades on that render look very steep, as if they needed to move a lot of air ? Or is that just me ?
https://forums.guru3d.com/data/avatars/m/204/204261.jpg
Hmm... if the Founders Edition has dual fans, it means its hot which will be "hot fix" by Nvidia probably next year so... coming from a 1070 FE, I think its safe to skip this series and wait for the next one.
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
FordLynx07:

Hmm... if the Founders Edition has dual fans, it means its hot which will be "hot fix" by Nvidia probably next year so... coming from a 1070 FE, I think its safe to skip this series and wait for the next one.
The Pascal FE series would throttle after like 5 minutes of gaming. Perhaps they just wanted to avoid that for good?