Quad-slot NVIDIA GeForce RTX 4090Ti/TITAN 800W graphics card / cooler caught on camera

Published by

Click here to post a comment for Quad-slot NVIDIA GeForce RTX 4090Ti/TITAN 800W graphics card / cooler caught on camera on our message forum
data/avatar/default/avatar04.webp
Sounds cool. I predict $5000.00
https://forums.guru3d.com/data/avatars/m/258/258664.jpg
Comes with 2 12VHPWR connectors, for extra crit chance. Probably uses 600W still, but lowers clocks to make it happen. Maybe it's just noob me, but nobody wants that card like those (already old?) leaked screenshots show us with air cooling, this to me looks like AIO / custom loop mandatory. If this even ever sees the light of store shelves.
https://forums.guru3d.com/data/avatars/m/56/56004.jpg
Moonbogg:

Sounds cool. I predict $5000.00
Honestly, enough with the mocking, everyone knows nobody's gonna buy it at that price, even Mr Leather Jacket knows that! But priced at a reasonable 4999USD, Mr Leather Jacket believes it'd be a steal!!!;):D
https://forums.guru3d.com/data/avatars/m/260/260828.jpg
mikeysg:

But priced at a reasonable 4999USD, Mr Leather Jacket believes it'd be a steal!!!;):D
Only if you buy more, to save more
data/avatar/default/avatar40.webp
How about some somewhat affordable mid-ranged GPUs??
https://forums.guru3d.com/data/avatars/m/202/202673.jpg
Big is back, because bigger is better...
https://forums.guru3d.com/data/avatars/m/115/115462.jpg
My only concern is I can't 2way SLI them, yet alone 4way. 🙁
https://forums.guru3d.com/data/avatars/m/216/216349.jpg
A 800w videocard?! That`s just wrong...
https://forums.guru3d.com/data/avatars/m/258/258664.jpg
Solfaur:

My only concern is I can't 2way SLI them, yet alone 4way. 🙁
You just have to get either PCIe riser cables, and custom blacksmithing supports for both the cards, or just go 4way custom watercooling loop with waterblocks and risers for the price of more than one organ in central Asia. o_O
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
And yet, even without miners or scalpers, I'm sure these would all sell-out immediately. Why make anything affordable to the masses when it seems somehow, everyone's got enough money to buy these behemoths? The market trends sure are getting suspicious to me.
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
schmidtbag:

And yet, even without miners or scalpers, I'm sure these would all sell-out immediately. Why make anything affordable to the masses when it seems somehow, everyone's got enough money to buy these behemoths? The market trends sure are getting suspicious to me.
Embra:

How about some somewhat affordable mid-ranged GPUs??
A friend mentioned it to me and at first I was like eh but now I think he might be right - I think Nvidia is going to push Geforce Now as the future for midrange/low end. No one is going to want to buy a $700 midrange card - might as well just subscribe for $10 a month at that point. I think it also makes sense given the general cost of design as well. https://www.researchgate.net/publication/340843129/figure/fig1/AS:883120060506112@1587563638610/Chip-Design-and-Manufacturing-Cost-under-Different-Process-Nodes-Data-Source-from-IBS.png The cost for building a modern chip is increasing exponentially along with transistor gate cost essentially stalling: http://img.deusm.com/eetimes/2016/06/1329887/Handel1.png So what exactly do you do? You can delay it by trying to design for cost instead like AMD is doing.. but that best that buys you a generation or two. I think at some point you just say fuck it.. cheap GPUs are not a thing - build them massive, run them in the cloud, and scale gaming instances across them to massively increase the efficiency of the hardware. You can still sell them to users who still want a dedicated card. Everyone else pays $10-20 a month to run one of those instances depending on what experience they want. It's far less GPUs being manufactured - it's less SKU's you need to build, less headache with inventory and everything else.It gives Nvidia constant revenue stream which companies love. It arguably gives them more benefit for value-add features like RTX as everyone will get it as soon as the server is upgraded.. which gives developers greater incentive to use Nvidia specific technologies. I think from Nvidia's viewpoint it makes perfect sense to do this and from a customer perspective might actually be the best scenario for them as well. Don't get me wrong - I love dedicated hardware.. but its possible at some point it's just not do-able anymore.
https://forums.guru3d.com/data/avatars/m/56/56686.jpg
probably true, but if have do my gaming threw cloud i will just stick to old stuff, honestly i dont buy it cost them half of what they charging per card to make the cards. it just really dawned on them that they charge arm and leg and make half and if not a quarter of that still keep there profit litterly if that happen there isnt much need for pc anymore seeing tv can just stream them so can our phones and tablets They idea of streaming my games and or renting hardware to stream the game does is not appealing to me. i like to physical own my hardware and games on disc for consoles for reason. never mind the fact stream would mean latecny issue, and fact that in states you cant touch internet for less then 50$ month is most cases the normal price double that or more were as euro seem to pay whole lot less. and you would be screwed with internet to stream. they could sell there cards and decent prices and still make profit it just they dont want to or care too. if price keep going they way there going sooner or later they wont sell at all. in 3 generations a xx70 series double to $799 and in some case $1000 in 3 more generations it keeps happen we will see $1499 xx70 card that would just be insane. and would be death pc gaming and consoles for that matter for alot people. It is what is and what happens will happen. cloud/stream gaming is not something i want nor is something i would pay monthy for.
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
tsunami231:

They idea of streaming my games and or renting hardware to stream the game does is not appealing to me. i like to physical own my hardware and games on disc for consoles for reason. never mind the fact stream would mean latecny issue, and fact that in states you cant touch internet for less then 50$ month is most cases the normal price double that or more were as euro seem to pay whole lot less. and you would be screwed with internet to stream. they could sell there cards and decent prices and still make profit it just they dont want to or care too.
Likewise, and we have to consider that we are both in places where we have access to high-speed and relatively low-latency internet. There's a lot of people where this simply cannot work. There's the whole dichotomy of demand going up but hardware capabilities hitting an upper limit, which is really more a matter of how we need to reflect upon development practices. That's probably one of the reasons why Nvidia started investing in DLSS when they did, because they knew they couldn't depend on developers optimizing their code, so they needed a head start in developing AI-assisted performance improvements in the likely event that they couldn't keep up with modern demands. Unless there's some major breakthrough, that day will come, and I figure that may come within a decade. If the rumors about the 4090 Ti are true, I would say that day has already come. It doesn't matter how fast a GPU is if becomes impractical to operate, let alone buy. I think what matters more is finding ways to further optimize games, without the need of AI (but AI is fine too). There are plenty of examples of games that run so poorly compared to others of similar (or maybe even better) detail level. Sometimes these games improve in updates, whether from the drivers or the game itself. The thing is, even the games that run well most likely have a lot of room for improvement. For too long, we've been spoiled by having more specs for the same reasonable price. Developers didn't have to try to make anything efficient or optimized because they would just throw more cores or RAM at the problem. It was reasonable for them to assume consumers would just go ahead and upgrade, because it was affordable to do so. Well, now we're left with nearly 20 years of poorly optimized code and hardware has reached a point where it's either too expensive or you're getting the same level of performance for the same price. That's one of the reasons I like using Linux and think iOS is well-designed, because you can get an overall better user experience for a fraction of the system resources. Something needs to change, or else PC gaming is going to become a hobby for the rich. Nvidia and AMD can only do so much to squeeze in more performance for a reasonable wattage or price if they want to show any sign of progress, which is hard to do when it's getting so hard to shrink transistors.
https://forums.guru3d.com/data/avatars/m/56/56686.jpg
I will probably get castrated for this but Windows is horrible inefficient when compared to iOS more time then not IOS is snappy and fast with mobile hardware of lesser design. Windows runs like trash on mobile tech if trying to run windows on that hardware iphone tech and that low of ram it will choke. and if you have any thing less then 4gb or even 8gb ram win 10 god help you i could never run windows with that kind ram, mean while ios is snappy and fast with 2~4gb and mobile tech that runing what 25watts if that? I want my 1070ti replaced with something new but to get something in same series or even in the xx60 series I could litterly build a new pc with 12700k+z690+32gb 5400ram+psu + and hsf + the case. well i could if i get cpu/mb from microcenter which what i been doing for quiet some time. I could buy gpu instead and start playing newer game on pc just find out how much 6700k and that 2400mhz 16gb i have is gona be bottleneck. game wise what i play 6700k isnt really bottleneck but i sure that 2400mhz ram is. shrinking the die is just make thing more expensive they need to go back draw board and resign chip that super efficient need a WHOLE lot less cuda cores, transistors and what not to produce that performance. but i sure it not that easy and they not gona want spend the money when they can just jack up power and cores and cudas and call it day. im sorry but if 800tdp become the normal i want nothing to do with 200 tdp is already way to much heat generation then i like. in winter it fine i dont use my heater, in summer i would need to jack up AC just to keep heat in check, i dont even want to image how bad it would be with 300tdp or even 600tdp more, seeing i already thing it to much. and cost electricity is skyrocketing in alot places like alot other things. Know all those movies we seen over decades where alien races said this chip is 1000x more power full then all computer on planet an uses next to no energy? yah we need that which probably will never happen. even if it did only 1% of the 1% would be able to afford it. I love pc gaming but these prices are gona kill it off for lot people. at this point i would not be surpise if next gen consoles are 899~1000$, which would kill console gaming for me and alot of people i know. which would pave path to cloud/streaming gaming most people i know now wont even touch pc for gaming or new consoles newer then ps2, cause everything after has been push for digital a permittee need for internet. I remember when when all i did was pop cartages in to console and in 5 seconds i was play the game game where release in working order no broke states. those were the days
data/avatar/default/avatar04.webp
Looking at my 5600xt and reading what the "new" midrange pricing could sound like: Please don't die. Just don't.
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
tsunami231:

I will probably get castrated for this but Windows is horrible inefficient when compared to iOS more time then not IOS is snappy and fast with mobile hardware of lesser design. Windows runs like trash on mobile tech if trying to run windows on that hardware iphone tech and that low of ram it will choke. and if you have any thing less then 4gb or even 8gb ram win 10 god help you i could never run windows with that kind ram, mean while ios is snappy and fast with 2~4gb and mobile tech that runing what 25watts if that?
You're preaching to the right person haha. I've ditched Windows on all of my PCs for almost a decade, including my gaming PC. There are many reasons for this, but its inefficiency was one of the main reasons. Android for a long while was pretty inefficient too but it does seem to have improved, a little. Companies keep throwing more background processes into the mix to undermine the overall improvements to the OS.
I want my 1070ti replaced with something new but to get something in same series or even in the xx60 series I could litterly build a new pc with 12700k+z690+32gb 5400ram+psu + and hsf + the case. well i could if i get cpu/mb from microcenter which what i been doing for quiet some time.
I share the same mentality. That's why I'm still stuck with my R9 290. Thankfully, it still receives updates on Linux, so it's been more playable.
I could buy gpu instead and start playing newer game on pc just find out how much 6700k and that 2400mhz 16gb i have is gona be bottleneck. game wise what i play 6700k isnt really bottleneck but i sure that 2400mhz ram is.
Honestly, I predict it'd be less of a bottleneck than you think it'll be. Keep your background processes to a minimum, maybe try to overclock a little higher if possible, and you might be ok for a little while longer. Games aren't that CPU dependent anymore, and 16GB is plenty when you keep your system lean.
shrinking the die is just make thing more expensive they need to go back draw board and resign chip that super efficient need a WHOLE lot less cuda cores, transistors and what not to produce that performance. but i sure it not that easy and they not gona want spend the money when they can just jack up power and cores and cudas and call it day.
The CUDA cores are used for gaming, so they ought to stay. This is a lot more difficult than you think it is though - there's no one-size-fits-all solution to how the transistors are arranged. There are solutions that are more optimal than others, but even then, that depends on modern workloads. Nvidia could start from scratch but honestly, I don't think they'd see that big of a difference. With the way modern APIs are designed and the preferences toward easier (rather than optimized) game development, Nvidia doesn't have much room to improve. What really needs to happen is a shift in software development. It's much more practical to make software take advantage of the ISA than it is to go the other way around, but the problem is convincing developers to agree to that.
im sorry but if 800tdp become the normal i want nothing to do with 200 tdp is already way to much heat generation then i like. in winter it fine i dont use my heater, in summer i would need to jack up AC just to keep heat in check, i dont even want to image how bad it would be with 300tdp or even 600tdp more, seeing i already thing it to much. and cost electricity is skyrocketing in alot places like alot other things.
Totally agree, though my upper limit is 300W. I've mentioned before that even a 400W GPU risks tripped circuit breakers, when you have a powerful CPU, a good sound system, and your AC connected to the same breaker. No idea how they think 800W is supposed to be feasible.
I remember when when all i did was pop cartages in to console and in 5 seconds i was play the game game where release in working order no broke states. those were the days
That's often why I resort to indie games, because being fun is often their #1 priority. AAA games are very pretty but more often than not, they're annoying to play. That's why I'm fine with getting something like a 6700XT, because that ought to be powerful enough to play what I want to play in 4K @ 60FPS+, but... I'm not willing to spend $350 on a mid tier GPU that's approaching 2 years old.
https://forums.guru3d.com/data/avatars/m/56/56686.jpg
@schmidtbag honestly i toying with notion of get 12700k+z690 asus tuf plus+ 32 gb 5600mhz 36CL corsair ram+ Deep cool AK620 HSF from Microcenter before going back to NC as that would cost $580 before taxs. I really want the (non k) models but they not give away mb for only 70$ vs normal price to go with 12700 or 13700 and same mob it would cost 200$ more. honestly this seem like better deal, though i really should add new psu to this cause current 750watt i have is now 5+ years old and been in my 920 and this 6700k. new build wont help me with wanting to running everything at 1440p with newer games but atlest CPU and ram be of little concern going forward. and maybe if the 4070 and 4060 ti price right i will get some time down road but more likely 5xxx will be out before i bother if i do this.or that 1070ti will die before i replace it gpu price are ridiculously and are are tdp imo
data/avatar/default/avatar04.webp

rtx4090ti.JPG
meme takes up only 4 PCI slots too :V
https://forums.guru3d.com/data/avatars/m/259/259654.jpg
I still don't think that a GPU in the cloud has better ROI than just selling it, and despite AMD getting the ridicule they deserve for their Windows software and driver quality, I really think they have the right idea with the chiplets.