NVIDIA Geforce RTX 4090 Ti Possibly Canceled

Published by

Click here to post a comment for NVIDIA Geforce RTX 4090 Ti Possibly Canceled on our message forum
https://forums.guru3d.com/data/avatars/m/79/79740.jpg
Zero point in such a card. What would it compete with? When the competition cant even touch the regular 4090.
https://forums.guru3d.com/data/avatars/m/268/268248.jpg
alanm:

Zero point in such a card. What would it compete with? When the competition cant even touch the regular 4090.
yeah but that did not stop em in the past really .
https://forums.guru3d.com/data/avatars/m/246/246564.jpg
They would either have to discount the 4090 or price the 4090Ti out of reach of almost everyone. They don't need bragging rights in this market.
https://forums.guru3d.com/data/avatars/m/79/79740.jpg
Venix:

yeah but that did not stop em in the past really .
Like the 3090 Ti? How well did that do? Bet it figured in their decision not to repeat same mistake.
https://forums.guru3d.com/data/avatars/m/132/132389.jpg
And nothing of value was lost. It's not like it'd replace the 4090; the delusional leather jacket probably wanted $3K for one which wouldn't sell enough to justify it existing.
https://forums.guru3d.com/data/avatars/m/220/220214.jpg
Isn't every full die AD102 chip being used and sold for much more money in data-center products? So why would they redirect any of these full chips to much less profit consumer gaming products. Only reason they use the "less cores" versions in any consumer products is because they were "defective" (one or more blocks broken) and couldn't be used as full die for data-center products. The "all blocks working" dies of the AD102 would always be kept for the big profit data-center products
https://forums.guru3d.com/data/avatars/m/258/258664.jpg
geogan:

Isn't every full die AD102 chip being used and sold for much more money in data-center products?
This, nothing else.
https://forums.guru3d.com/data/avatars/m/198/198862.jpg
4090's are still burning despite modded cables and updated power connections. 4090ti should be canceled.
https://forums.guru3d.com/data/avatars/m/216/216349.jpg
Makes sense, due to all the reasons already writen before. The only problem is that i was planning on buying two of them for my rig, oh well...:p
https://forums.guru3d.com/data/avatars/m/99/99478.jpg
I would buy one. 😎
https://forums.guru3d.com/data/avatars/m/268/268248.jpg
alanm:

Like the 3090 Ti? How well did that do? Bet it figured in their decision not to repeat same mistake.
the 2080ti as well amd had the 5700 only that gen , also in that gen the titan V , 1080ti and then the titan xp .... if we go back the 8800gtx then 8800 ultra and the bargain 8800gt while all ati had was the hd 2900xx and later the 3070 that was slower than the gt ... in general nvidia never seemed to be willing to slow down releasing a faster card when they could with or without competition really
https://forums.guru3d.com/data/avatars/m/232/232189.jpg
This literally doesn't matter. The 4090 is so powerful you really don't need anything else. They should just release the 5090 when its ready.
data/avatar/default/avatar01.webp
I doubt there was ever going to be 4090ti.
https://forums.guru3d.com/data/avatars/m/132/132389.jpg
geogan:

Isn't every full die AD102 chip being used and sold for much more money in data-center products?
AFAIK that's correct, so there is no way a 4090 Ti or Ada Titan would be released with the full die enabled. The 4090 is pretty heavily cut down, it's more comparable to a 3080 than 3090, so there is enough room to fit in another defective AD102 chip notably above it.
goat1:

This literally doesn't matter. The 4090 is so powerful you really don't need anything else. They should just release the 5090 when its ready.
Blackwell has already been delayed until 2025. With AMD refusing to compete they have no reason to stick with a 2 year cycle until Intel start posing a threat.
https://forums.guru3d.com/data/avatars/m/246/246564.jpg
The 2080Ti was different - it was the only card in the lineup that could actually (barely) do ray tracing at the time. The 4090Ti is wholly unnecessary. Now, if they rebranded it as a Titan and sold it for $5000 that's a whole different thing, but doesn't seem like they're keen on releasing something in such low volume. Probably not much interest from board partners either. I think geogan nailed the main reason, anyway.
https://forums.guru3d.com/data/avatars/m/199/199386.jpg
rocking a 3060 @ 2160P @ 144Hz Nvidia really messed themselves over with their scaling tech, because, srsly, the power of 4090/3090 is OP, unless you are running 8K, but of course, games do not even use 8K textures, mostly because of the size of them and disk space. Unreal Engine 5 got rid of the need to use normal maps, and, got rid of tessellation (buggy nastiness) purely to save mem and make games more efficient.
https://forums.guru3d.com/data/avatars/m/266/266726.jpg
geogan:

Isn't every full die AD102 chip being used and sold for much more money in data-center products? So why would they redirect any of these full chips to much less profit consumer gaming products. Only reason they use the "less cores" versions in any consumer products is because they were "defective" (one or more blocks broken) and couldn't be used as full die for data-center products. The "all blocks working" dies of the AD102 would always be kept for the big profit data-center products
there are different kinds of bins than just fully functional, such as high leakage or low frequency parts, that do not pass the required specification. i could see nvidia selling a handful of 4090 tis at a ridiculous price that are high leakage , I'm pretty sure they have done this in the past.
data/avatar/default/avatar38.webp
RTX 4090 Ti is cancelled because the TDP of 600W is too very high and may be dangerous. 😉 But I'm super satisfied for the incredible performance of RTX 4090. <3 😀
https://forums.guru3d.com/data/avatars/m/275/275921.jpg
Thanks AMD
data/avatar/default/avatar27.webp
Loobyluggs:

rocking a 3060 @ 2160P @ 144Hz Nvidia really messed themselves over with their scaling tech, because, srsly, the power of 4090/3090 is OP, unless you are running 8K, but of course, games do not even use 8K textures, mostly because of the size of them and disk space. Unreal Engine 5 got rid of the need to use normal maps, and, got rid of tessellation (buggy nastiness) purely to save mem and make games more efficient.
Sorry but the only game a 3060 can run 2160p 144hz is something like CS go or rainbow 6 siege. I can easily max out my 6900xt with only 1440p 100-165hz, there are plenty of games a 4090 is not able to max out a 4k 144hz monitor, especially if RT is involved.