Forthcoming GeForce RTX 4060 Ti and RTX 4060 Boast 8GB Memory Standard

Published by

Click here to post a comment for Forthcoming GeForce RTX 4060 Ti and RTX 4060 Boast 8GB Memory Standard on our message forum
https://forums.guru3d.com/data/avatars/m/268/268248.jpg
How can you boast 8gb vram in 2023 ? :O
https://forums.guru3d.com/data/avatars/m/258/258664.jpg
After thinking about this for some time, I have to say that I guess Nvidia does not want people with a xx60 card to keep it for long. The one and only reason behind their design of those cards. I mean, they never were the best performing card tier anyway, and that's what people seem to forget. But it's pretty clear, that they are not built to be kept for longer than a single generation anyway. As entry level gaming cards, I guess Nvidia wants to get people to buy a xx70 the next time, just after they realize what's up with gaming GPUs. And after that, they realize that xx70 cards are soon limited by their 12GB when AMD does not do much to tackle that if they just slap 16GB on those cards. Which in the end leads to people buying higher tier Nvidia cards, not same tier AMD cards. Which is brilliant, business wise for Nvidia, not so much for AMD tbh.
data/avatar/default/avatar25.webp
If it was priced at $300 yeah maybe they can get away with 8GB VRAM for non-AAA gamers. Really depends on that non AAA gamer segment which is okay with low tier GPU for lighter new games.
https://forums.guru3d.com/data/avatars/m/246/246088.jpg
fantaskarsef:

After thinking about this for some time, I have to say that I guess Nvidia does not want people with a xx60 card to keep it for long. The one and only reason behind their design of those cards. I mean, they never were the best performing card tier anyway, and that's what people seem to forget. But it's pretty clear, that they are not built to be kept for longer than a single generation anyway. As entry level gaming cards, I guess Nvidia wants to get people to buy a xx70 the next time, just after they realize what's up with gaming GPUs. And after that, they realize that xx70 cards are soon limited by their 12GB when AMD does not do much to tackle that if they just slap 16GB on those cards. Which in the end leads to people buying higher tier Nvidia cards, not same tier AMD cards. Which is brilliant, business wise for Nvidia, not so much for AMD tbh.
Haven't the **60 cards always been the best sellers though? I've also read and this may just be hearsay but those who like to turn graphics down to enable an advance in some games, FPS I'd imagine, well those people flock to the **60 series cards.
https://forums.guru3d.com/data/avatars/m/196/196426.jpg
That 300 is the limit for sure for this performance + vram capacity class. But I'm pretty sure nvidia will ask $449 for it, daylight robbery.... UNLESS their 4070 sales are even worse than they appear to be, in which case they'll understand that such a price will not fly.
https://forums.guru3d.com/data/avatars/m/258/258664.jpg
pegasus1:

Haven't the **60 cards always been the best sellers though? I've also read and this may just be hearsay but those who like to turn graphics down to enable an advance in some games, FPS I'd imagine, well those people flock to the **60 series cards.
Well, one could argue people are buying 60 cards after 60 card just because they don't last. Ever thought about it that way? 😀 Of course, I'm just theorizing there, but does it look to you as if they're trying to offer the best deal they can there? Just trying to guess Nvidia's plan behind their offers so one can make the best buy. I can always tone down settings to get FPS, I don't need a new gen's card for that either, or so I've heard.
https://forums.guru3d.com/data/avatars/m/246/246088.jpg
fantaskarsef:

Well, one could argue people are buying 60 cards after 60 card just because they don't last. Ever thought about it that way? 😀 Of course, I'm just theorizing there, but does it look to you as if they're trying to offer the best deal they can there? Just trying to guess Nvidia's plan behind their offers so one can make the best buy. I can always tone down settings to get FPS, I don't need a new gen's card for that either, or so I've heard.
Again i might be wrong as im not really one for following the gaming media but the most popular games are far from graphically demanding. Do people keep upgrading their **60 cards the same as other upgrade their top tier cards, or their cars or coke-head girlfriends. I like a bit of immersion myself so Mil flight Sims, absorbing FPS and any Mil based RTS so i like the best eye candy i can get, and dont mind paying a premium for that. I always see life as it is, not what i wished it was.
https://forums.guru3d.com/data/avatars/m/258/258664.jpg
pegasus1:

Again i might be wrong as im not really one for following the gaming media but the most popular games are far from graphically demanding. Do people keep upgrading their **60 cards the same as other upgrade their top tier cards, or their cars or coke-head girlfriends. I like a bit of immersion myself so Mil flight Sims, absorbing FPS and any Mil based RTS so i like the best eye candy i can get, and dont mind paying a premium for that. I always see life as it is, not what i wished it was.
Good point there, usually you don't upgrade your coke head gf, since she either breaks, or you are a coke head too. 😀 Yeah I'm just like that too. I have no issues playing Overwatch or Battlefield on my 2080TI with minimal details for max FPS. But with SP games like RPGs and some nice RTS games, my oldest love genre, that's why I buy high end cards for.
https://forums.guru3d.com/data/avatars/m/145/145154.jpg
Is there a limit on how many users I can ignore? Why do I keep coming to Guru3D when so many discussions look like this thread?
data/avatar/default/avatar01.webp
Nvidia would like to gat back to the pre-covid days when many would upgrade their gpus each generation. These cards will never really be very good at RT games, yet Nvidia is pushing it hard.
https://forums.guru3d.com/data/avatars/m/247/247876.jpg
Do I need more than 8GB to play at 1080p and not at ultra level of settings?
https://forums.guru3d.com/data/avatars/m/216/216349.jpg
fantaskarsef:

After thinking about this for some time, I have to say that I guess Nvidia does not want people with a xx60 card to keep it for long. The one and only reason behind their design of those cards. I mean, they never were the best performing card tier anyway, and that's what people seem to forget. But it's pretty clear, that they are not built to be kept for longer than a single generation anyway. As entry level gaming cards, I guess Nvidia wants to get people to buy a xx70 the next time, just after they realize what's up with gaming GPUs. And after that, they realize that xx70 cards are soon limited by their 12GB when AMD does not do much to tackle that if they just slap 16GB on those cards. Which in the end leads to people buying higher tier Nvidia cards, not same tier AMD cards. Which is brilliant, business wise for Nvidia, not so much for AMD tbh.
I think it`s the other way around. Budget buyers tend to keep their stuff for longer because they care/need/whatever more about money and most are still "stuck" on 1080p screens, so a 60 card makes perfect sense. Even the 8Gb are enough for them. The big problem, like others have already said, are the prices. At 250/300€ this would be very nice cards for the mainstream/budget audience. The problem is that they are going to cost around 500€...
https://forums.guru3d.com/data/avatars/m/55/55855.jpg
AMDs upcoming 8GB cards also DOA then :P
https://forums.guru3d.com/data/avatars/m/235/235224.jpg
Embra:

These cards will never really be very good at RT games, yet Nvidia is pushing it hard.
The 4060Ti should have roughly 50.7 RT TFLOPS, RT performance alone sits between a 3070Ti and 3080. They're pretty capable but will have other drawbacks if you go higher than 1080p.
https://forums.guru3d.com/data/avatars/m/198/198862.jpg
4060ti main selling point will be dlss3 frame generation but as i've said before turning it on increases the vram usage so its a mute point.
https://forums.guru3d.com/data/avatars/m/239/239175.jpg
So dumb. At least they could have given the Ti 192-bit and 12GB. But nvidia basically tells people to just f off.
https://forums.guru3d.com/data/avatars/m/235/235224.jpg
Undying:

4060ti main selling point will be dlss3 frame generation but as i've said before turning it on increases the vram usage so its a mute point.
You keep saying this but it's not true like you keep saying rebar doesn't run on pcie 3.0 [youtube=CTotMcGUj5A] I just checked in Cyberpunk and it didn't add to my VRAM either.
https://forums.guru3d.com/data/avatars/m/198/198862.jpg
Spets:

You keep saying this but it's not true like you keep saying rebar doesn't run on pcie 3.0 [youtube=CTotMcGUj5A] I just checked in Cyberpunk and it didn't add to my VRAM either.
Rebar theoretically work on pcie3 but in reality it doesn't. HU tested 10900k. As for FG you are wrong it increases vram usage quite a bit. I posted Dying Light 2 screens yesterday from Zwormz yt and vram jumps from 8.6 to 10.7gb with FG enabled. Owen tested Cyberpunk it also did increase it so idk what you ware doing and how you tested.
https://forums.guru3d.com/data/avatars/m/190/190660.jpg
As many have already said, DOA. You already have turn down textures from ultra to high, or medium with 8GB cards. Even at 1080p. And it's still barely enough sometimes. Would not touch cards with 8GB, or lower, from this point onward. Only at entry level prices and/or out of absolute necessity.