GeForce RTX 4060 gets AD107 GPU with a proper 3072 shader cores

Published by

Click here to post a comment for GeForce RTX 4060 gets AD107 GPU with a proper 3072 shader cores on our message forum
https://forums.guru3d.com/data/avatars/m/248/248994.jpg
If this is true, wouldn't it actually be good news for 4060 Ti? If the vanilla 4060 uses a full chip, 4060 Ti obviously would need to use a different chip. An earlier rumour suggested 4060 and 4060 Ti use the same chip. But then again, rumours are always rumours, nothing more.
https://forums.guru3d.com/data/avatars/m/132/132389.jpg
Kaarme:

If this is true, wouldn't it actually be good news for 4060 Ti? If the vanilla 4060 uses a full chip, 4060 Ti obviously would need to use a different chip. An earlier rumour suggested 4060 and 4060 Ti use the same chip. But then again, rumours are always rumours, nothing more.
Not necessarily... 3072 shaders is 5.3x less than the 4090. I don't have high hopes of these being a good value. It just seems like the 4090 is a horrible value and everything below it is somehow worse in this bizarro duopoly world.
https://forums.guru3d.com/data/avatars/m/248/248994.jpg
Neo Cyrus:

Not necessarily... 3072 shaders is 5.3x less than the 4090. I don't have high hopes of these being a good value. It just seems like the 4090 is a horrible value and everything below it is somehow worse in this bizarro duopoly world.
Nothing is good value these days, with the prices having been raised so much. But perhaps 4060 Ti won't be a joke performance wise, even if the price was inevitably a bad joke, just like the prices of the previous 4000 series cards have been. It would be pretty funny if the 4060 really used a laptop chip, with laptop memory architecture, yet would still cost 100+ bucks more than 3060.
https://forums.guru3d.com/data/avatars/m/288/288652.jpg
Neo Cyrus:

the 4090 is a horrible value and everything below it is somehow worse
This 100% Either spend a months salary on a good GPU or buy a GPU that is worse then last Gen and more expensive. We just can't have nice things for a reasonable price anymore ...
https://forums.guru3d.com/data/avatars/m/268/268248.jpg
So it is the same laptop chip on a card no changes what so ever ? This smells kinda like the 6500xt ...
https://forums.guru3d.com/data/avatars/m/90/90667.jpg
Yaks! Poor 8gb already renders it useless, probably the price will cut it further. Only AMD can make this card valid, judging by recent releases.
https://forums.guru3d.com/data/avatars/m/248/248291.jpg
Only 8GB, on a GPU with only 8 lanes of PCIe. And only a 128 bit bus. I can already see people buying this card and having to lower texture resolution an RT to avoid performance issues. And considering how nvidia is pricing their GPUs, this is probably going to cost around 500$. Even more in Euros. What a bad joke of a GPU.
data/avatar/default/avatar08.webp
Seems like the authors behind the rumours are heavily downplaying the RTX 4060. Some people have bought AMD stock? Even if possible it would be a surprise to have only 8 GB on RTX 4060 after RTX 3060 having 12 GB. Also the 115 W TGP on desktop is a plain joke, while it is realistic for an extreme laptop configuration. I know that GTX 1060 consumed below 120 W, but then, all the models above it back then consumed way less than now. I recall Nvidia hasn't downgraded VRAM amount in the past in the same product tier, for example between GTX 1060 and RTX 2060 (both have 6 GB, though both have alternative models as well). Another possibility is that there will again be two RTX xx60 models with different amounts of VRAM, which is wrong in my opinion if they have a significant performance difference, but not a problem for those who are aware of it. I'm expecting (the fastest) RTX 4060 to be a successful GPU sales wise for offering somewhat appealing value in today's market. EDIT. Additional notes.
https://forums.guru3d.com/data/avatars/m/268/268248.jpg
@GamerNerves well they have the 3060 ti with 8gb and the 3060 with 8 gb that costs exactly the same as the 12gb version ... With 128 bit bus they have to go with either 8gb or 16 ...
https://forums.guru3d.com/data/avatars/m/198/198862.jpg
Venix:

@GamerNerves well they have the 3060 ti with 8gb and the 3060 with 8 gb that costs exactly the same as the 12gb version ... With 128 bit bus they have to go with either 8gb or 16 ...
its highly unlikely that 4060 will come in any different configuration. 16gb vram on a low-midrange card makes no sense.
https://forums.guru3d.com/data/avatars/m/270/270041.jpg
Undying:

its highly unlikely that 4060 will come in any different configuration. 16gb vram on a low-midrange card makes no sense.
Sadly we have reached an awkward point for vram. Games are demanding 10 or 12gb as a min, but to stick 16gb on lower end cards seems odd. But then we get to the odd position where a 3060(or ti) is doing better than a 3070ti because it actually has vram. Not sure if direct storage can fix this or we need something else...or if we are going to get to a point where we get cheaper vram slapped on as texture storage though that might be like a odd 970 situation again. I do feel that 12gb. Needs to get made the min, with higher cards getting 16/24 or more
data/avatar/default/avatar07.webp
These are aimed at 1080p systems, so they wouldn't bother with more than 8GB. Most of the people buying these will be running everything turned down to lowest specs to get that sweet sweet fictional 240hz competitive advantage anyway so even 8GB is overkill for the target audience.
https://forums.guru3d.com/data/avatars/m/268/268248.jpg
Undying:

its highly unlikely that 4060 will come in any different configuration. 16gb vram on a low-midrange card makes no sense.
Yeah I believe the 4060 will com with 8gb as well.
https://forums.guru3d.com/data/avatars/m/56/56686.jpg
I think min it should have is 12gb like 3060 have, i wont buy 8gb card again, special not at these insane prices, I then again i see the xx60 as mid range card not mid~low or eve low end cards the xx50 or low end cards and xx30 are low, but nvidia has been doing everything in power wipe those card from existence while jacking the price up. 3 generations ago a xx70ti was 450$ card not is $850 card in most cases more. these days I run everything at 1440p via DSR so long as the game allows it 1080p is no long good imo and especial not if dynamic res or even DLSS/FSR are involved what i game i seen with FSR even highquality look worse to me. certainly better then 720p in most cases.
https://forums.guru3d.com/data/avatars/m/279/279272.jpg
What is the point of having ray tracing cores, DLSS3/DLSS2 advanced tech with only merely 8 GB VRAM, when it is apparent that games from this point forward will require like 10 GB+ VRAM even at 1080p/tuned settings with ray tracing?
data/avatar/default/avatar10.webp
Probably the worst x60 card in history, relative to the time it is released in.
https://forums.guru3d.com/data/avatars/m/132/132389.jpg
Dragam1337:

Probably the worst x60 card in history, relative to the time it is released in.
Relative to everything really if it turns out to be as bad as the specs make it out to be.
https://forums.guru3d.com/data/avatars/m/198/198862.jpg
Dragam1337:

Probably the worst x60 card in history, relative to the time it is released in.
Im sure nvidia will enable FG in their graphs and market it as 2x-4x faster. 😀
https://forums.guru3d.com/data/avatars/m/268/268248.jpg
Dragam1337:

Probably the worst x60 card in history, relative to the time it is released in.
With the 460 660 760 1060 2060 being decent midrangers as is 306012gb the title of the relative worse 960 , after the review we will have to see if the 960 will hold the crown
data/avatar/default/avatar14.webp
Venix:

With the 460 660 760 1060 2060 being decent midrangers as is 306012gb the title of the relative worse 960 , after the review we will have to see if the 960 will hold the crown
Yeah, 260 was really good aswell... But yeah, 960 was dogtrash, especially in the default 2gb version. This 4060 will be at least on par with that in terms of awfulness... it has 4gb less vram than the 3060, which was the thing the 3060 had going for it... -_-'