GeForce RTX 4060 gets AD107 GPU with a proper 3072 shader cores

Published by

Click here to post a comment for GeForce RTX 4060 gets AD107 GPU with a proper 3072 shader cores on our message forum
data/avatar/default/avatar17.webp
Undying:

Im sure nvidia will enable FG in their graphs and market it as 2x-4x faster. 😀
Lol, i don't doubt that they will claim that !
https://forums.guru3d.com/data/avatars/m/198/198862.jpg
Dragam1337:

Lol, i don't doubt that they will claim that !
They did claim 4070ti was 2-4x faster than 3090ti which was obviously bs so why not do that here as well?
https://forums.guru3d.com/data/avatars/m/56/56686.jpg
the better question nvidia knows that we know there doing that and claiming 2x faster, so why are still doing, i mean i get from marketing PR they probably dont know the difference between DLSS performance native and FG and no FG all they see is "better" performance. the problem comes from there actual lying where those gains are. I sure they have * someplace saying that is with DLSS3 and or FG. but usual in such small print most people arnt gone see. I think they should just put in big letter where gains are and show the gains at native and dlss with and with out fg. both FG and DLSS are stop gaps imo and should not use as crutch like industry seem to want do I am instructed in Native gains and native RT gains, not this DLSS and now FG, FG is nothing more then pc verison fake hz crap tvs been using for years. last i look most good tv can do that in game mode and have little to no latency hit, if i was to use that anywhere it would be from tv. but seem most generates artifacts that would normal be there like DLSS and FSR can it not something really want on
data/avatar/default/avatar22.webp
While the performance is no doubt going to leave a lot to be desired it's worth keeping in mind that the real issue is the price. The 4060 isn't midrange, it's entry level and that's been true for a couple of cycles now. When the 4060 drops it'll be the slowest card of its generation for quite some time and the second slowest until the 5000-series. It's not like we're getting xx30 or xx10 cards these days. Don't compare it to the 2060, compare it to a 1650Ti.
https://forums.guru3d.com/data/avatars/m/56/56686.jpg
which is another problem xx60 have been mid range for as long as i can remeber, now there marketing it less but still using xx60 naming for and kicking the price up to boot if it entry level and 600$ + like most of us think it gona be, that is anything but "entry" level prices.
https://forums.guru3d.com/data/avatars/m/224/224714.jpg
Nvidia charge £800 for a 4070ti and limited the cards performance by memory and bandwidth £500 for a entry level card (1080 gaming) probably makes sense to them as they are trying to make out the "mining" prices are the new norm New gen mid range cards should be able to play modern games at 1440 and have no less than 12GB memory
https://forums.guru3d.com/data/avatars/m/156/156348.jpg
illrigger:

These are aimed at 1080p systems, so they wouldn't bother with more than 8GB. Most of the people buying these will be running everything turned down to lowest specs to get that sweet sweet fictional 240hz competitive advantage anyway so even 8GB is overkill for the target audience.
Even at 1080p when you factor in ray tracing and ultra 8GB is a limitation. According to TS (and HU) at 1080p with ray tracing enabled and ultra quality the 3070 TI performs worse than a 6700xt and a 3060 in HL and the only possible explanation for that is a lack of vram. It's even behind the A770 according to TS. 8GB is not enough in 2023 unless you play old games, indie games, don't care about ray tracing or if you're fine running games at mid to high settings with rt. But if that's the case wtf would you spend that amount of money on a new GPU while a cheap used 2k series GPU would do the trick just fine. If this card is sold 200$ US then yeah 8GB is fine. But it wont be sold at this price so no it's not fine.
https://forums.guru3d.com/data/avatars/m/156/156348.jpg
tsunami231:

which is another problem xx60 have been mid range for as long as i can remeber, now there marketing it less but still using xx60 naming for and kicking the price up to boot if it entry level and 600$ + like most of us think it gona be, that is anything but "entry" level prices.
Back in the 560 days a 60 card was considered entry level gaming card. Then lately it moved to mid range (price the perf is questionable for mid range). If it keeps going like that soon the xx30 will be considered a gaming card. You'll run games at 720p 30 fps but with the magic of DLSS it will display at 4k 100fps. Price will be 500$.
PinguX:

New gen mid range cards should be able to play modern games at 1440 and have no less than 12GB memory
Agree. 1440p is definitely mid range now and i'd say mid quality RT settings too. A 8GB card can't run HL with those settings. The 3070Ti card got 5 minimum fps and 17 average fps with those settings in HL at 1440p rt ultra (mid settings would not help much). The 2080 Ti with the same settings run HL at 28/41 (sort of playable mid settings would probably make it playable). Outside of vram limitation the 3070 Ti is usually more powerful than the 2080 Ti.
https://forums.guru3d.com/data/avatars/m/196/196284.jpg
moab600:

Yaks! Poor 8gb already renders it useless, probably the price will cut it further. Only AMD can make this card valid, judging by recent releases.
How exactly does only having 8GB VRAM render the card useless? The card will work just fine at 1080P.
MonstroMart:

Back in the 560 days a 60 card was considered entry level gaming card. Then lately it moved to mid range (price the perf is questionable for mid range). If it keeps going like that soon the xx30 will be considered a gaming card. You'll run games at 720p 30 fps but with the magic of DLSS it will display at 4k 100fps. Price will be 500$.
The 560 was only considered "entry level" by those that would never buy anything but the top end card. The 560 was a solid budget/mid-range card. In its day, the GT/S240 was an entry-level gaming card as was the GT640 in its day.
MonstroMart:

Even at 1080p when you factor in ray tracing and ultra 8GB is a limitation. According to TS (and HU) at 1080p with ray tracing enabled and ultra quality the 3070 TI performs worse than a 6700xt and a 3060 in HL and the only possible explanation for that is a lack of vram. It's even behind the A770 according to TS. 8GB is not enough in 2023 unless you play old games, indie games, don't care about ray tracing or if you're fine running games at mid to high settings with rt. But if that's the case wtf would you spend that amount of money on a new GPU while a cheap used 2k series GPU would do the trick just fine. If this card is sold 200$ US then yeah 8GB is fine. But it wont be sold at this price so no it's not fine.
Most of the people buying mid-range cards aren't trying to play at maximum graphics settings..... And there's no reason for mid-range cards to target 1440P when 1080P is still the dominant resolution by a large margin.
https://forums.guru3d.com/data/avatars/m/198/198862.jpg
MonstroMart:

Even at 1080p when you factor in ray tracing and ultra 8GB is a limitation. According to TS (and HU) at 1080p with ray tracing enabled and ultra quality the 3070 TI performs worse than a 6700xt and a 3060 in HL and the only possible explanation for that is a lack of vram. It's even behind the A770 according to TS. 8GB is not enough in 2023 unless you play old games, indie games, don't care about ray tracing or if you're fine running games at mid to high settings with rt. But if that's the case wtf would you spend that amount of money on a new GPU while a cheap used 2k series GPU would do the trick just fine. If this card is sold 200$ US then yeah 8GB is fine. But it wont be sold at this price so no it's not fine.
Idk where did you see that. Every raytracing supported game runs miles better on nvidia cards with the Hogwarts Legacy as an exeption. Is that a sign for the future we'll see. Returnal that just came out even 3060ti beats 6700xt by a 20% when rt is on.
https://forums.guru3d.com/data/avatars/m/156/156348.jpg
sykozis:

Most of the people buying mid-range cards aren't trying to play at maximum graphics settings..... And there's no reason for mid-range cards to target 1440P when 1080P is still the dominant resolution by a large margin.
I'm talking 1080p resolution here. Not sure how one would consider 1080p resolution at any settings as maximum graphics settings. What's next? 720p RT will be maximum graphics settings just bacause a 1 grand GPU wont be able to run it without DLSS 3?
Undying:

Idk where did you see that. Every raytracing supported game runs miles better on nvidia cards with the Hogwarts Legacy as an exeption. Is that a sign for the future we'll see. Returnal that just came out even 3060ti beats 6700xt by a 20% when rt is on.
Maybe HL is the only one for now but i would not feel confortable with a 8GB card moving forward personally unless i did not care about RT too much and would not mind turning it off. But then if it was the case i would not spend 500$ on a GPU i'd just buy a PS5 or Xbox. I expect this kind of GPU to cost 300$ USD.