GeForce RTX 4060 would be equivalent to an RTX 3070 Ti in performance

Published by

Click here to post a comment for GeForce RTX 4060 would be equivalent to an RTX 3070 Ti in performance on our message forum
https://forums.guru3d.com/data/avatars/m/87/87487.jpg
I remember a time, pre-COVID, eBay scammers and bitcoin boom when graphics vendors offered new generations of cards with a nice performance bump over the previous generation ones but at the same price point (more or less). Now it seems we are expected to pay a premium for something we took for granted before and that absolutely sucks in my view. NVIDIA seem to have forgotten about offering "value for money" in a rush to please its shareholders after presuming raking in the profits for the 2 years during COVID when prices were extremely high and their cards were impossible to get hold of at MSRP. They are also behind AMD when it comes to offering VRAM on their products, keeping the higher amounts for the ridiculously expensive cards. AMD seem to have settled on 16 GB as standard yet NVIDIA seem to think 12 GB is enough, again offering less for more.
https://forums.guru3d.com/data/avatars/m/239/239175.jpg
I wonder if NVidia is still two generations ahead in their R&D. Because with their pricing, one might think they're not. Or if they are, then what they have sucks and are trying to make as much money as they can while they still can πŸ˜›
https://forums.guru3d.com/data/avatars/m/216/216349.jpg
Dragam1337:

It truely has offensively poor specs - but honestly, that has been the case with all 4000 series gpu's, aside of the 4090. The 4050 will likely end up with a 64 bit bus... lol.
So true. It seems Nvidia is copying Intel and artificially segmenting (crippiling) lower cards so they can justify the prices of the higher end GPUs...
https://forums.guru3d.com/data/avatars/m/198/198862.jpg
That timespy score is not 3070ti performance level even my 3070 scores higher. The card will bottlenecked by 128bit bus. 499$? No thanks.
https://forums.guru3d.com/data/avatars/m/270/270792.jpg
I just want a good and cheap xx60 card (with AV1 encode/decode and PCIe x16). Yeah, I know, it’s probably not coming. But hey, I can wait almost forever.
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
Dragam1337:

It truely has offensively poor specs - but honestly, that has been the case with all 4000 series gpu's, aside of the 4090. The 4050 will likely end up with a 64 bit bus... lol.
If it really has 3070 Ti performance, uses less power, and hopefully costs a lot less then I don't think it matters. Part of me wonders if Nvidia managed to just optimize the GPU in a way where it needs less memory bandwidth. There have been enough changes in the past decade to reduce CPU usage, PCIe bandwidth, and memory consumption.
https://forums.guru3d.com/data/avatars/m/271/271560.jpg
Kaarme:

Oh, yes, this sounds like a Porsche designer saying that this metal part was replaced with a plastic part to bring the price of the sports car down. The only real way to drop the graphics card price is to drop the price of the GPU chip, but Jensen needs to show the investors that he can keep the multi-billion profits growing, so it's not an option. So, instead, the price of the GPU chip is raised, and then starts the desperate hunt for any pennies that could be pinched elsewhere, including the memory.
a real problem is the crappy yield on massive monolithic chips. despite getting a yield bump from going to a smaller process, Nvidia has the lowest yield of any design @ node. this is inherently expensive, which is not to excuse Nvidia for their pricing, but to highlight part of their rationale about keeping the same margins despite increased costs (incl. from TSMC) and passing all of them downstream w/o any sugar coating (like a wider bus). even the monolithic AMD dies coming out are smaller and higher yield (but lets wait for their pricing). Nvidia, imho is trying to cash out their last gen oversupply problems w/ 4xxx
data/avatar/default/avatar15.webp
Bigger cache can only take you so far. 4070Ti already has performance drops when going into higher resolutions.AMD did it last gen with infinity cache but still when resolution increased or games were memory heavy, it lost peformance compared to their counter part from nvidia. This gen they beefed up memory subsystem, nvidia went the other way (except their halo product 4090). This thing with 128Bit bus as a xx60 class card is just funny. They should top it off with just 8GB of VRAM and price it at 500$. They are just counting that DLLS3 FG will fix it all, but having 20 fps and making it up with interpolation to get 40 will not work well. Yields are not bad, no news has been in place saying that they have large defects. And only AD102 can be considered a big chip, others are pretty small when you compare them to GA.
data/avatar/default/avatar05.webp
Venix:

yes the paper specs of the ada cards are low bit bass etc ...that said the cards perform thought what is offensive is the price as an 1060 owner ... i would love to see the 4060 at 200 euros ...but i know that is not going to happen ...EVER i would be surprised if i see the card at 350 to 400 ...... witch is very high for a 4060 card ... i am afraid nvidia will try to lunch the card at 500 usd msrp
I guess 350 for the 1060 and 550 for the 1070 - right on the edge of too expensive but just low enough to tempt you.
https://forums.guru3d.com/data/avatars/m/271/271560.jpg
Remekra:

Yields are not bad, no news has been in place saying that they have large defects. And only AD102 can be considered a big chip, others are pretty small when you compare them to GA
yields are terrible compared to every other processor of any type @ node, which means they're "not bad" for Nvidia who's used to worse. and this gen they're dealing w/ competition using the highest yield ever achieved for a GPU due to MCM and "pretty small" is only applicable to Nvidia, as every other processor (incl. type) at each of the last three nodes has better yields than Nvidia. but your points on cache & subsystems are spot on
https://forums.guru3d.com/data/avatars/m/270/270091.jpg
Pryme:

Yes, GTX 1060 6GB was on par with the 980 and was easy to overclock to 2.1Ghz, mantaining low power consumption. I upgraded to it from the 960 and was double the performance. Now, as I recall very well, the 960 was another 128-bit Bus card, with Nvidia stating that BUS had plenty bandwith for games... yeah right. And this 4060 it will end with 8Gb or 16GB of VRAM? This will be fun.
yeah the 960 was basically just a 980 chopped exactly in half, whereas the 1060 was at least more than just half of a 1080 (had 75% of the ROPs and bus width, but half the shaders and texture units).
https://forums.guru3d.com/data/avatars/m/56/56686.jpg
meh priced right i would buy it but priced right i mean no more then $399 even that is to much for xx60 series
data/avatar/default/avatar33.webp
I am in this really strange spot where I can still run most games (not things like cyberpunk for example, but that game still needs a bit of work) on my 1080p screen with my nvidia 1080 that unless I get 4K high refresh rate for a decent price (under 700?) I simply have no reason to upgrade. I can probably get a 6800 and run a 1440p screen (no RT thou), but I don't see the point in spending that much money of a "slight" upgrade.
data/avatar/default/avatar18.webp
I believe this would be the RTX 4060 Ti.
data/avatar/default/avatar31.webp
Why don't ppl just buy 3070,3080,3090/Tis giving the prices are about the same?! It's available now....
data/avatar/default/avatar20.webp
Looks like the tests were run on a system with a PCI-E 3.0 bus. Not sure how much that influences the performance, but at least something to keep in mind. For the people wondering why the performance of the 128-bit bus is matching the 256-bit bus of the 3060: the L2 cache has been scaled up from 4 MB to 32 MB, which like the larger L3 cache on the newer Radeon cards will have large benefits (even more than adding L3 cache as L2 caches are normally much lower latency and higher bandwidth than your typical L3 cache).
https://forums.guru3d.com/data/avatars/m/229/229509.jpg
Texter:

18 times the bandwidth of a 2004 GPU. 19 years later. And even that 6800GT had a 256 bit memory bus. 😎
You still running a 20-year old machine? What OS can handle such old hardware?
https://forums.guru3d.com/data/avatars/m/239/239175.jpg
Valken:

Why don't ppl just buy 3070,3080,3090/Tis giving the prices are about the same?!
Some do, some are waiting to see the price/perf ratio of the 4060(Ti). And I guess some really want DLSS 3?
https://forums.guru3d.com/data/avatars/m/202/202673.jpg
BLEH!:

You still running a 20-year old machine? What OS can handle such old hardware?
Hardly ever use it. But on its 4th PSU it's running Windows XP SP3 Ultimate Edition iirc...technically it can browse the internet via LAN, but I've upgraded the (still very cool looking) case to S.H.E.L.F. mode. It's very good at keeping 5 Ext HDD's, a huge LEGO X-wing and a big Transformer at exactly 105.4 cm above floor level when stacked on top of a less fancy (empty) big tower from 2000. And that 6800GT died in 2016 or so (applause), so there's a Geforce2 GTS inside it now.
https://forums.guru3d.com/data/avatars/m/268/268248.jpg
Valken:

Why don't ppl just buy 3070,3080,3090/Tis giving the prices are about the same?! It's available now....
Dude I waited 2++years already and I will not buy 2 years after close to MSRP .... I can wait and see the rest of the 4xxx and amds 7xxx models and when I see those I would then decide!