4GB GeForce RTX 3050 Ti Coming towards Notebooks

Published by

Click here to post a comment for 4GB GeForce RTX 3050 Ti Coming towards Notebooks on our message forum
https://forums.guru3d.com/data/avatars/m/245/245459.jpg
You'd think that raytracing on a fairly lowly GPU like the 3050Ti would be kind of redundant.....you'd think the GPU wouldn't have enough overall performance capacity to ensure that raytracing could be enabled whilst retaining any decent fps. The tensor cores are probably more useful, as they could be used for DLSS to realise a performance bump.
https://forums.guru3d.com/data/avatars/m/198/198862.jpg
4gb is far too low to make use of any raytracing capabilities its just a marketing.
https://forums.guru3d.com/data/avatars/m/282/282473.jpg
dlss will be mighty useful for weak ass laptop gpus
https://forums.guru3d.com/data/avatars/m/273/273678.jpg
6GB is the minimum for raytracing, so either this doesn't have it, or its not 4GB.
https://forums.guru3d.com/data/avatars/m/198/198862.jpg
Astyanax:

6GB is the minimum for raytracing, so either this doesn't have it, or its not 4GB.
Even the 6gb are on the edge. I think DF tested the 2060 6gb in youngblood and it was having some issues due vram when 2070 was fine.
data/avatar/default/avatar07.webp
I wonder how well DLSS can help low end gpus , imagine GTX 1060 6GB , could it push 60 FPS in lets say Horizon Dawn at 1080p with DLSS ? . That could be game changer.
https://forums.guru3d.com/data/avatars/m/165/165018.jpg
cucaulay malkin:

dlss will be mighty useful for weak ass laptop gpus
yup I can say that DLSS on my 2070s is very welcome in a game like Cyberpunk 2077.
https://forums.guru3d.com/data/avatars/m/132/132389.jpg
On top of what you all already said, I'd like to point out it's using GDDR6, not the stupidly named GDDR6X which is quad data rate rather than double. Just call it GQDR6 nVidia you clowns. It's going to be half the memory bandwidth of what most people are thinking, so it can't even swap in data rapidly like a high end card to mitigate some of that 4GB choking which will happen.
https://forums.guru3d.com/data/avatars/m/197/197287.jpg
Undying:

4gb is far too low to make use of any raytracing capabilities its just a marketing.
It'll likely be able to do ray tracing better then a 11GB 1080 ti
https://forums.guru3d.com/data/avatars/m/282/282473.jpg
Undying:

Even the 6gb are on the edge. I think DF tested the 2060 6gb in youngblood and it was having some issues due vram when 2070 was fine.
and bf5
Kool64:

yup I can say that DLSS on my 2070s is very welcome in a game like Cyberpunk 2077.
except 2070s can play this without dlss just fine for entry level mobile gpu having dlss is fantastic news,having dlss is gonna be make or break in more demanding games for those laptops. amd's upcoming mobile gpus are dead in the water if they don't have a good equivalent.
https://forums.guru3d.com/data/avatars/m/132/132389.jpg
Aura89:

It'll likely be able to do ray tracing better then a 11GB 1080 ti
That'd still firmly put it in "utterly worthless" territory. You already know 1080 Ti doesn't have the hardware the software was originally made to address, and would use an intentionally bad, brute force, sloppy method. Ray Tracing is not an option on a 1080 Ti by design, I get literally 7 fps in Control with it on, on a 1080 Ti. It was meant to let people see the new graphics and nudge them into an upgrade. If nVidia wanted to make it better, they definitely could. Beating "technically it functions" is not useful bar. Even for having a single RT setting enabled, it still has to build its RT BHV tree, and eat up a (relative) ton of RAM. Having RT as an option on a low bandwidth 4GB card is a joke, as opposed to just spending the budget on more RAM or more rasterization power. And as far as I'm concerned RT in most games is pretty worthless right now anyway, unless you're playing the handful of games that have nice reflections. Even the lighting doesn't do a single bounce outside of CP2077's Lunatic setting. And one more note to throw in: If they're counting the amount of "CUDA cores" the same way as the higher end cards, it has half that count in reality, the other half CAN have some of the units function as CUDA cores depending on what's running. But it would never be all of it. That's the main reason why the super high core count 3080/3090s perform way lower than the CUDA count would indicate, it has half that in reality + maybe more depending on what's running. Honestly with RT on at 1440p, I think even 10GB of GDDR6X is really pushing how low they can go. For an item priced at what halo products used to cost, it was a monumental dick move on nVidia's part to seriously go with 10GB. I seriously can't imagine it not causing obviously RAM limited performance loss in 1-2 years at 4K, vs having just a few GB more. Would it have been possible to mix RAM chip sizes and go for 15GB? Or does that require a more elaborate memory controller that's not worth going for, or something? Hell, even going for a 384 bit bus and making it 12GB would have been so much safer than 10 considering 10 is on the edge already. I would have gladly paid an extra $100, or whatever that leather jacket demands, for 20GB as opposed to 10GB on a card that I'll have for who knows how many years. But no, they had to segment normal RAM quantity up in super halo territory at "if you have to ask the price you can't afford it".
https://forums.guru3d.com/data/avatars/m/197/197287.jpg
Neo Cyrus:

That'd still firmly put it in "utterly worthless" territory. You already know 1080 Ti doesn't have the hardware the software was originally made to address
That's simply my point, that amount of memory doesn't necessarily equal amazing performance for ray tracing. This card could have 20GB of memory and because of its hardware would likely perform the same at ray tracing tasks. As well, an RTX 3090 could have 4GB of memory and smoke the 3050 ti still, at ray tracing tasks (no i'm not saying it'd be the same performance as a normal 3090 with normal amounts of ram) So to state that THIS card will be horrible at ray tracing because of its 4GB of memory is literally moot point. Though i disagree at the premise that this card will not be able to do ray tracing at all, i do believe that at specific resolutions, DLSS, graphics settings and specific games, if someone wants to turn on some ray tracing, they'll be able to do it, and that's literally the whole point: Choice.
https://forums.guru3d.com/data/avatars/m/56/56686.jpg
are XX50 series now the price of xx60 was? seeing XX60 series prices have went up along with ever other series above it?
https://forums.guru3d.com/data/avatars/m/198/198862.jpg
tsunami231:

are XX50 series now the price of xx60 was? seeing XX60 series prices have went up along with ever other series above it?
Yes. Price wise 3050ti will stand where 2060 was last gen but you have less memory and similar performance so its justified.
https://forums.guru3d.com/data/avatars/m/56/56686.jpg
Undying:

Yes. Price wise 3050ti will stand where 2060 was last gen but you have less memory and similar performance so its justified.
Justified? that highly debatable, if you go by it justified in how XX50 are $ 100 more and XX600 are 50-100$ and XX70-XX80 are 150~200 more every few gen really soon price of gpu will be so expensive it would be cheaper to just by consoles for most people. I am around weary about buy gpu's or even rebuild my pc due to prices just getting stupider and stupider each new gen
https://forums.guru3d.com/data/avatars/m/280/280231.jpg
True 360fps 720p gpu
https://forums.guru3d.com/data/avatars/m/198/198862.jpg
tsunami231:

Justified? that highly debatable, if you go by it justified in how XX50 are $ 100 more and XX600 are 50-100$ and XX70-XX80 are 150~200 more every few gen really soon price of gpu will be so expensive it would be cheaper to just by consoles for most people. I am around weary about buy gpu's or even rebuild my pc due to prices just getting stupider and stupider each new gen
I was sarcastic. Nothing is justfied this gen and people are paying more for less. Sad times...
https://forums.guru3d.com/data/avatars/m/132/132389.jpg
Aura89:

So to state that THIS card will be horrible at ray tracing because of its 4GB of memory is literally moot point.
It's not. That ignores a lot of what I said. The 1K series had no dedicated RT hardware and therefore no expectation of it, this thing does, and it's in the name. Yet it won't really be RT capable in reality largely because of the prehistoric 4GB of RAM. That's definitely not a moot point. The RAM is the big problem. Comparing it to the 1K series is a complete false equivalence for making an argument. The software is not written for RT on the 1K series, there's only some alpha software to make it look bad as possible as a demo/3K sales pitch, period.
Aura89:

Though i disagree at the premise that this card will not be able to do ray tracing at all, i do believe that at specific resolutions, DLSS, graphics settings and specific games, if someone wants to turn on some ray tracing, they'll be able to do it, and that's literally the whole point: Choice.
If someone wants to enjoy their "cinematic" 720p 24 fps lowest setting RT shadows with DLSS on resulting in an internal resolution of 320p (at best, /2.25 is typically the "quality" setting), they can pretend it's fine all they want, but to anyone with reasonable standards that is effectively incapable of RT. Do you know a single person in real life who actually plays with such settings? I don't. That's not a real choice, that's just a lie/tease. The budget should have been used to make it a more capable card, not to promise things it's incapable of delivering. RT exists almost exclusively in GPU murdering titles, and will for the foreseeable future, even with everything cranked down to take take it easy on both RAM and rasterization requirements, that 4GB is going to choke upon any attempt at RT usage, it will cause slowdowns. Pick any RT title with DLSS, and tell me how that mobile RTX 3050 Ti would handle RT at 720p running internally at 320p. Spoiler: Abysmally if at all. CP2077? Lol. Control? Lol. The Medium? LOL! That's not changing, ever, it will be forever worthless at RT. And that example is just for the retarded argument of "negative 9001p at 0.2 fps looks fine you're just an elitist". No one would tolerate 320p, that's not enough to look tolerable with the current DLSS 2 implementation. Even 4x that resolution, an internal 640p, "quality" DLSS at 1440p, it's just barely enough to reconstruct the image with DLSS 2 if you ask me, there's still distracting image data loss... and smearing. nVidia are NOT giving a choice. They're giving a middle finger, and laughing as suckers who are ignorant buy something they think will give them RT. That is unless you think sub 30 fps at smeary reconstructed 80s resolutions, that more often than not will just crash, is a choice. That's not an exaggeration, that's the truth.