4GB GeForce RTX 3050 Ti Coming towards Notebooks
Click here to post a comment for 4GB GeForce RTX 3050 Ti Coming towards Notebooks on our message forum
Robbo9999
You'd think that raytracing on a fairly lowly GPU like the 3050Ti would be kind of redundant.....you'd think the GPU wouldn't have enough overall performance capacity to ensure that raytracing could be enabled whilst retaining any decent fps. The tensor cores are probably more useful, as they could be used for DLSS to realise a performance bump.
Undying
4gb is far too low to make use of any raytracing capabilities its just a marketing.
cucaulay malkin
dlss will be mighty useful for weak ass laptop gpus
Astyanax
6GB is the minimum for raytracing, so either this doesn't have it, or its not 4GB.
Undying
kapu
I wonder how well DLSS can help low end gpus , imagine GTX 1060 6GB , could it push 60 FPS in lets say Horizon Dawn at 1080p with DLSS ? . That could be game changer.
Kool64
Neo Cyrus
On top of what you all already said, I'd like to point out it's using GDDR6, not the stupidly named GDDR6X which is quad data rate rather than double. Just call it GQDR6 nVidia you clowns. It's going to be half the memory bandwidth of what most people are thinking, so it can't even swap in data rapidly like a high end card to mitigate some of that 4GB choking which will happen.
Aura89
cucaulay malkin
Neo Cyrus
BHV tree, and eat up a (relative) ton of RAM. Having RT as an option on a low bandwidth 4GB card is a joke, as opposed to just spending the budget on more RAM or more rasterization power. And as far as I'm concerned RT in most games is pretty worthless right now anyway, unless you're playing the handful of games that have nice reflections. Even the lighting doesn't do a single bounce outside of CP2077's Lunatic setting.
And one more note to throw in: If they're counting the amount of "CUDA cores" the same way as the higher end cards, it has half that count in reality, the other half CAN have some of the units function as CUDA cores depending on what's running. But it would never be all of it. That's the main reason why the super high core count 3080/3090s perform way lower than the CUDA count would indicate, it has half that in reality + maybe more depending on what's running.
Honestly with RT on at 1440p, I think even 10GB of GDDR6X is really pushing how low they can go. For an item priced at what halo products used to cost, it was a monumental dick move on nVidia's part to seriously go with 10GB. I seriously can't imagine it not causing obviously RAM limited performance loss in 1-2 years at 4K, vs having just a few GB more.
Would it have been possible to mix RAM chip sizes and go for 15GB? Or does that require a more elaborate memory controller that's not worth going for, or something? Hell, even going for a 384 bit bus and making it 12GB would have been so much safer than 10 considering 10 is on the edge already.
I would have gladly paid an extra $100, or whatever that leather jacket demands, for 20GB as opposed to 10GB on a card that I'll have for who knows how many years. But no, they had to segment normal RAM quantity up in super halo territory at "if you have to ask the price you can't afford it".
That'd still firmly put it in "utterly worthless" territory. You already know 1080 Ti doesn't have the hardware the software was originally made to address, and would use an intentionally bad, brute force, sloppy method. Ray Tracing is not an option on a 1080 Ti by design, I get literally 7 fps in Control with it on, on a 1080 Ti. It was meant to let people see the new graphics and nudge them into an upgrade. If nVidia wanted to make it better, they definitely could.
Beating "technically it functions" is not useful bar. Even for having a single RT setting enabled, it still has to build its RT Aura89
tsunami231
are XX50 series now the price of xx60 was? seeing XX60 series prices have went up along with ever other series above it?
Undying
tsunami231
itpro
True 360fps 720p gpu
Undying
Neo Cyrus