GeForce RTX 3070 Ti performance Leaks in Ashes of the Singularity
Leakbench (twitter usuer) reveals the first benchmark of the GeForce Ti 3070 RTX in Ashes of the Singularity (AOTS).
Preset in High quality and accompanied by an AMD Ryzen 9 3900X, this GPU achieves a performance of 150, 102 and 90 FPS at resolutions 1080p, 1440p and 2160p , respectively. Compared to its younger sister, the GeForce RTX 3070, the new graphics card gets 9% more performance at 1440p. You can see the detail of each test below.
The GeForce RTX 3070 Ti will be equipped with the GA104-400 graphics chip, equipped with 6144 CUDA Cores (256 more than the RTX 3070), 192 TMUs / Tensor Core, 96 ROPs and 48 RT Cores, accompanied by 8 GB of memory. GDDR6X at 19 GHz which, together with a 256-bit memory interface, provides a bandwidth of 608 GB/s. This new GPU is expected to offer 50% more performance than its predecessor , the GeForce RTX 2070 SUPER, as revealed by NVIDIA in its presentation. Finally, it will go on sale for $599 (suggested price) next Thursday, June 10.
Senior Member
Posts: 8305
Joined: 2008-07-31
I'm most definitely not wrong, and i was absolutely gaming back then. First GPU was a geforce 2 ti.
Just go look at benchmarks from back then. They explicitly prove what i said.
To be clear: I'm not saying memory doesn't matter at all, ever. But we still have cards that are exactly the same with 2GB and 4GB models, or 4GB and 8GB models, and they still, TO THIS DAY, show very little difference between them.
We've gotten to a point where we have so much memory even games that "use" it do not actually USE it, and the whole console thing is a bunch of BS because the consoles run the entire system on said memory and use the memory differently then a PC, it's not an apples to apples comparison.
Give 3 examples.
And to be clear: You must have the exact same GPU with and without 16GB of memory. If not that, then you must have the same architecture, and to show that there is a difference in relative performance not based off the fact that the GPU itself with less memory is slower, and there are no bottlenecks at lower resolutions to showcase the memory difference. Anything that screws with the results due to OTHER factors do not count.
Good luck.
Senior Member
Posts: 18902
Joined: 2008-08-28
I'm most definitely not wrong, and i was absolutely gaming back then. First GPU was a geforce 2 ti.
Just go look at benchmarks from back then. They explicitly prove what i said.
To be clear: I'm not saying memory doesn't matter at all, ever. But we still have cards that are exactly the same with 2GB and 4GB models, or 4GB and 8GB models, and they still, TO THIS DAY, show very little difference between them.
We've gotten to a point where we have so much memory even games that "use" it do not actually USE it, and the whole console thing is a bunch of BS because the consoles run the entire system on said memory and use the memory differently then a PC, it's not an apples to apples comparison.
Give 3 examples.
And to be clear: You must have the exact same GPU with and without 16GB of memory. If not that, then you must have the same architecture, and to show that there is a difference in relative performance not based off the fact that the GPU itself with less memory is slower, and there are no bottlenecks at lower resolutions to showcase the memory difference. Anything that screws with the results due to OTHER factors do not count.
Good luck.
There is a few examples but between 4gb and 8gb gpus like rx580. Newer games starting from not so new rise of the tomb raider to resident evil village where 8gb allows you to play higher details. Doom eternal is just the best example of it. Card have enough gpu power to run nightmare textures but not enough vram, like the gtx1660super.
12-16gb comes next especially now where raytracing uses extra vram and resolution higher.
Senior Member
Posts: 13438
Joined: 2018-03-21
You're misunderstanding something about the RAM on these consoles.
They aren't dedicated to System or Video, they are a unified pool, you're getting exactly 16GB total, and some of that is actually locked away and not available to games/user.
Senior Member
Posts: 2990
Joined: 2005-09-27
im not going to insult your intelligence by suggesting that you believe what you just said
Senior Member
Posts: 927
Joined: 2003-07-02
Ridiculous. A 970 served people fine for 6-7 years. Myself included. 40-60fps ulltra everything. It had 3.5gb.
People keep pushing this overstated narrative. Look at the evidence for yourself, its all over youtube.
Vram only matters for 4k in this day and age.