A GeForce RTX 3070 Ti with 16GB GDDR6 is listed at Lenovo

Published by

Click here to post a comment for A GeForce RTX 3070 Ti with 16GB GDDR6 is listed at Lenovo on our message forum
https://forums.guru3d.com/data/avatars/m/248/248721.jpg
If RTX 3070Ti 16GB GDDR6 is a real thing, guess few months from now, then RTX 3080 20GB GDDR6X will happen for sure, probably not before Christmas. But for sure 3080 20GB could hurt 3090 24GB sales by a lot, so NVidia will wait the last possible moment to release 20G version of 3080.
data/avatar/default/avatar37.webp
Wasn't there a leak out of asia from either gigabyte or MSI stating they had like 72 or so different models in the 3xxx lineup? If so we can expect many variations then, and if these specs are accurate for the 3070ti and the price is only 100 bucks worse I could see this card being way more popular than the 3070 or 3080 since it has the most memory out of all. There were leaks stating the 3080 was too going to have different memory variations as well. The real kicker though is holding out for 6 months waiting for the 3080ti to drop, what price it will be, and how it's going to squeeze in between the 3090 and 3080 when their specs are so very close already. If nvidia gets the 80ti out for 999 I'll prolly finally upgrade from my 1080ti. Holy crap are 2080ti prices dropping like mad on ebay and saw a guy commenting on reddit yesterday that he sold his 2080ti for 950 the day before the announcement so whomever he sold to will be pretty sad now. Certainly know once 3k fully released my 1080ti will be worth like 250 bucks at most. Wondering how well it's going to do still with CP2077? Guess I'll see if I end up waiting for the 80ti instead of getting the 90 which wont sell for crap once the 80ti comes out...
https://forums.guru3d.com/data/avatars/m/218/218363.jpg
Ah, nVidia is back to the "waiting for a Ti-model" strategy just like with the 1080 🙂 Noice.
https://forums.guru3d.com/data/avatars/m/280/280231.jpg
CrazY_Milojko:

If RTX 3070Ti 16GB GDDR6 is a real thing, guess few months from now, then RTX 3080 20GB GDDR6X will happen for sure, probably not before Christmas. But for sure 3080 20GB could hurt 3090 24GB sales by a lot, so NVidia will wait the last possible moment to release 20G version of 3080.
3090 is a card for 8k. If anyone buys it with 4k TV does it only for epeen reasons. So, no it will not hurt 3090 sales at all a 3080 Ti 20gb. It's still a 4k gpu. The same goes with 3070ti 16gb, it will also be targeting 4k/1440p res.
data/avatar/default/avatar15.webp
itpro:

3090 is a card for 8k. If anyone buys it with 4k TV does it only for epeen reasons. So, no it will not hurt 3090 sales at all a 3080 Ti 20gb. It's still a 4k gpu. The same goes with 3070ti 16gb, it will also be targeting 4k/1440p res.
8K? It cant even do 4K 60 without DLSS in some games or 4K/120 thats available in 4K OLED TV's
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
MegaFalloutFan:

8K? It cant even do 4K 60 without DLSS in some games or 4K/120 thats available in 4K OLED TV's
Yeah I don't think it will do 8K (without DLSS) but to say the 3090 won't do 4K60 when a 2080Ti nearly does that isn't accurate either.
https://forums.guru3d.com/data/avatars/m/280/280231.jpg
MegaFalloutFan:

8K? It cant even do 4K 60 without DLSS in some games or 4K/120 thats available in 4K OLED TV's
Denial:

Yeah I don't think it will do 8K (without DLSS) but to say the 3090 won't do 4K60 when a 2080Ti nearly does that isn't accurate either.
It will be capable to play without crashes or lag at 8k resolution. True resolution. All other cards will just easily stream 8k+ textures in 4k resolution. It isn't quite the same. But it is just an early adoption like rtx first series without true raytracing power. Before we first get nice rt performance with titan rtx now we get 8k gaming with single gpu for the 1st time.
https://forums.guru3d.com/data/avatars/m/282/282600.jpg
I had a thought, just like with Nvidia wanting to market the 3090 as an 8k card to justify the price and memory capacity its also possible with the potentially huge performance upgrades that they are going to shift how they work with the stack and in particular the TI and Super variants, since 4k is now a thing for many more people the memory requirements go up. TI could just become a memory capacity upgrade released alongside the standard. Then the performance based Super would come later with the extra cores/speeds, this way they can further segment the market, add more marketing opportunities, longer mindshare, counter anything AMD does.
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
itpro:

It will be capable to play without crashes or lag at 8k resolution. True resolution. All other cards will just easily stream 8k+ textures in 4k resolution. It isn't quite the same. But it is just an early adoption like rtx first series without true raytracing power. Before we first get nice rt performance with titan rtx now we get 8k gaming with single gpu for the 1st time.
Define play and lag. It's not playing Cyberpunk 2077 at 8K at any framerate over 15fps. In which case i would argue that's not playing and is 100% lag.
https://forums.guru3d.com/data/avatars/m/282/282600.jpg
MegaFalloutFan:

8K? It cant even do 4K 60 without DLSS in some games or 4K/120 thats available in 4K OLED TV's
How would you know that the 3090 can't do 4k60?, looks like the 3080 will be able to do that in all but some crazy games like Metro or microsoft sims
data/avatar/default/avatar31.webp
buying a regular 3080 is a bad choice but the 10Gbs vram make it worse, you'd be a fool to buy one and not wait for a 3080Ti if even a 3070(ti) has more I got screwed with the GTX980..even had two in SLI when they announced the Ti I'm never ever buying another "80" it's the worst value of the lineup nowadays you know something better is coming 100% the 3090 tough it'll have to be way above a regular 3080 to be worth the money, why buy a 1500$ card if like the 80 you know you're buying the inferior version ? pretty sure what people want is 4K 100+fps or 60fps at max settings nobody cares about 8K if the 3080 (that you shouldn't buy anyway wait for the Ti) doesn't do it then yes the 3090 will make sense great marketing from nvidia but without benchmarks it's all hype
https://forums.guru3d.com/data/avatars/m/270/270041.jpg
Denial:

Define play and lag. It's not playing Cyberpunk 2077 at 8K at any framerate over 15fps. In which case i would argue that's not playing and is 100% lag.
Think you might be lucky to get 15fps at 8K haha! not without DLSS anyway, unsure why people are saying the 3090 is a 8k card? It's a 4k high refresh-rate card not 8K
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
Ricepudding:

Think you might be lucky to get 15fps at 8K haha! not without DLSS anyway, unsure why people are saying the 3090 is a 8k card? It's a 4k high refresh-rate card not 8K
They are saying it because Jen's said it's an 8K card but he specifically said that was with DLSS. So the output is 8K but the render is like 4 being upscaled. https://www.nvidia.com/content/dam/en-zz/Solutions/geforce/news/geforce-rtx-3090-8k-hdr-gaming/watch-dogs-legion-geforce-rtx-1k-4k-8k-screenshot-resolution-comparison.jpg https://www.nvidia.com/en-us/geforce/news/geforce-rtx-3090-8k-hdr-gaming/ https://www.nvidia.com/content/dam/en-zz/Solutions/geforce/news/geforce-rtx-3090-8k-hdr-gaming/geforce-rtx-3090-8k-gaming-performance.png
https://forums.guru3d.com/data/avatars/m/248/248994.jpg
In Soviet Russia video cards have twice the GDDR because the Soviets had the DDR.
https://forums.guru3d.com/data/avatars/m/225/225084.jpg
Well i'm sorry Nvidia but there's no chance i'm gonna buy a 3080 10GB now, not gonna happen as i now know there's Ti versions coming or 20GB edition. I really hope 3080 sales are slow so it forces Nvidia to release other models in between 3070/3080/3090.
https://forums.guru3d.com/data/avatars/m/270/270041.jpg
Denial:

They are saying it because Jen's said it's an 8K card but he specifically said that was with DLSS. So the output is 8K but the render is like 4 being upscaled. https://www.nvidia.com/content/dam/en-zz/Solutions/geforce/news/geforce-rtx-3090-8k-hdr-gaming/watch-dogs-legion-geforce-rtx-1k-4k-8k-screenshot-resolution-comparison.jpg https://www.nvidia.com/en-us/geforce/news/geforce-rtx-3090-8k-hdr-gaming/ https://www.nvidia.com/content/dam/en-zz/Solutions/geforce/news/geforce-rtx-3090-8k-hdr-gaming/geforce-rtx-3090-8k-gaming-performance.png
Where the graph from? Cause beyond control and wolfenstein (most likely due to RTX) that is some impressive fps considering its just performance without DLSS, If its getting 60fps 8k thats impressive even if its lighter/esports games 4k120 should be quite easy considering how less demanding it is, 4k to 8k is similar jump to 1080p to 4k in terms of the pixel increases. So I'm quite impressive. Chances are cyberpunk will run like control mind you 8 fps xD
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
Ricepudding:

Where the graph from? Cause beyond control and wolfenstein (most likely due to RTX) that is some impressive fps considering its just performance without DLSS, If its getting 60fps 8k thats impressive even if its lighter/esports games 4k120 should be quite easy considering how less demanding it is, 4k to 8k is similar jump to 1080p to 4k in terms of the pixel increases. So I'm quite impressive. Chances are cyberpunk will run like control mind you 8 fps xD
It's from the link above the graph https://www.nvidia.com/en-us/geforce/news/geforce-rtx-3090-8k-hdr-gaming/ And yeah I'm actually kind of impressed with the non-dlss 8K performance in some of those titles
https://forums.guru3d.com/data/avatars/m/242/242134.jpg
not sure why HH and others are talking about a ti not (even) being a super, when those a 2 different things. super is chip "refinement" down the road, xxxx-tii are cut version of the next bigger chip and a ti thats is identical to non ti version but has more vram, then its just that, the same card with more vram, and not a ti. that said, if someone can show me where in the past 20y there was a Nvidia ti that only had more ram (but same chip/shader/clocks), i will adjust my view. and no card is limited or excluded from certain res.. i can play 8K on a gtx 760, as long as the game isnt demanding, just stating thes res does not equal the gpu load nor performance level. a 1060 can run nfs rivals @4k/30 jist fine, so much for what is or isnt a "xx" - res card... not everyone plays the same games, and there are lots of older games i can run steady 60gps @4k with maxed out settings and even ultra text mods on something like a 2070, but that doesn't mean i cant spend MY money on a 3090, just because i don't have a 8k moni or tv.
https://forums.guru3d.com/data/avatars/m/270/270017.jpg
I don't need a super-powerful graphics card - a 2070 super that I have in here now with the 3950x Ryzen is PLENTY more than enough - possibly 2x what I need. HOWEVER, designing 3D content / assets for today's and tomorrow's games, leaves me with A LOT to be desired when it comes to 8gb of VRAM (and that's without running the modeling software along-side the destination game-engine in it's editing mode. 16GB (or better) or bust. They've got to get this down to the 400~500$ range and then it'll be a good bargain and make people want to upgrade. I wanted to get a 16gb Radeon VII back last year on the first week of December, but I just couldn't do any more dealing with AMD's crash-tastic Radeon drivers after the Polaris 'issues'. Somewhat sorry I didn't, as a normally 1~2 second long model face-splitting operation on a 1.5-million triangle model takes 1~5 MINUTES when you're out of VRAM (yes, it really, genuinely, sucks; and yes it IS that bad). Dare I say I ran out of VRAM on an 8gb card with one model at 1080p... So while the generation-over-generation performance jump is good, 8gb ~ 10gb cards aren't going to last long with the next gen games when I can make a tunnel entrance and 1/8th mile of tunnel (which is one model), weigh approx 150~500mb no issue for the model, and 200~300mb or more for the textures (and that's a range of 512 pixel to 4k pixel square & rectangular textures, with one set of road surface textures being 4k x 8k as it's a driving game). But now where do I put the rest of the city when I've already used up over 500mb of my VRAM just on ONE tunnel ENTRANCE model and textures??? For those that aren't in the 3D modeling scene, when you have a detailed model, yes it eats a LOT of space, but then you also have to have progressively simpler LOD's as the viewport is backed away from the object, you need an accurate collision mesh (though this is much simpler than the detailed 100% quality mesh), all of which take up more space even if the main detail level itself is only 150mb size-on-disk. I included a render of what it looks like below (ray traced, at that, even though our game-engine doesn't have ray tracing currently), the tunnel was over 500mb for just this full-detail tunnel mesh alone - removing the 3d plants and grasses in favor of random auto game-populated ones, took it down to about 160mb on disk for ALL models combined (with the main model split in half due to low VRAM issues, so double this figure to roughly 300mb plus textures). That's right folks. If I don't re-use tunnel sections, one mile of the tunnel - provided they use all identical texture sources (in a perfect world they would), would take approx half the VRAM on an 8gb card. So entirely, 16gb is a good start, 32gb would really be nice, and 48~64gb would be blissful. I would 'settle' for a 16gb in a hot minute though. Ray tracing of course adds a LOT of VRAM overhead requirement as due to the complexity of the lighting model you need a LOT of VRAM to store all that data, plus the models, plus the textures, plus the frame buffer, etc. DX11 single-threaded draw call issues held back gaming for a dog's age (even though technically, to a limited amount, dx11 can multi-thread draw calls, it's no-where near as good or as easy to use as dx12's implementation), but once you're free of that draw call limit, the complexity of models afforded by being more free of draw call budget restrictions leads to a very, very quick exhaustion of VRAM. So if you guys/gals out there want truly awesome immersive game environments, let those 8gb/10gb VRAM cards sit on the shelf, especially for these prices. I am really sorry I didn't spring for a 16gb card last year when my Radeon RX 480 8gb had such bad driver issues that it even scrambled the desktop (it isn't broken, works fine in my old 4790k machine w/Windows 7). This tunnel will be featured as part of my BeamNG Drive city map mod project titled 'Los Injurus City Map' project. It's on BeamNG's website under mods, but I won't link it due to off-topic and also forum rules.