GeForce RTX 2060 Could Be Based on Six models

Published by

Click here to post a comment for GeForce RTX 2060 Could Be Based on Six models on our message forum
https://forums.guru3d.com/data/avatars/m/176/176610.jpg
If true nVidia has lost its compass. . .
data/avatar/default/avatar30.webp
This is just stupid. I can already see the confusion and speculation with pre-builds and laptop variants. There will be a lot of users that are not very familiar that will end up with the "lesser" model, but would have paid for the one with the faster memory.
https://forums.guru3d.com/data/avatars/m/175/175902.jpg
m4dn355:

If true nVidia has lost its compass. . .
Yes for us but no for them to earn money with the product. The GTX 1060 was based on so many version and was also (and still) one of the most profitable product of the 10** line.
https://forums.guru3d.com/data/avatars/m/216/216349.jpg
This is stupid even by Nvidia standards...
https://forums.guru3d.com/data/avatars/m/227/227994.jpg
Nice, makes it easier to get scammed.
data/avatar/default/avatar26.webp
Are they solved problems with artifacts in a new RTX cards?
https://forums.guru3d.com/data/avatars/m/248/248627.jpg
I think that in this class of GPU's now 4gb should be the minimum and even that is getting low today for games.
https://forums.guru3d.com/data/avatars/m/268/268248.jpg
Why they do not name every card they have 2060 from the 2030's all the way up to 2080's and get it over with!?
https://forums.guru3d.com/data/avatars/m/105/105985.jpg
oh what a surprise nv's got 12 xx60 cards you could never guess that
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
To be honest I do not think NVIDIA would release 3 and 4 GB models initially but will keep that open as an option in a later stage to create some different SKUs in different markets.
There shouldn't be 3GB variants, period. That's just going to bottleneck a GPU core like this. 3GB is fine for a 2050 (non-Ti) but it doesn't ever belong in a 2060. Anyway, if the 6GB variants were marked as "2060 Ti" with slight clock boosts, I think that'd be ok. I don't really understand why Nvidia seems to be avoiding the Ti suffix for their -60s models for so many years.
https://forums.guru3d.com/data/avatars/m/198/198862.jpg
6GB variant will probably be expensive so no wonder nvidia will bring us 3gb versions.
data/avatar/default/avatar10.webp
They are plotting to dominate mid range as I see it. Vega 56 vs 2060 RTX 6gb RX 590 vs 2060 RTX 4gb Fury X vs 2060 RTX 3gb
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
warlord:

They are plotting to dominate mid range as I see it. Vega 56 vs 2060 RTX 6gb RX 590 vs 2060 RTX 4gb Fury X vs 2060 RTX 3gb
I'm not 100% sure, but I think the Fury X is better than the 590.
data/avatar/default/avatar22.webp
schmidtbag:

I'm not 100% sure, but I think the Fury X is better than the 590.
Not in all games especially when you hit the wall of 4gb vram depletion.
https://forums.guru3d.com/data/avatars/m/236/236838.jpg
Inferior products to confuse and justify the ridiculously expensive prices in the "official" series. "See, our biscuits kept the same price as last year! It's half the weight, but still worth it, it's the same marvelous brand!" All should have 6GB, and the price should be within the reality.
data/avatar/default/avatar20.webp
I have a feeling that if this is true then GDDR5 will be for pre-builds only and asian markets In the west we will see only GDDR6 in 4 and 6 gb Others dont make sense since price difference between each model cant be more then 5-7$
data/avatar/default/avatar31.webp
Full list of variants from gigabyte: Gigabyte RTX 2060 6GB GDDR6 AORUS Xtreme, GV-N2060AORUS X-6GC Gigabyte RTX 2060 6GB GDDR5 AORUS Xtreme, GV-N2060AORUS X-6GD Gigabyte RTX 2060 6GB GDDR6 GAMING OC, GV-N2060GAMING OC-6GD Gigabyte RTX 2060 6GB GDDR6 WindForce 2X OC, GV-N2060WF2OC-6GD Gigabyte RTX 2060 6GB GDDR6 WindForce 3X OC, GV-N2060WF3OC-6GD Gigabyte RTX 2060 6GB GDDR6 OC, GV-N2060OC-6GD Gigabyte RTX 2060 6GB GDDR6 Mini ITX OC, GV-N2060IXOC-6GD Gigabyte RTX 2060 6GB GDDR6 AORUS, GV-N2060AORUS-6GC Gigabyte RTX 2060 6GB GDDR5 AORUS, GV-N2060AORUS-6GD Gigabyte RTX 2060 6GB GDDR6 GAMING, GV-N2060GAMING-6GD Gigabyte RTX 2060 6GB GDDR6 WindForce 2X, GV-N2060WF2-6GD Gigabyte RTX 2060 6GB GDDR5, GV-N2060D5-6GD Gigabyte RTX 2060 6GB GDDR5 Mini ITX OC, GV-N2060IX-6GD Gigabyte RTX 2060 4GB GDDR6 AORUS Xtreme, GV-N2060AORUS X-4GC Gigabyte RTX 2060 4GB GDDR5 AORUS Xtreme, GV-N2060AORUS X-4GD Gigabyte RTX 2060 4GB GDDR5 GAMING OC, GV-N2060GAMING OC-4GD Gigabyte RTX 2060 4GB GDDR5 WindForce 2X OC, GV-N2060WF2OC-4GD Gigabyte RTX 2060 4GB GDDR5 WindForce 3X OC, GV-N2060WF3OC-4GD Gigabyte RTX 2060 4GB GDDR5 OC, GV-N2060OC-4GD Gigabyte RTX 2060 4GB GDDR5 Mini ITX OC, GV-N2060IXOC-4GD Gigabyte RTX 2060 4GB GDDR6 AORUS, GV-N2060AORUS-4GC Gigabyte RTX 2060 4GB GDDR5 AORUS, GV-N2060AORUS-4GD Gigabyte RTX 2060 4GB GDDR5 GAMING, GV-N2060GAMING-4GD Gigabyte RTX 2060 4GB GDDR5 WindForce 2X, GV-N2060WF2-4GD Gigabyte RTX 2060 4GB GDDR5, GV-N2060D5-4GD Gigabyte RTX 2060 4GB GDDR5 Mini ITX, GV-N2060IX-4GD Gigabyte RTX 2060 3GB GDDR6 AORUS Xtreme, GV-N2060AORUS X-3GC Gigabyte RTX 2060 3GB GDDR5 AORUS Xtreme, GV-N2060AORUS X-3GD Gigabyte RTX 2060 3GB GDDR5 GAMING OC, GV-N2060GAMING OC-3GD Gigabyte RTX 2060 3GB GDDR5 WindForce 2X OC, GV-N2060WF2OC-3GD Gigabyte RTX 2060 3GB GDDR5 WindForce 3X OC, GV-N2060WF3OC-3GD Gigabyte RTX 2060 3GB GDDR5 GAMING OC, GV-N2060OC-3GD Gigabyte RTX 2060 3GB GDDR5 Mini ITX OC, GV-N2060IXOC-3GD Gigabyte RTX 2060 3GB GDDR6 AORUS, GV-N2060AORUS-3GC Gigabyte RTX 2060 3GB GDDR5 AORUS, GV-N2060AORUS-3GD Gigabyte RTX 2060 3GB GDDR5 GAMING, GV-N2060GAMING-3GD Gigabyte RTX 2060 3GB GDDR5 WindForce 2X, GV-N2060WF2-3GD Gigabyte RTX 2060 3GB GDDR5, GV-N2060D5-3GD Gigabyte RTX 2060 3GB GDDR5 Mini ITX, GV-N2060IX-3GD This is insane. Somehow reminds me this weirdo from xfx: https://videocardz.net/xfx-geforce-9600-gso-1536mb-far-cry-2-edition/ That 9600 was as pointless as over 30 of these variants from gigabyte.
https://forums.guru3d.com/data/avatars/m/243/243702.jpg
Ultimate "Rebrandeon" anyone?
https://forums.guru3d.com/data/avatars/m/56/56686.jpg
right this is geting crazier
https://forums.guru3d.com/data/avatars/m/243/243702.jpg
BlackZero:

How come the Fury X does so well at higher resolutions then?
Well, thing is that killing all cards with 4GB VRAM is easy as long as you forbid games from using lower texture LOD. But on 1080p, there is point from which higher texture resolution makes no difference. And that is before 4GB cards start to suffer. Therefore 4GB cards will continue to do just fine. What I would like to see is actually improved IQ through 32x AF. (This bar should have been risen long time ago.) Higher texture IQ is instead done via increased rendering resolution which is much more wasteful in computational resources. (That's one of reasons why people actually use downsampling.)