Gigabyte RTX 4070 may come with a number of different RAM configurations to choose from

Published by

Click here to post a comment for Gigabyte RTX 4070 may come with a number of different RAM configurations to choose from on our message forum
https://forums.guru3d.com/data/avatars/m/198/198862.jpg
Im getting that 16gb 4070 if true and price is below 4070ti.
https://forums.guru3d.com/data/avatars/m/248/248994.jpg
It wouldn't surprise me if Nvidia had planned a couple of different 4070s, just like there were supposed to be two 4080s. If Jensen is a slow learner, he might still try, again, with a couple of different versions of 4070. 160-bit bus would allow 10GB of VRAM with the same 2GB chips. It would be totally pointless, though.
https://forums.guru3d.com/data/avatars/m/198/198862.jpg
Kaarme:

It wouldn't surprise me if Nvidia had planned a couple of different 4070s, just like there were supposed to be two 4080s. If Jensen is a slow learner, he might still try, again, with a couple of different versions of 4070. 160-bit bus would allow 10GB of VRAM with the same 2GB chips. It would be totally pointless, though.
160bit/10gb - 499$ 192bit/12gb - 599$ 256bit/16gb - 699$ Optimistic but could be true. Leaves a space for 330$ 4060 and 399$ 4060ti.
https://forums.guru3d.com/data/avatars/m/274/274425.jpg
Maybe everyone should just pre-order and hope for the best? Sometimes, you get lucky...
https://forums.guru3d.com/data/avatars/m/248/248994.jpg
Undying:

160bit/10gb - 499$ 192bit/12gb - 599$ 256bit/16gb - 699$ Optimistic but could be true. Leaves a space for 330$ 4060 and 399$ 4060ti.
Considering what was supposed to be originally 4080 12GB, and later became 4070 Ti, only has the 192-bit bus with 12GB of memory, I find it extremely hard to believe Nvidia somehow made a 4070 16GB with a 256-bit bus.
https://forums.guru3d.com/data/avatars/m/227/227994.jpg
Undying:

Im getting that 16gb 4070 if true and price is below 4070ti.
More memory most likely means a higher price, since GPU memory has never been cheap. If the 16GB is gonna be cheaper it's probably gonna be 100 bucks below the Ti.
https://forums.guru3d.com/data/avatars/m/268/268248.jpg
Releasing multiple sku's with different amount of cuda cores and or memory & bit bus traditionally is the xx60 sku strategy GTX 1060 3/6/5 gb... 2060 6 12 gb... 3060 8 &12 gb.,.so they moved that to the "4070" now ?
https://forums.guru3d.com/data/avatars/m/258/258664.jpg
Argueably, it's bad strategy with every SKU. In reality, it makes the classification as "4060" and "4070" obsolete if you even change anything about the GPU itself, like enabled shaders etc.
data/avatar/default/avatar01.webp
Kaarme:

Considering what was supposed to be originally 4080 12GB, and later became 4070 Ti, only has the 192-bit bus with 12GB of memory, I find it extremely hard to believe Nvidia somehow made a 4070 16GB with a 256-bit bus.
Well, there were the RTX 3060 8 GB and 12 GB models with 128 bit and 192 bit bus respectively, so it wouldn't be the first time for a multi SKU adventure from NVIDIA.
data/avatar/default/avatar34.webp
If the option is between a single model or even two different ones that offer different performance, I'd choose the former. It would be okay to have like five models with only difference being the VRAM amount, but since they name the cards the same they are simply trying to fool customers and that is not acceptable. How could one even know for sure without research that the model which has more memory is faster? I believe that there will be one model that has largely the same specs as RTX 4070 Ti, but with just less units and cores, but if they do otherwise then I wish they state in the name that the cards are not equal in performance. The RTX 4070 I believe to become reality costs likely 600 €, but for 500 €, perhaps one day, it would be a somewhat appealing product with the 12 GB VRAM of course. EDIT. quick correction
https://forums.guru3d.com/data/avatars/m/248/248994.jpg
GamerNerves:

The RTX 4070 I believe to become reality costs likely 600 €, but for 500 €, perhaps one day, it would be a somewhat appealing product with the 12 GB VRAM of course.
Aren't you from Finland as well? 4070 will cost 800 euros over here, if not more. Assuming this rumour of multiple models turns out to be true, and thus there's going to be a lesser model of 4070, it might cost only 700-750 euros. If we are lucky.
data/avatar/default/avatar03.webp
Kaarme:

Aren't you from Finland as well? 4070 will cost 800 euros over here, if not more. Assuming this rumour of multiple models turns out to be true, and thus there's going to be a lesser model of 4070, it might cost only 700-750 euros. If we are lucky.
Yes I am and agree that 700 € is a very possible price, but I was thinking about the European market in my post. The cheapest price for RTX 4070 Ti in Finland is now 940 €, which is atrocious! So indeed, you are closer to the truth in this case, because it sadly seems not before summer they, Nvidia and AMD, are not pressured to push price down, since they are likely waiting until the release of the lesser models before such act. Nvidia already has their midrange models awaiting their launch I suppose, but Nvidia is playing the waiting game because they still have RTX 3000 series stock and they seem to believe milking with the higher end cards in the meanwhile is a viable tactic - I think it is not and will result in weak revenue. This year it is very much possible that Nvidia releases the midrange models earlier than AMD, as we have seen that AMD didn't even have their driver in a respectable state when releasing the RX 7900 series, and they might lack a satisfactory manufacturing capacity due to the Covid related interruptions and current global economic turmoil. I bet AMD is working hard to make the midrange models available as fast as possible and would like to move to market them instead of the RX 7900 series, which has sold okay I guess when we look at retail stocks, but to my understanding they have not been manufactured in particularly high numbers. AMD needs the marketshare and more money for rapid investments to avoid losing to Intel once they get their game really together both on the CPU and GPU front, perhaps in the next few years already before AM6 socket. Nvidia still has a significant lead on AMD especially after they purchased more manufacturing capacity in summer and they seem to have capable designs quite far in the future, though innovation is the key that can change tides any day.
https://forums.guru3d.com/data/avatars/m/246/246088.jpg
It can be confusing enough as it is with just a single SKU per model, certainly when you add in multiple variations of the same manufacturers model but with more or less 'X's in the name. For years its been possible to buy (purely an example) a top end fancy named **70 at the same price as a faster vanilla **70Ti, now add differing bus widths and VRam capacity's that may all perform differently in various games depending on hardware utilisation.
https://forums.guru3d.com/data/avatars/m/266/266726.jpg
worth mentioning that nvida can mix memory densities, they've done it in the past.
https://forums.guru3d.com/data/avatars/m/284/284177.jpg
Celcius:

Maybe everyone should just pre-order and hope for the best? Sometimes, you get lucky...
Agree, people sometimes forget... a 3060ti could/can match the RTX 2080 super... if a RTX 4060ti performs like a RTX 3090 @150/200 watts it's a slam dunk! Good Lawd the energy savings
https://forums.guru3d.com/data/avatars/m/145/145154.jpg
More and more, I suspect my next GPU is going to be an Intel. At least they're still (apparently) trying to attract new GPU customers, not simply take them for granted with silly naming schemes (on GPUs with pre scalping built into the MSRP). Pushing every tier into more and more "elite" pricing is why I use an AMD card now after 15 years of exclusively nvidia cards. It's simply not worth paying the nvidia tax. To be fair, AMD pricing has also gotten worse since my 5700XT purchase, but nvidia's epic greed seems to know no bounds.
https://forums.guru3d.com/data/avatars/m/246/246088.jpg
0blivious:

More and more, I suspect my next GPU is going to be an Intel. At least they're still (apparently) trying to attract new GPU customers, not simply take them for granted with silly naming schemes (on GPUs with pre scalping built into the MSRP). Pushing every tier into more and more "elite" pricing is why I use an AMD card now after 15 years of exclusively nvidia cards. It's simply not worth paying the nvidia tax. To be fair, AMD pricing has also gotten worse since my 5700XT purchase, but nvidia's epic greed seems to know no bounds.
Everybody has a different reason to buy what they buy, a purchase analysis if you will. Some may base a decision on finance, others on morality or even availability.
data/avatar/default/avatar08.webp
0blivious:

More and more, I suspect my next GPU is going to be an Intel. At least they're still (apparently) trying to attract new GPU customers, not simply take them for granted with silly naming schemes (on GPUs with pre scalping built into the MSRP). Pushing every tier into more and more "elite" pricing is why I use an AMD card now after 15 years of exclusively nvidia cards. It's simply not worth paying the nvidia tax. To be fair, AMD pricing has also gotten worse since my 5700XT purchase, but nvidia's epic greed seems to know no bounds.
You do realize that Intel at the moment is trying to buy market share by making a loss on their GPU's. It also feels like they are trying to get rid of inventory at the moment. If the rumors are correct they're about to exit the consumer market for GPU's and only concentrate on the professional and data center markets. I checked the biggest online retailer here in The Netherlands and they only have a single Intel Arc SKU available, a 8GB Arc 750 with VGA, DVI and HDMI connectors. Not what I would call an exciting GPU to have (especially with only 8GB, not really that future proof). But it is cheap at 275 Euros, I'll give Intel that.
https://forums.guru3d.com/data/avatars/m/246/246088.jpg
Crazy Joe:

You do realize that Intel at the moment is trying to buy market share by making a loss on their GPU's. It also feels like they are trying to get rid of inventory at the moment. If the rumors are correct they're about to exit the consumer market for GPU's and only concentrate on the professional and data center markets. I checked the biggest online retailer here in The Netherlands and they only have a single Intel Arc SKU available, a 8GB Arc 750 with VGA, DVI and HDMI connectors. Not what I would call an exciting GPU to have (especially with only 8GB, not really that future proof). But it is cheap at 275 Euros, I'll give Intel that.
i would imagine that the mid range market is where the profit really is, Intel can complete there.
data/avatar/default/avatar16.webp
pegasus1:

i would imagine that the mid range market is where the profit really is, Intel can complete there.
Sure, but the operative word there is profit. If you sell your product at a loss, then having a large market share only increases your losses. You can try and buy market share with a product by selling it at a loss, but there will be the need to recoup these losses and make a profit at some time in the future. If Intel feels that the investment they made can't be recouped by selling GPUs to consumers at attractive prices, because they would either still make a loss or have a razor thing profit margin, which will make the time to recoup the investment too long and uncertain, they will stop producing those GPUs. And let's face it: the Arc series is not a sales hit. Whether that is because of the awful state of the drivers at launch, that left everyone with a sour taste, or the level of performance that doesn't even beat last generation's low to mid range cards, is up for debate. But the truth of the matter is not enough people are willing to buy these cards. Maybe Intel will see how BattleMage turns out in the consumer space before making their decision about the future of the consumer GPUs, but they have already restructured their GPU division in such a way that they can easily step away without hurting the data center and professional GPU side of the business.