NVIDIA GeForce RTX 4070 Launch Date Set for April 13th: Leaker Confirms

Published by

Click here to post a comment for NVIDIA GeForce RTX 4070 Launch Date Set for April 13th: Leaker Confirms on our message forum
https://forums.guru3d.com/data/avatars/m/216/216349.jpg
So, 700 dollars for this super ultra GPU?...
https://forums.guru3d.com/data/avatars/m/248/248994.jpg
H83:

So, 700 dollars for this super ultra GPU?...
If it was 700 dollars in the USA, it would be over 860 euros over here. 4070 Ti can be bought for a little under 1000 euros at the very cheapest. Although it would still be a disgusting price, but I hope 4070 would start from 700-750 euros at max. However, I hear leather jackets are awfully expensive these days, so I don't expect Jensen to be able to set such a "low" price.
https://forums.guru3d.com/data/avatars/m/191/191533.jpg
Kaarme:

If it was 700 dollars in the USA, it would be over 860 euros over here. 4070 Ti can be bought for a little under 1000 euros at the very cheapest. Although it would still be a disgusting price, but I hope 4070 would start from 700-750 euros at max. However, I hear leather jackets are awfully expensive these days, so I don't expect Jensen to be able to set such a "low" price.
Instead of blaming Nvidia, put the blame where it belongs ... your country's VAT. Obviously Europeans love their taxes.
https://forums.guru3d.com/data/avatars/m/248/248994.jpg
Why_Me:

Instead of blaming Nvidia, put the blame where it belongs ... your country's VAT. Obviously Europeans love their taxes.
So, countries should lower their taxes to allow multi-billion corporations to make more profit? Any other bright ideas? 3070 Ti launch price (without tax since it's the US price) was $599 (though it was during the mining craze). 4070 Ti launch price (without tax since it's the US price) was $799. That's 33% more added suddenly on top of 3070 Ti's price. But then again, it's not like I wouldn't have been used to 33% inflation here in Europe, thanks to the effects of the war. Nah, actually, I'm not used to it at all. I have never had a leather jacket personally, but based on Nvidia's pricing policy, leather jacket prices must have also gone up by 33%.
https://forums.guru3d.com/data/avatars/m/191/191533.jpg
Kaarme:

So, countries should lower their taxes to allow multi-billion corporations to make more profit? Any other bright ideas? 3070 Ti launch price (without tax since it's the US price) was $599 (though it was during the mining craze). 4070 Ti launch price (without tax since it's the US price) was $799. That's 33% more added suddenly on top of 3070 Ti's price. But then again, it's not like I wouldn't have been used to 33% inflation here in Europe, thanks to the effects of the war. Nah, actually, I'm not used to it at all. I have never had a leather jacket personally, but based on Nvidia's pricing policy, leather jacket prices must have also gone up by 33%.
Why lower taxes when Europeans can pay more while complaining about the cost of gpu's? https://pcpartpicker.com/product/kVqPxr/asus-tuf-gaming-geforce-rtx-4070-ti-12-gb-video-card-tuf-rtx4070ti-12g-gaming Asus TUF GAMING RTX 4070 Ti 12GB Video Card $799.99
https://forums.guru3d.com/data/avatars/m/191/191533.jpg
Kaarme:

Sorry, I have no idea why you posted those graphs.
Would you pay $100 more for four more GB's of VRAM, 53 more FPS, DLSS3 and lower power consumption? I know of plenty of gamers that would and are. https://pcpartpicker.com/search/?q=rtx+3070+ti RTX 3070 Ti 8GB Video Card
https://forums.guru3d.com/data/avatars/m/248/248994.jpg
Why_Me:

Would you pay $100 more for four more GB's of VRAM, 53 more FPS, DLSS3 and lower power consumption? I know of plenty of gamers that would and are. https://pcpartpicker.com/search/?q=rtx+3070+ti RTX 3070 Ti 8GB Video Card
Every new gen is naturally supposed to be stronger than the previous one. Why would anyone otherewise upgrade before the old card gets broken, which might not very well happen even in a decade? However, the prices aren't supposed to recklessly increase all the time. Otherwise a decent graphics card would cost $10,000 already. The additional VRAM probably costs Nvidia/AIB partners no more since DRAM has been in overproduction, compared to the demand now, post-crypto craze and post-Covid (when people are again spending money in things other than home electronics or remote working equipment). The DRAM manufacturers are going to cut production to drive the prices up, most likely, unless they already are doing it. However, Nvidia and AIBs would have already got their orders in bulk during the lower price period. GDDR6X has now also matured as technology from the initial 3000 series times. Not to mention, 4070 Ti only has a 192-bit bus and 6 memory chips. 3070 Ti had a 256-bit bus with 8 memory chips.
https://forums.guru3d.com/data/avatars/m/191/191533.jpg
Kaarme:

Every new gen is naturally supposed to be stronger than the previous one. Why would anyone otherewise upgrade before the old card gets broken, which might not very well happen even in a decade? However, the prices aren't supposed to recklessly increase all the time. Otherwise a decent graphics card would cost $10,000 already. The additional VRAM probably costs Nvidia/AIB partners no more since DRAM has been in overproduction, compared to the demand now, post-crypto craze and post-Covid (when people are again spending money in things other than home electronics or remote working equipment). The DRAM manufacturers are going to cut production to drive the prices up, most likely, unless they already are doing it. However, Nvidia and AIBs would have already got their orders in bulk during the lower price period. GDDR6X has now also matured as technology from the initial 3000 series times. Not to mention, 4070 Ti only has a 192-bit bus and 6 memory chips. 3070 Ti had a 256-bit bus with 8 memory chips.
Who cares that the 3070 Ti has a 256 bit sidebus ... it gets stomped by the 4070 Ti.
https://forums.guru3d.com/data/avatars/m/198/198862.jpg
Why_Me:

Who cares that the 3070 Ti has a 256 bit sidebus ... it gets stomped by the 4070 Ti.
But it looses to 3090 and 3090ti which was supposed to beat. At 4k its severly bandwidth limited and soon memory limited even at 1440p. I would not buy such a card for 900eur.
https://forums.guru3d.com/data/avatars/m/239/239175.jpg
Undying:

But it looses to 3090 and 3090ti which was supposed to beat. At 4k its severly bandwidth limited and soon memory limited even at 1440p. I would not buy such a card for 900eur.
A friend of mine was trying to run an old game (Collin McRae Rally 2005) at 4K with SGSSAA on his 1660. It was stuttering. Runs perfectly on my 980 Ti. Is this the result of 192-bit vs 384-bit memory?
https://forums.guru3d.com/data/avatars/m/278/278626.jpg
Nvidia may need to be careful with the pricing on these. With all the recent driver improvenemets the Intel Arc offerings cannot be ignored.
https://forums.guru3d.com/data/avatars/m/248/248994.jpg
RealNC:

A friend of mine was trying to run an old game (Collin McRae Rally 2005) at 4K with SGSSAA on his 1660. It was stuttering. Runs perfectly on my 980 Ti. Is this the result of 192-bit vs 384-bit memory?
Memory bus width is just a physical detail. The bandwidth the card can achieve in practice is more important. Better memory controllers and more advanced memory can produce higher bandwidth with a narrower bus than a wider bus on an older card with old memory and more primitive controllers. Nevertheless, Nvidia and AMD are more concerned about the cost of the components than the performance, so they love to make a narrower bus and use a lower number of memory chips (with a higher capacity per chip if needed). A large cache in the GPU can help with a narrow bus, especially at lower resolutions. All that being said, 3070 Ti and 4070 Ti both use GDDR6X. 4070 Ti uses a bit faster memory chips, but it can't compensate for the much narrower memory bus in full. 3070 Ti actually achieves a higher bandwidth than 4070 Ti. To compensate, 4070 Ti has a huge cache memory in the GPU, compared to 3070 Ti.
https://forums.guru3d.com/data/avatars/m/198/198862.jpg
Kaarme:

Memory bus width is just a physical detail. The bandwidth the card can achieve in practice is more important. Better memory controllers and more advanced memory can produce higher bandwidth with a narrower bus than a wider bus on an older card with old memory and more primitive controllers. Nevertheless, Nvidia and AMD are more concerned about the cost of the components than the performance, so they love to make a narrower bus and use a lower number of memory chips (with a higher capacity per chip if needed). A large cache in the GPU can help with a narrow bus, especially at lower resolutions. All that being said, 3070 Ti and 4070 Ti both use GDDR6X. 4070 Ti uses a bit faster memory chips, but it can't compensate for the much narrower memory bus in full. 3070 Ti actually achieves a higher bandwidth than 4070 Ti. To compensate, 4070 Ti has a huge cache memory in the GPU, compared to 3070 Ti.
3070ti has a huge disadvantage with memory capacity. 8gb is on the low side atm and you cannot compensate that for wider memory bus. You saw in latest games it does not help if you are memory limited.
https://forums.guru3d.com/data/avatars/m/198/198862.jpg
V1CT1MISED:

Since getting a 7900 XTX I have realised how low 8gb really is. Even 12Gb isn't going to be enough for 4K. I saw highs of 17Gb VRAM usage in the RE4 demo at max settings with RT.
12gb isnt enough even for 1440p in number of games. Most recently RE4 remake you mentioned especially if you use raytracing.
https://forums.guru3d.com/data/avatars/m/56/56686.jpg
i look foward to the laugh to bank price the give it and i sorry but 3 memory schemes? talk about blurring the line on what xx70 is seeing them seem to want to that everything now
https://forums.guru3d.com/data/avatars/m/248/248994.jpg
Undying:

3070ti has a huge disadvantage with memory capacity. 8gb is on the low side atm and you cannot compensate that for wider memory bus. You saw in latest games it does not help if you are memory limited.
Yeah, over the last couple of years I've abandoned the idea of getting an 8GB card. 12GB probably would still be enough for me, seeing how I'm rocking a trusty old 1080p screen.
https://forums.guru3d.com/data/avatars/m/56/56686.jpg
I think 12gb is bare min they should have now DX MD easily used 8gb if one started jack up AA with eveything on maxed, and that is how old? im sure 8gb easily filled now well at 1440p anyway, I rarely run 1080p on 1080p monitor these days less for some reason i cant get the game to run 1440p with DSR. but none the less have multi sku of the same NAMED card with diffrent amounts ram and or diffrent or 128bit or 198bit etc is stupid it great way to confused consumers. if it xx60 is one set of ram amount and 1 set of XXXbit. initinal 3060ti had what 8gb and 450gb/s ish bandwith and end of last year that silently changed to ram and bandwitth became 610gb/s ish both being 256bit, then 3060 had 2 diffrent ram configs i get the probably have to many chips that meet standard but, stop using the same for card with diffrent ram/bit config or diffrent amounts od cuda/tmu's/rop's/etc if any of that stuff changes it no lone card. you know like when the actual made xx40/xx30 card
https://forums.guru3d.com/data/avatars/m/242/242443.jpg
Undying:

But it looses to 3090 and 3090ti which was supposed to beat. At 4k its severly bandwidth limited and soon memory limited even at 1440p. I would not buy such a card for 900eur.
I know stupid right?