ASUS GeForce RTX 4070 DUAL review

Graphics cards 1049 Page 1 of 1 Published by

Click here to post a comment for ASUS GeForce RTX 4070 DUAL review on our message forum
https://forums.guru3d.com/data/avatars/m/248/248994.jpg
Maybe I'll consider this specific card model. While 4070 will not amaze anyone, considering I'm a 1080p gamer, this would actually be relatively better for me than for someone with a fancy 1440p screen. No 4k gamer should get a 4070 card anyway, unless they are only playing specific games, where they know it will perform well.
https://forums.guru3d.com/data/avatars/m/87/87652.jpg
Could you post the height of the card. Like actual mm or inches. I didnt see that. Trying to see if this will fit in my itx machine.
https://forums.guru3d.com/data/avatars/m/282/282473.jpg
not bad, better than palit dual. sells at msrp here, available since yesterday. really thinking about it tbh, 190w power, dlss 2/3 support, and in games with multiple rt effects it rips my 6800/3060 ti a new one. even matches 7900xt.
upload_2023-4-14_11-49-1.png
I'll think about it next week, gonna see how better aib versions are priced.
https://forums.guru3d.com/data/avatars/m/248/248994.jpg
cucaulay malkin:

not bad, better than palit dual. sells at msrp here, available since yesterday.
The thing about this card I don't quite understand is the idle power consumption. It's several watts higher than the FE's, 18W instead of 12W. In fact 18W is the same as the 4080 FE's idle consumption. Where exactly are those watts disappearing? The FE's 12W is really nice. Below it are only AMD's bottom mainstream or entry level cards. Asus 4070 TUF sports 16W idle. MSI Ventus X3 has 15W... I don't quite get it. TUF ought to have heavy duty power delivery compared to the Dual, yet the idle is lower.
https://forums.guru3d.com/data/avatars/m/108/108389.jpg
Kaarme:

The thing about this card I don't quite understand is the idle power consumption. It's several watts higher than the FE's, 18W instead of 12W. In fact 18W is the same as the 4080 FE's idle consumption. Where exactly are those watts disappearing? The FE's 12W is really nice. Below it are only AMD's bottom mainstream or entry level cards. Asus 4070 TUF sports 16W idle. MSI Ventus X3 has 15W... I don't quite get it. TUF ought to have heavy duty power delivery compared to the Dual, yet the idle is lower.
Silicon quality difference, might be VRAM too
https://forums.guru3d.com/data/avatars/m/282/282473.jpg
already thought about this, and no way am I buying the 4070. The card itself is a nice 4060Ti successor (not a 4070!), but it's like paying a crypto retail price as regular msrp. I advise not to buy the 4070 unless you're coming from a 2060/2060S or older. 3060Ti to 4070 at 650eur makes zero sense. The price sadly won't come down until amd's competition is n32 not 350-400w rdna2 at -40% discounts.
https://forums.guru3d.com/data/avatars/m/248/248994.jpg
cucaulay malkin:

already thought about this, and no way am I buying the 4070. The card itself is a nice 4060Ti successor (not a 4070!), but it's like paying a crypto retail price as regular msrp. I advise not to buy the 4070 unless you're coming from a 2060/2060S or older. 3060Ti to 4070 at 650eur makes zero sense. The price sadly won't come down until amd's competition is n32 not 350-400w rdna2 at -40% discounts.
Since I'm coming from Radeon 390, pretty much anything would be an upgrade for me. I don't consider the price especially good, which is why I'm looking at the cheapest models, but on the other hand, if I intend to keep gaming at all, I need to get a new card. I also have an annoying principle of wanting to buy from the latest gen available, so the old AMD 6000 cards aren't an option. They also have too little RT power for the wattage.
Krizby:

Silicon quality difference, might be VRAM too
MSI Ventus is also the cheapest model from MSI's selection, yet it's 15W compared to Dual's 18W. That's what baffles me, otherwise I'd have also immediately thought it's a matter of silicon prioritising. But maybe it still is, and MSI got better chips than Asus.
https://forums.guru3d.com/data/avatars/m/108/108389.jpg
Kaarme:

MSI Ventus is also the cheapest model from MSI's selection, yet it's 15W compared to Dual's 18W. That's what baffles me, otherwise I'd have also immediately thought it's a matter of silicon prioritising. But maybe it still is, and MSI got better chips than Asus.
Chips with low ASIC quality tend to have less leakage at the same freq/voltage, that's why they are used on laptops. I think the MSI has lower ASIC quality, so it use less power at the same idle clock/voltage than other 4070, during high load though the better ASIC chip can run higher clock at lower voltage (usually used on OC SKUs)
https://forums.guru3d.com/data/avatars/m/282/282473.jpg
Kaarme:

I also have an annoying principle of wanting to buy from the latest gen available, so the old AMD 6000 cards aren't an option. They also have too little RT power for the wattage.
frankly, the rt power is not the main problem here, it's fsr2 vs dlss witcher 3 with rtgi+rtao runs at ~60 fps avg on 6800 2.5G, CP2077 with rtgi medium and rt reflections will do 50-60 with xess, dying light 2 with rtgi, rtao and rt reflections runs at 60-80. the performance is okay-ish for 2-3 effects enabled but fsr2 is a sad joke. static image is blurry already, and the shimmer is just awful ,there's ghosting on moving objects too. https://forums.guru3d.com/threads/fsr-thread.440398/page-61#post-6121884 it's better in witcher3 and dying light 2, but still noticeable shimmer and weak detail reconstruction.
https://forums.guru3d.com/data/avatars/m/271/271560.jpg
cucaulay malkin:

frankly, the rt power is not the main problem here, it's fsr2 vs dlss witcher 3 with rtgi+rtao runs at ~60 fps avg on 6800 2.5G, CP2077 with rtgi medium and rt reflections will do 50-60 with xess, dying light 2 with rtgi, rtao and rt reflections runs at 60-80. the performance is okay-ish for 2-3 effects enabled but fsr2 is a sad joke. static image is blurry already, and the shimmer is just awful ,there's ghosting on moving objects too. https://forums.guru3d.com/threads/fsr-thread.440398/page-61#post-6121884 it's better in witcher3 and dying light 2, but still noticeable shimmer and weak detail reconstruction.
again, it depends on the games you play i'm not fond of dlss or fsr as i like real frames 1:1 with source but as you like 1st person games your observations are valid for them. as far as witcher 3 RT is a sad joke on any gpu with no real benefit except to say it's enabled. CP2077.... the number of players using RT is limited to fanboys because like all other titles the performance hit is horrible. the only time/place i think RT works is at 1440p with a newer card because its easier to drive
https://forums.guru3d.com/data/avatars/m/282/282473.jpg
tunejunky:

as far as witcher 3 RT is a sad joke on any gpu with no real benefit except to say it's enabled. CP2077.... the number of players using RT is limited to fanboys because like all other titles the performance hit is horrible.
God damn you're boring with this fanboy crap. The only people that don't use rt are those whose cards are useless for rt. Everyone with a card better than 6700xt, where enabling even one is a problem for ~60fps, can use rt these days, at least to a degree. I can drive a single player game at 100fps, or enable two-three rt effects and still maintain 55-65fps enjoying better graphics. I always turn on rtgi and rtao when I can, they always look better and far, far more natural https://imgsli.com/MTcwMzcx You praise lumen so much, seen the performance hit ? The only thing you have against rt is the wrong gpu maker had it working first.
https://forums.guru3d.com/data/avatars/m/271/271560.jpg
cucaulay malkin:

God damn you're boring with this fanboy crap. The only people that don't use rt are those whose cards are useless for rt. Everyone with a card better than 6700xt, where enabling even one is a problem for ~60fps, can use rt these days, at least to a degree. I can drive a single player game at 100fps, or enable two-three rt effects and still maintain 55-65fps enjoying better graphics. I always turn on rtgi and rtao when I can, they always look better and far, far more natural https://imgsli.com/MTcwMzcx You praise lumen so much, seen the performance hit ? The only thing you have against rt is the wrong gpu maker had it working first.
i didn't call you a fanboy, but if the description fits... you are completely wrong in your assessment of RT use and that's a well documented fact. RT is and has been a marketing gimmick for two generations and is only now coming into it's own. in fact, RT is limited to enthusiasts only (by requirement) and is and has always been useless in the mainstream. and as far as "natural" appearances go, i'm glad you're happy. in my book the graphical power to render such scenes in a playable setting (over 50+ fps w/ RT for a single player non 1st person OR 75+ fps w/RT for a 1st person) has only been viable with the top cards from the last and the current generation. i'm not the one to be paying for trickery at a lower res.
https://forums.guru3d.com/data/avatars/m/282/282473.jpg
tunejunky:

i didn't call you a fanboy, but if the description fits...
went right over your head, in your own words everyone who turns on rt is a fanboy. I guess people who bought 4090s and 7900xtx's need to run games at 150-200fps or else they're stupid for buying them to use more natural light and correct real-time reflections. Unless 7900xtx is 1% faster than 4080 of course, then rtgi like lumen is revolutionary. I play everything on a controller, even 1st person, so 55-65fps is great with fressync, as long as it's smooth and doesn't stutter.
tunejunky:

in fact, RT is limited to enthusiasts only (by requirement)
see ? you can even get someone's point and not call everyone a fanboy in the process.
https://forums.guru3d.com/data/avatars/m/248/248994.jpg
Maybe it seems obsessive or foolish, but I'm seriously starting to consider the Founders Edition (since you can now buy them in the Nordic countries) because of the idle power issue. It seems like all AIB partner models have high idle consumption compered to the FE. I'm a pretty sporadic gamer, but I use my PC long hours in a week not gaming. So, the idle power use is more important to me than gaming use. Plus it just doesn't make sense how it's so much higher even for the better models that should be using better chips. Perhaps Nvidia reserves such a bulk of the better chips for itself, for the time being, that the AIB parners have no choice but to set the idle voltage high to be able to mass produce. All that being said, during this research I noticed, for the first time, how high idle consumption the AMD 7900 cards have. I wonder if it's because of the MCM design. Perhaps it's very immature still.
data/avatar/default/avatar02.webp
Kaarme:

Maybe it seems obsessive or foolish, but I'm seriously starting to consider the Founders Edition (since you can now buy them in the Nordic countries) because of the idle power issue. It seems like all AIB partner models have high idle consumption compered to the FE. I'm a pretty sporadic gamer, but I use my PC long hours in a week not gaming. So, the idle power use is more important to me than gaming use. Plus it just doesn't make sense how it's so much higher even for the better models that should be using better chips. Perhaps Nvidia reserves such a bulk of the better chips for itself, for the time being, that the AIB parners have no choice but to set the idle voltage high to be able to mass produce. All that being said, during this research I noticed, for the first time, how high idle consumption the AMD 7900 cards have. I wonder if it's because of the MCM design. Perhaps it's very immature still.
You're looking at no less than 4x performance over 390 🙂 Higher idle consumption... No chip should physically need 30% more power to sustain the Idle operations compared to FE brethren. IMHO it has to be either due to changed PCB or the BIOS requesting more frequent/higher clock micro-fluctuations. Regardless, this issue of custom cards consuming more in idle is there and it has been with us for some time already. Except, it seems that it went from few Watts of difference to several with RTX 4***. PS The Idle voltage itself seems extremely high ~0.9V. WTH? When did that happen with Nvidia? I mean rly... the chip registers roughly the same 0.9V for both Idle(210MHz) and Cyberpunk vSync (1300MHz) (according to TPU). Is that due to some technical conundrum like a choice of definition of GPU voltage, or is that simply so? Then again it looks like this shouldn't be a concern because that which matters (the power) scales up and down as expected.
https://forums.guru3d.com/data/avatars/m/108/108389.jpg
Noisiv:

You're looking at no less than 4x performance over 390 🙂 Higher idle consumption... No chip should physically need 30% more power to sustain the Idle operations compared to FE brethren. IMHO it has to be either due to changed PCB or the BIOS requesting more frequent/higher clock micro-fluctuations. Regardless, this issue of custom cards consuming more in idle is there and it has been with us for some time already. Except, it seems that it went from few Watts of difference to several with RTX 4***. PS The Idle voltage itself seems extremely high ~0.9V. WTH? When did that happen with Nvidia? I mean rly... the chip registers roughly the same 0.9V for both Idle(210MHz) and Cyberpunk vSync (1300MHz) (according to TPU). Is that due to some technical conundrum like a choice of definition of GPU voltage, or is that simply so? Then again it looks like this shouldn't be a concern because that which matters (the power) scales up and down as expected.
my 4090 idle at 875mV, while 3090 idle at 725mV, 2080Ti can go much lower minimum voltage (maybe in the 500mV range). Looks like silicon lottery is to blame for the discrepancy in idle power consumption, not related to GPU but Der8aur tested 13 ryzen 7600 and find there is a huge variation in power consumption between them
https://forums.guru3d.com/data/avatars/m/248/248994.jpg
Krizby:

my 4090 idle at 875mV, while 3090 idle at 725mV, 2080Ti can go much lower minimum voltage (maybe in the 500mV range). Looks like silicon lottery is to blame for the discrepancy in idle power consumption, not related to GPU but Der8aur tested 13 ryzen 7600 and find there is a huge variation in power consumption between them
If that's the only reason, I suppose there's no guarantee even the specific FE card one'd get from Nvidia would have the low idle power reviews have recorded. Most certainly AIB partners would reserve the better chips for the more expensive models. Chances are, they just didn't have enough better chips right from day zero. But later the division would start to appear.
data/avatar/default/avatar14.webp
Every single review for the last 2 gens has custome cards idling higher. So I am pretty sure it's not a lottery 🙂
Krizby:

my 4090 idle at 875mV, while 3090 idle at 725mV, 2080Ti can go much lower minimum voltage (maybe in the 500mV range).
How would you hack it to 500mV? You can't drag the curve in AB lower than the available minimum.