NVIDIA GeForce RTX 4070 Launch Date Set for April 13th: Leaker Confirms
Click here to post a comment for NVIDIA GeForce RTX 4070 Launch Date Set for April 13th: Leaker Confirms on our message forum
Undying
cucaulay malkin
3070 is worth more used than 6700xt. Not only that, it's faster and has better features. You gave in to developers' doing a crappy job by releasing unfinished games to cash in early, then fixing them over the course of the next few months.
Krizby
Brasky
https://www.nerdwallet.com/article/taxes/vat-value-added-tax
https://www.bbc.com/news/explainers-53334098
@Kaarme seem's to be confused, YOU/the consumer are paying the VAT, the evil scary corporation that makes everything you willingly buy is not. It's called a consumption tax, feel free to ask questions below and I'll happily google you the answer.
cucaulay malkin
VAT is ultimately collected by the state, it's an easy way to repair the budget a little.
MonstroMart
Why_Me
TimmyP
H83
SplashDown
cucaulay malkin
People who use 4K for benching 4070ti are right though, a 900eur 1440p card ? good grief.....
still, I'd rather use 4070Ti for 4K, same performance as 7900XT, lower power, dlss stomps fsr. Their performance at 4K still requires upscaling for both 4070Ti and 7900XT, and nvidia has a better technique atm.
Kaarme
Why_Me
Aura89
https://www.guru3d.com/index.php?ct=articles&action=file&id=84933
The issue with that one though, is, it's not the majority of games, and there are games that also do the opposite
https://www.guru3d.com/index.php?ct=articles&action=file&id=84936
So this really just puts the 4070 ti right back where it is and always will be, around the 3090 ti performance, not really suffering from the less vram or lower bandwidth. And it certainly does not "lose to the 3090 and 3090 ti at 4k"
Not a good thing to pay attention to when you're looking at the importance of VRAM or even RAM for that matter.
A game, or application, can put whatever it wants into either of those, even if it doesn't need it, even if it doesn't benefit from it performance wise. This happens a lot, and is most noticeable with RAM, where you put 8GB of ram and a game uses 7GB, making you assume if you had 4GB of ram, it'd cause issues, and so you try it and your performance is exactly the same.
This obviously isn't always the case, the line between what the program can use and what it needs isn't obvious. But there is a BIG difference between what a program can use, and what will change the performance.
A perfect example of this that has been benchmarked many times is GTA 5, which will use 9GB+ memory easily, yet has no real performance degradation from only having 8GB, or even 4GB of ram in the system.
Same thing applies to vram.
Look at performance, not technical numbers.
Not sure what your issues were, my wife as a 3070 and ultra wide at 3440x1440, just the same as you, with a 5600x processor, never once had any issue with any game she plays. And she plays a lot, with the settings as high as is reasonable for a 3070 (which is pretty much cranked on most games)
Only game she can't really crank is ARK, but lets be honest the only card that can really crank ARK is the 4090.... lol
But yeah, definitely no stuttering, never once a "low memory warning", and any chance she can get she enables ray tracing, her favorite, though obviously without DLSS 3.0 not the "best" experience, was portal RTX...but that's just cause she loves that game
I also used a 3070 for a few months, same resolution, only difference was a 5900x CPU, and can also attest, none of the issues you had i had. And i always run games max, no matter what. My philosophy is: If i can't play a game cranked, i won't play it until i can.
By that measure, i could say an 8800 Ultra was a horrible card because it couldn't do 4K yet it was "priced like a 4k card"
What you're doing is just moving the goal post, if pricing of cards were based off of what resolution they could run, we'd have some HEFTY HEFTY priced cards, as $500+ use to be the 1080p range, heck 720p use to be the "omg i can't believe this card runs this game at 720p!" so by that measure i guess 4K cards should cost $4000+?
Not to mention that if the 3090 and 3090 ti were 4K cards at $1500 and $2000, and are similar in performance to the 4070 ti at 4K but is $800, then that means what you want is what you got: 4K performance, for less.
Again, not saying the card should or should not be cheaper, but just on a more realistic side, how about we all just look at what the 4070 ti was, rather then wasn't, it was:
A card that has similar performance to the 3090 and 3090 ti at all resolutions, lower resolutions really hitting the 3090 and 3090 ti hard, at $800 instead of $1500 or $2000. $800 not a price you want to spend? Alright, then the 3090 and 3090 ti are not cards you should be comparing it to since you wouldn't want to spend $1500 or $2000 on a GPU either.
I really kind of wish that we'd get away from the whole numbering schemes of graphics card, for one reason only: The whole "But it's a 70 series card! i could never spend that for a 70 series card!!!!" When realistically the only metric is: Price to performance
Because in the end who really cares what a card is named? The only thing that matters is the price to performance. Don't like the performance you are getting from the price? Don't buy it, simple as that. Don't like a card because of its naming compared to the price? This should absolutely never be a decision.
I'd buy a RTX 5010 if it fit my needs for the price, because who cares about the name.
I'm not saying the card is worth it or not, but i don't believe the 4070 ti was ever labeled as a 4K card, in fact nvidias own website compares it with 1440p. Not to say it can't do 4K, and often is the winner at 4K vs the 3090 and 3090 ti, but really shines at 1440p.
If anyone is buying a 4070 ti as a "This is a 4K card, that's why i want it", they got their priorities wrong. Again, it can do 4K, it can even do it quite nicely, but even when it was labeled as a 4080 12GB, it was not "the card you want to get if you want to play in 4K"
By adding that metric, you could just as well say "Well the 3090 and 3090 ti beat the 4070 ti at 8K and 12K so therefore it sucks"...what's the point?
But again, to focus on your last point that it's "memory limited at 4K and soon even at 1440p", i don't know what reviews you have been looking at, so i'm just going to reflect on guru3D's reviews, the 4070 ti meets, or beats the 3090 ti at 4K in the majority of games, it's basically a 3090 ti for 4K performance, which cost at MSRP $1999, down to $799, and while i don't disagree that it shouldn't be cheaper, a xx70 card costing MSRP of $800 is out there, hence the reason i'm not saying the card is worth it or not, there's not a whole lot to "hate" about it when comparing to the 3090 or 3090 ti.
And yes, i know, there are outliers, there always is, for instance this one
Brasky
H83