Gigabyte confirms GeForce RTX 4070 Ti graphics cards

Published by

Click here to post a comment for Gigabyte confirms GeForce RTX 4070 Ti graphics cards on our message forum
https://forums.guru3d.com/data/avatars/m/227/227994.jpg
mackintosh:

I don't see any reason for this card existing unless it vastly undercuts the 7900s in price. I have a hard time believing there's any chance of that happening. I suppose some DLSS3 shenanigans could give it some leverage.
I can see myself upgrading my 2016 GTX 1070 finally, it makes sense for people with such a card.
https://forums.guru3d.com/data/avatars/m/235/235398.jpg
personally i'm waiting for the 7900 xtx to be released so that I can get an upgrade for my 1080. I've had an AMD CPU for while now and everything has been going smoothly.
https://forums.guru3d.com/data/avatars/m/56/56686.jpg
100w more then my 180watt 1070ti then i undervolted, look like just get card under 200w base isnt happen any time soon, not even sure i want to see cost of them, probably just more stupid
https://forums.guru3d.com/data/avatars/m/260/260103.jpg
Agonist:

Prolly cost $850
Even that's way to expensive. Should be $650 tops.
https://forums.guru3d.com/data/avatars/m/56/56686.jpg
Maddness:

Even that's way to expensive. Should be $650 tops.
it all way to expensive and seem in crease a good bit every gen and they get way with along with more and more TDP
data/avatar/default/avatar28.webp
Judging by its performance difference to the 4080, this so-called 4070 ti (formerly 4080 12Gb) should actually be a 4070 instead with a price of US$ 650. Prob is that Nvidia will probably price this around US$ 900 as they originally planned.
data/avatar/default/avatar06.webp
cucaulay malkin:

I don't think this can be casually ignored.
Something that can’t be casually ignored is the software stack that comes with buying an Nvidia card. For gaming, you’ve got RTX, DLAA, DLSS, PhysX, G-Sync (the last two are less relevant) and so on. For compute, you’ve got OptiX, CUDA, and TensorRT just to name a small fraction. Add to that things like historically MUCH better DX11 drivers, beating AMD to market with CUDA, and many people’s personal experience with AMD’s launch day drivers, plenty of folks won’t consider an AMD card regardless of price 😀 AMD has released some competitive options like FSR, their Freesync, and so on, but Nvidia has beaten them to market at virtually every point. Compute on AMD is still a mess since OpenCL was largely abandoned, and neither HIP, nor ROCm, nor SYCL can approach CUDA in terms of flexibility, support, or performance. Nvidia has a more mature software stack for every market segment (gaming, compute, deep learning, etc.), and an ecosystem of software tied into it. This is why Jensen can say “we don’t have to compete on price, we compete on quality” 😎 That’s all to say, Nvidia is able to price gouge because consumers perceive a superior product. And when you look at the software, there isn’t an argument against that 😛
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
winning.exe:

Something that can’t be casually ignored is the software stack that comes with buying an Nvidia card. For gaming, you’ve got RTX, DLAA, DLSS, PhysX, G-Sync (the last two are less relevant) and so on. For compute, you’ve got OptiX, CUDA, and TensorRT just to name a small fraction. Add to that things like historically MUCH better DX11 drivers, beating AMD to market with CUDA, and many people’s personal experience with AMD’s launch day drivers, plenty of folks won’t consider an AMD card regardless of price 😀
Actually, you can pretty casually ignore them when: A. There is a good-enough alternative (granted, most of Nvidia's exclusive features can't be so easily replaced) B. The feature isn't widely implemented (which is pretty much everything you mentioned except CUDA) C. Of applications where the features are implemented, many (not necessarily most) implementations do not impress. D. The feature is too demanding to be of any practical use for anyone who isn't using a flagship or halo product. Also, AMD's DX11 drivers are just "fine". Not great, and maybe takes a little while to mature, but they're okay. Where AMD severely falls behind is their Windows OpenGL drivers.
AMD has released some competitive options like FSR, their Freesync, and so on, but Nvidia has beaten them to market at virtually every point. Compute on AMD is still a mess since OpenCL was largely abandoned, and neither HIP, nor ROCm, nor SYCL can approach CUDA in terms of flexibility, support, or performance. Nvidia has a more mature software stack for every market segment (gaming, compute, deep learning, etc.), and an ecosystem of software tied into it. This is why Jensen can say “we don’t have to compete on price, we compete on quality” 😎 That’s all to say, Nvidia is able to price gouge because consumers perceive a superior product. And when you look at the software, there isn’t an argument against that 😛
I pretty much agree with all this. But as I alluded to earlier, it doesn't really matter if you beat someone to the punch if there wasn't exactly anybody lining up for the feature in the first place. PhysX was cool and all but since it was mostly just a gimmick and vendor-locked, it died pretty quick. You pay extra for Nvidia's tensor cores, so if you're building a server that doesn't need them, that's a wasted expense. AMD is also really starting to step it up lately, particularly with CDNA2. Depending on your workload, Nvidia's software isn't going to be enough. Intel too will be encroaching on Nvidia's server sales - we might be disappointed with Arc but they're bound to do much better on the server side of things.
data/avatar/default/avatar30.webp
schmidtbag:

Depending on your workload, Nvidia's software isn't going to be enough. Intel too will be encroaching on Nvidia's server sales
Nvidia ships something like 9 in 10 server GPU accelerators and 8 in 10 consumer [discrete] GPUs at present, so we'll have to wait and see on this conjecture, won't we 😛 You may not use Nvidia's software stack, but I've listed only tangible benefits available to consumers today.
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
winning.exe:

Nvidia ships something like 9 in 10 server GPU accelerators and 8 in 10 consumer [discrete] GPUs at present, so we'll have to wait and see on this conjecture, won't we 😛
Indeed, hence the more aggressive competitors in the server market (which to my understanding has much higher profit margins). Nvidia paved the way to their own success via CUDA. As of today, that along with their mature AI and OptiX support is what will keep them dominating the server market for many more years. My point was, if an organization is not bound to Nvidia's proprietary APIs, Intel and AMD have a pretty good chance of pulling the rug beneath Nvidia's feet if they continue at this trajectory. Intel is honestly the bigger threat to Nvidia as far as I'm concerned. They're a more trusted brand in the server market and unlike AMD, they actually care about getting APIs to be widely adopted. Aside from being a little lazy with drivers, AMD did practically nothing to incentivize developers to use OpenCL, let alone their GPUs. Kinda mind boggling IMO considering back in the TeraScale2 days, AMD's compute performance was insane. Anyway, AMD now is seeing how much their reputation has fallen due to their neglect with compute and they've got a lot of catching up to do. Intel does too but I'm confident they will outpace AMD, at least in terms of drivers and developer incentives, maybe not so much in hardware capabilities.
You may not use Nvidia's software stack, but I've listed only tangible benefits available to consumers today.
Understood, but you were saying those tangible benefits were not so casually ignored; I'd argue they are for a rather large audience. Us enthusiasts on Guru3D are going to show a lot of interest in such benefits but the average person probably won't care, not for the proposed cost anyway.
https://forums.guru3d.com/data/avatars/m/268/268248.jpg
@winning.exe & @schmidtbag you are both right on a lot of fronts ! And oh my god schmidtbag the opengl on Windows is broken over a decade .....makes me wonder why they can not ... Or why they are not bothering to fix it at all !
https://forums.guru3d.com/data/avatars/m/282/282473.jpg
to each their own, but I want an upper mid-range gpu that does it all at 600-700eur. no problem for high refresh rasterized, no problem for 60fps+ ray tracing, great image reconstruction technique that is being widely implemented in original launches not via mods later, working great with all apis from dx9-dx12/vulkan, supports the best version of super resolution, and has a ton of add-ons built into the driver package like freestyle filters and in addition doesn't use a lot of power for multi monitor setup. a working frame generator technique would be a nice novelty too, instant fps boost not requiring a cpu upgrade. a card good for literally every use, without having to wait an indetermined amount of time for things that I'm not able to see during product presentation.and nothing with that new adapter, if 4070ti has one it's out of the question.
https://forums.guru3d.com/data/avatars/m/225/225084.jpg
I have a huge game library installed right now about 3Tb worth and only these have DLSS 2.4.12.0: C:\Program Files (x86)\Call of Duty\_retail_ 2.3.11.0: C:\Program Files (x86)\Steam\steamapps\common\EVERSPACE™ 2\ES2\Plugins\DLSS\Binaries\ThirdParty\Win64 2.3.4.0: C:\Program Files (x86)\Steam\steamapps\common\Cyberpunk 2077\bin\x64 2.3.2.0: C:\Program Files (x86)\Steam\steamapps\common\Shadow of the Tomb Raider 2.3.1.0: C:\Program Files (x86)\Steam\steamapps\common\Crysis3Remastered\Bin64 2.3.0.0: C:\Program Files (x86)\Steam\steamapps\common\DOOMEternal 2.2.18.0: C:\Program Files (x86)\Origin Games\Battlefield 2042 2.2.11.0: C:\Program Files (x86)\Steam\steamapps\common\No Man's Sky\Binaries 2.2.10.0: C:\Program Files (x86)\Steam\steamapps\common\Red Dead Redemption 2 2.2.9.0: C:\Program Files (x86)\Steam\steamapps\common\F1 2022 2.1.55.0: C:\Program Files\Epic Games\MEEnhancedEdition 2.1.25.0: C:\Program Files (x86)\Steam\steamapps\common\Control That's 12 games out of about a 100. DLSS 3.0 will be even worse. It'll be a year or two before 3.0 is even relevant. I probably have 10% of all my games installed right now so many missing off that list but still it's not a large number.
https://forums.guru3d.com/data/avatars/m/277/277158.jpg
cucaulay malkin:

to each their own, but I want...
I think it's commendable that someone on a tight budget tries to cram as many options into their purchase as possible. Makes a lot of sense.
https://forums.guru3d.com/data/avatars/m/268/268248.jpg
beedoo:

I think it's commendable that someone on a tight budget tries to cram as many options into their purchase as possible. Makes a lot of sense.
I mean that's the whole vfm game trying to get the best possible thing for your needs for your bucks!
https://forums.guru3d.com/data/avatars/m/246/246088.jpg
cucaulay malkin:

thought process of imagining pricepoints that don't exist ? price/perf charts show nvidia is the same as amd, or better in rasterization. they crush them in rt on top of that, and have better image recontruction technique. yet you say even if amd was better, people would still buy nvidia,which is BS. sometimes things you see in black and white are indeed the things you see, there's nothing more. figures are the most important thing if you want to prove a point. otherwise you're just ranting. calling someone a blind fanboy when they are the ones who actually look at data is ironic. yeah,amd so much better, beats nvidia by 35 fps at the same price
4090 TUF OC im Test.png
As i said, it was metaphoric, people will stick to a brand irrespective of whether or not logic says its not the best for what they want, that logic may be driven by performance, price, reliability, after sales care, reliability etc etc etc. How many people buy a car or type of food because they have always done so, people stick to the same energy provider, the same ISP, go to the same locations for a holiday irrespective of there being more logical choices. As for looking at the data, that data extends far beyond price/performance, other factors come into play including "Its my money i earnt it il buy what i want", this is something it seems those who cant afford 0ften lament. But strutting around online forums throwing out statistics to criticise peoples choices, observations and beliefs isnt a cool look, it smells of rented flat, cat owning, lonely, bitter at society narcissism.
https://forums.guru3d.com/data/avatars/m/38/38428.jpg
If they give it the form factor of the RTX 3080 cards and at around $800 or less then sales could be decent. With the coolers of the RTX 3080 cards, and a TDP of 285 watts, it could make for a very quiet card. If the AIB partners go with the power connectors of the last generation, it could be a no-brainer, "just drop it in and go", for lots of existing users. I think people would be instantly attracted to a more normal sized card. nVidia might not want it emphasized that their flagship cards were very large, but AMD cards will be making that point anyway, so nVidia might want to offer the option for cards sized at the same dimensions of the last generation. Too bad though it doesn't have just a tad more "oomph". Those looking to upgrade from an RTX 3080 won't be overcome by desire for the RTX 4070, not unless it's pricing is surprisingly reasonable. The form factor of the RTX 4080 is daunting to think of dealing with. I'm puzzled at the decisions to not offer anything except jumbo size for them.
https://forums.guru3d.com/data/avatars/m/282/282473.jpg
pegasus1:

As i said, it was metaphoric, people will stick to a brand irrespective of whether or not logic says its not the best for what they want, that logic may be driven by performance, price, reliability, after sales care, reliability etc etc etc.
yeah but in that particular case,nvidia vs amd, it's not all that clear. it seems that nvidia does have a better overall product and price difference isn't always in amd's favor. see how in 17 new game average I linked 3080 does better than 6900xt, even though the rt games thrown in the mix aren't particularly rt intensive.
beedoo:

I think it's commendable that someone on a tight budget tries to cram as many options into their purchase as possible. Makes a lot of sense.
I don't consider 600-700eur gpus "budget", no matter what nvidia and amd would like us to think. it's kinda laugable when 7900xt,which is a 6800xt successor on a cut n31 die, got a +250eur price increase (650 to 900) and everyone lost their mind how good the price was.
Reddoguk:

I have a huge game library installed right now about 3Tb worth and only these have DLSS 2.4.12.0: C:\Program Files (x86)\Call of Duty\_retail_ 2.3.11.0: C:\Program Files (x86)\Steam\steamapps\common\EVERSPACE™ 2\ES2\Plugins\DLSS\Binaries\ThirdParty\Win64 2.3.4.0: C:\Program Files (x86)\Steam\steamapps\common\Cyberpunk 2077\bin\x64 2.3.2.0: C:\Program Files (x86)\Steam\steamapps\common\Shadow of the Tomb Raider 2.3.1.0: C:\Program Files (x86)\Steam\steamapps\common\Crysis3Remastered\Bin64 2.3.0.0: C:\Program Files (x86)\Steam\steamapps\common\DOOMEternal 2.2.18.0: C:\Program Files (x86)\Origin Games\Battlefield 2042 2.2.11.0: C:\Program Files (x86)\Steam\steamapps\common\No Man's Sky\Binaries 2.2.10.0: C:\Program Files (x86)\Steam\steamapps\common\Red Dead Redemption 2 2.2.9.0: C:\Program Files (x86)\Steam\steamapps\common\F1 2022 2.1.55.0: C:\Program Files\Epic Games\MEEnhancedEdition 2.1.25.0: C:\Program Files (x86)\Steam\steamapps\common\Control That's 12 games out of about a 100. DLSS 3.0 will be even worse. It'll be a year or two before 3.0 is even relevant. I probably have 10% of all my games installed right now so many missing off that list but still it's not a large number.
now make a list of the most gpu heavy ones you own and see how many have dlss when they need it.
https://forums.guru3d.com/data/avatars/m/94/94773.jpg
Let's see how they price the 4070 Ti aye? If it ends up being the same as the 4080 12GB then it is still a stupid product nor did the consumer "win".
https://forums.guru3d.com/data/avatars/m/248/248994.jpg
schmidtbag:

Intel is honestly the bigger threat to Nvidia as far as I'm concerned. They're a more trusted brand in the server market and unlike AMD, they actually care about getting APIs to be widely adopted. Aside from being a little lazy with drivers, AMD did practically nothing to incentivize developers to use OpenCL, let alone their GPUs. Kinda mind boggling IMO considering back in the TeraScale2 days, AMD's compute performance was insane. Anyway, AMD now is seeing how much their reputation has fallen due to their neglect with compute and they've got a lot of catching up to do. Intel does too but I'm confident they will outpace AMD, at least in terms of drivers and developer incentives, maybe not so much in hardware capabilities.
As far as I know, Intel still hasn't delivered the first pure Intel supercomputer (CPU+GPU). I could be wrong, though, if they did it very recently. AMD has been shipping them "all the time" these days, so I doubt AMD's reputation has dropped in that market. Nvidia is still the market leader, I believe, and now that it relies on its own ARM-based CPUs, it doesn't benefit AMD or Intel anymore on the side, either.