Rumor: NVIDIA likely launches GeForce RTX 4090 first then later RTX 4080/70 (Updated)
Click here to post a comment for Rumor: NVIDIA likely launches GeForce RTX 4090 first then later RTX 4080/70 (Updated) on our message forum
nizzen
Nice! Then I don't need wait for the good stuff, like RTX 30 series. Had to buy 3080 first, then 3090. Waste of time.
Only good thing is that I sold 3080 for almost the same price as 3090 LOL
tunejunky
what this tells me from an industry standpoint is Nvidia is tickled pink with the node shrink because of the vastly increased yield.
this has the highest circuit density of any gpu ever designed and it's monolithic, both of which usually fall victim to lower yields ( = higher price and limited availability), so i was impressed last year when Nvidia bought production time at node. this will be the benefit of it.
tldr; Nvidia's doing this because they finally can
with lower yield processes Nvidia employed a wide array of cut-down models to minimize waste (and still make money). now they have less waste to start with which equals more chips from a wafer.
this is no permanent solution as there is still waste silicon ( you are making rectangles out of a circle), just not the kind to make cut-downs with. which is why they are also going to a mcm at some point.
even with the higher yield, the monolithic design means there may be within wafer variations (silicon lottery)
or the uArch equivalent of "dead pixels" which knowing Nvidia will be a cut-down at a later date IF that's a common artefact, but i don't expect it. this node while new to Nvidia (or pc master race) has been matured and refined thanks to Apple who's done very well with high density monolithic chips (M1/ Max).
the most impressive thing to me (and where the statistically higher reject rate comes from) is the job the wire bundlers do (the backbone of any "chip") as these production nodes decrease and the circuits become denser. the lithography uses light which can be focused, the bundler used wire that needs to formed somewhat like rebar in construction, but smaller than the smallest insect's legs, thinner than thread and following a wiring diagram.
oh and that's not done by TSMC, Intel, or Samsung
Raif Uzkan
Those days the VGA cards are for the rich... I remember times that we used to pay $350 to buy the best VGA card on planet which was an "AMD RADEON 3850 AGP" (512 MB 256 bit memory) to use on a P4 PC with Windows 7...
Nowadays crypto-currency mining boosted the prices, so that you have to pay double or triple of the MSRP prices to buy a decent VGA card... Does not matter if you're a hardcore gamer or a crypto miner...
cucaulay malkin
that's a long wait for 4070
i can see myself buying a 3080 from panic sellers
perellaz
And then youll have the gamer that buys a 4090 to play fortnite lol. completely useless card, i run msfs in 2k at ultra with a 3070
tunejunky
Undying
suty455
Am lucky I have a 6900xt and a 3090 in my 2 systems, although the 3090 is ultimatly the faster in cases such as RT or benchmarks the reality is the 6900xt is actually the better card of the two when you look at it from a users point of view, RT is never used on online games in fact online games we tend to lower the eye candy to get better frame rates and less lag and single player offline I really struggle to tell the difference between RT and none DLSS and none apart from in one way, the heat of the 2 PCs both are identical apart from the GPU yet the Nvidia machine always runs hotter under load with a loop temp max of 38deg C the Radeon runs around 4-5deg C cooler same cpu, mobo case etc both have a 1x480, 2 x 360 rad with an aquacomputer pump res combo, the 6900xt keeps up but uses a lot less power doing it
For this reason alone am going to go all Radeon on my AM5 build am hoping RDNA3 will really kick ass
kapu
bobnewels
Nvidia is very good at marketing ,they know when to release a product.
Get a faster product out the door before anyone else and it is a win.
Does not even matter if AMD/Intel have faster products.You lose sales,reputation blah blah if not first.
alanm
Regardless of what the 4090 sells for (my guess $1800), I think they are preparing consumers for what follows that. They will have an easier time introducing the 4080 at around $999 and it still will be a volume seller. 4080Ti maybe $1400, etc.
I have no doubt these cards cost more to produce than Ampere since fabs increased their prices to meet demand, but Nvidia paid even more just to secure a higher share of the wafer supply than others to avoid any shortages this time around.
Reddoguk
Unless you already have a high end PSU then i'm guessing this will need a new one too for a lot of users.
750w is probably out. 850w might get away with it but this really means 1000w with be the sweet spot for 4090 at least.
A factory OC'd 4090 will be i'm guessing 500w.
alanm
(updated rumor) NVIDIA plans to launch GeForce RTX 4090 in August, RTX 4080 in September and RTX 4070 in October
NVIDIA GeForce RTX 4090 is now expected to debut in August. The current launch schedule lists RTX 4080 with a September launch date and RTX 4070 should launch in October. However, our sources expect some dates to change because AIBs still have lots of RTX 30 inventory left, and the last thing they want is to NVIDIA announce a new generation...
cucaulay malkin
kapu
Aura89
So much whining and complaining
Neo Cyrus
There's rumours that the 4080 won't use the AD102 chip and will be significantly weaker than the 4090 as opposed to the 3080 and 3090 essentially being the same crap. I'm guessing I'm going to be forced into skipping this generation because nVidia are going to try to milk it at something stupid like $3-4K CAD, no thanks you evil leather jacket. Even $2K would be ludicrous.
Undying
https://abload.de/img/linus-torvalds-linusc9kq6.gif
Thats why having a competition is so important. You can always say
cucaulay malkin
SameOldx2
Doesn't make sense to me since these chips are probably binned. They release the top line card first when there are going to be more chips for the other cards ?