NVIDIA GeForce RTX 4000 Ada Lovelace series design reportedly completed (5nm)

Published by

Click here to post a comment for NVIDIA GeForce RTX 4000 Ada Lovelace series design reportedly completed (5nm) on our message forum
data/avatar/default/avatar30.webp
If Nvidia is smart (would be a first in over a decade) they will hard code (talking non-writable ROM) a firmware/bios lockout for bitcoin and all of the other virtual currency mining from the 4000 series gaming cards. Put them on the market first. Then introduce a dedicated 4000 series mining card at 2x the msrp of the gaming card 6 months later. Nvidia screwed the pooch with the 3000 series cards. They are going to shelf warm for a long time till they become obsolete and get clearance pricing for a fraction of what they are getting now for them. AMD and Intel have a double wide door to walk through and take a serious ass whoopen to Nvidia.
https://forums.guru3d.com/data/avatars/m/282/282392.jpg
Yes nvidia should make cards for miners and charge twice the price, let the miners lap up these amazing cards while us with a brain buy up all the 20 series no-one wants.
https://forums.guru3d.com/data/avatars/m/272/272918.jpg
I_Eat_You_Alive:

If Nvidia is smart (would be a first in over a decade) they will hard code (talking non-writable ROM) a firmware/bios lockout for bitcoin and all of the other virtual currency mining from the 4000 series gaming cards. Put them on the market first. Then introduce a dedicated 4000 series mining card at 2x the msrp of the gaming card 6 months later. Nvidia screwed the pooch with the 3000 series cards. They are going to shelf warm for a long time till they become obsolete and get clearance pricing for a fraction of what they are getting now for them. AMD and Intel have a double wide door to walk through and take a serious ass whoopen to Nvidia.
they are smart, they dont really care bitcoin miners buy the card, either way the cards are sold. would you if you had a business? i wouldnt! the hash rate limiter was to appease people, they arent fussed its broken.
https://forums.guru3d.com/data/avatars/m/225/225084.jpg
Mates this hobby of ours is very fickle. I only just got a 3090 and now i'm told it will be spanked in only a years time by a beast with nearly double the Unified Shaders from 10496 to over 18000 and from 82 SMs to 144 SMs. Makes you kinda angry knowing that all that cash you spent and in a year will be beaten probably by the 4080. šŸ™ šŸ™ šŸ™
https://forums.guru3d.com/data/avatars/m/282/282392.jpg
Reddoguk:

Mates this hobby of ours is very fickle. I only just got a 3090 and now i'm told it will be spanked in only a years time by a beast with nearly double the Unified Shaders from 10496 to over 18000 and from 82 SMs to 144 SMs. Makes you kinda angry knowing that all that cash you spent and in a year will be beaten probably by the 4080. šŸ™ šŸ™ šŸ™
How much do you see online retailers flogging the 4080 for? I say 1k The 3090 is a great card and very future proof, enjoy it.
https://forums.guru3d.com/data/avatars/m/225/225084.jpg
DannyD:

How much do you see online retailers flogging the 4080 for? I say 1k The 3090 is a great card and very future proof, enjoy it.
Agreed will most likely be a grand yes which is just how it's been going lately, you'd really have to flood the market with GPUs because i believe scalpers will continue doing there thing for years to come with gfx cards as we already know these new GPUs will be made in a very tight production number unless Samsung or TSMC build extra 5nm foundries. Deep down i really believe nVidia is enjoying the scalper wars even if they are pretending to care about gamers because it's allowed them to bump up prices themselves.
https://forums.guru3d.com/data/avatars/m/277/277158.jpg
Reddoguk:

Mates this hobby of ours is very fickle. I only just got a 3090 and now i'm told it will be spanked in only a years time by a beast with nearly double the Unified Shaders
Don't worry mate, you can always crank the settings 'down' and turn on FSR if your games start to struggle šŸ˜€ It's gonna be an expensive 12 months... it's time to retire my threadripper and get a new one - that means new CPU, new motherboard and new video card... My credit card will be in tears!
https://forums.guru3d.com/data/avatars/m/282/282392.jpg
6900xt no good then?
https://forums.guru3d.com/data/avatars/m/272/272918.jpg
Only worth getting top end on day one to maximise use and break down of cost. Iā€™d have never bought a 3090 this late in game.
https://forums.guru3d.com/data/avatars/m/277/277158.jpg
DannyD:

6900xt no good then?
I like the 6900XT, but I always saw this generation as a bit of a stop-gap.
data/avatar/default/avatar08.webp
I bought a 3080 to replace my 970 in November last year, I'm still only halfway through the order queue now. I think I'll wait it out so I can get my card at the original retail price, not overly optimistic about 4080 availability.
https://forums.guru3d.com/data/avatars/m/242/242573.jpg
NewTRUMP Order:

And don't forget to add 25% to the gpu price if you live in the U.S.for tariff.
Then don't buy Chinese made cards. The GPU is made by TSMC in Taiwan and there are boards also made there. The one to blame here is China anyways considering they have Tariffs on all US made goods.
https://forums.guru3d.com/data/avatars/m/282/282392.jpg
JJJohan1:

I bought a 3080 to replace my 970 in November last year, I'm still only halfway through the order queue now. I think I'll wait it out so I can get my card at the original retail price, not overly optimistic about 4080 availability.
4080 will be more available than 3080 imo, miners won't be interested this time around, thinking back and knowing what we know now those 30 series were incredible mining cards, probably never to be beaten now the gig's up, at least on the green team. (i bet nvidia mined themselves millions, i know i would have) under the guise of 'stress-testing'. Scalpers tho, we seen the dirty online retailers themselves scalp the crap outta the cards. I still think they're gonna be quite abundant tho, with a massive premium lol.
https://forums.guru3d.com/data/avatars/m/225/225084.jpg
insp1re2600:

Only worth getting top end on day one to maximise use and break down of cost. Iā€™d have never bought a 3090 this late in game.
I see your point but unfortunately i couldn't get my 3090 any earlier and might never of gotten one if i didn't buy a full PC just to get one. I had the money but was never gonna pay a scalper for a gfx card on eBay. I got scalped anyway from the etailer as i overpaid by at least 1K from normal pricing. PC should of been about Ā£3500 and i ended up paying Ā£4500 which made the purchase even more painful as Ā£700 of that was on just the gfx card. So not only a bit late to the party but also charged Ā£2200 for the privilege on a Ā£1500-Ā£1600 card. I could of got 2 3080Tis for that money and ran them in SLI.
https://forums.guru3d.com/data/avatars/m/108/108389.jpg
Reddoguk:

I see your point but unfortunately i couldn't get my 3090 any earlier and might never of gotten one if i didn't buy a full PC just to get one. I had the money but was never gonna pay a scalper for a gfx card on eBay. I got scalped anyway from the etailer as i overpaid by at least 1K from normal pricing. PC should of been about Ā£3500 and i ended up paying Ā£4500 which made the purchase even more painful as Ā£700 of that was on just the gfx card. So not only a bit late to the party but also charged Ā£2200 for the privilege on a Ā£1500-Ā£1600 card. I could of got 2 3080Tis for that money and ran them in SLI.
Ouch, I got my Asus TUF 3090 OC and Bitspower Waterblock for 2k2$ total back in Nov 2020, got an LG OLED CX in Jan 2021 for 1200usd and have been happy shut-in for the last couple months LOL (city under lockdown). 3090 is already too capable for 4K gaming, which is the current holy crown of gaming atm. So you might not have a reason to upgrade until 8K gaming is popular šŸ˜€
https://forums.guru3d.com/data/avatars/m/271/271560.jpg
NewTRUMP Order:

And don't forget to add 25% to the gpu price if you live in the U.S.for tariff.
not if sourced from Taiwan
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
Undying:

Hooper/Lovelace still using monolithic die? AMD will win next generation.
Hopper is MCM, Lovelace is monolithic - at least based on all these rumors. Presumably Hopper will be data center only, given Nvidia's released papers on how MCM is going to come with some performance penalties for gaming.
https://forums.guru3d.com/data/avatars/m/271/271560.jpg
this is actually a very hard place for Nvidia to be in, despite profits and appearances. Nvidia has lead in uArch for decades, they have lovingly embraced every square nanometer of die space and have pushed the envelope for circuit density. but MCM changes everything and does it categorically. AMD has had a long term view of competition since Dr. Su has taken the reins. they have taken long term strategic partnerships that are bearing a bumper crop of fruit. whatever you call the process node, AMD will have an advantage for the next ten years. if you are a smaller company (like AMD vs. Nvidia) you have to play to your strengths. AMD's main strength was fabrication, but unloading Global Foundries was necessary to provide the R&D funding that Dr. Su knew was necessary on both sides of their portfolio. so they sold the fab, but kept the brains. and those brains knew TSMC wanted a taste of x86, so together they went and created Ryzen and more importantly Threadripper as a way to introduce new technologies to the stagnant PC market as Intel was resting on their laurels and nibbling bon-bons. Threadripper used an entirely new and revolutionary technology called Infinity Fabric to sew together (or as Intel would say, glue) chiplets to create unheard of performance in the PC marketplace. not being complacent, AMD shook up the marketplace and became Wall St.'s darling, raising even more R&D dollars to reinvest in this new technology but GPU's have entirely different internal requirements than CPU's and sensitivity to latency was on an entirely different level... thought to be impossible... but no, difficult, but not impossible which is why Nvidia's in a spot until their technology (and plans for ARM) ripen. and that spot is created by the fabrication experience of AMD and Intel. even though Intel's GPU is mid-market (at best), they have manufacturing ability to make MCMs. even if RDNA 3 as an uArch isn't as good as the Nvidia uArch, it can drastically outperform at any level by scale , which is drastically less expensive because you are making LOTS of very high yield parts which multiply cost savings at every level. indeed, if marketings ugly head is nowhere to be seen, you can create a 3090 killer for under $800 with most of that coming from DDR and VRMS. but best of all for enthusiasts you can truly and comfortably choose the same build quality plus or minus the MCM count (or disabled cores or both) and even allow a Very Large Socket (2x- 4x) for datacenter. and i got to tell you Enterprise loves the idea of more grunt in less space, especially as it will replace many other components with one "card". so basically better than NVLink could ever hope to perform at lower cost and heat (i.e.1 card replacing 3-4). all a single MCM has to do is perform at today's entry level (tho' it will be better than that) and the Economies of Scale upset the marketplace. and it couldn't happen fast enough.
https://forums.guru3d.com/data/avatars/m/271/271560.jpg
Denial:

Hopper is MCM, Lovelace is monolithic - at least based on all these rumors. Presumably Hopper will be data center only, given Nvidia's released papers on how MCM is going to come with some performance penalties for gaming.
i suspect that's smoke. the real issue is how they are going to work around AMD or Intel patents, but they're flush enough to buy licenses still however even if they do MCM, they're still behind the production nodes of AMD (by contract). but there is no doubt MCM is the future.
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
tunejunky:

i suspect that's smoke. the real issue is how they are going to work around AMD or Intel patents, but they're flush enough to buy licenses still however even if they do MCM, they're still behind the production nodes of AMD (by contract). but there is no doubt MCM is the future.
Idk, maybe - the research on that came out in 2017 but I think the point of their papers would still hold true. Gaming GPU requirements are significantly different than scientific/AI workloads especially when it comes to scheduling - which is basically what they said would lead to issues. More recently they did RC18 MCM for inferencing, while I'm sure they learn something from projects like this it's probably much different then doing a traditional GPU as MCM. https://research.nvidia.com/sites/default/files/pubs/2019-08_A-0.11-pJ/Op%2C//HotChips_RC18_final.pdf Either way I agree that MCM is 100% future in both, I just think gaming might be monolithic for another generation. I'm not entirely convinced AMD is doing MCM for gaming next round either. I personally think the MCM leaks are about their next gen CDNA chip.