Nvidia's Ada Lovelace Successor GPUs Set for 2025 Release, Focus Shifts to AI Demand

Published by

Click here to post a comment for Nvidia's Ada Lovelace Successor GPUs Set for 2025 Release, Focus Shifts to AI Demand on our message forum
https://forums.guru3d.com/data/avatars/m/189/189980.jpg
Nvidia played the best hand with GPU computing. Having aligned their GPU clusters to data center demands and by extension Ai nowadays, there is virtually nothing to hinder their growth. On the other hand, bye-bye gaming, it was fun while it lasted and was somehow affordable.
https://forums.guru3d.com/data/avatars/m/248/248994.jpg
It would be cool if AMD released Radeon 8000 next year and managed to smoothly beat Nvidia's best. I wouldn't dare to expect too much, but it would be really good for the market. Soon we will be at a point where Nvidia's GPUs are designing their own next generations, and AMD will need to do the same.
data/avatar/default/avatar27.webp
NVIDIA has already split their data center GPUs from their consumer/professional GPUs before the Ada Lovelace/Hopper generation, but in the past the differences where more in balance than in a difference of capabilities. What you saw before was that data center cards would have more FP64 processing units than their consumer/professional cards. With the current generation there is a quite large difference between the capabilities of Hopper vs those of the Ada Lovelace cards. The Hopper cards come with a lot of functionality that increases their performance in AI based workloads, especially in the training of AI, whereas the Ada Lovelace cards don't have this and are more focused on general AI work and AI network execution. I don't see this trend changing for future architectures, though of course there is still a significant amount of overlap to be expected between the architectures. There just will be a different focus and set of capabilities that make sense for the data center cards vs the ones that make more sense for consumer/professional cards.
https://forums.guru3d.com/data/avatars/m/258/258664.jpg
Well... when they only will get out new hardware in 2025, they better start teasing refreshes because I'm in the market. Give me that 4090TI you schmocks.
https://forums.guru3d.com/data/avatars/m/282/282473.jpg
Waiting for AMD to respawn.
https://forums.guru3d.com/data/avatars/m/79/79740.jpg
fantaskarsef:

Well... when they only will get out new hardware in 2025, they better start teasing refreshes because I'm in the market. Give me that 4090TI you schmocks.
I think we will more likely see a 4080ti (320 bit bus, 20gb vram) than a 4090ti.
https://forums.guru3d.com/data/avatars/m/271/271560.jpg
Kaarme:

Soon we will be at a point where Nvidia's GPUs are designing their own next generations, and AMD will need to do the same.
we are already there
https://forums.guru3d.com/data/avatars/m/246/246088.jpg
Best buy those 4090's now ladies, might just well be the last dedicated full fat gaming card from NV. Next season will be 5060S - 5060Si - 5060Six - 5060T - 5060Ti - 5060TiX - 5060TiXX
https://forums.guru3d.com/data/avatars/m/108/108389.jpg
Hm, looks like 4090 will replace 2080Ti as the "fastest gaming GPU for the longest time". 2080Ti held that record for 24 months, which is unprecedented in GPU history
https://forums.guru3d.com/data/avatars/m/248/248994.jpg
tunejunky:

we are already there
So Ada Lovelace is already more machine than man? I thought it was still the other way around, and we are only getting there.
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
anticupidon:

Nvidia played the best hand with GPU computing. Having aligned their GPU clusters to data center demands and by extension Ai nowadays, there is virtually nothing to hinder their growth.
Well, their pricing has the potential to hinder their growth. They're not the only AI player. In terms of AI-ready GPUs, I think Intel is bound to catch up quickly. AMD could easily compete if they put their devs were putting their fingers on their keyboards rather than up their noses. AMD's server hardware is impressive, but it means nothing if nobody can develop for it. Anyway, while I totally get why Nvidia is deferring the gaming market, I think they're making a mistake in basically allowing both AMD and Intel to catch up.
Kaarme:

It would be cool if AMD released Radeon 8000 next year and managed to smoothly beat Nvidia's best. I wouldn't dare to expect too much, but it would be really good for the market. Soon we will be at a point where Nvidia's GPUs are designing their own next generations, and AMD will need to do the same.
I'm sure Intel's Battlemage will help draw a lot more competition.
https://forums.guru3d.com/data/avatars/m/181/181063.jpg
The current lack of strong competition from AMD's RDNA 3 products
So, this is decided....
https://forums.guru3d.com/data/avatars/m/181/181063.jpg
schmidtbag:

Well, their pricing has the potential to hinder their growth. They're not the only AI player. In terms of AI-ready GPUs, I think Intel is bound to catch up quickly. AMD could easily compete if they put their devs were putting their fingers on their keyboards rather than up their noses. AMD's server hardware is impressive, but it means nothing if nobody can develop for it. Anyway, while I totally get why Nvidia is deferring the gaming market, I think they're making a mistake in basically allowing both AMD and Intel to catch up. I'm sure Intel's Battlemage will help draw a lot more competition.
It's not THAT simple. Putting it like that me or you could easily compete if we put together a team of strong developers and mathematicians. It's more than that: it's vision, business plans developed for years to come because these big corporations don't think a day at the time, community support - CUDA is very well represented and it is here to stay (OpenCV, TensorFlow, etc) - AMD need to win support in the community if they want to succeed, they need to change their business strategy, their focus and it takes a lot of time and money. AMD lacked the vision. Crazy thought: I think if they sold the Radeon brand to Nvidia so that Nvidia could have a monolithic and chiplets solution to compete with each other (not that Nvidia isn't capable of creating chiplet gpu's but just for the sake of craziness...) would be the best thing for Radeon brand - AMD nearly killed it - just for the sake of fun and crazy - don't take me seriously on this...
https://forums.guru3d.com/data/avatars/m/165/165018.jpg
AMD is probably going this route too. I wonder what the gaming landscape will be like in 10 years.
https://forums.guru3d.com/data/avatars/m/248/248291.jpg
Kool64:

AMD is probably going this route too. I wonder what the gaming landscape will be like in 10 years.
Probably the RTX 3000 and RX 6000 will still be the best value options. 😀
https://forums.guru3d.com/data/avatars/m/181/181063.jpg
Kool64:

AMD is probably going this route too. I wonder what the gaming landscape will be like in 10 years.
Only consoles for gaming? GPU replace the CPU and run OS, applications, AI, etc? ==> direct consequence: all of us being forced to have a painting with Jensen Huang in our homes to salute each morning when we wake up?
https://forums.guru3d.com/data/avatars/m/271/271560.jpg
Kaarme:

So Ada Lovelace is already more machine than man? I thought it was still the other way around, and we are only getting there.
Nvidia used A.I. to help design Lovelace & AMD used it for Ryzen 4
https://forums.guru3d.com/data/avatars/m/216/216349.jpg
Kaarme:

It would be cool if AMD released Radeon 8000 next year and managed to smoothly beat Nvidia's best. I wouldn't dare to expect too much, but it would be really good for the market. Soon we will be at a point where Nvidia's GPUs are designing their own next generations, and AMD will need to do the same.
AMD is too far behind to catch Nvidia like that, just look at the 4090, a cut down chip that beats AMD`s best card without too much effort. But they can close the gap, if they execute properly.
fantaskarsef:

Well... when they only will get out new hardware in 2025, they better start teasing refreshes because I'm in the market. Give me that 4090TI you schmocks.
I`m also expecting refreshes from both sides because the days of new generations each year is not coming back.
pegasus1:

Best buy those 4090's now ladies, might just well be the last dedicated full fat gaming card from NV. Next season will be 5060S - 5060Si - 5060Six - 5060T - 5060Ti - 5060TiX - 5060TiXX
Normally that would be correct, the problem is that the 4090 is too expensive for 90% of gamers. For example, i`ve just bought a 6950XT, a 7700X, an Asus ROG Strix B650E F Gaming and 32 GB of Gskill DDR5 6000Mhz for 1.400€, the cheapest 4090 costs more than 1.800€ here in Portugal... But for those who can afford one, then i agree that the 4090 is the way to go.
schmidtbag:

Well, their pricing has the potential to hinder their growth. They're not the only AI player. In terms of AI-ready GPUs, I think Intel is bound to catch up quickly. AMD could easily compete if they put their devs were putting their fingers on their keyboards rather than up their noses. AMD's server hardware is impressive, but it means nothing if nobody can develop for it. Anyway, while I totally get why Nvidia is deferring the gaming market, I think they're making a mistake in basically allowing both AMD and Intel to catch up. I'm sure Intel's Battlemage will help draw a lot more competition.
You mean that there are more companies as good as Nvidia regarding AI? Because from what i can see they are on a different level than anyone else...
https://forums.guru3d.com/data/avatars/m/271/271560.jpg
H83:

You mean that there are more companies as good as Nvidia regarding AI? Because from what i can see they are on a different level than anyone else...
only in some fields AMD's Instinct MI 1300 is revolutionary in every meaning of the word even in applications like Chat-GPT and DALL-E where it reduces training time from months to weeks (on Hopper H100) it offers vastly superior throughput to any Hopper accelerator incl. H100 or the gimped H800. the good news for Nvidia is the MI 1300 is not going to be made in the numbers H100 is made and Nvidia is making more profit from the gimped H 800 than the entire consumer gpu division. also good news for Nvidia is the NVlink-C2C interconnect/interposer has proven to be very good in linking the two ARM-V9 cores in Hopper accelerators which means their days of relying exclusively on massive monolithic gpu designs are coming to an end soon.
https://forums.guru3d.com/data/avatars/m/79/79740.jpg
Gaming is slowly receding from Nvidias business landscape. They know their best milking potentials are better achieved in AI, data centers, supercomputing and other areas. They will remain in gaming, but likely on a reduced scale with too highly priced products to generate the sort of volumes they had in the past.