#ProjectBeyond: Nvidia posts cryptic tweet invites Twitter speculations

Published by

Click here to post a comment for #ProjectBeyond: Nvidia posts cryptic tweet invites Twitter speculations on our message forum
https://forums.guru3d.com/data/avatars/m/231/231400.jpg
self destruct... RTX 4080/12GB (I hope)
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
Why are the 4080s so different? The 16GB seems like it'd be a 4080 Ti.
https://forums.guru3d.com/data/avatars/m/258/258664.jpg
It's called a marketing campaign guys.
https://forums.guru3d.com/data/avatars/m/282/282473.jpg
schmidtbag:

Why are the 4080s so different? The 16GB seems like it'd be a 4080 Ti.
9700 cuda vs 16300 isn't a 4080ti
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
cucaulay malkin:

9700 cuda vs 16300 isn't a 4080ti
Huh? Are you looking at the right specs? It's 9700 for the 16GB and 7600 for the 12GB. That's a 12% difference, which is significant. Not only that but the memory size, memory bandwidth, and power consumption are all pretty dramatically different.
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
schmidtbag:

Huh? Are you looking at the right specs? It's 9700 for the 16GB and 7600 for the 12GB. That's a 12% difference, which is significant. Not only that but the memory size, memory bandwidth, and power consumption are all pretty dramatically different.
Think he's confusing it with the 4090 specs and I agree it should be a 4080Ti. Maybe it will be, who knows.
https://forums.guru3d.com/data/avatars/m/282/282473.jpg
schmidtbag:

Huh? Are you looking at the right specs? It's 9700 for the 16GB and 7600 for the 12GB. That's a 12% difference, which is significant. Not only that but the memory size, memory bandwidth, and power consumption are all pretty dramatically different.
are you ? 4090 has 16300 cuda. 9700 cuda card isn't a 4080ti if full 4090 has 16300. more likely to have at least 14000.
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
cucaulay malkin:

are you ? 4090 has 16300 cuda. 9700 cuda card isn't a 4080ti if full 4090 has 16300. more likely to have at least 14000.
He's not talking about the 4090 - he's talking about the 4080 16GB, which has a few thousand more CUDA cores than the 4080 12GB. He's saying it should be branded as a 4080Ti instead of 4080 16GB. So it would be: 4080 (7600 CUDA) 4080 TI (9700 CUDA) 4090 (16300 CUDA) That being said I'm pretty sure Nvidia has had previous offerings with same name, different CUDA configurations - 1060 3GB vs 6GB comes to mind. I just think it's bad practice, especially when you have a Ti moniker that everyone is already aware of.
https://forums.guru3d.com/data/avatars/m/282/282473.jpg
Denial:

4080 TI (9700 CUDA) 4090 (16300 CUDA)
makes zero sense, btw you know the 9700 cuda card is a different die ? no cut down ad102 card ? surely there will be one
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
cucaulay malkin:

makes zero sense, btw you know the 9700 cuda card is a different die ? no cut down ad102 card ? surely there will be one
Why does that make zero sense? Wouldn't it being a different die make it make more sense?
https://forums.guru3d.com/data/avatars/m/282/282473.jpg
Denial:

Why does that make zero sense? Wouldn't it being a different die make it make more sense?
zero sense
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
cucaulay malkin:

zero sense
one hundred cents
https://forums.guru3d.com/data/avatars/m/282/282473.jpg
Denial:

one hundred cents
zero. it would mean that all not fully functional ad102 won't be sold as 4080ti and also that 4090 will have 70% more shaders than 4080, for 3090 vs 3080 it's like 20%.
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
Denial:

That being said I'm pretty sure Nvidia has had previous offerings with same name, different CUDA configurations - 1060 3GB vs 6GB comes to mind. I just think it's bad practice, especially when you have a Ti moniker that everyone is already aware of.
Indeed they did have different CUDA configurations. It was about a 10% difference, or only about 100 cores. The difference in CUDA cores between the 4080 12GB and 16GB is enough to make a whole RTX 2050. A 2050 is no slouch. But unlike the 1060s, there's clearly more happening with the different 4080 models, when you consider the 30% increase in power draw, despite the 12% increase in CUDA cores.
cucaulay malkin:

zero sense
How mature.
cucaulay malkin:

zero. it would mean that all not fully functional ad102 won't be sold as 4080ti and also that 4090 will have 70% more shaders than 4080, for 3090 vs 3080 it's like 20%.
As I just mentioned, it's not just the amount of VRAM and CUDA cores. The 16GB is bound to be more than 12% faster. No matter what, there is going to be some hefty gap between the 4080 (of either size) and 4090. Nvidia knows that at this price point and scale, the 12GB will not be of any interest if it's so close in performance. That being said, I still can't help but wonder who would buy the 12GB model over the 16GB. They're both most likely going to have quadruple-digit prices and consume a lot of power, so, it's not like the 12GB is going to be enticing to anyone on a budget.
https://forums.guru3d.com/data/avatars/m/282/282473.jpg
schmidtbag:

Indeed they did have different CUDA configurations. It was about a 10% difference, or only about 100 cores. The difference in CUDA cores between the 4080 12GB and 16GB is enough to make a whole RTX 2050. A 2050 is no slouch. But unlike the 1060s, there's clearly more happening with the different 4080 models, when you consider the 30% increase in power draw, despite the 12% increase in CUDA cores. How mature. As I just mentioned, it's not just the amount of VRAM and CUDA cores. The 16GB is bound to be more than 12% faster. No matter what, there is going to be some hefty gap between the 4080 (of either size) and 4090. Nvidia knows that at this price point and scale, the 12GB will not be of any interest if it's so close in performance. That being said, I still can't help but wonder who would buy the 12GB model over the 16GB. They're both most likely going to have quadruple-digit prices and consume a lot of power, so, it's not like the 12GB is going to be enticing to anyone on a budget.
Lovelace cores for making a turing card,lol,people in this thread are something else.