Nvidia AD106, A107 Ada Lovelace GPUs Likely to use PCIe x8 Interface

Published by

Click here to post a comment for Nvidia AD106, A107 Ada Lovelace GPUs Likely to use PCIe x8 Interface on our message forum
https://forums.guru3d.com/data/avatars/m/282/282473.jpg
scumbag move.
data/avatar/default/avatar04.webp
cucaulay malkin:

scumbag move.
Why so? Performance will be the same, we never saturated that slot anyway. Better use those lines for nvme, thing that you had to do anyway since most of the time the 16x was shared with a secondary disk if you added it
https://forums.guru3d.com/data/avatars/m/294/294824.jpg
Didn't people say the 4070 was going to beat a 3080? Yeah, maybe, na. Don't believe the hype people Kopite7kimi has constantly changed what the cards have over and over again. lol Nvidia did a good marketing tactic to get people talking months before release.
https://forums.guru3d.com/data/avatars/m/248/248291.jpg
asturur:

Why so? Performance will be the same, we never saturated that slot anyway. Better use those lines for nvme, thing that you had to do anyway since most of the time the 16x was shared with a secondary disk if you added it
No it won't. Especially for someone using PCI-e Gen3. Which there a re many, be it on Intel or AMD systems. And especially on nvidia cards, that usually come with a smaller vram buffer, forcing the GPU to fetch more frequently over PCI-e. And let's not forget that with Direct Storage, the GPU will be in charge of decompressing data, meaning more accesses trough PCI-e. This kind of move would only be acceptable in low end GPUs. Really a scumbag move from nvidia, only leaning the wrong lessons from AMD.
https://forums.guru3d.com/data/avatars/m/294/294824.jpg
Horus-Anhur:

No it won't. Especially for someone using PCI-e Gen3. Which there a re many, be it on Intel or AMD systems. And especially on nvidia cards, that usually come with a smaller vram buffer, forcing the GPU to fetch more frequently over PCI-e. And let's not forget that with Direct Storage, the GPU will be in charge of decompressing data, meaning more accesses trough PCI-e. This kind of move would only be acceptable in low end GPUs. Really a scumbag move from nvidia, only leaning the wrong lessons from AMD.
Maybe it's cheaper for them, or they are doing a business tactic. The lower the price, the lower the performance increase. Get more people buying, they need it... Or better yet, high end you have to pay more to have 16x. What if they did a different version of high-end? Abit like how car companies charge you for extra haha, anyway, this was to be expected from Nvidia.
https://forums.guru3d.com/data/avatars/m/248/248291.jpg
Meathelix1:

Maybe it's cheaper for them, or they are doing a business tactic. The lower the price, the lower the performance increase. Get more people buying, they need it... Or better yet, high end you have to pay more to have 16x. What if they did a different version of high end? Abit like how car companies charge you for extra's haha
Of course it's cheaper for nvidia and AMD. It's a way to offer an inferior product and get a bigger profit.
https://forums.guru3d.com/data/avatars/m/294/294824.jpg
Horus-Anhur:

Of course it's cheaper for nvidia and AMD. It's a way to offer an inferior product and get a bigger profit.
Exactly my point and this was to be expected. We will have 5000 series on the same architecture but instead of limiting it, they will then open the 5000 cards for a larger power draw and performance increase. It's a pure business tactic, now the prices need to follow if they are decreasing the performance. So... a 30-40% increase on the 4090 over the 3090 but at a little higher price.
https://forums.guru3d.com/data/avatars/m/226/226864.jpg
Meathelix1:

Don't believe the hype people Kopite7kimi has constantly changed what the cards have over and over again. lol
How can a "leaker" know what the SKUs will be for sure when Nvidia themselves probably don't know yet which SKU will have which features? It's like Nvidia gearting up to sell an RTX 3080 20GB (possibly named "3080 Ti" as it wasn't decided at that point) in the pipeline first with manufacturers having already produced the cards, changing to 12GB 3080 Ti then a 12GB 3080 as well with the 20GB models never being officially released (just inofficially sold to a bunch of mining farms, apparently). Jensen Huang likes to decide things just before their announcement. Perhaps if enough fuzz is made in protest about these alleged x8 cards, they'll change their minds or sell them as low-end SKUs instead.
https://forums.guru3d.com/data/avatars/m/294/294824.jpg
Cave Waverider:

Jensen Huang likes to decide things just before their announcement. Perhaps if enough fuzz is made in protest about these alleged x8 cards, they'll change their minds or sell them as low-end SKUs instead.
The average Gamer won't have a clue what you are talking about. Why would they fuss about it? Everyone just wants cheaper cards even if it's only a 20% increase. Also when has Nvidia listened to their customers, they only listen when they lose money.
https://forums.guru3d.com/data/avatars/m/248/248291.jpg
Cave Waverider:

How can a "leaker" know what the SKUs will be for sure when Nvidia themselves probably don't know yet which SKU will have which features? It's like Nvidia gearting up to sell an RTX 3080 20GB (possibly named "3080 Ti" as it wasn't decided at that point) in the pipeline first with manufacturers having already produced the cards, changing to 12GB 3080 Ti then a 12GB 3080 as well with the 20GB models never being officially released (just inofficially sold to a bunch of mining farms, apparently). Jensen Huang likes to decide things just before their announcement. Perhaps if enough fuzz is made in protest about these alleged x8 cards, they'll change their minds or sell them as low-end SKUs instead.
Because nvidia and AMD have to sample AIBs in time, so they can start their work. This has to be done months in advance. Also, at this point, a few SKUs are already in production to gather stock for the official launch. And there is always someone who says something. Another way is to look at driver updates, or kernel updates for Linux. Most of the features we know from RDNA3, come from this.
https://forums.guru3d.com/data/avatars/m/294/294824.jpg
Horus-Anhur:

Because nvidia and AMD have to sample AIBs in time, so they can start their work. This has to be done months in advance. Also, at this point, a few SKUs are already in production to gather stock for the official launch. And there is always someone who says something. Another way is to look at driver updates, or kernel updates for Linux. Most of the features we know from RDNA3, come from this.
The x16 mid range might have never been in production then? These so-called leaks and images could be coming from the Next Gen 5000 series, 3000mhz at 800 watts on load using PCIe 5. Again we don't know could be totally fake. I will be happy with a 4090 at $2500 AUD with a 25-35% increase from the 3090 as that will be a huge upgrade from 2080. Any higher and they are just ripping everyone off, trying to get money back on the leftover stock 3000 series.
https://forums.guru3d.com/data/avatars/m/216/216349.jpg
cucaulay malkin:

scumbag move.
Agree. Everything to save some pennies...
https://forums.guru3d.com/data/avatars/m/248/248291.jpg
Meathelix1:

The x16 might have never been in production then? These so-called leaks and images could be coming from the Next Gen 5000 series, 3000mhz at 800 watts on load using PCIe 5. Again we don't know could be totally fake. I will be happy with a 4090 at $2500 AUD with a 25-35% increase from the 3090 as that will be a huge upgrade from 2080. Any higher and they are just ripping everyone off, trying to get money back on the leftover stock 3000 series.
Mid and high end GPUs always had PCIe 16x lanes. With the 6600, AMD decided to cut corner and just give gamers 8 lanes, on a GPU that had an msrp of 379$. This was a complete rip-off. And it's very sad that NVidia has decided to screw consumers the same way.
https://forums.guru3d.com/data/avatars/m/294/294824.jpg
H83:

Agree. Everything to save some pennies...
All we should be worrying about now is the prices, if its not "hype 80% increase" then what is it and how much is it going to cost?
Horus-Anhur:

Mid and high end GPUs always had PCIe 16x lanes. With the 6600, AMD decided to cut corner and just give gamers 8 lanes, on a GPU that had an msrp of 379$. This was a complete rip-off. And it's very sad that NVidia has decided to screw consumers the same way.
So they are cutting back the lanes, so they are trying to make their mid-range cheaper.
https://forums.guru3d.com/data/avatars/m/226/226864.jpg
Meathelix1:

So they are cutting back the lanes, so they are trying to make their mid-range cheaper.
Cheaper to produce, yes, but cheaper sales price? Probably not. 😉
https://forums.guru3d.com/data/avatars/m/254/254725.jpg
H83:

Agree. Everything to save some pennies...
After multiple years of immense profit.
Meathelix1:

All we should be worrying about now is the prices, if its not "hype 80% increase" then what is it and how much is it going to cost? So they are cutting back the lanes, so they are trying to make their mid-range cheaper.
The only thing they care about is making it cheaper for themselves and how they can spin it to dupes.
https://forums.guru3d.com/data/avatars/m/273/273678.jpg
not seeing a problem here, 3.0 8x is plenty.
https://forums.guru3d.com/data/avatars/m/198/198862.jpg
Astyanax:

not seeing a problem here, 3.0 8x is plenty.
You mean pcie4 x8 and no its not.
https://forums.guru3d.com/data/avatars/m/273/273678.jpg
Undying:

You mean pcie4 x8 and no its not.
3.0 8x has not been saturated by any title. This isn't AMD selling a 4x card.
https://forums.guru3d.com/data/avatars/m/198/198862.jpg
Astyanax:

3.0 8x has not been saturated by any title. This isn't AMD selling a 4x card.
6600/XT have issues exceeding vram limitations at 4.0 x8. It behaves the same as x4.