NVIDIA Ends SLI Support and is Transitioning to Native Game Integrations (read terminated)

Published by

Click here to post a comment for NVIDIA Ends SLI Support and is Transitioning to Native Game Integrations (read terminated) on our message forum
https://forums.guru3d.com/data/avatars/m/156/156133.jpg
Moderator
An end of an era. I mean SLI has been kind of pretty dead for awhile now, but damn did it look cool back in the day having 2 8800 GT cards in my build.
https://forums.guru3d.com/data/avatars/m/218/218363.jpg
Question is how easy or difficult it is for game devs to include it in their games? Most likely not worth it but who knows.
data/avatar/default/avatar02.webp
Once it was automatic, 2 cards, nearly 200% performance. At some point egines changed and it was not anymore working good 🙁
data/avatar/default/avatar25.webp
The fact that the 3080 does not have a sli connector is very bad imho. I understand pushing the 3090 more, but knowing that 2 3080 would make the 3090 useless and stopping it... seems bad.
https://forums.guru3d.com/data/avatars/m/256/256352.jpg
i feel bad for a friend that just bought a second 1080 ti XD
https://forums.guru3d.com/data/avatars/m/198/198862.jpg
I remember the time when nvidia and ati ware saying multi gpu is the future.
https://forums.guru3d.com/data/avatars/m/220/220214.jpg
It - the use of multiple GPUs to speed up framerate - is not dead. The requirement of the SLI bridge connector is dead. Game devs need to implement multi-GPU support directly into their code while writing their game engine. Probably not easy. But IMO should be compulsory for VR related games engines and code, so we could use one GPU-per-eye to get decent framerates in certain tough applications. Probably not as critically needed for desktop. Anyway as always it suits Nvidia for it to be not used - sell more expensive high-end cards that way.
https://forums.guru3d.com/data/avatars/m/232/232349.jpg
asturur:

The fact that the 3080 does not have a sli connector is very bad imho. I understand pushing the 3090 more, but knowing that 2 3080 would make the 3090 useless and stopping it... seems bad.
No it wouldn't. 10gb of VRAM already makes the 3080 useless.... It's going to be obvious when playing any games that demand a lot of VRAM. Get these reviewers to use the division 2 as a benchmark. Gone will be that 10gb of VRAM. Run neon noir from cryengine. Chews up more vram than you've got.
https://forums.guru3d.com/data/avatars/m/232/232349.jpg
zhalonia:

i feel bad for a friend that just bought a second 1080 ti XD
I would too.....!
https://forums.guru3d.com/data/avatars/m/209/209146.jpg
Division 2 allocates 80% of the GPU's VRAM total (First game I think had that at 70%) it's not much of a measurement for actual consumption without checking for the real usage too. Pretty sure Vulkan or D3D12 under multi-GPU mode could also combine the total but it'd require the developer to utilize this support which I think is also part of D3D12 and has been used in what, Ashes of the Singularity back early on and then what?
https://forums.guru3d.com/data/avatars/m/277/277333.jpg
Not sure if I'm misunderstanding, but it seems like they're actually doing the proper thing for SLI to continue? I mean, instead of manually tweaking each profile for a game, they're saying "the major APIs have native support, so let's just use that". It's equivalent to them saying "we're dropping RTX in favor of native RT implementations that exist inside DX12 and Vulkan". How is that bad news?
https://forums.guru3d.com/data/avatars/m/275/275639.jpg
That´s bad news ... was going to buy two 3080s 20gb and NVlink them to get 40gb of VRAM for rendering...
https://forums.guru3d.com/data/avatars/m/275/275639.jpg
Ricardo:

Not sure if I'm misunderstanding, but it seems like they're actually doing the proper thing for SLI to continue? I mean, instead of manually tweaking each profile for a game, they're saying "the major APIs have native support, so let's just use that". It's equivalent to them saying "we're dropping RTX in favor of native RT implementations that exist inside DX12 and Vulkan". How is that bad news?
VRAM is an issue - NVLink creates shared pool, without it each GPU will need to load everything in its own VRAM
https://forums.guru3d.com/data/avatars/m/273/273678.jpg
3080 didn't have sli in the first place.
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
Ricardo:

Not sure if I'm misunderstanding, but it seems like they're actually doing the proper thing for SLI to continue? I mean, instead of manually tweaking each profile for a game, they're saying "the major APIs have native support, so let's just use that". It's equivalent to them saying "we're dropping RTX in favor of native RT implementations that exist inside DX12 and Vulkan". How is that bad news?
Came here to say the same thing. I think it'd be nice if Nvidia could offer users the chance to force-enable mGPU support (and maybe they will) but if Nvidia is working together with developers, that really ought to yield a better outcome than we've seen before.
Pepehl:

VRAM is an issue - NVLink creates shared pool, without it each GPU will need to load everything in its own VRAM
If the game itself is talking directly to the GPUs, it can more efficiently manage resources. There's no need to copy all game data into both GPUs if they're not processing the same thing. With traditional mGPU setups, the drivers did all the work, so the only reliable way to split the workload was for each GPU to share pretty much all the same resources as the others.
https://forums.guru3d.com/data/avatars/m/115/115710.jpg
Pepehl:

That´s bad news ... was going to buy two 3080s 20gb and NVlink them to get 40gb of VRAM for rendering...
Only 3090 has NVLink support though.
https://forums.guru3d.com/data/avatars/m/216/216349.jpg
Ricardo:

Not sure if I'm misunderstanding, but it seems like they're actually doing the proper thing for SLI to continue? I mean, instead of manually tweaking each profile for a game, they're saying "the major APIs have native support, so let's just use that". It's equivalent to them saying "we're dropping RTX in favor of native RT implementations that exist inside DX12 and Vulkan". How is that bad news?
It´s bad news because studios have already said since the beginning of DX12 that they wouldn´t waste time and money catering for the small crowd who uses SLI/Crossfire and with Nvidia stating that they will no longer support it either it means that SLI is dead and buried. Of course this is just a confirmation of what we already knew since the introduction of DX12. For me it makes no difference because i only use a single GPU but for those who like mGPU solutions it´s bad news.
https://forums.guru3d.com/data/avatars/m/253/253034.jpg
Seems to me like mGPU just needs a marketing adjustment, instead of marketing it at the 0.1% of gamers who want to buy 2 prosumer level GPUs focus more on the mid segment. Add mGPU to recommended specs to make it seem like a more viable upgrade path for people who don't want to by a new top teir GPU. If someone told me doubling up my 670 would make cyberpunk more playable I'd definitely spend 60 quid on a second hand one for just that one game alone. Ofc this assumes its not an insane amount of hassle for devs to get this stuff functioning.
https://forums.guru3d.com/data/avatars/m/227/227994.jpg
Last Multi GPU setup i had was the ATI HD5970, and it was a piece of shit. Had it for 8 months and had nothing but trouble, swapped it for a single GTX 580.
https://forums.guru3d.com/data/avatars/m/175/175902.jpg
asturur:

Once it was automatic, 2 cards, nearly 200% performance. At some point egines changed and it was not anymore working good 🙁
Nope, it never worked like that, it's more 2 = 1,5 if in good condition... Crossfire went to be a bit better... but the GPU were worse in it's time. Both dead. Hopefully now both brand follow a more natural way to do multiple GPU (but it still won't do 2 = 2 lol).