NVIDIA Ends SLI Support and is Transitioning to Native Game Integrations (read terminated)

Published by

Click here to post a comment for NVIDIA Ends SLI Support and is Transitioning to Native Game Integrations (read terminated) on our message forum
https://forums.guru3d.com/data/avatars/m/267/267641.jpg
Its still better than crossfire 🙂 BTW only real way is integrate it in Unreal engine, after that all other engine would have to keep up.. Last really good sli was with Voodoo 2.
https://forums.guru3d.com/data/avatars/m/79/79740.jpg
On the plus side we wont have complainers in the drivers section whining about SLI driver issues.
https://forums.guru3d.com/data/avatars/m/56/56686.jpg
it was always gona happen, I sure people will not like this though. honestly I never like SLI great when work usual but most time it as more trouble then it worth, maybe that will change now that game cause use natively providing the dev implents for there game in way that it works BEST for there own game
https://forums.guru3d.com/data/avatars/m/270/270008.jpg
Pepehl:

That´s bad news ... was going to buy two 3080s 20gb and NVlink them to get 40gb of VRAM for rendering...
How did you plan to do that with no NVlink connector on the 3080?
https://forums.guru3d.com/data/avatars/m/270/270008.jpg
Frankly I think they are dropping support now as they need to work on drivers for there next generation MCM approach.
https://forums.guru3d.com/data/avatars/m/258/258664.jpg
Now it comes down to the devs of the engines themselves to offer a "kit" to implement for game devs. And the GPU companies would be wise to invest some of their power to work on something like such a kit with the engine devs.
https://forums.guru3d.com/data/avatars/m/216/216349.jpg
alanm:

On the plus side we wont have complainers in the drivers section whining about SLI driver issues.
I think this is the main reason for the death of SLI/Crossfire, too much hassle for a very small group of buyers. And to make thing worse, many of them bought their second card used meaning that Nvidia and AMD wouldn´t see a penny for their troubles with mGPU...
https://forums.guru3d.com/data/avatars/m/201/201182.jpg
geogan:

It - the use of multiple GPUs to speed up framerate - is not dead. The requirement of the SLI bridge connector is dead. Game devs need to implement multi-GPU support directly into their code while writing their game engine. Probably not easy. But IMO should be compulsory for VR related games engines and code, so we could use one GPU-per-eye to get decent framerates in certain tough applications. Probably not as critically needed for desktop. Anyway as always it suits Nvidia for it to be not used - sell more expensive high-end cards that way.
But a hardware bridge between cards is still a requirement for multi-GPUs to scale up performance whether traditional SLI profiles or " Native Game Integrations" are used, correct?
https://forums.guru3d.com/data/avatars/m/156/156133.jpg
Moderator
Pepehl:

VRAM is an issue - NVLink creates shared pool, without it each GPU will need to load everything in its own VRAM
NVLINK with full VRAM pooling was only available on the Quadro RTX cards. It also was not featured in DX12 or Vulkan.
https://forums.guru3d.com/data/avatars/m/282/282657.jpg
I was very disappointed by FS2020 weak performance, hoping SLI may help, especially when using multi-display setups. Now with the pruchase of arm in mind, nvidia is resettling its position. In the near future, I believe, we don´t need so much graphics power anymore, because all the compute power will come from the cloud. Nvidia with its Virtual GPU solutions and cloud services already is providing its compute power for content creators and developers. So only the basic tasks will be done on your workstation GPU(s), Xbox-X, PS7 or PS8, the rest will be streamed. Therfore a fast internetconnection will be more important than powerful client graphic cards. FS2020 is a very first step from Microsoft Azure in this direction. In future the whole world will be calculated "outside" and then streamed to the Multiplayer enviroment.
https://forums.guru3d.com/data/avatars/m/273/273678.jpg
SLI can't fix CPU bottlenecks.
https://forums.guru3d.com/data/avatars/m/175/175902.jpg
JamesSneed:

How did you plan to do that with no NVlink connector on the 3080?
It was planned to have one originaly, they have kept it on the 3090... Anyway NVlink isn't SLI and SLI isn't the only solution for multiple NVidia GPU... Only SLI is dead (i was even think that he was already dead years ago).
https://forums.guru3d.com/data/avatars/m/197/197287.jpg
rl66:

It was planned to have one originaly, they have kept it on the 3090...
Proof, not conspiracy theory, about this statement please? Otherwise not sure why you are posting it.
https://forums.guru3d.com/data/avatars/m/165/165018.jpg
Ahh the gold old days of the 8800GTX GTx295 and my 660tis is finally over. Though I had planned to get a second 295 back in the day.
https://forums.guru3d.com/data/avatars/m/242/242573.jpg
ruthan:

Its still better than crossfire 🙂 BTW only real way is integrate it in Unreal engine, after that all other engine would have to keep up.. Last really good sli was with Voodoo 2.
Yeah. I had a pair of Diamond Monster 3d II's back then and they had great scaling. Unreal... Mechwarrior 2 3dfx edition... Descent... Blood... and Shadow Warrior. Years later I got a pair of EVGA 8800GTS 320mb cards. They were great too. Haven't done SLI since.
https://forums.guru3d.com/data/avatars/m/270/270288.jpg
What about nvelink ?
data/avatar/default/avatar14.webp
It amazes me you even need to have SLI enabled in the drivers at all for DirectX 12 and Vulkan, when those APIs use explicit multi-GPU, where the API and the engine have control over the multi-GPU scaling, and not the drivers like was the case with DirectX 11 and OpenGL, which all utilized implicit multi-GPU, where drivers did all the heavy lifting. You don't even need an SLI bridge, technically, to get them working, but only Ashes of the Singularity, Strange Brigade, and possibly Shadow of the Tomb Raider has implemented explicit, heterogenous/multi-display adapter mode to allow multi-GPU without the need for an SLI/NVLink bridge, or SLI needing to be enabled in the drivers, when the DX12/VK API is responsible for all the heavy lifting and not the drivers.
data/avatar/default/avatar30.webp
Well that sucks, i loved SLI. Never had an issue with it and it always just worked for me. I had SLI for just about every generation ever since it was available starting with my 7800 GTX, 8800 GTS, GTX 260 core 216, GTX 570, GTX 780. The 10 series was the first time in YEARS i had went with a single GPU and i wasnt happy about it because i was afraid i wouldnt get the level of performance i was used to. I had to go from the regular 1080 to the Ti to get it to run where i felt happy with the performance. RIP old friend, it was fun!