NVIDIA Working on Tile-based Multi-GPU Rendering Technique Called CFR - Checkered Frame Rendering

Published by

Click here to post a comment for NVIDIA Working on Tile-based Multi-GPU Rendering Technique Called CFR - Checkered Frame Rendering on our message forum
https://forums.guru3d.com/data/avatars/m/242/242134.jpg
@Netherwind And was gonna say, except more recent cards like the 7/900 series. Just was surprised when ppl were getting a 460 to do sli later, instead of getting the next level up, even those cut versions like a SE model.
data/avatar/default/avatar33.webp
Dribble:

The basic problem is it's now down to the game devs. DX11 multigpu was mostly done in drivers, so it was down to the gpu driver writers (i.e. AMD/Nvidia). The low levelness of DX12/Vulcan moves most of that work over to the game devs. For a lot of game devs it's too hard and just not worth the effort. Hence no Sli/Xfire. In one sense they are lazy. Alternatively you could argue that the shift to low level api's has made the driver writers lazy - now their job is really simple as they've pushed a lot of the work they used to do over to the game devs. Imo that's part of the reason AMD pushed low level api's so hard. DX11 and other high level api's mean gpu drivers are complex and because Nvidia had more people they would do a better job. Mantle and now Vulcan/DX12 pushes that job over to the game devs and makes driver writing simple so it effectively evens the driver writing field which suits AMD.
Alot of modern DX11 games don't support SLI/Crossfire anymore either though, the reason given by AMD and Nvidia being it's not viable anymore with the various newer techniques games are using today.
data/avatar/default/avatar02.webp
Yxskaft:

Alot of modern DX11 games don't support SLI/Crossfire anymore either though, the reason given by AMD and Nvidia being it's not viable anymore with the various newer techniques games are using today.
Which is untrue... they simply don't bother putting in the work... see metro exodus for instance, works great with sli once you have done the work yourself. Even in the new assassins creed games there is near perfect sli scaling if you force it on, but ofc it flickers cause they havent bothered making sli profiles for them. In Battlefield 5 there is no official sli support, but the game has perfect sli scaling if you use the bf4 sli profile...
data/avatar/default/avatar21.webp
angelgraves13:

This is obviously for Ampere or Hopper. Guess we're getting MCM late next year with either Ampere or Hopper. I've heard rumors that Ampere won't even be coming out for consumers, and that it's basically like Volta. So I guess we're getting Hopper late next year and the 2080 Ti Super will be the best of the best next year for some time.
I really hope this isn't true... i can't really see it being true either tbh.
https://forums.guru3d.com/data/avatars/m/242/242134.jpg
More plausible than not. They make money on the non-consumer stuff, so you wont always get a regular gpu, especially if they have good yields on chips and dont need to sell them as a cut down/lesser unit.
data/avatar/default/avatar04.webp
The utilization of this innovation with chiplet design is a good guess, but returning of SLI is also a possibility. The problem is always about the game developers not wanting for sensible reasons to put the extra effort to support multi GPU configurations, but this new approach seems simpler to me and could work well on a driver level. The aim is surely to make games run at 4K better and today there are many well earning enthustiastic customers who would certainly buy another GPU if it only could help with this resolution. Nvidia is surely after the big bucks with this. I have nothing more exact to say, but this feature certainly is something to keep eye on and it will bode well for Nvidia soon enough.
https://forums.guru3d.com/data/avatars/m/242/242134.jpg
the market for single (even big) gpus is bigger than that. not even if i look globally, just india and china: how many Millions will buy gpus just in those 2 countries. i doubt you will have a lot more ppl buying one card, just because of the cost involved. and like many others, not against sli, but if i can have a single card, i prefer that (better cooling, smaller psu, less power wasted). and you dont generally get 2x the performance vs a single card. again, looking at ALL players AND ALL games, not just the 20 or maybe even 50 out of how many pc games? last, with multi cards no one will run 720p, so you need vram, which right now is only on the ti going past 8gb, so im still buying the biggest/most expensive card, just to have the amount of vram needed, to really make a use of the horse power.
https://forums.guru3d.com/data/avatars/m/273/273678.jpg
angelgraves13:

This is obviously for Ampere or Hopper. Guess we're getting MCM late next year with either Ampere or Hopper. I've heard rumors that Ampere won't even be coming out for consumers, and that it's basically like Volta. So I guess we're getting Hopper late next year and the 2080 Ti Super will be the best of the best next year for some time.
nah, its for nvlink. original sli just doesn't have the bandwidth to do checkerboard.
https://forums.guru3d.com/data/avatars/m/232/232349.jpg
Dribble:

The basic problem is it's now down to the game devs. DX11 multigpu was mostly done in drivers, so it was down to the gpu driver writers (i.e. AMD/Nvidia). The low levelness of DX12/Vulcan moves most of that work over to the game devs. For a lot of game devs it's too hard and just not worth the effort. Hence no Sli/Xfire. In one sense they are lazy. Alternatively you could argue that the shift to low level api's has made the driver writers lazy - now their job is really simple as they've pushed a lot of the work they used to do over to the game devs. Imo that's part of the reason AMD pushed low level api's so hard. DX11 and other high level api's mean gpu drivers are complex and because Nvidia had more people they would do a better job. Mantle and now Vulcan/DX12 pushes that job over to the game devs and makes driver writing simple so it effectively evens the driver writing field which suits AMD.
Hence most support went way of the dodo after os updates/driver releases. Is updates as in from windows 7 through 10 support was lost for sure....
data/avatar/default/avatar23.webp
Did someone return to the 90a and unboxed later generation of 3dfx? Checkered technic is not new. But I guess there is a SLI cycle until single card beat them again. Then current generation plato go back to multi GPU...etc.
data/avatar/default/avatar10.webp
According the the 3DCenter thread, it does show a 40-50% FPS increase in games tested using two RTX 2080Ti@4K. Metro Exodus (DX11/DX12) works (DLSS not compatible!) BattleField V (DX11/DX12) not compatible Borderlands 3 (DX11 works, DX12 crash) Chernobylite (DX11) works Crysis 3 (DX11) works Shadow of the Tomb Raider (DX12 doesn't start, DX11 works) Deus Ex Mankind Divided (DX12 doesn't start, DX11 works) GRID (2019) (DX12) crash Control (DX12) stability problems F1 2019 (DX12) crash Hitman 2 (DX11+DX12) (Visuel problems,flicker) Forza Horizon 4 (DX12/UWP -> crash BSOD) The Elder Scrolls Skyrim SE (DX11) (no scaling) Final Fantasy XV (DX11) (no scaling) A Plague Tale Innocence (DX11) works Mafia III (DX11) works Monster Hunter: World (DX11) crash Tomb Raider (2013) (uneven GPU) Middle Earth Shadow of Mordor (DX11) works (shadows show sometimes problems) Devil May Cry 5 (DX11) works Quantum Break (DX11) (no sclaing) Resident Evil 7 (DX11) works Far Cry 5 DX11 (Terrain flickers) Resident Evil 2 Remake (DX11) works The Division II (DX12) (works, z-fighting and framepacing-problems) https://www.forum-3dcenter.org/vbulletin/showpost.php?p=12144578&postcount=3586 Edit: Curious how his testing didn't need NVLink, though bandwidth is likely compromised.
data/avatar/default/avatar19.webp
I remember a card called the xgi volari v8 I think, and I think it used tiled rendering. Or it was another defunct gpu maker at around the same time. Had some AA or shading, or both, issues due to the tiling. Been a while, can't remember Some console used it too back in the day.
data/avatar/default/avatar19.webp
pharma:

According the the 3DCenter thread, it does show a 40-50% FPS increase in games tested using two RTX 2080Ti@4K. Metro Exodus (DX11/DX12) works (DLSS not compatible!) BattleField V (DX11/DX12) not compatible Borderlands 3 (DX11 works, DX12 crash) Chernobylite (DX11) works Crysis 3 (DX11) works Shadow of the Tomb Raider (DX12 doesn't start, DX11 works) Deus Ex Mankind Divided (DX12 doesn't start, DX11 works) GRID (2019) (DX12) crash Control (DX12) stability problems F1 2019 (DX12) crash Hitman 2 (DX11+DX12) (Visuel problems,flicker) Forza Horizon 4 (DX12/UWP -> crash BSOD) The Elder Scrolls Skyrim SE (DX11) (no scaling) Final Fantasy XV (DX11) (no scaling) A Plague Tale Innocence (DX11) works Mafia III (DX11) works Monster Hunter: World (DX11) crash Tomb Raider (2013) (uneven GPU) Middle Earth Shadow of Mordor (DX11) works (shadows show sometimes problems) Devil May Cry 5 (DX11) works Quantum Break (DX11) (no sclaing) Resident Evil 7 (DX11) works Far Cry 5 DX11 (Terrain flickers) Resident Evil 2 Remake (DX11) works The Division II (DX12) (works, z-fighting and framepacing-problems) https://www.forum-3dcenter.org/vbulletin/showpost.php?p=12144578&postcount=3586 Edit: Curious how his testing didn't need NVLink, though bandwidth is likely compromised.
All those games work perfectly with old AFR. Bf5, shadow of the tomb raider, etc, all show 70-95% sli scaling. https://i.imgur.com/GPvubdz.jpg https://i.imgur.com/IYzL8Ng.jpg (Fps capped at 58 in rtss on the last screenie)
https://forums.guru3d.com/data/avatars/m/149/149159.jpg
Why does everyone think using chiplets mean it will be run as sli or some sort? All it is is just splitting up the parts of the gpu to make it easier and cheaper to create a variety of gpus and gain more yield.
https://forums.guru3d.com/data/avatars/m/220/220214.jpg
DeskStar:

Really beg to differ...... Only lazy development from lazy developers showed it wasn't worth it. Anything from dice is a scalable dream.... Play valve games and they're amazing. Play a crytek game and it is a dream with more than one card...... Lazy development is what got multi-card setups the hack..... I can still game at 6880x2440 in some games maxed out on my quad sli Titans...... But sli has been dead because of other garbage....... Brand new systems with most powerful card can not run what I run at 120+fps 6880x2440 I know because I just built another one...... Five year old computer can run "supported" games faster than a sysytem built today.... Facts.
Really? can you play latest versions of DICE games eg BF5, using latest Frostbite engine or are you talking about older games running older Frostbite versions like BF3 or BF4. I think that is the reason why your 5 year old computer can run "supported" games - because maybe these games are old and use the older APIs and game engines.
https://forums.guru3d.com/data/avatars/m/232/232349.jpg
geogan:

Really? can you play latest versions of DICE games eg BF5, using latest Frostbite engine or are you talking about older games running older Frostbite versions like BF3 or BF4. I think that is the reason why your 5 year old computer can run "supported" games - because maybe these games are old and use the older APIs and game engines.
Really..... Don't own BF5 so can't say. And my backlog of games is so old that maybe this does benefit me. Thinking I might try my older system at windows 7 ult. once again as that was the best for performance regarding this particular setup.
https://forums.guru3d.com/data/avatars/m/220/220214.jpg
DeskStar:

Really..... Don't own BF5 so can't say. And my backlog of games is so old that maybe this does benefit me. Thinking I might try my older system at windows 7 ult. once again as that was the best for performance regarding this particular setup.
Yes of course... years ago i had dual GPU cards like 7990 and the older DirectX 9 based games (also when SLI was supported by drivers) some games had good SLI scaling.... I haven't had more than one GPU since then though because of all the trouble it caused even back then.
data/avatar/default/avatar19.webp
I have an SLI 1080ti setup. Will CFR work on my PC or is it only for RTX cards that use nvlink?