VESA Adds Adaptive-Sync to DisplayPort Video Standard - Updated AMD

Published by

Click here to post a comment for VESA Adds Adaptive-Sync to DisplayPort Video Standard - Updated AMD on our message forum
https://forums.guru3d.com/data/avatars/m/227/227994.jpg
But it only works if the monitor has the hardware for it?
https://forums.guru3d.com/data/avatars/m/227/227994.jpg
Is that the same as Gsync?
Probably. Why can't they make something that's between the two ends of the cable so it can be used on all DisplayPort Monitors.
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
The monitor has to support Displayport 1.2a but I don't think it requires specific hardware. It does require firmware changes obviously to account for variance in the frame display. So I doubt any previously released monitors will ever get support. But it should work on the last few generations of AMD hardware + a monitor that has support for it.
https://forums.guru3d.com/data/avatars/m/227/227994.jpg
The monitor has to support Displayport 1.2a but I don't think it requires specific hardware. It does require firmware changes obviously to account for variance in the frame display. So I doubt any previously released monitors will ever get support. But it should work on the last few generations of AMD hardware + a monitor that has support for it.
Guess i'm stuffed then, since mine only supports 1.1a.
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
I dunno I might not even be right although I'm inclined to believe AMD over random forum users but a guy over at overlock.net posted this.
I REALLY don't want to have to have the FreeSync discussion again, because that horse has been beating to death on this forum; by myself and a few others. So let me just sum it up this way, and if you don't believe me check the forums here for the sources.... VBlank is an OPTIONAL standard within DP 1.2a - Just because a device has 1.2a doesn't mean it has the OPTIONAL VBlank feature. Any display that is going to use it WILL need an ASIC (scaler), just like we see with G-Sync. AMD has said they won't develop this, and they want display vendors to do it. Nvidia did it themselves, thus G-sync module. The one time FreeSync was "shown" was actually proof of concept demo using eDP on a very specific notebook. The hardware to do variable refresh rate on a desktop display hasn't been developed yet outside of G-Sync, which is an ASIC. "FreeSync" would require its own ASIC - who is going to develop that at who's expense? I literally can not stress enough, outside of paper, FreeSync doesn't exist. The hardware for it doesn't exist yet, the supply chain of vendors don't exist, hell AMD doesn't even have VBlanking enabled on their GPUs even! Although I heard a rumor that someone did dig into their driver and found a disabled option for VBlank to work.
Maybe Hilbert or someone can look into this from AMD itself because the original press release AMD definitely made it sound like there would be no hardware required + their current GPU's are capable. Like I said I'm inclined to believe in AMD over some random forum poster but I really don't know enough about monitors to say.
https://forums.guru3d.com/data/avatars/m/210/210150.jpg
Is that the same as Gsync?
It sounds more like an official version of what AMD announced as "Free-Sync" a few months back. They pointed out that VESA already incorporates most of what is required for this feature, without all of the specialized hardware and licensing fees that Nvidia's Gsync requires (somewhat unnecessarily it seems now). But yeah, the end result is Gsync features built into all devices conforming to VESA and displayport spec.
data/avatar/default/avatar34.webp
Do you think VG248qe has 1.2a standard? I mean i will be ready for AMD Freesync?
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
Do you think VG248qe has 1.2a standard? I mean i will be ready for AMD Freesync?
No, not only does the 248QE not have 1.2a but now, from what i'm reading, freesync does require VBLANK support on the scalar chip. So it does technically require hardware although anyone can integrate it via ASIC chip as oppose to having to go through Nvidia. So Freesync monitors should be significantly cheaper. Although I have no idea if they will be better at doing what they are supposed to do or how the input lag would workout. So basically I guess this is how it goes: AMD Freesync Requirements - Monitor w/scalar chip with VBLANK support + Any Graphics card that supports it. Scalar chip can be ASIC (which is significantly cheaper than an FPGA) and it can be developed by anyone which should also drive down costs. Nvidia G-Sync Requirements - Nvidia supported GSync Monitor (FPGA implementation although Nvidia could eventually switch this to a ASIC implementation as well) + Nvidia graphics card that supports it. AMD's solution should be cheaper when it eventually comes out (it could be a while). Nvidia's solution should be upgradable in the future via firmware (why else would they use an FPGA?). One may be better than the other in terms of performance. Nvidia's FPGA carries some beefy hardware (ram and stuff) so I'm not sure what exactly it's doing but maybe it could lead to lower input lag. Dunno, until one of these companies discloses more stuff on what's going on it's like impossible to say.
https://forums.guru3d.com/data/avatars/m/16/16662.jpg
Administrator
News-item updated with a QA/FAQ from AMD.
https://forums.guru3d.com/data/avatars/m/240/240541.jpg
There is a big difference between AMD's and Nvidia's implementations. AMD's predicts the framerate, nvidia doesn't predict the framerate. So i'd rather take nvidia's implementation.
data/avatar/default/avatar38.webp
There is a big difference between AMD's and Nvidia's implementations. AMD's predicts the framerate, nvidia doesn't predict the framerate. So i'd rather take nvidia's implementation.
That's not true. It's a baseless assumption. Both work the same at the low level.
https://forums.guru3d.com/data/avatars/m/196/196284.jpg
Is that the same as Gsync?
No. It's not the same as G-Sync. G-Sync is "proprietary" and will only ever work with NVidia hardware. Adaptive-Sync is now part of the DisplayPort 1.2a standard and can be used by both AMD and NVidia, if NVidia chooses to support it.
https://forums.guru3d.com/data/avatars/m/208/208308.jpg
Right in nVidia's stomach... AMD opted to OPEN STANDARD their sync technology, guess what. Everyone wants it. I hope nVidia learns this.
https://forums.guru3d.com/data/avatars/m/240/240541.jpg
That's not true. It's a baseless assumption. Both work the same at the low level.
It's not baseless, look at their witepaper for freesync. Nvidia the monitor waits for the gpu to set the VBI; freesync the driver guesses the VBI. Granted it has to be an assumption because amd's freesync is still a paper launch product. Plus freesync absolutely won't have the main advantage that gsync has with its direct communication, g-sync is no lag because of it. Gsync is absolutely superior for gaming.
https://forums.guru3d.com/data/avatars/m/196/196284.jpg
"Freesync" isn't open, but Adaptive-Sync is, as it's now part of the DisplayPort 1.2a spec. Since Adaptive-Sync has been adopted as part of the DisplayPort 1.2a spec, anyone can use it. That's the great part about industry standards. Anyone that wants to use them, can. NVidia could have pushed for "G-Sync" to be adopted as part of the DisplayPort spec had they wanted to. Instead, as is their nature, they went the proprietary route. Nothing against NVidia for doing so as from a business perspective it was in their best interest. I, personally, have no interest in it from either company as it has no benefit for me.
https://forums.guru3d.com/data/avatars/m/227/227994.jpg
Well, I guess the difference between Adaptive-Sync and G-sync are exactly what AMD says. G-Sync is hardware, likely a patented protected piece of hardware. And some kind of proprietary signaling specification for it. While Adaptive-Sync is a specifications saying how a graphics card should talk to hardware to support variable refresh rate. We will have to see what kind of hardware and fidelity requirements there will be for hardware to be classified as Adaptive-Sync capable, and how the quality of the resultant hardware will compare. One would have to guess that Nvidia would like to get the G-Sync chips and monitors made capable to work with Adaptive-Sync and certified by the VESA so they can get an early lead as a vendor for such hardware. But one question remains on if there will be any fidelity differences when a G-Sync hardware equipped monitor gets it's signaling from the Adaptive-Sync vs the G-Sync standard. I also wonder if vendors will have to deal with Nvidia patents if they make their own hardware implemenations to support Adaptive-Sync. Nvidia GPUs are known to have better performance from CUDA API, as opposed to when OpenCL is used, so I wonder if such a situation will apply here. Hopefully, there are no technical reasons, caused by the Adaptive-Sync standard that will prevent the highest quality implementations of said standard from having parity with an implementation using the full G-Sync stack. Until such parity is reach, one knows that Nvidia will market their advantage to no end.
Adaptive-Sync is also hardware restricted. As far as i can see, you need a Monitor and Videocard that support DisplayPort 1.2a. And most likely a 1.2a DisplayPort Cable. I'm still hoping for something that goes between the Videocard and Monitor that works for ANY user using DisplayPort.
https://forums.guru3d.com/data/avatars/m/223/223196.jpg
That's it then for Gsync ? Dead on arrival ?
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
Please. AMD's is an open standard and Nvidia can implement it they choose. But no, they want us to believe we need to buy an expensive chip that costs more than a $100 dollars? Get real. If VESA makes it a standard spec all future vendors can get the tech...even Intel.
It's only expensive because they chose to use an FPGA, I guess because they plan on issuing firmware upgrades for it (for example they talked about removing polling in an update). Also Freesync still requires the scalar chip to implement VBLANK. So it's not like it's free anyway, it requires all new scalars.
https://forums.guru3d.com/data/avatars/m/196/196284.jpg
That's it then for Gsync ? Dead on arrival ?
Not exactly dead. At least not yet. Whether or not it dies, depends entirely on NVidia. If NVidia chooses to support Adaptive-Sync, then G-Sync might possibly, quietly fade away. If NVidia chooses to ignore Adaptive-Sync, then G-Sync will still have a solid place in the market. I'm sure regardless of whether or not NVidia supports Adaptive-Sync, they could likely keep pushing G-Sync and the NVidia loyalists will still pay the price premium so long as display makers are willing to go along.
https://forums.guru3d.com/data/avatars/m/56/56686.jpg
Not exactly dead. At least not yet. Whether or not it dies, depends entirely on NVidia. If NVidia chooses to support Adaptive-Sync, then G-Sync might possibly, quietly fade away. If NVidia chooses to ignore Adaptive-Sync, then G-Sync will still have a solid place in the market. I'm sure regardless of whether or not NVidia supports Adaptive-Sync, they could likely keep pushing G-Sync and the NVidia loyalists will still pay the price premium so long as display makers are willing to go along.
At this point i dont care which takes hold, as long one of the 2 replace the flawed vsync with something better that dont have performance hits like vsync does. But as of now BOTH are nothing more white paper ideas, as neither are out on the market in any meaning full way for the end user to appreciate either. By the time one of them takes hold I will probably of made new pc, and I "making" new pc for 3+ years now.