AMD FreeSync will be adopted by VESA
As you guys know AMD announced a competitive and may I say very cheap solution to tackle Nvidia's Gsync recently, it goes under the name FreeSync. The problem thus far is that monitors need to support a new protocol, and for a monitor protocol you need to talk to VESA to make it a standard. Well, good news AMDs FreeSync will be adopted by VESA.
So the graphics card is running a dynamic device and that the monitor is static Hz, these two don't really match together. Now the irony is that the solution to solve all this is so very simple in its bare essence. G-Sync and FreeSync will get rid of screen tearing and sync stuttering/pulsing.
Basically the protocol will be embeded into DisplayPort 1.2a, there is a catch though, manufacturers of monitors are free to decide wether or not to support the technology.
If you like to learn more about the 'overall' technology, have a peek at our G-Sync article. Both companies use different methodologies, but the outcome is nearly similar.
Senior Member
Posts: 2843
Joined: 2009-09-15
So basically everyone that already owns a monitor/HDTV and has it connected to their discrete gpu is out of luck for enjoying either FreeSync or G-Sync. Correct?
Or did I miss something?
Senior Member
Posts: 1089
Joined: 2010-09-16
Regardless of whoever implements this first, I'll be glad to see things like screen tearing gone. But this will take awhile. I was thinking about buying a GSync monitor but now I'm probably just going to wait and see how Free does first.
Senior Member
Posts: 774
Joined: 2006-05-08
arg! display port is limited to 60hz

aka youll never get more than 60fps

my current moniter does up to 72hz...blah.
hence why I do NOT use DP but HDMI.

Senior Member
Posts: 1843
Joined: 2005-08-12
arg! display port is limited to 60hz

aka youll never get more than 60fps

my current moniter does up to 72hz...blah.
hence why I do NOT use DP but HDMI.

I don't think it's really true.
Senior Member
Posts: 152
Joined: 2013-11-26
So at first glance AMD is one step ahead of Nvidia in this, although it would be interesting to see the pros and cons for the two technologies.