List of FreeSync - Adaptive Sync Compatible Monitors

Published by

Click here to post a comment for List of FreeSync - Adaptive Sync Compatible Monitors on our message forum
data/avatar/default/avatar06.webp
thx fox for that info.
https://forums.guru3d.com/data/avatars/m/258/258664.jpg
Mantle/DX12 will work with Free/G-Sync without problem. Because both solutions use Graphics card display output to do all synchronization.
Yeah, that's what I thought.
https://forums.guru3d.com/data/avatars/m/187/187265.jpg
I wonder what the chances of current high end monitors being made compatible. There was, if I recall, a mention that it's possible, although not sure how likely, a firmware upgrade could be all that's needed. Be great if my XL2720Z could be "upgraded" to Freesync. I know I've got an Nvidia card but in 6 months, who knows. That's why I got this BenQ instead of a GSync one.
https://forums.guru3d.com/data/avatars/m/243/243702.jpg
I wonder what the chances of current high end monitors being made compatible. There was, if I recall, a mention that it's possible, although not sure how likely, a firmware upgrade could be all that's needed. Be great if my XL2720Z could be "upgraded" to Freesync. I know I've got an Nvidia card but in 6 months, who knows. That's why I got this BenQ instead of a GSync one.
BenQ is misleading customers for years. When I got mine I got it because website stated that there will be FW updates. There is only one type of displays which got official FW update which can be DLoaded and flashed into display unit, but only if you have special programming adapter. Otherwise you have to send them your monitor and they'll do it in-house. And since BenQ has Freesync screen now, I do not think they will consider older screen as candidates for upgrades even if they were physically capable. If that new XL2730Z does not cost too much, I'll get one with new GPU. Otherwise I'll take other FSync screen. But I really like BenQ pivots.
https://forums.guru3d.com/data/avatars/m/79/79740.jpg
Fail.. You havent even checked LG's homepage, meaning YES there is a 34'' 1080p monitor http://www.lg.com/uk/monitors/lg-34UM65-P
Surprised. But 'justified' fail if you will 😀. Given that rest of LGs 34" 'ultra-wide line up is this: 34UM95 = 3440x1440 34UC97 = 3440x1440 34UM67 = 3440 x 1440 while all the rest of 2560x1080 are 29"
https://forums.guru3d.com/data/avatars/m/196/196284.jpg
So much for being anywhere near 'Free' when manufacturers only put it on those penny pusher models. Make a 24" 1080p IPS 60Hz model please.
The "technology" is free....meaning that neither AMD nor VESA are charging an additional licensing fee for it. The ASIC and panel needed for such feature still costs money and as such increases the cost of the displays that support it. The companies making the ASIC have to get paid for their product. The companies making the panel have to get paid for their product. Anyone that expects relatively new "technology" to be free when it hits the market is out of their mind. It's not free to develop or implement and as such typically results in higher prices until R&D is covered and uptake increases to the point where profit margins reach a desirable level for the companies making such products.
https://forums.guru3d.com/data/avatars/m/113/113761.jpg
Quote: "The AMD Radeonâ„¢ R9 295X2, 290X, R9 290, R9 285, R7 260X and R7 260 GPUs additionally feature updated display controllers that will support dynamic refresh rates during gaming. http://support.amd.com/en-us/search/faq/219
https://forums.guru3d.com/data/avatars/m/242/242471.jpg
Any info about this one, @ their homepage there is nothing yet, is this IPS too? Nixeus NX-VUE24 24" 1080p 144Hz
https://forums.guru3d.com/data/avatars/m/178/178868.jpg
YES there is a 34'' 1080p monitor http://www.lg.com/uk/monitors/lg-34UM65-P
That is a BEAUTIFUL monitor. An ultrawide screen being IPS, drool drool... There are also several Dell panels being ultra wide and IPS. Great for movies and games such as Command & Conquer/Skyrim would look Fuucckking great on such a screen. Once several more ultra wide screen IPS panels are released being around 2-4ms, 120-144hz and 2k or 4k, I'll definitely be getting one...
https://forums.guru3d.com/data/avatars/m/243/243702.jpg
That is a BEAUTIFUL monitor. An ultrawide screen being IPS, drool drool... There are also several Dell panels being ultra wide and IPS. Great for movies and games such as Command & Conquer/Skyrim would look Fuucckking great on such a screen. Once several more ultra wide screen IPS panels are released being around 2-4ms, 120-144hz and 2k or 4k, I'll definitely be getting one...
CS 1.6 would be very bad on 21:9 screen. Its FOV is horizontally locked not vertically. That causes loss of top & bottom area when going from 4:3 to 16:9/10 and loss of ability to see upper and lower areas would be too big on 21:9. It would cause quite big disadvantage. And I do not know any modification which would keep vertical FOV and expand horizontal. :bang:
https://forums.guru3d.com/data/avatars/m/178/178868.jpg
CS 1.6 would be very bad on 21:9 screen. Its FOV is horizontally locked not vertically. That causes loss of top & bottom area when going from 4:3 to 16:9/10 and loss of ability to see upper and lower areas would be too big on 21:9. It would cause quite big disadvantage. And I do not know any modification which would keep vertical FOV and expand horizontal. :bang:
Haha, agreed, which is why I specifically mentioned games such as Skyrim, C & c, Warhammer, Dota, Warcraft. Some shooters will not look right due to FOV as you already mentioned...
data/avatar/default/avatar12.webp
I keep thinking back to a year ago when Nvidia's, Tom Petersen, first responded to FreeSync.. An article posted here on Guru3d.. http://www.guru3d.com/news-story/nvidia-responds-to-amd-freesync.html """Desktop monitors use other interfaces, like HDMI and DisplayPort, and typically have a scaler chip situated in the path between the GPU and the panel. As a result, a feature like variable refresh is nearly impossible to implement on a desktop monitor as things now stand.""" and """When asked about a potential VESA standard to enable dynamic refresh rates, Petersen had something very interesting to say: he doesn't think it's necessary, because DisplayPort already supports "everything required" for dynamic refresh rates via the extension of the vblank interval. That's why, he noted, G-Sync works with existing cables without the need for any new standards. Nvidia sees no need and has no plans to approach VESA about a new standard for G-Sync-style functionality—because it already exists.""" What bothers me most is that instead of working on a "standard", Nvidia developed a separate module that they only allow to work with Nvidia hardware, and comes at a premium price.. So this is the way I understand it.... FreeSync can work on both AMD and Nvidia hardware, but Nvidia wont support it because they invested in G-sync.... G-sync can work on both Nvidia and AMD hardware, but Nvidia wont allow it.... I prefer to support a standard.. So unless Nvidia changes direction, or offers gsync without the huge premium.. I will likely go red team on my next gpu upgrade...
https://forums.guru3d.com/data/avatars/m/243/243702.jpg
I keep thinking back to a year ago when Nvidia's, Tom Petersen, first responded to FreeSync.. An article posted here on Guru3d.. http://www.guru3d.com/news-story/nvidia-responds-to-amd-freesync.html """Desktop monitors use other interfaces, like HDMI and DisplayPort, and typically have a scaler chip situated in the path between the GPU and the panel. As a result, a feature like variable refresh is nearly impossible to implement on a desktop monitor as things now stand.""" and """When asked about a potential VESA standard to enable dynamic refresh rates, Petersen had something very interesting to say: he doesn't think it's necessary, because DisplayPort already supports "everything required" for dynamic refresh rates via the extension of the vblank interval. That's why, he noted, G-Sync works with existing cables without the need for any new standards. Nvidia sees no need and has no plans to approach VESA about a new standard for G-Sync-style functionality—because it already exists.""" What bothers me most is that instead of working on a "standard", Nvidia developed a separate module that they only allow to work with Nvidia hardware, and comes at a premium price.. So this is the way I understand it.... FreeSync can work on both AMD and Nvidia hardware, but Nvidia wont support it because they invested in G-sync.... G-sync can work on both Nvidia and AMD hardware, but Nvidia wont allow it.... I prefer to support a standard.. So unless Nvidia changes direction, or offers gsync without the huge premium.. I will likely go red team on my next gpu upgrade...
Cable between nVidia graphics card and G-Sync monitor is standard DP cable. If you know what kind of signaling nV uses, you can make G-Sync work with intel/AMD graphics. In same way as nVidia Vision3D monitor can be forced to run with AMD/intel graphics in 3D mode (strobe, which was initially not possible as nV locked those features). And when I look at this, nV Vision3D collides with variable sync of GSync, and only one of those can be used at time. While passive 3D can be used with G/Free-sync. But backlight strobe feature which people around world tried to unlock for improved response time has its purpose. Hacking GSync monitor to work with AMD/intel HW gives no benefit as Freesync works nearly same (same for average user) and those differences are not to Freesync disadvantage.
https://forums.guru3d.com/data/avatars/m/258/258664.jpg
What bothers me most is that instead of working on a "standard", Nvidia developed a separate module that they only allow to work with Nvidia hardware, and comes at a premium price.. So this is the way I understand it.... FreeSync can work on both AMD and Nvidia hardware, but Nvidia wont support it because they invested in G-sync.... G-sync can work on both Nvidia and AMD hardware, but Nvidia wont allow it.... I prefer to support a standard.. So unless Nvidia changes direction, or offers gsync without the huge premium.. I will likely go red team on my next gpu upgrade...
Yeah, I can see why you're putting the monitor in for the GPU choice now, I'm really tempted to do so too... I'm already considering NOT going for the 1440p Swift, and probably getting a cheap 120Hz monitor without gsync because if nvidia doesn't make it cheaper (which they won't for the next months / years), I might consider going for AMD. If only their GPUs were my choice to go red, and not the monitors.... absurd, isn't it, to choose the GPU because of your monitor? Haven't seen people go for nvidia because of gsync...
https://forums.guru3d.com/data/avatars/m/243/243702.jpg
Yeah, I can see why you're putting the monitor in for the GPU choice now, I'm really tempted to do so too... I'm already considering NOT going for the 1440p Swift, and probably getting a cheap 120Hz monitor without gsync because if nvidia doesn't make it cheaper (which they won't for the next months / years), I might consider going for AMD. If only their GPUs were my choice to go red, and not the monitors.... absurd, isn't it, to choose the GPU because of your monitor? Haven't seen people go for nvidia because of gsync...
Some moved for GSync, but considering nVidia would lock ex AMD user on their side, GSync should have been donated by nV as that would bring revenue in future. They had practically entire year to gain unquestionable dominance in PC market.
data/avatar/default/avatar30.webp
Yeah, I can see why you're putting the monitor in for the GPU choice now, I'm really tempted to do so too... I'm already considering NOT going for the 1440p Swift, and probably getting a cheap 120Hz monitor without gsync because if nvidia doesn't make it cheaper (which they won't for the next months / years), I might consider going for AMD. If only their GPUs were my choice to go red, and not the monitors.... absurd, isn't it, to choose the GPU because of your monitor? Haven't seen people go for nvidia because of gsync...
So I really never ever stick with AMD or NVIDIA... it always depends on who is the best at that time. I mean I went 2GB 5850 to GTX 670 to (2) R9 290s (one I bought early November 2013 for $380 and unlocked to X and the other was an X that I got for $200 used in late April 2014 when all those idiots who bought 20 290X cards for $500-700 a pop found out it was a pipe dream) But I go back and fourth... before the 5850 I had NVIDIA... BUT!!! This TIME!!!!!!!!! I have 2 main PCs... One is MINE and one I build and maintain so my little brother who is in college can play top notch games with me. Well I replaced the (2) GTX 580s I had in his system for a single Gigabyte GTX 970 G1 Gaming... I found out the NVIDIA RAPED ME!!! The card utterly fails if more than 3.5GB of VRAM is used... the 980 does not have this problem and nobody knew that at launch. NVIDIA says it has something to do with the way the RAM is partitioned or some junk like that. They knew about this from the get go... because they are the ones who did the partitioning... for what reason is unkown... they say because the last 512MB behaves slower... but it could have been just to cripple it compared to the GTX 980... because that is what NVIDIA does... they play around with stuff... IT IS BS!!! On a bunch of games I have to run them with lowered texture settings... I don't have to do that on my 290X setup... So now not only is AMD going the legit ethical way with the FreeSync but NVIDIA budget enthusiast level card is JUNK!!!! Absolute JUNK!!! It is in no way shape or form a better card than the 290X... I thought it was... which is why I bought my brother the $370 GTX 970 G1 Gaming when I could have grabbed a used R9 290 or 290X for $200-250 NOW I have a problem with NVIDIA It will take at least 1 year if not 2 years to clear that issue up... They will have to do 2 things... Make an enthusiast budget card that works AND Enabled AdaptiveSync/FreeSync on all of their cards... the later could take weeks, months, or years but it WILL happen.