AOC Adds Three G90 3-sided frameless gaming monitors
Click here to post a comment for AOC Adds Three G90 3-sided frameless gaming monitors on our message forum
FrostNixon
So basically the same as the old monitors, but since they are frameless, they are more expensive. Definitely worth it when you are a gamer and all you care about is a cheap performing monitor...
entr0cks
From a productivity point the frame is good as well - less of a dark area - contrast between screen and frame.
Less eyestrain and distance to move your eyes - keep more in peripheral vision. Higher resolution would be nice though 🙂
For me, I'd pay more for the frame being like that.
Raider0001
From what I understand G-Sync just died
Neo Cyrus
Raider0001
sammarbella
TimmyP
Raider0001
sammarbella
Raider0001
ChisChas
Nothing of interest here, move along now................
Raider0001
dragonlord
I still see a frame (albeit a smallish one) on those other three sides. I therefore call shenanigans on this "frameless" claim.
Nolenthar
Raider0001
Nolenthar
sammarbella
HDMI 2.1 who has MANDATORY adaptative sync in the standard will de facto kill AMD's optional HDMI 1.4 adptative sync AND Nvidia proprietary Gsync.
ThIs will not happen before 3Q 2018 with monitors AND TVs adding this standard needed for 4K HDR 4:4:4.
Until then gamers who want the best performance AND adaptative sync have only one choice...
...if they are not blindfolded (AKA "blindfooled") by AMD PR stunts!
:D
LordKweishan
Freesync vs GSync:
One of the key aspects most people forget or don't even know about is that you need to OVERDRIVE the voltage to the pixels currently for higher refresh rates. This creates unique PROBLEMS for variable refresh monitors since the amount of voltage to apply changes depending on how quickly it needs to respond. This can result in smeared or distorted colors.
GSync as a hardware module has things like this in mind and there's likely a "GSync 2" module which will account for other issues and possibly improve on this (i.e. a lookup table of voltage/time values).
Freesync is all over the map in the quality of the monitors from the lack of support for any monitor without LFC (ratio must be at least 2.5x such as 30Hz to 75Hz thus 75/30 = 2.5), overdrive etc.
Even a 4K GSync monitor rated at 30Hz to 60Hz still works below 30Hz (i.e. 29FPS would redraw each frame thus updating at 58Hz and keeping the tear-free, minimal latency experience) whereas Freesync in that situation would not operate below 30Hz so you'd get either VSYNC ON (stutter and lag) or VSYNC OFF (screen tearing) when the GPU can't output at least 30FPS.
HDMI 2.1 may implement a mandatory Adaptive Sync standard but that doesn't mean that the QUALITY of the final product is guaranteed any more than saying every HDR monitor or HDTV looks the same.
*I personally would like to see GSync work with all Adaptive Sync/Freesync monitors, but still come out with a "Premium" GSync monitor solution if the GSync Module is capable of producing a BETTER result than simply following the standards in place.
**Combining the larger RANGE of brightness to darkness that HDR offers along with the variability of each frame it MAY be that a hardware module like GSYNC is needed for the best experience.
(some 4K, HDR, 144Hz, GSYNC monitors are coming in 2018.. drool!!)
***HDTV's that support FREESYNC, 4K, HDR are coming. The XB1X supports Freesync so it will be interesting to see how well that works (and when). Consider a game that is locked to 30FPS but drops below periodically... at 30FPS you have VSYNC ON which adds lag/sluggishness, whereas below 30FPS you get screen tearing.
Now with Freesync the framerate would be UNLOCKED so you might even average 40FPS/40Hz, and also have a tear-free experience that feels less sluggish. The game would run much better with no game developer code changes. (Just a supported HDTV and having Microsoft unlock Freesync support in the XB1X).
(I try to use "FPS" to describe the GPU output and "Hz" to describe the monitor or TV refresh rate. For GSYNC/Freesync it's the same value, but for an HDTV it's almost always 60Hz regardless of the GPU output)
Lee83ant
Technical specs aside regarding Freesync vs G Sync a large portion of buyers will be in the same boots as me at the moment with an aging mid range GPU looking to upgrade. For most a RX 570 or 580 will not be a big enough upgrade to justify the cost, leaving the "Out of stock" Vega 56 which is priced more then a GTX1080 or a Vega 64 priced around a 1080Ti.
Freesync monitors may be priced better then G Sync but does that really matter when you cant buy a AMD GPU in the first place?
Raider0001
https://imgur.com/download/lh0vJAl
what is this sorcery doing in the menu of my cheap freesync panel ?
oh no, nvidia is not the owner of the overdrive feature ?!
HDR avalible with freesync 2.0 enabled panels
Since when G-Sync is a quality certificate ? because my brother has one of those expensive G-Sync enabled Asus laptops - it doesn't have overdrive function and that matrix is really not good
Do we have to reach a conclusion of which adaptive sync technology is better for a 5 second per frame slide-show smoothness ?
Yes it matters a lot because freesync doesn't go anywhere but a performance crown might in the near future