AOC Adds Three G90 3-sided frameless gaming monitors

Published by

Click here to post a comment for AOC Adds Three G90 3-sided frameless gaming monitors on our message forum
data/avatar/default/avatar25.webp
So basically the same as the old monitors, but since they are frameless, they are more expensive. Definitely worth it when you are a gamer and all you care about is a cheap performing monitor...
https://forums.guru3d.com/data/avatars/m/225/225706.jpg
From a productivity point the frame is good as well - less of a dark area - contrast between screen and frame. Less eyestrain and distance to move your eyes - keep more in peripheral vision. Higher resolution would be nice though 🙂 For me, I'd pay more for the frame being like that.
https://forums.guru3d.com/data/avatars/m/166/166706.jpg
From what I understand G-Sync just died
https://forums.guru3d.com/data/avatars/m/132/132389.jpg
Raider0001:

From what I understand G-Sync just died
Why?
https://forums.guru3d.com/data/avatars/m/166/166706.jpg
Neo Cyrus:

Why?
Does it offer any advantages over freesync now? its just more expensive The END
https://forums.guru3d.com/data/avatars/m/260/260826.jpg
Raider0001:

Does it offer any advantages over freesync now? its just more expensive The END
Freesync monitor offers the advantage to be cheaper than G sync ones.Stop here if you don't like bad news. It also offers the unique "advantage" to enjoy his adaptive sync feature ONLY with less performing (less FPS) AND at the same time more expensive AMD GPUs (more $$$). If you want an adaptive sync monitor AND don't care about performance enjoy a Freesync monitor. If you want an adaptive sync monitor AND care about performance don't buy a Freesync monitor. LOL
data/avatar/default/avatar16.webp
Raider0001:

From what I understand G-Sync just died
Educated consumers still buy nVidia. In fact, most everybody does. Logic.
https://forums.guru3d.com/data/avatars/m/166/166706.jpg
sammarbella:

It also offers the unique "advantage" to enjoy his adaptive sync feature ONLY with less performing (less FPS) LOL
Who invented that bullshit ? because I came here prepared [youtube=2CE-wSU1KMw]
https://forums.guru3d.com/data/avatars/m/260/260826.jpg
Raider0001:

Who invented that bullshit ? because I came here prepared
AMD PR stunt did marvels with some ppl. You came blinded like the AMD PR stunt. It was a blinded and completely subjective "test" because both monitors were locked at the exact same Hz to avoid the need to show AND unlock FPS. The winner was obviously the cheaper Freesync monitor price tag not Vega GPU. My 5000 dollars car will always win a blind performance test at 30 kilometer per hour locked speed vs a 100000 dollars Ferrari at the same locked speed. It's cheaper and has the same speed for less price! LOL
https://forums.guru3d.com/data/avatars/m/166/166706.jpg
sammarbella:

AMD PR stunt did marvels with some ppl. You came blinded like the AMD PR stunt. It was a blinded and completely subjective "test" because both monitors were locked at the exact same Hz to avoid the need to show AND unlock FPS. The winner was obviously the cheaper Freesync monitor price tag not Vega GPU. My 5000 dollars car will always win a blind performance test at 30 kilometer per hour locked speed vs a 100000 dollars Ferrari at the same locked speed. It's cheaper and has the same speed for less price! LOL
you are wrong (they were comparing technologies not graphic cards)... on the video according to you > they came blinded.., is it correct ? Oh that test is reproducible... nvidia never did their own version...
data/avatar/default/avatar13.webp
Nothing of interest here, move along now................
https://forums.guru3d.com/data/avatars/m/166/166706.jpg
TimmyP:

Educated consumers still buy nVidia. In fact, most everybody does. Logic.
I know that but, I would not buy vauxhall astra diesel over a mustang per say just because vauxhall is more efficient and its faster top speed, there are other factors...unlike with cars we just have 2 gpu brands
data/avatar/default/avatar22.webp
I still see a frame (albeit a smallish one) on those other three sides. I therefore call shenanigans on this "frameless" claim.
data/avatar/default/avatar06.webp
Raider0001:

From what I understand G-Sync just died
The accurate and unbiased truth is that gsync has more variable framerate over free-sync, with capabilities down to 30 hz and up to 240hz whereas free sync monitor generally have a lot less leeway (most are 48 to 75 Hz). GSync is also a "premium" technology which is often integrated to higher quality monitor whereas Freesync is a bit of hit and miss and can be integrated to cheap panels. Not mentioning the fact that Nvidia has the uncontested performance crown. So yeah, where you might have been right saying Freesync is helping making adaptive refresh technology affordable and mainstream, it will take some time before Gsync dies. It might just do indeed, as more expensive TV sets release with Freesync and force Nvidia to support it, but until then, Gsync is well present and the big stars (4K and Ultra Wide HDR monitors are all releasing with Gsync).
https://forums.guru3d.com/data/avatars/m/166/166706.jpg
Nolenthar:

The accurate and unbiased truth is that gsync has more variable framerate over free-sync, with capabilities down to 30 hz and up to 240hz whereas free sync monitor generally have a lot less leeway (most are 48 to 75 Hz). GSync is also a "premium" technology which is often integrated to higher quality monitor whereas Freesync is a bit of hit and miss and can be integrated to cheap panels. Not mentioning the fact that Nvidia has the uncontested performance crown. So yeah, where you might have been right saying Freesync is helping making adaptive refresh technology affordable and mainstream, it will take some time before Gsync dies. It might just do indeed, as more expensive TV sets release with Freesync and force Nvidia to support it, but until then, Gsync is well present and the big stars (4K and Ultra Wide HDR monitors are all releasing with Gsync).
Did you read the content of this news? there are monitors with Freesync of 30Hz - up to performance crown of a given GPU is not a factor of a given adaptive sync technology performance... these are 2 separate things G-Sync has no meaning or purpose it is wasted cash and resources you can have a ~10 dollar 30Hz-240Hz Freesync monitor driver chip (which is just new version of a 10 dollar non freesync chip) or a 200 dollar G-Sync board same spec
data/avatar/default/avatar12.webp
Raider0001:

Did you read the content of this news? there are monitors with Freesync of 30Hz - up to
My post was more general than those few monitors which indeed have a good range but having Freesync on a monitor doesn't mean you'll have such a variable range. GSync guarantees it. Now, obviously which GPU has the performance crown matters. I would not buy a Freesync monitor because my GPU wouldn't support it, and as long my GPU (or which ever GPU has the performance crown when I buy a GPU) works best with GSync, the technology won't die. It will only ever die if AMD holds the performance crown for long enough for most monitor maker to not bother with GSync, and it might be very easy for Nvidia to support it (after all it's integrated in the standard) so if that days come, they might turn it on quickly. So no, GSync ain't dead, irrelevant of whether or not the technology should live.
https://forums.guru3d.com/data/avatars/m/260/260826.jpg
HDMI 2.1 who has MANDATORY adaptative sync in the standard will de facto kill AMD's optional HDMI 1.4 adptative sync AND Nvidia proprietary Gsync. ThIs will not happen before 3Q 2018 with monitors AND TVs adding this standard needed for 4K HDR 4:4:4. Until then gamers who want the best performance AND adaptative sync have only one choice... ...if they are not blindfolded (AKA "blindfooled") by AMD PR stunts! :D
data/avatar/default/avatar22.webp
Freesync vs GSync: One of the key aspects most people forget or don't even know about is that you need to OVERDRIVE the voltage to the pixels currently for higher refresh rates. This creates unique PROBLEMS for variable refresh monitors since the amount of voltage to apply changes depending on how quickly it needs to respond. This can result in smeared or distorted colors. GSync as a hardware module has things like this in mind and there's likely a "GSync 2" module which will account for other issues and possibly improve on this (i.e. a lookup table of voltage/time values). Freesync is all over the map in the quality of the monitors from the lack of support for any monitor without LFC (ratio must be at least 2.5x such as 30Hz to 75Hz thus 75/30 = 2.5), overdrive etc. Even a 4K GSync monitor rated at 30Hz to 60Hz still works below 30Hz (i.e. 29FPS would redraw each frame thus updating at 58Hz and keeping the tear-free, minimal latency experience) whereas Freesync in that situation would not operate below 30Hz so you'd get either VSYNC ON (stutter and lag) or VSYNC OFF (screen tearing) when the GPU can't output at least 30FPS. HDMI 2.1 may implement a mandatory Adaptive Sync standard but that doesn't mean that the QUALITY of the final product is guaranteed any more than saying every HDR monitor or HDTV looks the same. *I personally would like to see GSync work with all Adaptive Sync/Freesync monitors, but still come out with a "Premium" GSync monitor solution if the GSync Module is capable of producing a BETTER result than simply following the standards in place. **Combining the larger RANGE of brightness to darkness that HDR offers along with the variability of each frame it MAY be that a hardware module like GSYNC is needed for the best experience. (some 4K, HDR, 144Hz, GSYNC monitors are coming in 2018.. drool!!) ***HDTV's that support FREESYNC, 4K, HDR are coming. The XB1X supports Freesync so it will be interesting to see how well that works (and when). Consider a game that is locked to 30FPS but drops below periodically... at 30FPS you have VSYNC ON which adds lag/sluggishness, whereas below 30FPS you get screen tearing. Now with Freesync the framerate would be UNLOCKED so you might even average 40FPS/40Hz, and also have a tear-free experience that feels less sluggish. The game would run much better with no game developer code changes. (Just a supported HDTV and having Microsoft unlock Freesync support in the XB1X). (I try to use "FPS" to describe the GPU output and "Hz" to describe the monitor or TV refresh rate. For GSYNC/Freesync it's the same value, but for an HDTV it's almost always 60Hz regardless of the GPU output)
data/avatar/default/avatar38.webp
Technical specs aside regarding Freesync vs G Sync a large portion of buyers will be in the same boots as me at the moment with an aging mid range GPU looking to upgrade. For most a RX 570 or 580 will not be a big enough upgrade to justify the cost, leaving the "Out of stock" Vega 56 which is priced more then a GTX1080 or a Vega 64 priced around a 1080Ti. Freesync monitors may be priced better then G Sync but does that really matter when you cant buy a AMD GPU in the first place?
https://forums.guru3d.com/data/avatars/m/166/166706.jpg
LordKweishan:

Freesync vs GSync: One of the key aspects most people forget or don't even know about is that you need to OVERDRIVE the voltage to the pixels currently for higher refresh rates. This creates unique PROBLEMS for variable refresh monitors since the amount of voltage to apply changes depending on how quickly it needs to respond. This can result in smeared or distorted colors. ...GSync as a hardware module has things like this in mind...
https://imgur.com/download/lh0vJAl what is this sorcery doing in the menu of my cheap freesync panel ? oh no, nvidia is not the owner of the overdrive feature ?!
LordKweishan:

**Combining the larger RANGE of brightness to darkness that HDR offers along with the variability of each frame it MAY be that a hardware module like GSYNC is needed for the best experience.
HDR avalible with freesync 2.0 enabled panels
Nolenthar:

My post was more general than those few monitors which indeed have a good range but having Freesync on a monitor doesn't mean you'll have such a variable range. GSync guarantees it.
Since when G-Sync is a quality certificate ? because my brother has one of those expensive G-Sync enabled Asus laptops - it doesn't have overdrive function and that matrix is really not good Do we have to reach a conclusion of which adaptive sync technology is better for a 5 second per frame slide-show smoothness ?
Lee83ant:

... Freesync monitors may be priced better then G Sync but does that really matter when you cant buy a AMD GPU in the first place?
Yes it matters a lot because freesync doesn't go anywhere but a performance crown might in the near future