AOC Adds Three G90 3-sided frameless gaming monitors

Published by

Click here to post a comment for AOC Adds Three G90 3-sided frameless gaming monitors on our message forum
https://forums.guru3d.com/data/avatars/m/132/132389.jpg
Raider0001:

Does it offer any advantages over freesync now? its just more expensive The END
It works on nVidia cards. nVidia has the large majority of the market which would want any type of adaptive sync. And yes, according to every analysis ever, GSync is more consistent across brands due to nVidia's strict requirements and that otherwise worthless module that jacks up the price even more. GSync isn't going anywhere anytime soon.
data/avatar/default/avatar13.webp
Raider0001:

From what I understand G-Sync just died
Raider0001:

Does it offer any advantages over freesync now? its just more expensive The END
Raider0001:

Did you read the content of this news? there are monitors with Freesync of 30Hz - up to performance crown of a given GPU is not a factor of a given adaptive sync technology performance... these are 2 separate things G-Sync has no meaning or purpose it is wasted cash and resources you can have a ~10 dollar 30Hz-240Hz Freesync monitor driver chip (which is just new version of a 10 dollar non freesync chip) or a 200 dollar G-Sync board same spec
I know you're here simply trying to confirm your biases without actually looking for legitimate answers. Others have given you the advantages of G-Sync over FreeSync and why it's not dead. Going to reiterate, including advantages of FreeSync vs. G-Sync, so that it's all here: 1) Abundance: FreeSync >>> G-Sync. Advantage: FreeSync. By a long shot. 2) Cost: The G-Sync premium is generally $200-$250. Advantage: FreeSync. By a long shot. 3) Refresh rate range: G-Sync: 30Hz refresh rate minimum is a requirement, regardless of the monitor in question, regardless of the refresh rate maximum (60, 100, 120, 144, 165, 180, 240). All monitors support going below the minimum limit without disabling G-Sync functionality. FreeSync: Nothing is a requirement. Some Korean monitors even have a laughable 48-60Hz range. Low Framerate Compensation (LFC) is only enable when the spread between the minimum and the maximum is 2.5x. You can adjust the range via CRU, and for several monitor models, can actually achieve the 2.5x spread (but not necessarily for every sample of that monitor model). For some monitors like the MG278Q, if you want 144Hz as the refresh rate maximum, then your minimum HAS to be 57Hz. Some samples can hit 55Hz, others might just miss that 57Hz (e.g. 58Hz). Advantage: G-Sync. By a long shot. 4) Response times: G-Sync: G-Sync modules are fine-tuned for each panel that is G-Sync certified. Variable overdrive is implemented so that when your refresh rate changes, you don't get ghosting / overshoot. Static overdrive on a variable refresh-rate monitor is a bad idea, and Nvidia know this from the start. Very infrequently do you get complaints of G-Sync monitors having flicker, except on menus and cases where the framerate goes down to zero or close. FreeSync: Nothing is guaranteed. It is up to the monitor manufacturer to implement overdrive correctly. So, you get a bunch of monitors that have variable overdrive working, another bunch that has ghosting / overshoot, another bunch that has borked overdrive settings when FreeSync is enabled, etc... Much more frequent complaints of flicker on FreeSync monitors, sometimes going as far as affecting an entire model. Advantage: G-Sync. By a long shot. 5) Strobing: G-Sync: Every single monitor (perhaps with an exception or two) implements ULMB. With TN panels, ULMB supports higher refresh rates, closer / equal to the maximum the monitor is capable of in normal / G-Sync mode. With IPS / AHVA panels, often lower refresh rates than what the monitor is capable of in normal / G-Sync mode. ULMB generally works, but is not necessarily perfectly implemented on every model. FreeSync: Nothing is guaranteed. It is up to the manufacturer to implement strobing or not - some implementations work great, others are laughable. Advantage: G-Sync. By a long shot. 6) Compatibility: G-Sync works all the way back to a GTX 680 (release in March 2012). FreeSync works all the way back to a 290X (released in November 2013). Advantage: G-Sync, if ever so slightly now (the 680 simply outdated - the 290X is still a very capable card).
Raider0001:

https://imgur.com/download/lh0vJAl what is this sorcery doing in the menu of my cheap freesync panel ? oh no, nvidia is not the owner of the overdrive feature ?!
You're being dense on purpose. Reread what he said. He talked about variable overdrive. The option you have just showed us controls overdrive level, which could be variable (probably not), static (highest probability), or not working at all (still likely with some FreeSync monitors ...).
Since when G-Sync is a quality certificate ? because my brother has one of those expensive G-Sync enabled Asus laptops - it doesn't have overdrive function and that matrix is really not good Do we have to reach a conclusion of which adaptive sync technology is better for a 5 second per frame slide-show smoothness ?
Nice of you to talk about G-Sync as a whole then summon ONLY a particular subset of G-Sync monitors - laptops. Not only are they a relatively new breed, they also lack the G-Sync module that I assume you want dead (understandable, given its cost - not understandable, given variable overdrive - which could perhaps be implemented GPU-side with a display standard update, or not). If the display does not give you a menu with an overdrive function, it does not mean that it lacks overdrive - that would be madness, particularly if the display in question has an IPS-type panel. As for the "matrix" (I assume you mean the panel?) not being good - G-Sync is not necessarily a guarantee of good panel quality. It is more-so a guarantee of fluidity and cleanliness of motion - something gamers tend to care more about (although I disagree with the trajectory many gamers take with TN panels with absolutely horrible color presets but "low" response times).
Yes it matters a lot because freesync doesn't go anywhere but a performance crown might in the near future
Time is money, and waiting for AMD to catch up might be worth less to someone than just ponying up the $200-$250 extra for a G-Sync monitor and getting a guarantee of motion smoothness / cleanliness and pairing that up with a GPU that performs (often much) better than what the competition from AMD is capable of (e.g. Vega 56 / 64 vs. 1080Ti). Now with the latest Titan V release, although that GPU is clearly not meant for gaming, and yields are horribly low, it is an indicator that Nvidia are (if slightly) more than a generation ahead of AMD. Their GTX1080 released in the middle of last year (2016). AMD's competition arrived a year later, with quite worse power draw, and comparable (and sometimes worse / better) performance. Let's not talk about the laughable Vega 64 - that card is just nonsense. If Nvidia manage to get yields up in the next few months - year, then by the time AMD has any response to the GTX 1080Ti (if they can save themselves from Vega), Nvidia would already have a big, fat Volta GPU on the market. So, in conclusion, it really doesn't make sense to talk of a shifting performance crown when that performance crown is not likely (read: at all) to be moving heads any time soon.
https://forums.guru3d.com/data/avatars/m/166/166706.jpg
yasamoka:

I know you're here simply trying to confirm your biases without actually looking for legitimate answers. Others have given you the advantages of G-Sync over FreeSync and why it's not dead. Going to reiterate, including advantages of FreeSync vs. G-Sync, so that it's all here: 1) Abundance: FreeSync >>> G-Sync. Advantage: FreeSync. By a long shot. 2) Cost: The G-Sync premium is generally $200-$250. Advantage: FreeSync. By a long shot. 3) Refresh rate range: G-Sync: 30Hz refresh rate minimum is a requirement, regardless of the monitor in question, regardless of the refresh rate maximum (60, 100, 120, 144, 165, 180, 240). All monitors support going below the minimum limit without disabling G-Sync functionality. FreeSync: Nothing is a requirement. Some Korean monitors even have a laughable 48-60Hz range. Low Framerate Compensation (LFC) is only enable when the spread between the minimum and the maximum is 2.5x. You can adjust the range via CRU, and for several monitor models, can actually achieve the 2.5x spread (but not necessarily for every sample of that monitor model). For some monitors like the MG278Q, if you want 144Hz as the refresh rate maximum, then your minimum HAS to be 57Hz. Some samples can hit 55Hz, others might just miss that 57Hz (e.g. 58Hz). Advantage: G-Sync. By a long shot. 4) Response times: G-Sync: G-Sync modules are fine-tuned for each panel that is G-Sync certified. Variable overdrive is implemented so that when your refresh rate changes, you don't get ghosting / overshoot. Static overdrive on a variable refresh-rate monitor is a bad idea, and Nvidia know this from the start. Very infrequently do you get complaints of G-Sync monitors having flicker, except on menus and cases where the framerate goes down to zero or close. FreeSync: Nothing is guaranteed. It is up to the monitor manufacturer to implement overdrive correctly. So, you get a bunch of monitors that have variable overdrive working, another bunch that has ghosting / overshoot, another bunch that has borked overdrive settings when FreeSync is enabled, etc... Much more frequent complaints of flicker on FreeSync monitors, sometimes going as far as affecting an entire model. Advantage: G-Sync. By a long shot. 5) Strobing: G-Sync: Every single monitor (perhaps with an exception or two) implements ULMB. With TN panels, ULMB supports higher refresh rates, closer / equal to the maximum the monitor is capable of in normal / G-Sync mode. With IPS / AHVA panels, often lower refresh rates than what the monitor is capable of in normal / G-Sync mode. ULMB generally works, but is not necessarily perfectly implemented on every model. FreeSync: Nothing is guaranteed. It is up to the manufacturer to implement strobing or not - some implementations work great, others are laughable. Advantage: G-Sync. By a long shot. 6) Compatibility: G-Sync works all the way back to a GTX 680 (release in March 2012). FreeSync works all the way back to a 290X (released in November 2013). Advantage: G-Sync, if ever so slightly now (the 680 simply outdated - the 290X is still a very capable card). You're being dense on purpose. Reread what he said. He talked about variable overdrive. The option you have just showed us controls overdrive level, which could be variable (probably not), static (highest probability), or not working at all (still likely with some FreeSync monitors ...). Nice of you to talk about G-Sync as a whole then summon ONLY a particular subset of G-Sync monitors - laptops. Not only are they a relatively new breed, they also lack the G-Sync module that I assume you want dead (understandable, given its cost - not understandable, given variable overdrive - which could perhaps be implemented GPU-side with a display standard update, or not). If the display does not give you a menu with an overdrive function, it does not mean that it lacks overdrive - that would be madness, particularly if the display in question has an IPS-type panel. As for the "matrix" (I assume you mean the panel?) not being good - G-Sync is not necessarily a guarantee of good panel quality. It is more-so a guarantee of fluidity and cleanliness of motion - something gamers tend to care more about (although I disagree with the trajectory many gamers take with TN panels with absolutely horrible color presets but "low" response times). Time is money, and waiting for AMD to catch up might be worth less to someone than just ponying up the $200-$250 extra for a G-Sync monitor and getting a guarantee of motion smoothness / cleanliness and pairing that up with a GPU that performs (often much) better than what the competition from AMD is capable of (e.g. Vega 56 / 64 vs. 1080Ti). Now with the latest Titan V release, although that GPU is clearly not meant for gaming, and yields are horribly low, it is an indicator that Nvidia are (if slightly) more than a generation ahead of AMD. Their GTX1080 released in the middle of last year (2016). AMD's competition arrived a year later, with quite worse power draw, and comparable (and sometimes worse / better) performance. Let's not talk about the laughable Vega 64 - that card is just nonsense. If Nvidia manage to get yields up in the next few months - year, then by the time AMD has any response to the GTX 1080Ti (if they can save themselves from Vega), Nvidia would already have a big, fat Volta GPU on the market. So, in conclusion, it really doesn't make sense to talk of a shifting performance crown when that performance crown is not likely (read: at all) to be moving heads any time soon.
Have you seen the video link I put up there ? there are ppl not seeing variable overdrive its virtual, response times also because AMD already won input lag war in VR which I assume works everywhere else too. G-Sync features are similar to some audio-voodoo stuff really, or Apple marketing. Good you write so much putting there 0 proofs that something works in reality. Diamond cables works for ya ? good It is obvious for me that nvidia would do absolutely everything in their power to make sure you buy their 200$ crap watching just numbers on the web page and of course nvidia being single best monitor driver chip manufacturer did the best driver at a 1.0 version than all of the other brands battle their way to survive for ~50 years? G-SYNC: Nothing is mandatory ! not even G-SYNC board inside
data/avatar/default/avatar31.webp
Raider0001:

Have you seen the video link I put up there ? there are ppl not seeing variable overdrive its virtual, response times also because AMD already won input lag war in VR which I assume works everywhere else too.
More nonsense. I have no idea what you're getting to. Show input lag figures for Nvidia vs. AMD where you "assume" they work.
G-Sync features are similar to some audio-voodoo stuff really, or Apple marketing. Good you write so much putting there 0 proofs that something works in reality. Diamond cables works for ya ? good
I have a G-Sync monitor (ViewSonic XG2703-GS) and I have tested what I have said up there. What do you have to go from? Some nonsense about your brother's G-Sync laptop? Sure. Your method of proving things by "assumption" seems to work really well. If you have any contention with what I have said up there, then counter-argue. I will have to take you for a kid if all you can do is respond that I have provided zero proofs. But yes, erect strawmen as if we are people who would buy expensive cables that offer no advantage. Such great arguing skills - we all know more than you do here, and you're not fooling us at all.
It is obvious for me that nvidia would do absolutely everything in their power to make sure you buy their 200$ crap watching just numbers on the web page and of course nvidia being single best monitor driver chip manufacturer did the best driver at a 1.0 version than all of the other brands battle their way to survive for ~50 years?
Unlike you, we here actually own G-Sync monitors and have tested them extensively. You have nothing to go by, we have everything to go by.
G-SYNC: Nothing is mandatory ! not even G-SYNC board inside
More crap.
https://forums.guru3d.com/data/avatars/m/232/232130.jpg
@Raider0001, just stop typing and re-read above. Yasamoka making valid points. If you had bad experience with G-sync, it doesn't mean technology sucks. Gaming on laptop is an issue itself from my point of view. Less stability, possible heat issues and tuned down hardware as a attempt not to melt hardware inside. While I am not interested in G-sync nor Freesync, i don't bash it nor telling people to git gut and learn to stabilize frames to match panel refreshrate. Each got it strength as yasamoka already mentioned.
https://forums.guru3d.com/data/avatars/m/196/196284.jpg
yasamoka:

6) Compatibility: G-Sync works all the way back to a GTX 680 (release in March 2012). FreeSync works all the way back to a 290X (released in November 2013). Advantage: G-Sync, if ever so slightly now (the 680 simply outdated - the 290X is still a very capable card).
FreeSync is supported back to the HD7000 series. Aside from that, I really don't care to read any more because this thread appears to have turned into another pathetic AMD bashing thread, just like every other thread that mentions anything even remotely related to AMD. From what I've seen over the last few years, this is no longer a hardware enthusiast forum. It's an Intel/NVidia enthusiast forum. Any time AMD is mentioned to any extent, all the Intel or NVidia loyalists pop up and start bashing AMD. It's time you people grow up. You're actually damaging this forum's reputation.
data/avatar/default/avatar10.webp
sykozis:

FreeSync is supported back to the HD7000 series.
Only for video playback. We've never seen it in action even then.
Aside from that, I really don't care to read any more because this thread appears to have turned into another pathetic AMD bashing thread, just like every other thread that mentions anything even remotely related to AMD. From what I've seen over the last few years, this is no longer a hardware enthusiast forum. It's an Intel/NVidia enthusiast forum. Any time AMD is mentioned to any extent, all the Intel or NVidia loyalists pop up and start bashing AMD. It's time you people grow up. You're actually damaging this forum's reputation.
Are you seriously suggesting that *I'm* an Nvidia fanboy? Dear, you must have a goldfish memory. Yeah, whenever someone posts what's actually going on in the GPU market, and criticizes one of the companies, people jump to call them a fanboy. Try your nonsense with someone else; you know my post history over the years on this forum - to accuse me of being a fanboy is utterly laughable.
https://forums.guru3d.com/data/avatars/m/196/196284.jpg
yasamoka:

Only for video playback. We've never seen it in action even then. Are you seriously suggesting that *I'm* an Nvidia fanboy? Dear, you must have a goldfish memory. Yeah, whenever someone posts what's actually going on in the GPU market, and criticizes one of the companies, people jump to call them a fanboy. Try your nonsense with someone else; you know my post history over the years on this forum - to accuse me of being a fanboy is utterly laughable.
I'm well aware of your posting history. You seem to completely miss my point here though. This thread actually has nothing to do with AMD. This thread concerns monitors from AOC. Yes, those monitors have "FreeSync" as a listed feature. Big deal. If people want a FreeSync monitor, so be it. I own one myself. I've had no issues with it. However, in my case, FreeSync is completely useless since every game I play runs well in excess of 200fps resulting in having a framerate limit set to 60fps for most games. That aside, there was no reason for NVidia to be mentioned in this thread as these monitors don't support G-Sync. This thread also has nothing to do with the GPU market seeing as the news article is specifically about 3 new (and vastly overpriced) AOC monitors. Of course, at no point did I refer to anyone as a "fanboy" nor did I make such an accusation of you but you respond to me with childish insults. I don't know how old you are and quite frankly I don't care. Childish behavior is damaging to a tech forum's reputation and credibility, regardless of the accuracy of the information. You moved from AMD to NVidia. Good for you. I moved from NVidia back to AMD again in my main system. Made more sense than sticking with NVidia when I had already planned to switch to RyZen upon it's release. I personally have no preference in regards to hardware. I buy what fits my needs at the time. My point still stands though. There was mention in the news post about a feature specific to AMD and the thread turned towards bashing AMD, as usual. Support, regardless of how limited, is still support. I could always throw my HD7950 in and test it, but what purpose would that serve aside from irritating me? I don't actually use FreeSync anyway.
data/avatar/default/avatar31.webp
sykozis:

I'm well aware of your posting history. You seem to completely miss my point here though. This thread actually has nothing to do with AMD. This thread concerns monitors from AOC. Yes, those monitors have "FreeSync" as a listed feature. Big deal. If people want a FreeSync monitor, so be it. I own one myself. I've had no issues with it.
The post of mine you had replied to was itself a reply to another member's post claiming that G-Sync is dead. It was a reply to an off-topic post where the member had demonstrated a particular obstinacy that misinforms the reader on these forums. I have thus chosen to correct him on the few points he had asked about regarding what G-Sync could possibly have over FreeSync.
However, in my case, FreeSync is completely useless since every game I play runs well in excess of 200fps resulting in having a framerate limit set to 60fps for most games. That aside, there was no reason for NVidia to be mentioned in this thread as these monitors don't support G-Sync. This thread also has nothing to do with the GPU market seeing as the news article is specifically about 3 new (and vastly overpriced) AOC monitors.
That's only because you are not aware of the latency advantage of FreeSync even when dealing with framerates close (almost equal) to the refresh rate, aside from the occasional stutter many games might experience every once in a while in V-Sync on scenarios. For one, when V-Sync gets engaged, input latency rises considerably. This latency can be reduced via a framerate limiter (in-game / RTSS) that reduces the number of buffered frames. The latency, then, can be reduced even further by making sure you never engage V-Sync - that is by setting a framerate limit ever so slightly below your refresh rate (e.g. 58FPS) and then enabling V-Sync for frametime compensation in the case where a frame is rendered then presented slightly faster than 60Hz (in less than 16.67ms), whereby the framerate limiter is not completely accurate (and it cannot be in a real-time OS). For frames that take slightly longer than 16.67ms, with V-Sync on they would cause a stutter - with FreeSync engaged, they are, for the most part, undetectable, unless they're a major spike (e.g. 50ms). All you have to do to reap the benefits of FreeSync even on a 60Hz display when your games are running well beyond is to cap your framerate slightly below 60Hz (57-58FPS) and enable V-Sync. That's all.
Of course, at no point did I refer to anyone as a "fanboy" nor did I make such an accusation of you but you respond to me with childish insults. I don't know how old you are and quite frankly I don't care. Childish behavior is damaging to a tech forum's reputation and credibility, regardless of the accuracy of the information.
By implication, when you say:
I really don't care to read any more because this thread appears to have turned into another pathetic AMD bashing thread, just like every other thread that mentions anything even remotely related to AMD. From what I've seen over the last few years, this is no longer a hardware enthusiast forum. It's an Intel/NVidia enthusiast forum. Any time AMD is mentioned to any extent, all the Intel or NVidia loyalists pop up and start bashing AMD. It's time you people grow up. You're actually damaging this forum's reputation.
What do you mean to tell me? It's pretty clear that you're claiming I'm an Nvidia fanboy. No need to lie about it, that's exactly what you meant.
You moved from AMD to NVidia. Good for you. I moved from NVidia back to AMD again in my main system.
See? You already know.
Made more sense than sticking with NVidia when I had already planned to switch to RyZen upon it's release.
How is that even related? What does the CPU choice have to do with the GPU choice? If anything, with Ryzen's slightly lower IPC and the need, here and there, for game patches in order to take advantage of the new architecture, an equivalent (performance identical) Nvidia card would arguably deliver a slightly better framerate in CPU-limited scenarios due to less driver overhead in DX11 games, which are still the majority of games in this day and age. That's aside from the obvious benefits of an AMD card: FreeSync support - which you don't utilize anyway, along with a better control panel and arguably a better feature set in some cases and higher VRAM on the 480 / 580 models, along with better / worse performance depending on the title, and the obvious benefits of an Nvidia card: higher efficiency, ShadowPlay, and G-Sync.
I personally have no preference in regards to hardware. I buy what fits my needs at the time. My point still stands though. There was mention in the news post about a feature specific to AMD and the thread turned towards bashing AMD, as usual.
The thread didn't turn into anything but a reply to someone falsely claiming that G-Sync is dead just because they are biased towards AMD and would not listen to a well-made argument in favor of the technology they unreasonably despise.
Support, regardless of how limited, is still support. I could always throw my HD7950 in and test it, but what purpose would that serve aside from irritating me? I don't actually use FreeSync anyway.
As is painfully obvious here, we are talking about G-Sync / FreeSync support for games. FreeSync is supported in windowed mode. However, there is no telling if a graphics card that only supports FreeSync in video playback is able to control the entire desktop in variable refresh rate mode - if that were the case, what then would prevent it from supporting games in windowed mode? With the vast majority of our video watching being in a browser, and browsers do not support exclusive fullscreen. How useful is video playback support, really? That is aside from that enabling G-Sync / FreeSync for windowed mode applications sets the entire display to that refresh rate - good luck moving a mouse cursor at 24 / 30Hz - that is if the display in question supports 24Hz minimum / 30Hz minimum / LFC with a 2.5x range. For a 60Hz monitor, that's 24Hz - 60Hz, pretty much the maximum range you are likely to see on a 60Hz FreeSync display.
https://forums.guru3d.com/data/avatars/m/196/196284.jpg
yasamoka:

That's only because you are not aware of the latency advantage of FreeSync even when dealing with framerates close (almost equal) to the refresh rate, aside from the occasional stutter many games might experience every once in a while in V-Sync on scenarios.
And there you go assuming again....
yasamoka:

What do you mean to tell me? It's pretty clear that you're claiming I'm an Nvidia fanboy. No need to lie about it, that's exactly what you meant.
You assumed I was referring to you because of the placement of the statement within the response.
yasamoka:

See? You already know.
Yes, I know you switched from AMD to NVidia. You did mention that you have a G-Sync monitor. You also made a self inclusive statement in regards to testing G-Sync. There's also the "EVGA GTX 1080Ti SC" that appears under your avatar.
yasamoka:

How is that even related? What does the CPU choice have to do with the GPU choice? If anything, with Ryzen's slightly lower IPC and the need, here and there, for game patches in order to take advantage of the new architecture, an equivalent (performance identical) Nvidia card would arguably deliver a slightly better framerate in CPU-limited scenarios due to less driver overhead in DX11 games, which are still the majority of games in this day and age. That's aside from the obvious benefits of an AMD card: FreeSync support - which you don't utilize anyway, along with a better control panel and arguably a better feature set in some cases and higher VRAM on the 480 / 580 models, along with better / worse performance depending on the title, and the obvious benefits of an Nvidia card: higher efficiency, ShadowPlay, and G-Sync.
How is it related? I can get chipset and graphics drivers from a single website. It's called simplicity. If NVidia designed x86 CPUs, I'd build a system using CPU and GPU from NVidia. If Intel produced capable dedicated graphics cards (or even had a capable iGPU), I'd be using CPU and GPU from Intel. I don't have much free time so, the fewer websites I have to check for driver updates, the better. In regards to FreeSync. It is enabled in the monitor. It's also enabled in Radeon Settings. I haven't disabled it because I see no need to. I play BF4, TL2 and when my son is here, I play Roblox with him. I've noticed no difference between this monitor at 60fps framerate limit with FreeSync enabled and the 24" display it replaced with the same 60fps framerate limit. Maybe there's a difference, maybe there isn't. It's more likely that I just don't care enough to worry about it though. Much like my feelings in regards to the potential impact of FreeSync, I don't care enough to share everything I know with this or any other forum that I post on. Instead, I allow people to make assumptions in regards to what I know. I have nothing to prove to anyone on this or any other forum that I post on. I don't work in this industry and haven't since 2004 when I quit working as a consultant and system builder. Besides, I find it entertaining when people like you start making assumptions about what I know, just as I found it entertaining when the last company I worked as a consultant for thought they would survive just fine on their automated backup systems after I quit. Had they hired someone to replace me, they might not have lost a multi-million dollar lawsuit in 2005.
yasamoka:

As is painfully obvious here, we are talking about G-Sync / FreeSync support for games. FreeSync is supported in windowed mode. However, there is no telling if a graphics card that only supports FreeSync in video playback is able to control the entire desktop in variable refresh rate mode - if that were the case, what then would prevent it from supporting games in windowed mode? With the vast majority of our video watching being in a browser, and browsers do not support exclusive fullscreen. How useful is video playback support, really? That is aside from that enabling G-Sync / FreeSync for windowed mode applications sets the entire display to that refresh rate - good luck moving a mouse cursor at 24 / 30Hz - that is if the display in question supports 24Hz minimum / 30Hz minimum / LFC with a 2.5x range. For a 60Hz monitor, that's 24Hz - 60Hz, pretty much the maximum range you are likely to see on a 60Hz FreeSync display.
For me, the vast majority of video watching is done through Kodi or the HDHomerun View software. So, it doesn't affect me anyway. However, as I said, limited support is still support.
data/avatar/default/avatar40.webp
sykozis:

And there you go assuming again....
Yeah?
However, in my case, FreeSync is completely useless since every game I play runs well in excess of 200fps resulting in having a framerate limit set to 60fps for most games.
You clearly don't know about the latency advantage. Not an assumption. This isn't about your ego; it's fine not to know.
You assumed I was referring to you because of the placement of the statement within the response.
I'm regular enough with my comprehension to assume that when you quote my post then respond with an accusation levelled at a group of people, then I am included. Sheesh.
How is it related? I can get chipset and graphics drivers from a single website. It's called simplicity. If NVidia designed x86 CPUs, I'd build a system using CPU and GPU from NVidia. If Intel produced capable dedicated graphics cards (or even had a capable iGPU), I'd be using CPU and GPU from Intel. I don't have much free time so, the fewer websites I have to check for driver updates, the better.
Dat reason ... The last time I visited a website to update my drivers for my desktop, was, like, never... Windows 10 takes care of that, as well as GeForce Experience (Nvidia), Crimson (AMD). So I really, really, really can't see your point. At all. Zilch. Basing a hardware purchase decision on whether you visit one website or two if ever after you install the hardware ...
In regards to FreeSync. It is enabled in the monitor. It's also enabled in Radeon Settings. I haven't disabled it because I see no need to. I play BF4, TL2 and when my son is here, I play Roblox with him. I've noticed no difference between this monitor at 60fps framerate limit with FreeSync enabled and the 24" display it replaced with the same 60fps framerate limit. Maybe there's a difference, maybe there isn't. It's more likely that I just don't care enough to worry about it though.
Back to point 1, where I lamented that you clearly did not know about the latency advantage of FreeSync that I have explained in detail which proves that you did not even bother to read how such an advantage is taken care of yet still insist that I am "assuming". Alright.
For me, the vast majority of video watching is done through Kodi or the HDHomerun View software. So, it doesn't affect me anyway. However, as I said, limited support is still support.
No, it is not, and don't try to twist this any other way. Video playback support is pretty equivalent to no support, and no amount of arguing semantics will change this. I have never even read talk of someone taking advantage of that video playback support, while on the other hand, I am one of the people who wished I actually had FreeSync support when I was on 7970s, which, by the way, would not even allow me to enable FreeSync for whatever reason after CRU EDID overrides to my Korean monitor - so there goes video playback support... Again, to repeat this for crystal clarity - FreeSync "Video Playback" support is - completely - pointless, especially if one's video watching is done in exclusive fullscreen (which is probably the only mode video playback support works in) at which point any respectable player (e.g. MPC-HC) actually has an automatic refresh rate changer depending on the framerate of the video content you are playing...
https://forums.guru3d.com/data/avatars/m/196/196284.jpg
You really need to check your overly inflated ego. Not knowing and not caring are 2 completely different things. In this case, I simply don't care. If I cared, I'd have spent more than the 10 seconds it took to find the list of FreeSync supported graphics cards. You appear to have some desperate need to prove you know more than everyone else. Yes, Windows 10 can take care of driver updates. That's great. I prefer to control driver updates myself. Personal preference. As such, I don't expect you to understand it. For you to understand it, you'd have to get past your ego. Also, Windows isn't the only operating system I use....
data/avatar/default/avatar23.webp
EDIT
data/avatar/default/avatar08.webp
sykozis:

You really need to check your overly inflated ego. Not knowing and not caring are 2 completely different things. In this case, I simply don't care. If I cared, I'd have spent more than the 10 seconds it took to find the list of FreeSync supported graphics cards. You appear to have some desperate need to prove you know more than everyone else.
What is blatantly obvious is that *you* are the one who has some desperate need to argue semantics in a topic they do not understand properly despite not caring about the topic essentially.
Yes, Windows 10 can take care of driver updates. That's great. I prefer to control driver updates myself. Personal preference. As such, I don't expect you to understand it. For you to understand it, you'd have to get past your ego.
You are. Matching hardware brands. Because. You don't want to visit more than one website to. Download drivers. Which is done only a handful of times at most. During a hardware life cycle...
Also, Windows isn't the only operating system I use....
Same issue... You can't be serious, matching brands just so you don't visit more than one website for such a silly thing. You really can't be serious.