ASUS Releases TUF Gaming VG279QL1A gaming monitor

Published by

Click here to post a comment for ASUS Releases TUF Gaming VG279QL1A gaming monitor on our message forum
https://forums.guru3d.com/data/avatars/m/199/199386.jpg
Review HH; it seems this is a swansong monitor for 1080P? Many boxes ticked here.
https://forums.guru3d.com/data/avatars/m/72/72485.jpg
Asus? I'm sure it's way over priced.
https://forums.guru3d.com/data/avatars/m/280/280231.jpg
Hdr400 is a scam. All new games like rdr2 and gears 5 have a minimum HDR of 500/600 nits. I think sites should mention this on every article. False marketing is bad, 400 nits was never hdr.
https://forums.guru3d.com/data/avatars/m/199/199386.jpg
itpro:

Hdr400 is a scam. All new games like rdr2 and gears 5 have a minimum HDR of 500/600 nits. I think sites should mention this on every article. False marketing is bad, 400 nits was never hdr.
I stand to be corrected here, but isn't "HDR" listed as a compatibility to a colour space size/spread by the games company? So - like a mastering process?
https://forums.guru3d.com/data/avatars/m/280/280231.jpg
Loobyluggs:

I stand to be corrected here, but isn't "HDR" listed as a compatibility to a colour space size/spread by the games company? So - like a mastering process?
Not really, you're both wrong and correct. Surely it's a WCG space, beyond srgb. Monitors have an absolute maximum brightness capability whether it has local dimming or not. That intended designed brightened areas have a corresponding nits level for apps and games. Meaning, if you force more nits than a monitor's cd brightness, you will lose image clarity, details loss, white crash, contrast or shadow errors etc. For example in my 600 nits monitor, I can safely choose anything from 500~700 nits no worries.
https://forums.guru3d.com/data/avatars/m/215/215813.jpg
1080p and HDR 400 lol...
https://forums.guru3d.com/data/avatars/m/172/172560.jpg
itpro:

Hdr400 is a scam. All new games like rdr2 and gears 5 have a minimum HDR of 500/600 nits. I think sites should mention this on every article. False marketing is bad, 400 nits was never hdr.
Well it is now. Look up HDR 400 πŸ™ Besides, You don't want 400 nits blasting you while "browsing". Calibrated screen would have 120 cd/m2 / gamma 2.2 / color temp 6500. Then if you have real HDR you would have it enabled in movies and such.
https://forums.guru3d.com/data/avatars/m/199/199386.jpg
itpro:

Not really, you're both wrong and correct. Surely it's a WCG space, beyond srgb. Monitors have an absolute maximum brightness capability whether it has local dimming or not. That intended designed brightened areas have a corresponding nits level for apps and games. Meaning, if you force more nits than a monitor's cd brightness, you will lose image clarity, details loss, white crash, contrast or shadow errors etc. For example in my 600 nits monitor, I can safely choose anything from 500~700 nits no worries.
So - the games company doesn't have to do anything to switch on HDR for you - it's automatic for the monitor?
https://forums.guru3d.com/data/avatars/m/280/280231.jpg
gx-x:

Well it is now. Look up HDR 400 πŸ™ Besides, You don't want 400 nits blasting you while "browsing". Calibrated screen would have 120 cd/m2 / gamma 2.2 / color temp 6500. Then if you have real HDR you would have it enabled in movies and such.
No. In windows 10 you set it in average of 50% to be great. So, if my HDR brightness is low, I will not enjoy everything. It is not worth it to switch HDR on and off in OS if you got a nice monitor.
Loobyluggs:

So - the games company doesn't have to do anything to switch on HDR for you - it's automatic for the monitor?
Quite the opposite. It's automatic signal, meaning, you must manually switch on HDR either from OS, application video or game. Then, to get calibrated image you must edit settings everywhere, from gpu vendor to in-game menu. Only HDR10/DV movies are ready to play correctly, out of the box.
https://forums.guru3d.com/data/avatars/m/172/172560.jpg
look, it's simple: there are two ways to properly calibrate display 1. For windows 2. For MAC. Both will use 120cd/m2 neither will use more. You like the way you like, that's fine, but that's not color accurate.
https://forums.guru3d.com/data/avatars/m/280/280231.jpg
gx-x:

look, it's simple: there are two ways to properly calibrate display 1. For windows 2. For MAC. Both will use 120cd/m2 neither will use more. You like the way you like, that's fine, but that's not color accurate.
It is beyond accurate. HDR enabled usually uses WCG(DCI-P3) which is greater and more accurate than sRGB/Adobe RGB. What are you talking about? cd/m2 means nothing. Nobody can live with a monitor dimmer than mobile phone. Eyes are hurt more from lower brightness than higher one.
https://forums.guru3d.com/data/avatars/m/172/172560.jpg
that's just not true. It's not how that works at all.
https://forums.guru3d.com/data/avatars/m/199/199386.jpg
So what you are saying is, the marketing BS speak from games companies and monitor owners and movie studios is contradictory and we shouldn't worry about it?
https://forums.guru3d.com/data/avatars/m/172/172560.jpg
IF monitor does 400 nits while it's darkest black is 0.4 nits - You are looking at washed out picture. Calibration is required and usually after calibration monitor will not be showing 400nits anymore (or whatever marketed value is, just think "dynamic contrast" marketing, it's just that, marketing) but will have darker blacks, proper contrast and colors and overall much better and vivid picture then when it was doing 400 nits for the sake of doing 400 nits. Of course, some monitors will show 400 nits (again, insert whatever value) and have good blacks, so you know, it varies really. as was said, there is a "pin" in HDR content that would invoke display to turn on HDR mode. In, say, latest Doom you can turn HDR on/off. Movies should have a "pin" in signal to always tell display that movie is HDR and negotiate weather should it play as HDR or SDR. This is entirely different topic. πŸ™‚
https://forums.guru3d.com/data/avatars/m/220/220188.jpg
Lcd... 1080p... Its starting to feel like 14nm++++++ here
https://forums.guru3d.com/data/avatars/m/280/280231.jpg
K.S.:

It's the opposite. Your muscles surrounding your eyes that help with focus strain thus feel more sore from lower brightness. Your optic nerve and your iris, cornea etc will suffer from higher brightness at extended periods, in addition too much too fast too bright (impact flash)
It depends on exposure time. It is worse to strain your eyes for 8 hours with low brightness trying hard to distinguish anything, than 8 seconds occasional peak brightness here and there.
https://forums.guru3d.com/data/avatars/m/280/280231.jpg
K.S.:

Ahh shall we call it a draw then? Well fought! πŸ˜›:) Just don’t go staring into any monitors now kid’ you’ll poke ur eye out!
Haha, I await for the day we'll wear sunglasses when we're watching indoors. Just remember to mention me. :P πŸ˜€