Enermax Aquafusion 360 review LCS
Intel Core i5 11400F processor review
Corsair Vengeance RGB Pro SL 3600 MHz 32GB review
ASRock Z590 Extreme review
Gigabyte Radeon RX 6700 XT Gaming OC review
Corsair K70 RGB TKL keyboard review
Corsair RM650x (2021) power supply review
be quiet! Silent Loop 2 280mm review
Corsair K55 RGB PRO XT keyboard review
Guru3D Rig of the Month - March 2021
AOC CU34G2X monitor review




Ultrawide monitors are becoming more popular these days. In the PC world, 21:9 aspect ratio indicates the resolutions of 2560×1080 (WFHD and 3440×1440 (WQHD), and the latter is what we are going to be talking about because today we’re looking at the AOC CU34G2X monitor, introduced in August 2020. This isn’t the most recent display product around, but it deserves taking a look at.
Read article
Advertisement
« Hitman III: PC graphics perf benchmark review · AOC CU34G2X monitor review
· MS Flight Simulator (2020): the 2021 PC graphics performance benchmark review »
pages 1 2 3 4 5
omnimodis78
Member
Posts: 92
Member
Posts: 92
Posted on: 02/13/2021 07:04 PM
I've been out of the monitor market for a few years, and as an early adopter of G-SYNC, I'm a little confused by this: "G Sync technology did work" - so is this monitor a G-SYNC monitor or not? Has this tech evolved where G-SYNC monitors no longer require the dedicated chip on the monitor side?
I've been out of the monitor market for a few years, and as an early adopter of G-SYNC, I'm a little confused by this: "G Sync technology did work" - so is this monitor a G-SYNC monitor or not? Has this tech evolved where G-SYNC monitors no longer require the dedicated chip on the monitor side?
Venix
Senior Member
Posts: 1596
Senior Member
Posts: 1596
Posted on: 02/14/2021 08:55 PM
If your monitor support freesynch threw a display port you can enable it nvidia still call it gsync . There are also monitors with out modules that are certified by nvidia about their good compatibility . If not certified the driver will warn you that the monitor is not supported there might be issues from what i saw around the net in the majority of the time it works fine . So try at your own risk if you have a freesynch monitor with dp and an nvidia card just try it no reason not to.
I've been out of the monitor market for a few years, and as an early adopter of G-SYNC, I'm a little confused by this: "G Sync technology did work" - so is this monitor a G-SYNC monitor or not? Has this tech evolved where G-SYNC monitors no longer require the dedicated chip on the monitor side?
If your monitor support freesynch threw a display port you can enable it nvidia still call it gsync . There are also monitors with out modules that are certified by nvidia about their good compatibility . If not certified the driver will warn you that the monitor is not supported there might be issues from what i saw around the net in the majority of the time it works fine . So try at your own risk if you have a freesynch monitor with dp and an nvidia card just try it no reason not to.
Pictus
Senior Member
Posts: 129
Senior Member
Posts: 129
Posted on: 02/15/2021 05:17 PM
Thank you for the review!
A tip, in DisplayCAL create a "Curves + Matrix" profile instead of "XYZ LUT + Matrix"
You can also provide the profile for people without calibration devices.

Most of the times I get better profiles by using the monitor Custom Mode and
setting the luminosity to 120 cd/m2, gamma 2.2 and adjusting the RGB channels to 6500K
with the lower deviations as possible.
"XYZ LUT + Matrix" or "Curves + Matrix" profiles?
https://ninedegreesbelow.com/galleries/making-evaluating-monitor-profiles.html#what-kind-of-monitor-profile
For a gamer, I have no doubt "Curves + Matrix" profiles are much better!
If you are photographer/retoucher(probably using a wide gamut monitor) and
need color accuracy for better matching prints, XYZ LUT will give more precision.
"XYZ LUT + Matrix" profiles gives better calibration results, but visually they
can be more rough or less pleasant to the eye...
They are also less compatible with some programs.
Would be good if test the monitor for PWM, maybe a simple test like this
https://iristech.co/pwm-flicker-test/
Or something like this
Why PWM is BAD! (at least low frequency PWM is)
https://www.notebookcheck.net/Why-Pulse-Width-Modulation-PWM-is-such-a-headache.270240.0.html
Ah, Chris wrote this one, so he can answer that.
Thank you for the review!
A tip, in DisplayCAL create a "Curves + Matrix" profile instead of "XYZ LUT + Matrix"
You can also provide the profile for people without calibration devices.

Most of the times I get better profiles by using the monitor Custom Mode and
setting the luminosity to 120 cd/m2, gamma 2.2 and adjusting the RGB channels to 6500K
with the lower deviations as possible.
"XYZ LUT + Matrix" or "Curves + Matrix" profiles?
https://ninedegreesbelow.com/galleries/making-evaluating-monitor-profiles.html#what-kind-of-monitor-profile
For a gamer, I have no doubt "Curves + Matrix" profiles are much better!
If you are photographer/retoucher(probably using a wide gamut monitor) and
need color accuracy for better matching prints, XYZ LUT will give more precision.
"XYZ LUT + Matrix" profiles gives better calibration results, but visually they
can be more rough or less pleasant to the eye...
They are also less compatible with some programs.
Would be good if test the monitor for PWM, maybe a simple test like this
https://iristech.co/pwm-flicker-test/
Or something like this
Why PWM is BAD! (at least low frequency PWM is)
https://www.notebookcheck.net/Why-Pulse-Width-Modulation-PWM-is-such-a-headache.270240.0.html
pages 1 2 3 4 5
Click here to post a comment for this article on the message forum.
Senior Member
Posts: 1925
Buyer's remorse is still a thing, huh.