LG 34UM67 AMD FreeSync Monitor Review

Monitors 30 Page 1 of 12 Published by

teaser

Tested & Reviewed

LG 34UM67 With FreeSync / Adaptive Sync 
Gaming without screen tearing and/or sync stuttering

In this article slash review we will test out the 579 EURO / 599 USD costing FreeSync compatible LG 34UM67 (a 34-inch 2560x1080 monitor) screen. AMD tackled stutter and tearing while gaming using a different approach, if you create a setup with the right combination. The LG 34UM67 is a truly lovely looking and supaaah wide IPS monitor with great image quality.

So ya'll know that when NVIDIA announced GSync shortly thereafter AMD realized that they already had something similar available hidden and harbored deeply as a technology with a purpose for laptops. To jump onto that bandwagon AMD figured, hey if we can get manufacturers to offer monitor support that can chew on a dynamic refresh-rate and develop FreeSync into a VESA standard then you would get the same experience as GSYNC offers, but at lower cost as you do not need an expensive GSYNC module - right? So the graphics card is running a dynamic device that outs its frames in a varying FPS, the problem is that your monitor has a static refresh-rate (Hz), these two don't really match together. To gain the maximum out of your graphics card you can turn off VSYNC on your monitor but that will result into multiple rendered images per shown frame, the overlapping difference is what you guys see as screen tearing. With VSYNC activated a somewhat similar thing happens as the graphics cards tries to stay as close to 60 FPS as possible, however if incapable of sustaining 60 FPS or Hz, you can see an effect that we call soft sync stuttering. Soft sync stuttering is relative, honestly. Screen tearing however is just a nasty thing. So the biggest culprit is tearing. Considering that the hardcore FPS gamer obviously wants extremely high FPS, and for these frag-masters the alternative is simply disabling VSYNC. However if you have that same 35 FPS framerate on 60 Hz, you'd see visible screen tearing. Heck, this is why framerate limiters are so popular as you try to sync each rendered frame in line with your monitor refresh rate. But yeah, these are the main reasons for all sorts of screen anomalies. Ever since the start of the 3D graphics revolutions, we simply got used to these sync stutters and/or screen tearing. To compensate we have been purchasing extremely fast dedicated graphics cards to be to be able to match that screen refresh rate as close as possible. Over the years the industry tried to solve problems like vsync stutter or tearing basically in two ways. The first way is to simply ignore the refresh rate of the monitor altogether, and update the image being scanned to the display in mid cycle. This you guys all know and have learned as "VSync Off Mode" and is the default way most FPS gamers play. If you however freeze the display to one 1 Hz, this is what you will see, the epiphany of graphics rendering evil, screen-tearing. We have taken this for granted many years, screen-tearing and VSYNC stutters, but anno 2015 there now are solutions for it.

Nvidia has GSYNC and AMD now has FreeSync, a technology that eliminates the problems that come with VSYNC (both on and off) versus what is displayed on your monitor. Basically FreeSync is synonym to Adaptive Sync, a technology that was developed for the mobile market years ago. But with monitors not being compatible nobody paid any attention to it. Once FreeSync was announced AMD did three things:

  1. Further develop FreeSync as a monitor protocol and possible in drivers
  2. They started talking to the big monitor players in the market to get support
  3. Get Adaptive Sync / FreeSync to become a VESA standard
On all three bullets pointed out here AMD succeeded as we'll show you today.

DisplayPort 1.2a & eDP - The Adaptive Sync Protocol

The FreeSync / Adaptive Sync protocol will be embedded as a VESA protocol into DisplayPort 1.2a and eDP (embedded Displayport), there is a catch though, manufacturers of monitors are free to decide whether or not to support the technology. The Video Electronics Standards Association (VESA) announced the addition of "Adaptive-Sync" to its popular DisplayPort 1.2a video interface standard. So you guys know, technology wise Nvidia does more or less the same thing, yet it requires a 200 USD module to be embedded into a compatible monitor. BTW it is still uncertain if Nvidia is going to support the actual Adaptiver Sync standard, the latest series 900 cards are not made compatible.

AMD on their end claims that as long as the manufacturer will stick to the Displayport 1.2a standard it can support FreeSync/ Adaptive Sync and as such that would be free. So are both GSYNC and Adapative Sync the same thing? Yes and no... basically NVIDIA requires you to have a 150~200 USD part extra in your monitor. Of course Nvidia can support Adaptive Sync easily at driver level (in fact a laptop driver with Adaptive Sync renames as Mobile GSYNC leaked a little while ago), but right now they feel that GSYNC is the better working alternative. So we know that Nvidia could support Adaptive Sync, however right now they earn quite a bit of revenue from that 20 USD PCB+IC, so we doubt it will happen anytime soon. And yes, there are subtle differences in-between the two technologies. But we doubt they are big enough to make a substantial difference. In the end we can only hope that adaptive sync will become a standard in monitors, and if everybody supports it, prices will come down fast.

The LG monitor we test today model LG34UM67 and actually is equipped with a 34in IPS panel that has a native 2560x1080 pixel "ultrawide" resolution and a maximum refresh rate of 75Hz. The FreeSync range however is limited, it is active only in-between 48Hz and 75Hz.

Img_1136

Share this content
Twitter Facebook Reddit WhatsApp Email Print