AMD FreeSync Review With the Acer XG270HU Monitor

Monitors 31 Page 1 of 1 Published by

Click here to post a comment for AMD FreeSync Review With the Acer XG270HU Monitor on our message forum
data/avatar/default/avatar32.webp
How / why would it possibly support FreeSync?
some stupid leaks on internet saying it's the same as UE590 series that support it
AMD today announced the consumer electronic industry’s first-ever ultra-high-definition monitors to feature FreeSync technology. FreeSync enables dynamic refresh rates synchronized to the frame rate of AMD Radeon graphics cards and APUs to reduce input latency and reduce or eliminate visual defects during gaming and video playback. Samsung plans to launch the screen synching technology in March 2015, starting with the Samsung UD590 and UE850, and eventually across all of Samsung’s UHD lineups. “We are very pleased to adopt AMD FreeSync technology to our 2015 Samsung Electronics Visual Display division’s UHD monitor roadmap, which fully supports open standards,” said Joe Chan, Vice President of Samsung Electronics Southeast Asia Headquarters. “With this technology, we believe users including gamers will be able to enjoy their videos and games to be played with smoother frame display without stuttering or tearing on their monitors.” From what we have been told there will be five monitors in total in March 2015 that support AMD FreeSync in these two series. The Samsung UD590 series will consist of 23.6” and 28” models and then the UE850 series will have 23.6”, 27” and 31.5” models. The Samsung UD590 Series consists of a 28-inch LED 4K UHD Monitor that is has a native resolution of 3840 x 2160 resolution with a 1-millisecond (GTG) response time and a brightness level of 370 cd/m². The price tag on the SAMSUNG UD590 Series U28D590D display is $558 at Best Buy or $599 at Newegg. We are not sure if the models that you can buy today can be firmware updated to supported AMD FreeSync or if hardware changes are needed. Legit Reviews has reached out to AMD and asked them to clarify since this series is already available and this announcement might cause some confusion. AMD recently stated that a FreeSync display would be out this year, so we are still hopeful that obviously a non-Samsung branded display that supports FreeSync will be around in 2014. Read more at http://www.legitreviews.com/samsung-amd-freesync-supporting-displays-coming-2015_154351#tLXud2dxacZV5wqQ.99
https://forums.guru3d.com/data/avatars/m/174/174772.jpg
The reality is though that both standards have become a bit proprietary so with FreeSync you must have a compatible AMD Radeon graphics card
Isn't the monitor in the test just using the optional VESA Adaptive Sync on the DP? There should not be anything special AMD on the monitor side, so unlike NVidia's G-Sync it is not a GPU bound proprietary regardless of that AMD calls it FreeSync in their drivers.
https://forums.guru3d.com/data/avatars/m/235/235224.jpg
Nvidia Explains Why Their G-Sync Display Tech Is Superior To AMD's FreeSync
Forbes: Credit where credit is due: I give Nvidia props for inventing a solution to screen tearing, stutter, and input lag well before an open standard existed. But will we ever see a day when Nvidia ditches their proprietary solution in favor of Adaptive-Sync? Tom Petersen: “When we invented G-Sync, we determined very early on that in order to accomplish everything we wanted, we needed to be on both sides of the problem — at the front end where we’re controlling the GPU, and the backend inside of the monitor. As FreeSync comes to market, we’ll be able to compare the different strategies and see which one’s more effective. For us, having the module inside the panel allows us to deliver what we think is a very good experience across a full range of operating frequencies for refresh rate or framerate. We have some really significant technology inside that module dealing with the low end of refresh rates. So as a game transitions from 45fps down to 25fps and back, and games get really intense? During that transition our tech kicks in and delivers a smooth experience over that transition.” Forbes: Let’s talk about the minimum response times that both G-Sync and Adaptive Sync support. Tom Petersen: “First of all, the spec ‘Adaptive Sync’ has no minimum. Both have the ability to communicate any range, so there’s nothing about the base specs that are different. What’s interesting though, is the reason there are panel-specific refresh limits. LCD images decay after a refresh, you kinda paint the screen and it slowly fades. That fade is just related to the panel. The reason there’s an Adaptive Sync spec and G-Sync module is because that lower limit is variable depending on the technology inside the panel. But games don’t know about that! So what do you do when a game has a lower FPS than the minimum rate you want to run your panel? Because when they run below that minimum rate things start to flicker, and that’s a horrible experience.” Forbes: So what specifically does Nvidia do to combat that? Tom Petersen: “I can’t go into too much detail because it’s still one of our secret sauces. But our technology allows a seamless transition above and below that minimum framerate that’s required by the panel. PC Perspective wrote an article guessing how we did that, and they’re not that far off…” Forbes: You’ve said in the past that one of the crucial goals of G-Sync was to never introduce screen tearing, no matter what. Tom Petersen: “You never want to introduce stutter, either. It’s a complex problem, which is why we think you need some of that secret sauce in both the driver and the module. In contrast, AMD’s not doing that. As you transition from the high frequencies to the low frequencies of FPS, they have some jarringly negative experiences coming out of their zone. If you take any of their panels and run it from whatever frequency is in the zone, to any frequency out of the zone at the low end, the experience is not good at all.” Now G-Sync addresses two problems at the low end, tearing and stutter. Stutter is caused by having a repeat of the exact same frame. So you show all these new frames, then suddenly if you’re not able to keep up with the refresh rate minimum, you see that frame twice. That’s what V-Sync does, repeating a frame. But when you repeat a frame, motion stops and that’s why you feel a stutter. G-Sync doesn’t do that. It adjusts the refresh rate to keep it above that minimum rate, and we have other techniques like shifting and centering to avoid stutters. It’s a tough problem, which again requires the module inside the monitor. If you download our new G-Sync demo which purposefully goes in and out of that zone on your AMD FreeSync monitor, you’ll see that whether you’re tearing of stuttering, the experience is not great. You can set whatever framerate range you want, and it operates across all monitors to compare the differences.” Ghost In The Machine Tom Petersen: “There’s also a difference at the high frequency range. AMD really has 3 ranges of operation: in the zone, above the zone, and below the zone. When you’re above the zone they have a feature which I like (you can either leave V-Sync On or V-Sync Off), that we’re going to look at adding because some gamers may prefer that. The problem with high refresh rates is this thing called ghosting. You can actually see it with AMD’s own Windmill Demo or with our Pendulum Demo. Look at the trailing edge of those lines and you’ll see a secondary image following it.” Note to readers: I have seen this firsthand on the Acer FreeSync monitor I’m reviewing, and PC Perspective noticed it with 2 additional FreeSync monitors, the BenQ XL2730Z and LG 34UM67. To illustrate the problem they recorded the aforementioned monitors running AMD’s Windmill demo, as well as the same demo running on a G-Sync enabled Asus ROG Swift. Ignore the stuttering you see (this is a result of recording at high speed) and pay attention to the trailing lines, or ghosting. I agree that it’s jarring by comparison. Tom Petersen: “We don’t do that. We have anti-ghosting technology so that regardless of framerate, we have very little ghosting. See, variable refresh rates change the way you have to deal with it. Again, we need that module. With AMD, the driver is doing most of the work. Part of the reason they have such bad ghosting is because their driver has to specifically be tuned for each kind of panel. They won’t be able to keep up with the panel variations. We tune our G-Sync module for each monitor, based on its specs and voltage, which is exactly why you won’t see ghosting from us. We also do support the majority of our GPUs going back to Kepler. The 650Ti Boost is the oldest GPU we support, and there’s a lot of gaps in their GPU support. It’s a tough problem and I’m not meaning to knock AMD, but having that module allows us to exercise more control over the GPU and consequently offer a deeper range of support.” [youtube]watch?v=-ylLnT2yKyA[/youtube]
http://www.forbes.com/sites/jasonevangelho/2015/03/23/nvidia-explains-why-their-g-sync-display-tech-is-superior-to-amds-freesync/2/ Sounds like we'll be hearing more differences in time. I know this topic has been touched on but it's nice to have more insight.
https://forums.guru3d.com/data/avatars/m/152/152580.jpg
http://www.forbes.com/sites/jasonevangelho/2015/03/23/nvidia-explains-why-their-g-sync-display-tech-is-superior-to-amds-freesync/2/ Sounds like we'll be hearing more differences in time. I know this topic has been touched on but it's nice to have more insight.
Looking at the surface of the wings and the background of the film, PCper used different panel settings on different monitors. Definitely, LG and Acer panels have very elevated contrast. Multiple monitors with this settings shows the ghosting - it is rather a feature of the monitor, than FreeSync. Brad Chacos from PCWorld denied that his LG 34UM67 shows ghosting. http://www.pcworld.com/article/2900901/acers-500-amd-freesync-monitor-drastically-undercuts-nvidia-g-sync-pricing.html
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
Looking at the surface of the wings and the background of the film, PCper used different panel settings on different monitors. Definitely, LG and Acer panels have very elevated contrast. Multiple monitors with this settings shows the ghosting - it is rather a feature of the monitor, than FreeSync. Brad Chacos from PCWorld denied that his LG 34UM67 shows ghosting. http://www.pcworld.com/article/2900901/acers-500-amd-freesync-monitor-drastically-undercuts-nvidia-g-sync-pricing.html
The Swift and the BenQ both have the same panel. Honestly the PCPerspective article is w/e but in the comments one of the editors there clarifies why it happens. G-Sync automatically buffers the frames to prevent ghosting. A FreeSync monitor can do this, but it requires extra hardware to do so. The BenQ monitor does have a anti-blur feature but it's disabled when freesync is active. He goes into far more detail in the comments but that's basically the jist of it.
https://forums.guru3d.com/data/avatars/m/152/152580.jpg
The Swift and the BenQ both have the same panel. Honestly the PCPerspective article is w/e but in the comments one of the editors there clarifies why it happens. G-Sync automatically buffers the frames to prevent ghosting. A FreeSync monitor can do this, but it requires extra hardware to do so. The BenQ monitor does have a anti-blur feature but it's disabled when freesync is active. He goes into far more detail in the comments but that's basically the jist of it.
Comment explaining the cause of the problem that does not exist? There are many comments on OCUK, from users who already have these panels. All deny the existence of the problem of ghosting.
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
Comment explaining the cause of the problem that does not exist? There are many comments on OCUK, from users who already have these panels. All deny the existence of the problem of ghosting.
Yeah probably the same people that claim they don't see the problem of RGB striping/incorrect pixel mapping on the Swift, then back peddled weeks later when it's become apparent that it's an intrinsic part of the panel. Some people need to justify their $$ purchases I guess. Sorry but I trust PC Perspective with a high speed camera over subjective posts on a forum.
data/avatar/default/avatar22.webp
Yeah probably the same people that claim they don't see the problem of RGB striping/incorrect pixel mapping on the Swift, then back peddled weeks later when it's become apparent that it's an intrinsic part of the panel. Some people need to justify their $$ purchases I guess. Sorry but I trust PC Perspective with a high speed camera over subjective posts on a forum.
I trust random posters all the time. Just not today :wanker: Oh and the guy you're responding to uses funny definition of "ALL" http://forums.overclockers.co.uk/showthread.php?p=27816774
https://forums.guru3d.com/data/avatars/m/152/152580.jpg
Yeah probably the same people that claim they don't see the problem of RGB striping/incorrect pixel mapping on the Swift, then back peddled weeks later when it's become apparent that it's an intrinsic part of the panel. Some people need to justify their $$ purchases I guess. Sorry but I trust PC Perspective with a high speed camera over subjective posts on a forum.
No other tech sites, no real users, just PCper and NV card owners can see it? Interesting 🙂 edited: After reflection, I came up with an idea. Nvidia should gift every buyer of FreeSync monitors, high speed camera, so that everyone could see the inferiority of acquired monitor:)
https://forums.guru3d.com/data/avatars/m/101/101307.jpg
I think Nvidia is trying to do damage control. Bottom line is now Manufacturers can add whatever features they want to a Freesync monitor like Anti-Ghosting ect. Consumer can buy what they want as it's a new standard. Sorry Nvidia but Freesync has a better future.