AMD Shows First Displays with FreeSync Support

Published by

Click here to post a comment for AMD Shows First Displays with FreeSync Support on our message forum
https://forums.guru3d.com/data/avatars/m/115/115616.jpg
Now I'd like them to be available this year, supported by both NV and AMD. Also, before buying one or another, I'd rather see some reviews including frame jitter, overhead and end-to-end lag measurements.
https://forums.guru3d.com/data/avatars/m/254/254800.jpg
If I'm not mistaken Free Sync requires a displayport 1.2a, darn I guess even with free sync, I'll need to buy a new monitor
https://forums.guru3d.com/data/avatars/m/227/227994.jpg
The FreeSync Off monitor is completely different. They could have used a really crap monitor for that instead... Bad comparison is bad!
Free Sync requires a displayport 1.2a, darn I guess even with free sync, I'll need to buy a new monitor
Yep. I personally wont be bothering with a new monitor untill 2020 or so, since i bought my current EIZO CX240 not long ago.
data/avatar/default/avatar34.webp
it isnt supposed to improve image quality, so it doesnt matter if its super ips or crappy tn monitor, it should do same thing and that is removing screen tearing
https://forums.guru3d.com/data/avatars/m/224/224067.jpg
The FreeSync Off monitor is completely different. They could have used a really crap monitor for that instead... Bad comparison is bad!
Should have used a CRT for FreeSync OFF 😀
data/avatar/default/avatar22.webp
I do wonder if there will be much power saving coming from this technology on the monitor side. I can understand it will allow graphics cards to idle at lower wattage because they're not having to pump out 60fps when you're browsing online static pages or using word etc. There needs to be more motivation for monitor manufactures to get on board. Laptops and other mobile devices on the other hand would probably have their battery life extended considerably and imo should be the technology to lead the charge on this one.
https://forums.guru3d.com/data/avatars/m/115/115710.jpg
The FreeSync Off monitor is completely different. They could have used a really crap monitor for that instead... Bad comparison is bad! Yep. I personally wont be bothering with a new monitor untill 2020 or so, since i bought my current EIZO CX240 not long ago.
I just swapped my display to another IPS because the original had quirky overdrive (got tired of doing same button smashing sequence when I power it on...). But anyway, I'll stick with my ASUS MX239H (76 Hz FTW!) until there are ~24"-27" 4K screens that can do better than 60 frames per second and have variable framerate support. Also, no TN crap. It will probably take quite some time...
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
Seems Nvidia weent to a lot of effort for no reason with G-sync, well no reason other than to exclude AMD (or hefty licencing).
You could say the same about AMD's Mantle, and how DX12, OpenGL 5, and Apple's "Metal" are obsoleting it. The stupid thing is AMD did this whole freesync/g-sync thing a while ago, except it was meant only for laptops as a way to help save power. Anyway, considering this isn't supposed to actually improve the graphics, I think it's such a funny marketing ploy on how AMD's presentation shows better contrast on the monitor with freesync on.
https://forums.guru3d.com/data/avatars/m/223/223673.jpg
wonder if some Manufactures will release Firmware updates for there current monitors
https://forums.guru3d.com/data/avatars/m/227/227994.jpg
wonder if some Manufactures will release Firmware updates for there current monitors
I doubt a DisplayPort 1.1 monitor can be Firmware Upgraded to 1.2a. Pretty sure it's still hardware based.
https://forums.guru3d.com/data/avatars/m/201/201182.jpg
The protocol FreeSync / Adaptive Sync will be embedded into DisplayPort 1.2a and eDP (embedded Displayport), there is a catch though, manufacturers of monitors are free to decide wether or not to support the technology.
That's fine. Gamers are free to avoid monitors from manufacturers that don't support adaptive sync.... It would be more profitable to them that they do, and early adopters will get the most attention.
https://forums.guru3d.com/data/avatars/m/223/223176.jpg
I bet nvidia won't use this technology simply because they're too proud of Gsync, they may even add a new feature (ie, gimmick) and use that to justify the existence of Gsync. Another thing is the cost. Gsync was never cheap to begin with and now looks like a complete rip off next to FreeSync. I doubt nvidia will have enough time to cover the cost bringing this thing (Gsync) to market. edit: But kudos to nvidia for actually looking at how to fix screen tearing and coming up with a solution, even though it cost as much as a single entry level GPU.
https://forums.guru3d.com/data/avatars/m/94/94450.jpg
Damn, optional standard, means many will avoid it or charge more for its inclusion. I'd need to see it in a TV though.
data/avatar/default/avatar06.webp
The FreeSync Off monitor is completely different. They could have used a really crap monitor for that instead... Bad comparison is bad! Yep. I personally wont be bothering with a new monitor untill 2020 or so, since i bought my current EIZO CX240 not long ago.
The bad monitor that you are talking about look to be the 27" Asus. certainly the PB278 PLS .. Anyway it dont affect freesync as it is not the purpose of it. Dave Baumann:
The demo was a full "FreeSync" demo, i.e. controlled variable refresh rate. DisplayPort ActiveSync is not, however, FreeSync, it is purely part of the ecosystem specification that enables FreeSync to work. FreeSync uses the specification and GPU hardware and software to sync the refresh rates. During display initialisation the monitor EDID will send back the timing ranges available on the monitor to the GPU and the GPU drivers will store these for operation and control over when to send the VBLANK signal. During a game the GPU will send a VBLANK signal when a frame is rendered and ready to be displayed; if a frame rendering is taking longer than the lowest refresh then the prior frame will be resent only to be updated with the new frame as soon as it is finished within the timing range.
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
I bet nvidia won't use this technology simply because they're too proud of Gsync, they may even add a new feature (ie, gimmick) and use that to justify the existence of Gsync. Another thing is the cost. Gsync was never cheap to begin with and now looks like a complete rip off next to FreeSync. I doubt nvidia will have enough time to cover the cost bringing this thing (Gsync) to market. edit: But kudos to nvidia for actually looking at how to fix screen tearing and coming up with a solution, even though it cost as much as a single entry level GPU.
Nvidia's solution is expensive because it uses an FPGA instead of an ASIC chip which is significantly cheaper. They did this so they can deliver firmware updates to in the future. They already talked about removing polling and other things to help reduce latency and performance impact. It's also possible they can come up with some other neat tricks via firmware.
https://forums.guru3d.com/data/avatars/m/243/243702.jpg
Monitor on left is already existing monitor which is 60Hz but was made with HW capable to pull that stuff so they did just FW update. On YT video I noticed that this FreeSync screen had built in webcam at top center. I guess it may be bit hard to find what screen it is based on this an other physical features, but not impossible. Another thing what Guy stated in video is: Freesync can go from 9 to 240Hz ATM. Edit: Based on external power brick and stand it is Nexeus. On some pictures one can see description for buttons to bottom right, but none are clear enough to get language they are in to confirm exact model. Edit2: Found him: NX-VUE27D - 2560x1440 S-IPS (eBay/newegg ~= 400$)
https://forums.guru3d.com/data/avatars/m/115/115616.jpg
It doesn't have to be purely software upgrade. I can remember NV showcasing G-Sync with ASUS 32" 4k monitor, which was hacked, with a gaping hole at the rear, cables and PCBs sticking out.