2016 model Samsung TVs to support Youtube HDR
Click here to post a comment for 2016 model Samsung TVs to support Youtube HDR on our message forum
Mannerheim
HL2 has been supporting HDR for ages with all monitors..
ttnuagmada
ttnuagmada
Mannerheim
Denial
Mannerheim
Denial
http://i.imgur.com/lWqWHnd.png
http://abload.de/img/8-bite8xve.jpg - 8bit
http://abload.de/img/10-bit6uybq.jpg - 10bit
You also get access to significantly more color spectrum, especially reds - which are noticeably different in HDR.
https://i.imgur.com/OdQFWJC.png
Some more information about how HL2 other games use it:
In the end its not just marketing. I have a Samsung 8500 with HDR support and while Marco Polo isn't that great (Netflix show that supports HDR) Obduction in HDR is beautiful. I'll be buying the first QHD/4K 144Hz HDR monitor that's available regardless to the price.
Edit: Additional reading for those interested
https://developer.nvidia.com/implementing-hdr-rise-tomb-raider
https://developer.nvidia.com/preparing-real-hdr
It's not the same. It's like comparing 4K downsample on a 1080 monitor to a real 4K screen. It offers some advantages but it isn't replacing a 4K screen.
For starters color banding is significantly reduced on a 10bit display:
ivymike10mt
Xionor
JonasBeckman
The image is just showcasing how the difference would look, an actual 6-bit, 8-bit, 10-bit or whatever check (Usually a gray scale type image with clear defined borders between each color grade.) would be from a single image and then how that would look would depend on the display. 🙂
It's also further complicated by the whole 6+2 = 8 and 8+2 = 10 bit stuff that a lot of monitors have. (True 10 bit would well you'd probably notice from the price point alone I'm guessing heh.) my own display is 8+2 for example but of course it's marketed as 10-bit whereas the actual info is something you'd have to look up in a test where someone actually took the display apart.
EDIT: This seems to explain it pretty well though it too uses a more visible comparison between two images.
http://www.ronmartblog.com/2011/07/guest-blog-understanding-10-bit-color.html
Far as I understand back with CRT there was early experimentation with up to I think it was called 48-bit or some such.
(Not exactly sure what that would correspond to, 8-bit would be 24 and 10-bit 30 so probably the recent 16-bit LCD displays?)
Not really sure how it is with gaming and 10-bit either, Alien Isolation is among the earliest I believe offering support for deep-color (30 instead of 24 or 10-bit) and I'm guessing DirectX 12 and Vulkan also helps a bit with more standards and what not, starting to see some few games supporting actual HDR now such as Obduction via UE4 or Shadow Warrior 2 via their own RoadHog engine tech.
And far as AMD and Nvidia and driver tech goes I think AMD supports 10-bit via most GCN GPU models and for Nvidia Maxwell and Pascal with the little side bit about OpenGL requiring exclusive full-screen mode for it.
(Probably about the same with 12-bit support too although I don't think there's many actual monitors using 12 or higher so at the moment that's mainly in more recent TV's.)
Denial
Xendance