2016 model Samsung TVs to support Youtube HDR

Published by

Click here to post a comment for 2016 model Samsung TVs to support Youtube HDR on our message forum
https://forums.guru3d.com/data/avatars/m/72/72189.jpg
HL2 has been supporting HDR for ages with all monitors..
data/avatar/default/avatar01.webp
HL2 has been supporting HDR for ages with all monitors..
You're confusing HDR with "HDR".
data/avatar/default/avatar12.webp
My Samsung 2016 has supported HDR on YouTube for a long time now? Had it for 3 weeks, and it supported it out of the box. HDR video's are tagged with "HDR" in the YouTube app, and it looks like true HDR (I've seen 4K Bluray's w. HDR) But I have not seen a HDR category in the YouTube app. Maybe that is what the update is about?
Don't be so sure. The HDR videos look good on the Youtube app, but I had to manually install the 1154 firmware before the TV actually kicked into HDR mode with those videos.
https://forums.guru3d.com/data/avatars/m/72/72189.jpg
You're confusing HDR with "HDR".
Yes... HL2 is real HDR :3eyes:
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
Yes... HL2 is real HDR :3eyes:
HDR in HL2 renders the image in 10bit then maps it to 8bit. HDR in modern TVs/apps are 10bit throughout the entire process.
https://forums.guru3d.com/data/avatars/m/72/72189.jpg
HDR in HL2 renders the image in 10bit then maps it to 8bit. HDR in modern TVs/apps are 10bit throughout the entire process.
So.. we dont need a HDR TV.. It can be done with 8bit also. I call this marketing hype =)
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
So.. we dont need a HDR TV.. It can be done with 8bit also. I call this marketing hype =)
It's not the same. It's like comparing 4K downsample on a 1080 monitor to a real 4K screen. It offers some advantages but it isn't replacing a 4K screen. For starters color banding is significantly reduced on a 10bit display: http://i.imgur.com/lWqWHnd.png http://abload.de/img/8-bite8xve.jpg - 8bit http://abload.de/img/10-bit6uybq.jpg - 10bit You also get access to significantly more color spectrum, especially reds - which are noticeably different in HDR. https://i.imgur.com/OdQFWJC.png Some more information about how HL2 other games use it:
It's not related to the whole 'HDR' vs 'HDR' vs 'HDR' thing, but I wanted to add a bit more info to the "mapping" process you mentioned, because you're missing an important step in mapping from 10bit to 8bit. Rendering to a higher depth render target did allow for intermediate processing steps to have more detail, but if you simply mapped from 10bit to 8bit, the final result would look the same as if you had just rendered to 8bit (excluding the changes made during the intermediate step). This is where adaptation plays in. Just like our pupils expanding or contracting in response to the average luminosity of the light falling on our retina, most games which have HDR will use the average luminosity of the HDR target to change the mapping curve. Leading to hot spots in dark rooms, and dark spots in bright rooms. This actually results in a loss of detail in these areas (Something had to be lost in the 10 -> 8 bit downscale!). But because the average luminosity is used, the optimal brightness range covers more of the scene, meaning more detail in average. Incidentally, the HDR target also allowed bloom to work better, since there could be a distinction between something that was white, and something that was bright (up to a point, of course). In tandem, bloom and adaptation helped make light appear much more natural and less flat. Although adaptation does most of the work, bloom is the visual cue your brain naturally parses as a bright light. So even though the monitor can't create a light source bright enough to force you to shield your eyes, you still interpret it like that.
In the end its not just marketing. I have a Samsung 8500 with HDR support and while Marco Polo isn't that great (Netflix show that supports HDR) Obduction in HDR is beautiful. I'll be buying the first QHD/4K 144Hz HDR monitor that's available regardless to the price. Edit: Additional reading for those interested https://developer.nvidia.com/implementing-hdr-rise-tomb-raider https://developer.nvidia.com/preparing-real-hdr
data/avatar/default/avatar23.webp
It's not the same. It's like comparing 4K downsample on a 1080 monitor to a real 4K screen. It offers some advantages but it isn't replacing a 4K screen. For starters color banding is significantly reduced on a 10bit display: http://i.imgur.com/lWqWHnd.png http://abload.de/img/8-bite8xve.jpg - 8bit http://abload.de/img/10-bit6uybq.jpg - 10bit You also get access to significantly more color spectrum, especially reds - which are noticeably different in HDR. https://i.imgur.com/OdQFWJC.png Some more information about how HL2 other games use it: In the end its not just marketing. I have a Samsung 8500 with HDR support and while Marco Polo isn't that great (Netflix show that supports HDR) Obduction in HDR is beautiful. I'll be buying the first QHD/4K 144Hz HDR monitor that's available regardless to the price. Edit: Additional reading for those interested https://developer.nvidia.com/implementing-hdr-rise-tomb-raider https://developer.nvidia.com/preparing-real-hdr
Great Post. Just I like add.. Wider Color Space or rec.2020 (like on pic) is a part of UHD standard - HDR itself is abit other UHD element/component. This WCG come more from improved backlighting or filtering system, more dense pixels etc.. If use Full RGB signal can get UHD color without HDR itself technically.
data/avatar/default/avatar38.webp
It's not the same. It's like comparing 4K downsample on a 1080 monitor to a real 4K screen. It offers some advantages but it isn't replacing a 4K screen. For starters color banding is significantly reduced on a 10bit display: http://i.imgur.com/lWqWHnd.png http://abload.de/img/8-bite8xve.jpg - 8bit http://abload.de/img/10-bit6uybq.jpg - 10bit You also get access to significantly more color spectrum, especially reds - which are noticeably different in HDR. https://i.imgur.com/OdQFWJC.png Some more information about how HL2 other games use it: In the end its not just marketing. I have a Samsung 8500 with HDR support and while Marco Polo isn't that great (Netflix show that supports HDR) Obduction in HDR is beautiful. I'll be buying the first QHD/4K 144Hz HDR monitor that's available regardless to the price. Edit: Additional reading for those interested https://developer.nvidia.com/implementing-hdr-rise-tomb-raider https://developer.nvidia.com/preparing-real-hdr
LOL @ bull**** 8/10 bit screenshots. All those 8/10 bit comparison are complete marketing scams and fakes. They're actually showing 5 bit vs 8 bit colors. 8 and 10 Bit mode should show exactly the same photo on a normal 8bit monitor, because the monitor shouldn't be displaying the extra 2 bits of color.(which is only in the form of extra smooth gradient between two shades of the same color). So why are 8bit and 10bit mode showing different photos on an 8bit monitor? You obviously don't need a 10 bit monitor since you can see 10 bit color on your 8 bit monitor just fine, right? Christ...
https://forums.guru3d.com/data/avatars/m/209/209146.jpg
The image is just showcasing how the difference would look, an actual 6-bit, 8-bit, 10-bit or whatever check (Usually a gray scale type image with clear defined borders between each color grade.) would be from a single image and then how that would look would depend on the display. 🙂 It's also further complicated by the whole 6+2 = 8 and 8+2 = 10 bit stuff that a lot of monitors have. (True 10 bit would well you'd probably notice from the price point alone I'm guessing heh.) my own display is 8+2 for example but of course it's marketed as 10-bit whereas the actual info is something you'd have to look up in a test where someone actually took the display apart. EDIT: This seems to explain it pretty well though it too uses a more visible comparison between two images. http://www.ronmartblog.com/2011/07/guest-blog-understanding-10-bit-color.html Far as I understand back with CRT there was early experimentation with up to I think it was called 48-bit or some such. (Not exactly sure what that would correspond to, 8-bit would be 24 and 10-bit 30 so probably the recent 16-bit LCD displays?) Not really sure how it is with gaming and 10-bit either, Alien Isolation is among the earliest I believe offering support for deep-color (30 instead of 24 or 10-bit) and I'm guessing DirectX 12 and Vulkan also helps a bit with more standards and what not, starting to see some few games supporting actual HDR now such as Obduction via UE4 or Shadow Warrior 2 via their own RoadHog engine tech. And far as AMD and Nvidia and driver tech goes I think AMD supports 10-bit via most GCN GPU models and for Nvidia Maxwell and Pascal with the little side bit about OpenGL requiring exclusive full-screen mode for it. (Probably about the same with 12-bit support too although I don't think there's many actual monitors using 12 or higher so at the moment that's mainly in more recent TV's.)
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
LOL @ bull**** 8/10 bit screenshots. All those 8/10 bit comparison are complete marketing scams and fakes. They're actually showing 5 bit vs 8 bit colors. 8 and 10 Bit mode should show exactly the same photo on a normal 8bit monitor, because the monitor shouldn't be displaying the extra 2 bits of color.(which is only in the form of extra smooth gradient between two shades of the same color). So why are 8bit and 10bit mode showing different photos on an 8bit monitor? You obviously don't need a 10 bit monitor since you can see 10 bit color on your 8 bit monitor just fine, right? Christ...
Uh, I'm pretty sure everyone on Guru3D knows it's an example.
data/avatar/default/avatar28.webp
So.. we dont need a HDR TV.. It can be done with 8bit also. I call this marketing hype =)
With 8 bit colours colour banding is pretty obvious with things like skies in games.