Official requirements for new 8K standard published

Published by

Click here to post a comment for Official requirements for new 8K standard published on our message forum
data/avatar/default/avatar38.webp
the only thing I know is that I'm buying the largest oled tv I can buy in 4K before they start switching to that idiotic 8K
Denial:

8K TVs are the horses. Whether you agree that it's useful or not - no one is going to film/master 8K content when there are no playback devices available.
no one almost is filming in 4K already fyi, it takes too much time in post-production, just like in the video-game industry with movies they waste time and then rush the final cut, in avengers endgame actors didnt even wear real suits but green clothing and their white gear was added later in post-prod, with such a dumb way to make movies (making suits too complicated in 2019?) I hardly see 8K be used for anything else than "direct to tv" productions with low post-processing
data/avatar/default/avatar30.webp
the TV's have to start first, then disc/streaming formats, followed by everything else. they are just getting the ball rolling, as 4k TV's are becoming ubiquitous in the market they need a new way to sell TVs.
data/avatar/default/avatar10.webp
craycray:

Intrigued. What do LG 4K TVs lack today, that makes them not real 4K?
Current OLEDs and IPS (except 8000-9000 series) have WRGB, that's less than 3K resolution, 2880x2160. Anyway, people don't care that they pay thousands of dollars for a little more than full HD resolution. They want deep black levels, diluted colors and 150 nits of brightness. I had LG OLED 55EC930V, I was shocked what colors that thing can produce, then I've seen a C6 at a friend, it looked AWFUL. https://www.hdtvtest.co.uk/news/rgbw-201510084189.htm
https://forums.guru3d.com/data/avatars/m/199/199386.jpg
Deasnutz:

the TV's have to start first, then disc/streaming formats, followed by everything else. they are just getting the ball rolling, as 4kUHD TV's are becoming ubiquitous in the market they need a new way to sell TVs.
That is an interesting point, yet I would add there is more than enough tech as yet unrelased by teevee manufacturers such as: HDMI 2.1 HDCP 2.2 ALLM (Auto Low Latency) 802.11ac HDR HLG HDR10 DolbyVision These are dispensed ad hoc by manufacturers, mostly using prior-release technology, such as HDMI 1.4, and I have been most dissappointed by televsion manufacturers in not keeping their televisions up-to-date with the television standards across their range(s). Some of this can be updated via firmware, yet, with regards to connectivity, cannot be updated via firmware, such as HDMI. When we have entire ranges using this tech, then we can all seek to find a television that suites us and, will be compliant with content providers.
https://forums.guru3d.com/data/avatars/m/169/169351.jpg
I'm surprised people are forgetting how 1080p was marketed originally (and 720p) They were "FullHD" and "HD Ready" respectively. It's also worth noting that 4k and UHD are not the same as they are 4096x2160 and 3840x2160 respectively but then these formats are typically distinguished by panel density vs broadcast resolution etc which doesn't exactly help clarify naming usage, particlarly with this new standard either... (Reminds me of USB 3....)
data/avatar/default/avatar34.webp
Only Intruder:

I'm surprised people are forgetting how 1080p was marketed originally (and 720p) They were "FullHD" and "HD Ready" respectively. It's also worth noting that 4k and UHD are not the same as they are 4096x2160 and 3840x2160 respectively but then these formats are typically distinguished by panel density vs broadcast resolution etc which doesn't exactly help clarify naming usage, particlarly with this new standard either... (Reminds me of USB 3....)
Very true. I do remember 1080p being FullHD and it being on 1080p TVs and HD ready being on 720p TVs as wells. The 4096x2160 resolution is for the movies (Projectors) etc while 3840x2160 is for TVs/Monitors etc. Well 3840x2160 is technically UHD since it is 4K for other display types that are not a movie projector.
https://forums.guru3d.com/data/avatars/m/156/156348.jpg
Requirement will be 8 x 2080TI in octo sli mode 😉
https://forums.guru3d.com/data/avatars/m/199/199386.jpg
Only Intruder:

I'm surprised people are forgetting how 1080p was marketed originally (and 720p) They were "FullHD" and "HD Ready" respectively. It's also worth noting that 4k and UHD are not the same as they are 4096x2160 and 3840x2160 respectively but then these formats are typically distinguished by panel density vs broadcast resolution etc which doesn't exactly help clarify naming usage, particlarly with this new standard either... (Reminds me of USB 3....)
That is eversoslightly incorrect. FullHD included anything that had vertical 1080 lines - and the reason was that half were interlaced and some were not, which were really a hold-over from broadcast standards pre-DVD era where different broadcast standards in different countries had their upper-alternate lines displayed first and their lower-alternate lines display second, as the broadcast signal could not support a full-screen every 50/60 frames per second, so it broadcast half of the screen per frame via interlacing. So, in truth the 'FullHD' name meant 1,080 lines with either an "i" for interlacing or "p" for progressive. The name of "HD Ready" simply meant that when the broadcast standard was in HD (720 lines), the television had not only the screen resolution to support it, it was 'ready' for when that day arrived. This also had "i" and "p", and even to this day, broadcast television is still 720 lines. As a side note, some "HD Ready" television actually used the resolution of 1366×768, total PC TFT LCD panel resolution cheap-hack to push a "HD Ready" televisions onto unsuspecting purchasers of televisions into getting a slightly higher resolution, downsampled into 720, and of course, being TFTLCD, looked utterly dreadful, yet, at least "The Jones" were impressed..."look darling, it has got HD on the side of it!" I only mention it, because this type of conversation is being had right now, today, as we read and type this "look darling, it has got 4K written on the side of it!". /faceplam? Facepalm. https://emojipedia-us.s3.dualstack.us-west-1.amazonaws.com/thumbs/120/apple/225/face-palm_1f926.png
https://forums.guru3d.com/data/avatars/m/169/169351.jpg
Loobyluggs:

That is eversoslightly incorrect. FullHD included anything that had vertical 1080 lines - and the reason was that half were interlaced and some were not, which were really a hold-over from broadcast standards pre-DVD era where different broadcast standards in different countries had their upper-alternate lines displayed first and their lower-alternate lines display second, as the broadcast signal could not support a full-screen every 50/60 frames per second, so it broadcast half of the screen per frame via interlacing. So, in truth the 'FullHD' name meant 1,080 lines with either an "i" for interlacing or "p" for progressive. The name of "HD Ready" simply meant that when the broadcast standard was in HD (720 lines), the television had not only the screen resolution to support it, it was 'ready' for when that day arrived. This also had "i" and "p", and even to this day, broadcast television is still 720 lines. As a side note, some "HD Ready" television actually used the resolution of 1366×768, total PC TFT LCD panel resolution cheap-hack to push a "HD Ready" televisions onto unsuspecting purchasers of televisions into getting a slightly higher resolution, downsampled into 720, and of course, being TFTLCD, looked utterly dreadful, yet, at least "The Jones" were impressed..."look darling, it has got HD on the side of it!" I only mention it, because this type of conversation is being had right now, today, as we read and type this "look darling, it has got 4K written on the side of it!". /faceplam? Facepalm. https://emojipedia-us.s3.dualstack.us-west-1.amazonaws.com/thumbs/120/apple/225/face-palm_1f926.png
Indeed very true, though I only wanted to illustrate the terms being used applicable to their resolution since 1080p was never the marketing term. Interlacing and progressive systems being common back then due to transmission mediums being used (many solutions were using analogue component cables for example and of course, FullHD television today is still typically broadcast in 1080i) however this time around with "4K" we do indeed have separate marketing for them in that things can show either 4k or UHD or both. Nevertheless, it's undermining clarity just as you've demonstrated with how the original "HD" resolutions also had misleading or inaccurate details during their marketing. Edit: It's always worth remembering that marketing never truly portrays the full details of a product, typically just the ideal features.
data/avatar/default/avatar27.webp
FeDaYin:

Current OLEDs and IPS (except 8000-9000 series) have WRGB, that's less than 3K resolution, 2880x2160. Anyway, people don't care that they pay thousands of dollars for a little more than full HD resolution. They want deep black levels, diluted colors and 150 nits of brightness. I had LG OLED 55EC930V, I was shocked what colors that thing can produce, then I've seen a C6 at a friend, it looked AWFUL. https://www.hdtvtest.co.uk/news/rgbw-201510084189.htm
Wasn't this fixed in later revisions? That was about 5 years ago now and 4 TV generations ago.
https://forums.guru3d.com/data/avatars/m/224/224952.jpg
craycray:

Wasn't this fixed in later revisions? That was about 5 years ago now and 4 TV generations ago.
I dont know about the resolution he mentioned being correct as every single pixel is WRGB, there is no loss of pixels due to WRGB. OLED though are still using WRGB to exceed 400nit brightness for HDR and even when doing so ABL kicks in to dim the image. The additional white OLED reduces colour saturation/colour volume making the image look more washed out, and the lack of overall brightness makes HDR much darker than it should be as well. it is not the best way to watch HDR. This is why I also hope uLED comes to the rescue, for PC use sake and to give us the best parts of OLED and QLED. Samsungs latest venture with an OLED mix could be a good stepping stone.
https://forums.guru3d.com/data/avatars/m/132/132389.jpg
Still waiting for higher than 1080p Netflix content to actually be available... And for them not to stonewall their users with stupid DRM and allow PC users to actually view higher than 1080p content, and higher than 720p in browsers not as garbage as Edge or their crappy, crappy, absolute crap, desktop app.
https://forums.guru3d.com/data/avatars/m/145/145154.jpg
(1080p) 2,073,600 pixels (...4K..) 8,294,400 pixels (...8K..) 33,177,600 pixels That is a lot of pixels to bring a GPU to it's knees, even in a few years. Assuming we go there, which we will.
https://forums.guru3d.com/data/avatars/m/224/224952.jpg
0blivious:

(1080p) 2,073,600 pixels (...4K..) 8,294,400 pixels (...8K..) 33,177,600 pixels That is a lot of pixels to bring a GPU to it's knees, even in a few years. Assuming we go there, which we will.
It will be more than a few years before 8K will be manageable, except high end multi card systems, and assuming games can use them. Made worse when the focus is on using more silicon for Ray Tracing, which at the moment brings resolution down for most and definitely reduces frame rate.
https://forums.guru3d.com/data/avatars/m/196/196426.jpg
I think Eye-Tracking will be then norm before 8K computer displays and TVs are commonplace. A GPU only needs to render in high detail in the location you're looking at, everything else can be with 20% shading rate or less... This should have been done for 4K too but I guess the tech wasn't ready. For 8K it will be pretty much mandatory... HFR and 8K with 100% render coverage on all the 33 million pixels is near impossible (and also completely wasteful) until we get to carbon nanotube transistors or something like that.
https://forums.guru3d.com/data/avatars/m/79/79740.jpg
0blivious:

(1080p) 2,073,600 pixels (...4K..) 8,294,400 pixels (...8K..) 33,177,600 pixels That is a lot of pixels to bring a GPU to it's knees, even in a few years. Assuming we go there, which we will.
I doubt there will ever be a PC or gamers market for 8k. Since you will need massive screens to tell the diff vs 4k.
https://forums.guru3d.com/data/avatars/m/224/224952.jpg
wavetrex:

I think Eye-Tracking will be then norm before 8K computer displays and TVs are commonplace. A GPU only needs to render in high detail in the location you're looking at, everything else can be with 20% shading rate or less... This should have been done for 4K too but I guess the tech wasn't ready. For 8K it will be pretty much mandatory... HFR and 8K with 100% render coverage on all the 33 million pixels is near impossible (and also completely wasteful) until we get to carbon nanotube transistors or something like that.
Only if you lead a very lonely life or are wearing a VR headset, other people watching would have a crap time. It also depends how far away you sit from the display (as well as the size), further away puts more of the screen in your central view. I dont see this becoming the norm for PC displays.
https://forums.guru3d.com/data/avatars/m/196/196426.jpg
Mufflore:

Only if you lead a very lonely life
People gaming on their PC are usually gaming by themselves, without somebody watching them gaming... Don't confuse "being alone" with "gaming alone" (which might not even be alone, but with 99 other people on a giant map that keeps shrinking or something...) The point was... our biological vision is not compatible with 33 megapixels all showing razor sharp image everywhere. That is not needed.
https://forums.guru3d.com/data/avatars/m/224/224952.jpg
wavetrex:

People gaming on their PC are usually gaming by themselves, without somebody watching them gaming... Don't confuse "being alone" with "gaming alone" (which might not even be alone, but with 99 other people on a giant map that keeps shrinking or something...) The point was... our biological vision is not compatible with 33 megapixels all showing razor sharp image everywhere. That is not needed.
I dont disagree with the idea, its just not practical on a TV, especially one without mass sales. The idea is already under development for VR headsets where it could be very useful. Another worthwhile point, is 8K gaming compatible with our bodies anyway? (ie to make full use of the extra resolution) You need a damn huge screen or to sit very close, otherwise those extra pixels are wasted. Either way you are going to be moving your head a lot. You can use a 65" 8K screen but why bother? Its not like using a 32" 4K monitor because you will be sitting twice the distance away from a 65" screen at least, removing the benefit higher resolution brings. Take a look at the optimal distances at Rtings https://www.rtings.com/tv/reviews/by-size/size-to-distance-relationship A 70" 4K screen optimal distance to see all the pixels is 1.35m. Halve that again for 8K.