LG Gets Ready for 8K Quad UHD

Published by

Click here to post a comment for LG Gets Ready for 8K Quad UHD on our message forum
https://forums.guru3d.com/data/avatars/m/258/258664.jpg
Remarkably what technology offers today. Only that you barely get any 1080p source material these days (besides PC gaming of course), so why bother with 4K TVs, let alone 8K TVs? This all reads like the technology is there, but there's little way to use it.
https://forums.guru3d.com/data/avatars/m/227/227853.jpg
Since 8K resolution is the highest resolution that the human eye is capable of seeing, it will put an end to the resolution discussion
I don't know who said this but it's probably not true. My phone has around 440PPI and it's noticeable compared to a phone under 400PPI. Not much, but there IS a difference. Now, an 8k 24" monitor would have around 380PPI which is absolutely enough because you don't look at the monitor as close as you look at your phone. But we CAN see more. And we don't have pure 4k cable pretty much nowhere. Horrible upscaling incoming.
https://forums.guru3d.com/data/avatars/m/240/240605.jpg
This remindes me of the atari jaguar, people be like like "damn 64bits? that´s like 4 times the bits". Oh innocent atari... I´m looking forwand to have a 4k monitor. Lots of giggity when that happens.
https://forums.guru3d.com/data/avatars/m/230/230424.jpg
Remarkably what technology offers today. Only that you barely get any 1080p source material these days (besides PC gaming of course), so why bother with 4K TVs, let alone 8K TVs? This all reads like the technology is there, but there's little way to use it.
This. Im still sitting on 1080p as I dont see the reason to go any higher. Though, I love a reason to do so. 8k, insane.
data/avatar/default/avatar38.webp
That graph comparing the size of the resolution is dead wrong. That UHD 4K should be around half the size of the 8K, but at least to my eye it isn't.
https://forums.guru3d.com/data/avatars/m/16/16662.jpg
Administrator
That graph comparing the size of the resolution is dead wrong. That UHD 4K should be around half the size of the 8K, but at least to my eye it isn't.
Yeah I swapped out the image, noticed that myself as well. That was actually an image from the LG press release. Anyway, the new one is precise.
data/avatar/default/avatar31.webp
And the sad truth is that the only market that would actually benefit from such a high resolution (PC industry) won't see them for another 5 years.
https://forums.guru3d.com/data/avatars/m/223/223176.jpg
Remarkably what technology offers today. Only that you barely get any 1080p source material these days (besides PC gaming of course), so why bother with 4K TVs, let alone 8K TVs? This all reads like the technology is there, but there's little way to use it.
Tech needs to evolve and sitting still waiting for everybody else to catch-up won't exactly do them any good. However as long as they use correct sizes (4k,8k,16k) for scaler mappings then playing upscaled native 1080p material should not be a problem. The prices need to be in check also otherwise they are beating a dead horse with all these extra pixels.
data/avatar/default/avatar36.webp
I don't know who said this but it's probably not true. My phone has around 440PPI and it's noticeable compared to a phone under 400PPI. Not much, but there IS a difference. Now, an 8k 24" monitor would have around 380PPI which is absolutely enough because you don't look at the monitor as close as you look at your phone. But we CAN see more. And we don't have pure 4k cable pretty much nowhere. Horrible upscaling incoming.
It's a generalization, like how the human ear could perceive only between 20 Hz and 20 kHz.
https://forums.guru3d.com/data/avatars/m/224/224067.jpg
Since 8K resolution is the highest resolution that the human eye is capable of seeing, it will put an end to the resolution discussion
Doesn't that depend on the size of the screen? If you had a 500" screen @ 8k vs a 500" screen @ 16k I would expect you'd be able to see the difference no?
https://forums.guru3d.com/data/avatars/m/227/227853.jpg
PPI is vastly overrated on TVs imo. What are you viewing mostly on your TV? Games and movies and things in motion? These things don't even need 8k resolution. Film itself doesn't even have close to that resolution inherent to its source much less with all the digital tampering and motion resolution going on with fast action films. I am not even sure I WANT to see all the imperfections and such in 4K with film unless they treat the elements with great care. It is nice we are seeing 4K restorations because they are NEEDED but in reality I don't even actually need the 4K source material for a 42 inch. It would be nice but yet again the differences from DVD to Bluray were much much much greater. 380PPI is insane for a TV you sit 5 or more feet away from. Great to have it and I will eventually get one, but it isn't something I really honestly care about and nor do many other people.
For TVs I agree but for monitors eeeh not so sure. I sit quite close to my 24" monitor.
It's a generalization, like how the human ear could perceive only between 20 Hz and 20 kHz.
I'm afraid that it might be a false generalization. For example some people think the human eye can't see above 40FPS. I just feel the urge to beat those said people to death with a chair.
https://forums.guru3d.com/data/avatars/m/224/224067.jpg
For example some people think the human eye can't see above 40FPS. I just feel the urge to beat those said people to death with a chair.
Isn't it that the human eye stops being able to see individual frames above 30-40fps? We all know we can see the difference between 40 and 60fps gaming, so that's not the issue, but once you pass 30-40fps, you can't see flickering so to speak, but the frame rate difference is still easily noticable Hard to explain what I mean, below 30fps = slideshow individual frames, above 30fps = motion
https://forums.guru3d.com/data/avatars/m/258/258664.jpg
Tech needs to evolve and sitting still waiting for everybody else to catch-up won't exactly do them any good. However as long as they use correct sizes (4k,8k,16k) for scaler mappings then playing upscaled native 1080p material should not be a problem. The prices need to be in check also otherwise they are beating a dead horse with all these extra pixels.
Of course you're right from a technical point of view, but it's just annoying that the marketing clerks try to sell us every small improvement as the best thing ever created, although there is barely any use for it. It's like their patting their shoulders for working on 8K TVs like they just won a marathon run, and trying to sell it to us as we are handicapped sitting in wheelchairs, and we're already 'top notch' with our 1440p and 4K games.
https://forums.guru3d.com/data/avatars/m/72/72189.jpg
we dont even have real 1080p yet in Finland.... 4K might come in 2030 and 3D, maybe 2025 =( ??? internet TV and torrents are life savier =)
https://forums.guru3d.com/data/avatars/m/243/243702.jpg
Anyone who have seen 4k TV with proper content knows that 4k is stunningly more detail rich than 1080p. 8k will give same jump in available detail, but what kind of performance will be needed to feed it even with video content? We do not even have proper encoders for such resolution to keep file sizes reasonable (data storage price correlation). 4k passive 3D TVs will have its place as it is 3840x1080 per eye. And single GPUs will soon be able to render that at good enough frame rate. But I think 2560x1440 passive 3D (2560x720 per eye) is more reasonable and cost effective for this year purchases.
https://forums.guru3d.com/data/avatars/m/227/227853.jpg
Isn't it that the human eye stops being able to see individual frames above 30-40fps? We all know we can see the difference between 40 and 60fps gaming, so that's not the issue, but once you pass 30-40fps, you can't see flickering so to speak, but the frame rate difference is still easily noticable Hard to explain what I mean, below 30fps = slideshow individual frames, above 30fps = motion
Oh I understand what you mean. I'm not sure about the exact values, but I know that we need around 20FPS to perceive motion instead of a set of images. The point is that movies can get away with 24FPS because they have inter-frame blurring. Games do not offer this and it's exactly the reason why we need higher framerates. Perception is distorted when you do not have a lot of detail to look for. But when that detail is present, low framerates are very annoying. I'm not sure how accurate what I said is, but that was the general idea. LE: http://www.100fps.com/how_many_frames_can_humans_see.htm I believe this is the article I read a while back which inspired me to write this post.
https://forums.guru3d.com/data/avatars/m/251/251773.jpg
Well to play alright on 4K today you need at least 2-way SLI 980, and this is just until they release some newer games. So to play the same games at 8K you'll need more or less 4xtimes the power so 8-way SLI 980 is this really worth it? especially when all 4K gaming monitors have 27-30", yes if you are planing to play games with magnifying glass, and all this money spent on extra graphic cards for little less noticeable edges, basicaly it all boils down to less aliasing...
https://forums.guru3d.com/data/avatars/m/163/163560.jpg
I mean 8k is cool and all, for working on design / science applications I'm sure it'd rock but as for general use....I'm not convinced. We can barely run games at 4k, we have no "real" source of 4k media at present so 8k is a bit mute. My current 4k tv (bought because it was on a crazy discount at the time) only up scales 1080p and while it does look better than my old Sony 1080p screen it's not actually displaying any 4k content since there isn't anything. More content first please, but I'll say that it's still going to be a nice screen even if it has no 8k media to display 🙂
https://forums.guru3d.com/data/avatars/m/224/224067.jpg
Oh I understand what you mean. I'm not sure about the exact values, but I know that we need around 20FPS to perceive motion instead of a set of images. The point is that movies can get away with 24FPS because they have inter-frame blurring. Games do not offer this and it's exactly the reason why we need higher framerates. Perception is distorted when you do not have a lot of detail to look for. But when that detail is present, low framerates are very annoying. I'm not sure how accurate what I said is, but that was the general idea. LE: http://www.100fps.com/how_many_frames_can_humans_see.htm I believe this is the article I read a while back which inspired me to write this post.
Yea, so to make it basic, if we take a strobe light, and set it away at 10 flashes per second, we'll see the individual flashes, as we increase the speed, at some point around (you say 20) 20-30 flashes per second, we will no longer see each flash, but the lamp will appear to be a steady constant light instead of a strobe I feel as if we can see above that rate though, old CRTs would still appear to strobe at 40-50Hz iirc
https://forums.guru3d.com/data/avatars/m/223/223176.jpg
I mean 8k is cool and all, for working on design / science applications I'm sure it'd rock but as for general use....I'm not convinced. We can barely run games at 4k, we have no "real" source of 4k media at present so 8k is a bit mute. My current 4k tv (bought because it was on a crazy discount at the time) only up scales 1080p and while it does look better than my old Sony 1080p screen it's not actually displaying any 4k content since there isn't anything. More content first please, but I'll say that it's still going to be a nice screen even if it has no 8k media to display 🙂
Never mind HW that's the least of troubles, try delivering a 6GB/s signal to drive 8k to an average house lol..