LG Gets Ready for 8K Quad UHD

Published by

Click here to post a comment for LG Gets Ready for 8K Quad UHD on our message forum
https://forums.guru3d.com/data/avatars/m/243/243702.jpg
Yea, so to make it basic, if we take a strobe light, and set it away at 10 flashes per second, we'll see the individual flashes, as we increase the speed, at some point around (you say 20) 20-30 flashes per second, we will no longer see each flash, but the lamp will appear to be a steady constant light instead of a strobe I feel as if we can see above that rate though, old CRTs would still appear to strobe at 40-50Hz iirc
1st of all, this is completely wrong on so many levels. CRT!!! You see nearly flicker free images at 60Hz because it uses "lumifor" (material which keeps bright for certain amount of time after being hit by electron). But in 16ms (time between refreshes on 60Hz) it dims enough for people to notice. "Flicker free" CRTs were around 90Hz+ and only due to use of "lumifor". I can see flicker on my 120Hz screen if I enable backlight strobe. If I take real stroboscope I have no problem to distinct between flashes at over 300Hz. People who do not see distinct flashes at high frequencies have 2 possible causes: - Time for light emitting material to go dark is longer than delay between flashes - Person has some neurological disease/damage Edit: viz. bold text
https://forums.guru3d.com/data/avatars/m/224/224067.jpg
1st of all, this is completely wrong on so many levels. CRT!!! You see nearly flicker free images at 60Hz because it uses "lumifor" (material which keeps bright for certain amount of time after being hit by electron). But in 16ms (time between refreshes on 60Hz) it dims enough for people to notice. "Flicker free" CRTs were around 90Hz+ and only due to use of "lumifor". I can see flicker on my 120Hz screen if I enable backlight strobe. If I take real stroboscope I have no problem to distinct between flashes at over 300Hz. People who do not see distinct flashes at high frequencies have 2 possible causes: - Time for light emitting material to go dark is longer than delay between flashes - Person has some neurological disease/damage
^^ If you can see flicker at 120Hz, and almost flicker free at 60Hz, how can I be wrong on so many levels when I say I remember seeing flicker at 40-50Hz? You just contradicted yourself
https://forums.guru3d.com/data/avatars/m/243/243702.jpg
^^ If you can see flicker at 120Hz, and almost flicker free at 60Hz, how can I be wrong on so many levels when I say I remember seeing flicker at 40-50Hz? You just contradicted yourself
I explained why people though that 60Hz crt were close to "flicker free" (they used artificial pixel light persistence). You can read some here: http://xcorr.net/2011/11/20/whats-the-maximal-frame-rate-humans-can-perceive/
Importantly, however, flicker can occur from a different source. If you look at the image of a DLP projector, and move your gaze outside the screen, you will notice fringes of color. This also occurs if you wave your hand in front of the projector, as shown above. In a single-chip DLP projectors, the DLP chip produces black and white images, and the colors red, green and blue are created by a spinning color wheel. Wikipedia states that the wheel of common projectors can rotate at 4 or 5x the frame rate of the signal, so 240 or 300 Hz. This is a very high rate, yet clearly you are able to discriminate such large temporal frequencies.
And that does not state that 300Hz is limit at all.
https://forums.guru3d.com/data/avatars/m/224/224067.jpg
You completely read what you wanted to and failed to read my post correctly I was talking about STROBE LIGHTS Only the last sentence was about CRTs Read my post again, properly
https://forums.guru3d.com/data/avatars/m/243/243702.jpg
You completely read what you wanted to and failed to read my post correctly I was talking about STROBE LIGHTS Only the last sentence was about CRTs Read my post again, properly
Then it was dead on. Because using your "Strobe Light" (read stroboscope) you should distinct between flashes even at very high frequencies. Unless that strobe light is using as light source material which takes to more time to dim than is delay between strobes (as I wrote earlier). Industrial inspection of rotatory devices is done while they are running by use of high frequency stroboscopes. If something rotates at 377.35 rpm, you can visually put it into stop by using strobes at such frequency. Yes, you can use fraction of frequency, let device to do 2/3/4/5/... rotations before flashes if device rpm is few thousands.
https://forums.guru3d.com/data/avatars/m/34/34795.jpg
Portuguese generalist tv channels still transmit in SD (576i, even though IPTV service is through fiber...) No 4K content in sight. Shows like Walking Dead have pretty ****ty quality for 1080p. So, yeah... I wished the whole industry moved as fast as technology.
data/avatar/default/avatar10.webp
They just seeking new ways to get the money out of your wallet even it's all bull crap there still plenty who buy it. I stick with my 1080p TV for couple of years before i even concider to upgrade. And for PC well have 1440p and looks great my card can handle it fine but i doub if its 4k the drop in fps will be huge so 8k rediculous. There are no videocards on market that even can handle 4k at 60 fps steady let alone 8k. Juts a marketing trick so people upgrade again thats capitalism for yah.
data/avatar/default/avatar13.webp
No one beyond proof of concept is ready for 8K Cable providers are already charging out the nose for HD content which for the most part isn't much beyond 1080P. Movie studios aren't producing 8K media for distribution and barely 4K because storage starts to become iffy. ISP's would love to see full on 4K and 8K since their bandwidth requirement would shoot astronomically up and they have a choke hold on already severely over priced per MB pricing. Even if you woke up tomorrow and suddenly ISP's were super cheap, and 4K and 8K content and storage was everywhere and media was dirt cheap, people would still have to dole out for expensive 4K and 8K screens and monitors. It'll happen I imagine just not soon and not without ISP's and the industry getting their act together otherwise it may never.
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
For TVs I agree but for monitors eeeh not so sure. I sit quite close to my 24" monitor.
It isn't just a matter of how close you sit but how small the object is that you're looking at. On a phone, everything is tiny, so every increase in pixel density can be noticed. Meanwhile, my 1080p 32" display has a stuck pixel... a pink one... and I almost never notice it when watching a movie or playing a game. The finer levels of detail really don't matter unless you have something that requires that extra detail. That being said, if I'm playing a FPS and trying to snipe someone, yeah, the not-so-spectacular PPI really stands out. But in 99% of all other cases, I don't think having a higher pixel density would really improve my experience.
I'm afraid that it might be a false generalization. For example some people think the human eye can't see above 40FPS. I just feel the urge to beat those said people to death with a chair.
From what I recall, 40FPS is the maximum amount of frames human brains can process. In other words, once more frames are added, our brains can't keep up and distinguish the individual frames. You CAN see a difference between 40-60FPS, depending on what you're viewing and how the animation is rendered. But, if an animation is done properly, you can't tell the difference between something like 60FPS and 90FPS. There's a chance you could MAYBE see a difference if you have the displays side-by-side from each other, and you can definitely see a difference depending on what kind of post-processing effects are (or aren't) used. To me, the main appeal of 90Hz+ monitors, or 4k+ monitors, is the ability to play almost any game and it will still look good. But to me, that's not a price I'm willing to pay for the performance I have to sacrifice.
https://forums.guru3d.com/data/avatars/m/156/156133.jpg
Moderator
My baseball games are in 1080p, that is all I care about.
data/avatar/default/avatar06.webp
Film itself doesn't even have close to that resolution inherent to its source much less with all the digital tampering and motion resolution going on with fast action films.
Film is easily scanned to at least 10MP, that is if they are at least shot on 35mm, especially considering directors probably aren't using kodak gold quality stuff. Movies filmed in IMAX format can easily go over 30MP.
https://forums.guru3d.com/data/avatars/m/34/34585.jpg
lol 8K Quad UHD can the name get any longer? whats next 16k quad, quad UHD? lmfao Funny how they have 1080p look blurry like they did back in the day of 480p, i don't remember 1080p getting worse over time. all these ultra high resolutions are getting stupid only really useful on massive displays and you need to be fairly close to notice the difference anyway.
https://forums.guru3d.com/data/avatars/m/56/56686.jpg
This. Im still sitting on 1080p as I dont see the reason to go any higher. Though, I love a reason to do so. 8k, insane.
that cause most stuff isnt 1080p and even less of that stuff that is 1080p is still compressed. uncompressed 1080p look extremal nice, providing your not watching it on 60+ inch screen. It great tech is advance but what is the point if most stuff is still 720p in case of most cable/satellite/ota tv with the odd ball 1080p that is horrible compressed. alot of the "hd" channel i seen on all the above look horrible compressed and or upscale. Which ask the question what is the point if the HD we do have no is is all horrible compressed or upscaled. those higher resolutions will just make it look that much worse. Now if that all changes and we actual get native feed in those resolutions with no upscaling and even better no compression that this is all great news.
data/avatar/default/avatar22.webp
I would totally want 8K now after properly seeing 60" 4K curved TV with real 4K content at store demos. It totally blows FullHD and 2560p out of water. While it is technically not fully realized NOW, in about 5 years, I would predict it takes it place. Same thing with 720 and 1080 p format back then. And if Japan is really going to push 8k, it will actually benefit 4K faster since downsampling 8K to 4K will be really good, and probably push price down on 4K while leaving 8K highend until 8K transmission and media format becomes available and mainstream. Once we the right content, then the technology will be in demand, then with mass adaptation, prices will fall. Really need that 4K/8K killer app! I can say now I WANT 8K even if only for digital photography reasons. Its really amazing if you demo the correct content on it. For those countries still slothing it at less than 1080 steaming cable or TV or pay for movie service, they would probably jump straight to 4K or maybe consider 8k infrastructure instead since that cost will also allow scaling down and set them up nearly future proof (until 16K is released). Would not make any business sense to go from SD to FullHD when everyone knows better formats are in development.
https://forums.guru3d.com/data/avatars/m/118/118854.jpg
Oh please Mr. Hilbert, Please do 3/4 titan-x's(Overclocked) with this monitor and post some benches, oh my, JEEZUS. LOL, 8k would bury any video card ever made alive that we have currently.
https://forums.guru3d.com/data/avatars/m/132/132389.jpg
I explained why people though that 60Hz crt were close to "flicker free" (they used artificial pixel light persistence). You can read some here: http://xcorr.net/2011/11/20/whats-the-maximal-frame-rate-humans-can-perceive/
That link is absolute garbage and a half. It concludes that roughly 30fps, according to some questionable study done at ultra-low resolution in 2006, is good enough for FPS games. What a joke, the feeling between 60fps and 120 is night and day, especially in fighting games and FPS games. And yes I know fighting games are typically capped at 60fps. Not just feeling, obvious visual information is there, you can visibly see more frames super clearly such as when a character turns. Even in something as lag-infested (super high ping, no eastern server) as League of Legends benefits from high frame rates. Character movement looks choppy at 60fps versus 200+, especially turning. 60fps is the absolute minimum, and that crap tries to conclude that 30 is fine for humans. Maybe 90 year old ones with serious degenerative problems, but I can perfectly consistently time my responses to 1 frame at 60fps both now that I have 20-30ms total lag and before when I had close to zero. I can see individual frames just fine at a rate of 60 and I'm not even young.
https://forums.guru3d.com/data/avatars/m/243/243702.jpg
That link is absolute garbage and a half. It concludes that roughly 30fps, according to some questionable study done at ultra-low resolution in 2006, is good enough for FPS games. What a joke, the feeling between 60fps and 120 is night and day, especially in fighting games and FPS games. And yes I know fighting games are typically capped at 60fps. Not just feeling, obvious visual information is there, you can visibly see more frames super clearly such as when a character turns. Even in something as lag-infested (super high ping, no eastern server) as League of Legends benefits from high frame rates. Character movement looks choppy at 60fps versus 200+, especially turning. 60fps is the absolute minimum, and that crap tries to conclude that 30 is fine for humans. Maybe 90 year old ones with serious degenerative problems, but I can perfectly consistently time my responses to 1 frame at 60fps both now that I have 20-30ms total lag and before when I had close to zero. I can see individual frames just fine at a rate of 60 and I'm not even young.
It is, and it is not. They took many studies and showed their conclusions. While one concluded that human eye nerves are quite insensitive to flicker over 60Hz, they have there part (which most of as know to be true) with rotary RGB filters in DLP projectors. Which is proof that you can actually perceive difference even for stuff running at 300 fps.
https://forums.guru3d.com/data/avatars/m/132/132389.jpg
It is, and it is not. They took many studies and showed their conclusions. While one concluded that human eye nerves are quite insensitive to flicker over 60Hz, they have there part (which most of as know to be true) with rotary RGB filters in DLP projectors. Which is proof that you can actually perceive difference even for stuff running at 300 fps.
That article concluded 30fps is fine. Therefore the article is garbage.
https://forums.guru3d.com/data/avatars/m/215/215825.jpg
Remarkably what technology offers today. Only that you barely get any 1080p source material these days (besides PC gaming of course), so why bother with 4K TVs, let alone 8K TVs? This all reads like the technology is there, but there's little way to use it.
+1 on this and your source quote. Has anyone seen 4K in action beside a really good OLCD? Most demos of 4K involve static images with very little movement (think maybe a boat on the water in front of a city scene, or a person walking in a garden). When 4K is in motion and the viewer gets immersed in the subject matter your eyes quickly adjust and any visible gains are quickly forgotten (filtered). 4K, or 8K for that matter DOES look sharper and more detailed on larger displays, even in motion but for the average display being 46" I'd rather get me an OLED. Anyone who claims there is a noticeable difference on the cellphone/tablets need to consider that it is their brain telling them it's better as the human eye cannot see that fine of detail on such a small display. :banana:
https://forums.guru3d.com/data/avatars/m/262/262613.jpg
while it's nice that there's progress being made, I think 8k is useless outside the cinemas right now. I think that 4k TV's are useless for that matter until you start moving up to a 80 inch screen or bigger. My cable provider has yet to move up from 720P/1080i to full HD. That being Said, I've just registered to guru3d after years of being a passive reader of the comment section to announce that I just ordered a 28inch 4k monitor!