A 500Hz refresh rate NVIDIA G-Sync compatible gaming LCD is in the works
Click here to post a comment for A 500Hz refresh rate NVIDIA G-Sync compatible gaming LCD is in the works on our message forum
Picolete
We are getting really close to 600Hz, that is the golden number for many, as you can easily divide it by 120, 60, 50, 40, 30, 24, 25
aufkrawall2
120Hz would be sufficient with bright OLED and VRR & strobing properly combined, such a dumb development...
iNerd
iNerd
nizzen
I'm using 360hz while playing Quake. For me there is a big difference between 144hz and 360hz.
Many people here are like my wife:
Don't see difference between 60hz, 144hz and 360hz in games. She couldn't care less 😀
For her it's just another monitor taking up space 😛
iNerd
aufkrawall2
iNerd
:D
hahhaha same here! my wife told me once she would be surprised if she would get from me a bunch of colored rj45 cables instead of flowers 😀
but of course it depends what you are playing of course.
there is sure no point to play with unlimited frames hiting 700fps mark with 240 or 360 or 144hz monitor… that why i always cry when i see a question under a desktop pc „how much fps can i get in apex…“
And there are also type of people buying apple pro notebooks to read social medias.
I learn to have understanding for needs of other ppl 😀
mdrejhon
working on future versions of DLSS that will amplify frame rate by 4x-10x by the 2030s, enabling 100fps UE5 to be converted to 1000fps UE5 in a visually flawless and lagless manner (hint: It's not classic interpolation).
A finite framerate is an artificial humankind invention to imperfectly simulate analog real life motion.
It generates problems such as motion blur as well as stroboscopic effects, as seen in The Stroboscopic Effect of Finite Frame Rates as well as other articles in the Blur Busters Research Portal. I'm now cited in more than 25 peer-reviewed research papers at places like Google Scholar, ResearchGate, Academia, etc -- including very prestigious papers.
The bottom line is that more than 90% of population sees major motion blur differences between a 1/120sec camera shutter photograph and a 1/1000sec shutter camera photograph. Thus, most humans (even grandma) can tell the difference between 120Hz and 1000Hz at framerate=Hz, for things like browser scrolling or panning maps. That's why retina refresh rate blind tests in research, needs to compare dramatic differences in Hz. Instead of testing humans for 240Hz-vs-360Hz (which is only a 1.5x motion blur difference diminished to 1.1x motion blur difference due to slow GtG and jitter effects), researchers test 120Hz-vs-500Hz, or 240Hz-vs-1000Hz for the 4x blur difference -- as seen in
Properly Designing A Blind Test That >90% Of Humans Can See 240Hz-vs-1000Hz (non-Game Use Cases Too!)
Right now, this is very important for future VR. Many (even ~5-10% of population is a lot) current people can't use VR because they get flicker eyestrain, because they use flicker to eliminate motion blur headaches. But there are people who people get flicker headaches. Strobing is a humankind band-aid that will never get five-sigma ergonomics for entire population.
To get five-sigma of population to be comfortable with a holdeck/VR (no flicker headaches, no blur headaches), the only way to fix that is infinite frame rate at infinite Hz -- or at least numbers above human detection thresholds. Aka, ultra high framerates at ultra high Hz.
The retina refresh rate for a 16K 180-degree VR headset is well over 10,000 Hz. This is because even 8000 pixels/sec motion (in a sample-and-hold 8K screen) is a slow one-screenwidth per second motion yet generates 8 pixels of persistence-based motion blur at 1000fps 1000Hz. So, you need to go more than an order of magnitude above that, if you do a 16K VR headset (since 8K isn't retina anymore when stretched to a 180-degree FOV).
The bottom line is that refresh rate incrementalism (and throttled by GtG) is junk. That's why many in media say Hz is worthless, while the smart media has finally recognized the benefits of Hz to non-gaming humankind too.
Fortunately, 1000Hz GtG=0 (e.g. 1080p 24" OLED) will get us reasonably close to retina refresh rate for desktop monitors, although for ultrafast motion speeds it isn't retina (e.g. TestUFO Panning Map at 3000 pixels/sec will still have 3 pixels of motion blur).
It's a function of angular resolving resolution, the difference in sharpness between stationary image vs moving image (of the same image), and the human's fastest eye tracking speed for the available FOV of eye tracking. That's why higher resolutions and wider FOV amplify refresh rate limitations more than smaller-FOV lower-resolution displays.
The new best-practice for non-esports upgraders is to upgrade by between 2x-4x refresh rate, e.g. upgrade a 60Hz monitor directly to 240Hz, to see browser smooth-scrolling benefits in non-game use cases. Average Joe Users need much bigger big Hz jumps to see useful animation-fluidity differences in non-game use cases.
For esports players who see smaller differences in Hz, incrementalism is still the way to go because they need the tiniest edge to stay ahead of the other person -- metaphorically like being a millisecond ahead in a 100-meter Olympics sprint -- The Amazing Human Visible Feats Of The Millisecond (even if you don't feel the millisecond).
But... for average users:
In other words, if you are not in esports, don't upgrade your refresh rate more slowly than 60 -> 120 -> 240 -> 480 -> 1000 -> ... but preferably 2.5x to 4x steps such as 60 -> 240 -> 1000 or even 60 -> 144 -> 360 -> 1000. For non-gaming use cases like scrolling/panning/etc -- punching the diminishing curve of returns requires more dramatic jumps, e.g. 720p->4K.
Just like it's harder to see difference 1/120sec vs 1/240sec SLR camera shutter motion blur for sports stuff, it's vastly easier to see a difference in two photographs of 1/120sec shutter versus 1/1000sec shutter. There's actually an exact motion blur equivalence (when it comes to continuous-motion framerate=Hz material)
The main two things seen at high Hz sample and hold is persistence-blur and stroboscopic-effect:
https://blurbusters.com/wp-content/uploads/2019/05/display-persistence-blur-equivalence-to-camera-shutter.png
https://blurbusters.com/wp-content/uploads/2019/09/crosshairs-stroboscopic-arrowed-example-animate.apng
With small Hz differences, it's harder to see. But with big Hz differences, it's much easier to see -- even in non-gaming cases (e.g. smooth scrolling / smooth panning)
Many grandmas couldn't tell apart 720p-vs-1080p or VHS-vs-DVD, but they could tell 720p-vs-4K or more extreme, VHS-vs-4K much more easily. This is true with refresh rates above flicker fusion threshold (>70Hz) where you need more dramatic jumps for different artifacts that finite frame rates produce.
Even beyond 2x in this diminishing curve on a sample-and-hold display, some humans need a 4x-8x jump (e.g. 120Hz->1000Hz) for far beyond 90-99% of non-gaming humans to notice during continuous motion material (that doesn't have camera shutter blur) -- i.e. scrolling/panning/etc that becomes CRT motion clarity without the need for strobing.
Correct. Most non-gamers don't care -- they aren't bothered by scrolling motion blur.
However, this is still relevant science/research, and there are visible effects outside of games. In decades years from now, 1000Hz may be a freebie inclusion with no image degradation, much like 4K doesn't cost much more than 720p -- and can be optionally used. 120Hz is already slowly being commoditized (phones, tablets, consoles, etc) and once that's complete, the next dominoe (240Hz) falls, and so on. You get the picture of what will happen over the course of this century...
Besides, 1000Hz conveniently behaves as per-pixel VRR (you can play 24p, 25p, 48p, 59.94p simultaneously in 4 video windows with zero pulldown judder), so 1000Hz+ has other benefits for low frame rates too, making VRR obsolete and strobing obsolete, and making "VSYNC ON" have almost identical lag to "VSYNC OFF", no matter what framerate you do. So there's a ton of non-ultra-high-framerate benefit too. Manufacturers have now retina'd-out spatially, so it's time to retina-out temporally over the long term.
Mark Rejhon
Founder, Blur Busters / Inventor of TestUFO
Peer-Reviewed Display Researcher Cited in Over 25 Papers
Research Portal: www.blurbusters.com/area51
I agree. Adding more framerate is sometimes dumb, but it's the only way to keep reducing motion blur without strobing.
In the next ~10 years, NVIDIA is fantaskarsef
Dr.Puschkin
@mdrejhon, Amen brother. Been following your work for years, basically since first 120Hz displays came out.
Never stop doing what you do, it's appreciated very much.
yasamoka
Neo Cyrus
Mohammad Hassan
I will stick to 240Hz unless there is a 1000Hz display.
FlawleZ
The refresh rate on my 2013 plasms TV is 600hz. It's a shame that tech abandoned given its unparalleled color reproduction and black levels.
Venix
@yasamoka I was talking about the contribution to input latency..
So yeah I focused there . Yes more Hz means increased smoothness but the higher you go the more you reach to point of deminishing returns . If you get 500 hz to get rid of the blurriness ...do not bother other that I find extremely unlikely for a panel to deliver REAL 2ms or lower to actually do 500hz for real(I hope I will be proven wrong on that one.) , OLED eliminates completely the blurriness while providing better colors and black levers and ultra smooth.
KissSh0t
What would be the purpose of such a high refresh rate?
Catspaw
There probably will be a few people that get tricked into buying this, but most of the world is going to slowly move towards OLED.
Once you go past 165Hz its more up to the quality of the panel rather than the refresh rate (remember that you actually have to have the panel be fast enough to take advantage of the 500Hz).
I am afraid you got it wrong. BlurBusters claim there is a law of persistence that is potentially applicable to up to 1000FPS, they did not prove there is one.
What is the difference?
That they assume "everything else being perfect". That is NOT how monitors work now, its not how monitors worked in the past and it is unlikely they will work that way in the future.
The claim is only valid in a vacuum, meaning "everything else being equal and scaling linearly" higher frame rate is better to reduce motion blur. It is that "scaling linearly" part that is important.
You have to consider undershoot and overshoot (the higher the refresh rate, the harder it becomes to keep these in check), that will introduce blur, the whole thing that ultra fast refresh rate monitors are trying to solve.
Catspaw
MonstroMart