A 500Hz refresh rate NVIDIA G-Sync compatible gaming LCD is in the works
NVIDIA introduced a G-Sync compatible liquid crystal panel with a refresh rate of 500Hz at the keynote speech of COMPUTEX TAIPEI 2022.
SUS will reveal the "ROG Swift 500Hz Gaming Monitor," and the LCD panel will be the "Esports TN Panel," which is specifically intended for eSports. Furthermore, it has "NVIDIA G-SYNC Esports Mode" and supports "Esports Vibrance," which improves target visibility. Ghosting is greatly decreased compared to the standard model by supporting a refresh rate of 500Hz, allowing for more accurate targeting. As before, it features "NVIDIA Reflex Analyzer," allowing you to measure system delay while using an NVIDIA Reflex compliant mouse and GeForce series GPU.
Acer has introduced a 28-inch gaming liquid crystal display "Predator X28 G-SYNC" compatible with 4K / 152Hz, as well as Cooler Master "MM310" and "MM730" gaming mouse, as items compatible with "NVIDIA Reflex Analyzer."
Update: added ASUS press release
ASUS Republic of Gamers (ROG) today announced the ROG Swift 500Hz, the world’s first 500 Hz refresh-rate esports gaming monitor. The Swift 500Hz features a 24.1-inch FHD (1920 x 1080) display that utilizes Esports-TN panel (E-TN) technology to produce 60% shorter response times than standard TN LCD displays, making it the fastest LCD display ever. The Swift 500Hz includes NVIDIA® G-SYNC®, and the enhanced Esports Vibrance mode — specifically tuned for esports — built directly into the monitor firmware. It allows more light to travel through the LCD crystals, giving colors new levels of vibrancy. With latency a crucial factor in esports gaming, the Swift 500Hz also includes NVIDIA Reflex Analyzer, allowing gamers to measure latency with just a single click.
ASUS ROG has always pushed the boundaries of display technology. A decade ago, ASUS introduced the world’s first 144 Hz 1080p gaming monitor. In 2017, ROG introduced the first-ever NVIDIA G-SYNC 240 Hz gaming monitor. And in 2020, ROG unleashed the first 360 Hz gaming monitor.
Pushing the limits of display technology
"When we introduced the first 144 Hz monitor in 2012, people said the human eye can only perceive 60 frames per second,” explains Gavin Tsai, Display Product Manager for ASUS. “Then, when we introduced our 240 Hz monitor, they said the human can’t perceive the difference,” continues Tsai. “Today, in a market where 144 Hz and 240 Hz gaming monitors are common and standard specs, we are breaking entirely new ground with the incredibly fast ROG Swift 500Hz.”
With reduced motion blur, improved visuals and lower input latency, the ROG Swift 500Hz is designed to give professional esports gamers an advantage in tournaments. One of these impressive features – the extraordinarily low latency – is made possible in part through a vital partnership with NVIDIA. “The ROG Swift 500Hz with NVIDIA G-SYNC technology provides gamers the lowest latency available of any monitor on the market,” said Seth Schneider, Esports product manager at NVIDIA. “And with NVIDIA Reflex Analyzer on board, gamers can measure their latency with one click, ensuring the fastest response times for the most intense games.”
Unregistered
I do a custom refresh rate for a Acer S271HL 1080p monitor from 60Hz to 75Hz and can say the same. It's a minor improvement, but I can still tolerate 60Hz fine.
Differences in refresh rate are a lot more noticeable in VR for me. Going from 90Hz to 72Hz and I can easily see the screen blinking with the lower refresh rate. Even 72Hz to 80Hz is a good enough difference to go from uncomfortable to tolerable.
I saw a laptop with 300Hz G-Sync display and it was amazing. I don't know how much flat screens can really benefit with higher rates than that, but I can see it being useful for VR displays.
Senior Member
Posts: 128
Joined: 2007-08-28
My credentials (as Chief Blur Buster) have massively improved over the years, as I have discovered more and more weak links in current tests utilized by mainstream media.
Firstly, are you aware I am cited in more than 25 research papers already as well as, have you seen this comment?
In a blind ABX test? Because Ive never seen anyone pass a blind test after 144Hz, but I have seen a lot of people fail it.
Mainstream media are not following the scientific variables published here where 90% of mainstream can see 240Hz-vs-1000Hz, for maximizing human-visible tests.
We recently discovered limitations with the CS:GO test (software jitter, mouse jitter, sync technology jitter, GPU limitations) that maxes out below currently-available max-Hz monitor refresh rates. In addition, these factors along with nonzero LCD GtG blots out more than 80% difference of 240Hz-vs-360Hz.
Trained individuals can tell 240Hz-vs-360Hz. But it's ultra subtle, so not the majority of population. But we recently solved the limitations of that test, and discovered use cases that >90% of population can tell 240Hz-vs-1000Hz in a blind test of well-designed forced-eye-tracking framerate=Hz material (like attempting to read sideways scrolling text -- like the nametags above RTS players or the street name labels of an infinitely-scrolling map that never stops scrolling, etc). Also, VR exercises parallel use cases to that way more often than a stop-flick-stop-flick shooter like CS:GO.
We also, additionally discovered, that these weak links are also solvable, via new framerate=Hz technologies on 0ms-GtG displays, when fixing control-device jitter and fixing software jitter. That's why we recently discovered 120Hz-vs-240Hz is much more human-visible on OLED than 120Hz-vs-240Hz on LCD, now that we've feasted our eyes on OLEDs at DisplayWeek.
Also, 0ms-GtG 480Hz is available in some technologies like the Christie DLP E-Cinema projector, and there's other lab 0ms-GtG displays at quadruple-digit refresh rates already, which starts to linearly follow Blur Busters Law, unburdened by GtG, unburdened by mouse jitter, unburdened by software jitter, unburdened by VSYNC OFF jitter, etc, etc. This is hugely relevant to VR research.
Testing forced-eyetracking tests (like www.testufo.com/map), on 0ms-GtG displays, amplified refresh rate differences massively more than they currently do on triple-digit-Hz LCDs.
We have access to 1000Hz+ prototypes. The bottom line is that many everyday use cases (and future use cases such as VR & Holodecks) show massively more visible improvements capable of near-perfect blind test passes at refresh rates far higher than many researchers originally expected. Improved/newer software makes a massive difference, when utilizing the perfect-frame-pacing software technology that needed to be invented for VR headsets, which further amplifies Hz-differences (especially on near-0ms-GtG displays like OLED). Also many esports players don't eyetrack in crosshairsed games, so Hz-differences actually show up more in certain crosshairsless games and other use cases (e.g. VR). Forced-eyetracking tests made a huge difference in Hz-vs-Hz blind tests.
Given sufficient Hz differentials to punch the diminishing curve of returns (4x-8x Hz differences, like 240Hz-vs-1000Hz), sufficiently fast pixel response (0ms GtG) and sufficiently fast forced-eyetracked material (e.g. like trying to identify stuff inside a blur), passing blind tests among non-gamers reliably in the refresh rate stratospheres recently started to happen in early internal tests at many companies.
Read onwards.
I am afraid you got it wrong. BlurBusters claim there is a law of persistence that is potentially applicable to up to 1000FPS, they did not prove there is one.
What is the difference?
That they assume "everything else being perfect". That is NOT how monitors work now, its not how monitors worked in the past and it is unlikely they will work that way in the future.
The claim is only valid in a vacuum, meaning "everything else being equal and scaling linearly" higher frame rate is better to reduce motion blur. It is that "scaling linearly" part that is important.
It's fine to dispute claims, but the evidence is overwhelming now....
In addition, there are already many TestUFO tests that prove the claim at 60, 120, 240, 480 and thus easily extrapolates.
Also, quadruple-digit refresh rates already exists in prototype in laboratories.
I also had temporary access to multiple 1000-2000 Hz prototypes, including 1440 Hz vision-research projectors (Viewpixx sells one already), and the science still correctly scales.
It may be best to reply to my earlier comment, if you see any flaws.
My high-Hz versus powerpoints often does a see-for-yourself micdrops to CEOs, project managers, engineers, etc, some who were formerly dubious of the refresh rate race.
In a blind ABX test? Because Ive never seen anyone pass a blind test after 144Hz, but I have seen a lot of people fail it.
Mainstream media are not following the scientific variables published here, for maximizing human-visible tests.
CS:GO blind tests are useful for CS:GO only, but are not the same thing as testing for "maximum Hz of humankind benefit" (e.g. VR etc) because of multiple CS:GO limitations such as game jitter, mouse jitter, VSYNC OFF jitter (even 1-pixel jitter is worse than the motion blur difference of 144Hz-vs-165Hz at moderate motion speeds!). Also you need to test large Hz differences to compensate for a lot of weak links -- like 60Hz-vs-240Hz or 120Hz-vs-360Hz, or 240Hz-vs-1000Hz (at framerate=Hz).
Let's give an example of a different simplistic test. Trying to read the street map labels www.testufo.com/map succeeds a lot more blind tests when comparing 120Hz-vs-360Hz (3x Hz difference), because the test avoids a lot of error margins that are specific to CS:GO. The great news is that use cases such as virtual reality reproduce these differences much better than CS:GO does.
One great example is that refresh rate doesn't improve CS:GO nearly as much as it does a scrolling RTS game (like DOTA2 tweaked for better scrolling fluidity matching TestUFO -- there are tweaks available) running motion at framerate=Hz, because of the www.testufo.com/map effect. Easier to read nametags and identify enemies when double Hz halves motion blur (excluding GtG error margin).
Also, read below:
Even though 240Hz-vs-360Hz is almost invisible in CS:GO (1.5x blur difference throttled to 1.1x blur difference due to slow LCD GtG and high frequency microjitter caused by software / mouse hardware / mousepad / chosen sync technology).
Remember there are now over a million different TestUFO demos just by the combination of 30 tests multiplied by their customizable parameters -- so I can scientifically show off a lot of display concepts. For example, just look at how TestUFO stutter-to-blur ontinuum animation. High-frequency stutter (of sample-and-hold) or jitter (of erratic framerates or mouse jitter) can blend to motion blur.
(Demo of high-frequency stutters blends to blur. Stare at bottom UFO for 15 seconds)
It is already proven that mathematically if you perfectly framepace at framerate=Hz on a 0ms GtG display (like tests recently done on 60Hz OLED, 120Hz OLED, and 240Hz OLED), you get MPRT(100%) persistence of 1/refreshtime and 1/frametime worth of motion blur. The only way to reduce blur further without strobing is to raise the refresh rate and frame rate.
Now if you can't framepace perfectly (e.g. erratic stutter or jitter), even 70 erratic microstutters per second on a 360Hz or 500Hz vibrates so fast (like a fast-vibrating string -- see TestUFO demo for scientific proof) that it's extra motion blur worse than the maximum possible clarity afforded by the Hz. So, 240Hz-vs-360Hz is diminished further by all kinds of jitter sources (game itself, mouse itself, etc), in addition to slow GtG. So instead of the proper 1.5x blur difference, it's more like a 1.1x blur difference in many real-world games -- not visible.
So to overcome this, we have to oversample (overkill) refresh rates even more geometrically, e.g. 4x refresh rate differences, especially if we're sticking to LCD and 1000Hz mice (which has more jitter than 2000Hz+). There's a research paper on mouse jitter where it shows 1000Hz is not enough due to the jittering of mouse Hz versus display Hz:


Research paper DOI: https://dl.acm.org/doi/10.1145/3472749.3474783
*IMPORTANT NOTE: When using more stringent test variables, including display MPRT less than mouse poll interval, the 4000 Hz boxes can become red squares. However, 0.25ms desktop MPRT displays sufficiently bright is still a fair time away. 0.25ms = 1/4000sec persistence translating to 1 pixel of motion blur at 4000 pixels/sec or 2 pixels of motion blur at 8000 pixels/sec. Very subtle though, not relevant to 1080p (too fast to eyetrack) but relevant for future 0.25ms MPRT 4K, 8K, and VR displays where 4000 pixels/sec motion is easy to eyetrack. Also, it's worth noting ultralow MPRTs are already on the market -- Oculus Quest is already 0.3ms MPRT, via strobing.
It does not dismiss that internal demonstrations of 240Hz-vs-1000Hz under parameters normally used for VR (VR is always VSYNC ON and perfect framerate=Hz, which amplifies Hz-vs-Hz) has silenced a lot of non-gamers.
Also on OLED, 120Hz-vs-240Hz is much more visible than on LCD because of OLED's fast GtG. (Practically zeroing-out GtG proved Blur Busters Law even further!).
Remember, a 240Hz OLED and 500Hz LCDs was shown off at DisplayWeek 2022 at one of the public booths, which I was at.

On closing this out, it is common for many laypeople to miss the Hz forest for its refresh rate trees. So I will crosspost the different human-vision thresholds, since sometimes people are fixated on things like flicker-fusion thresholds (a low Hz).
There are many different effects caused by the multiple weak links of the humankind invention of finite frame rates to simulate analog moving images; this science is relevant science when using displays to try to perfectly match real life (e.g. VR).
---crosspost---
Many people misunderstand the different sensitivity thresholds, such as "Humans can't see above 75Hz" -- but that is only a flicker threshold. The purpose of this post is to show that there are extremely different orders of magnitude that refresh rate upgrades do address.
Even in a non-gaming context, one thing many people forget is that there’s many thresholds of detectable frequencies.
These are
This is a really low threshold such as 10 frames per second. Several research papers indicate 7 to 13 frames per second, such as this one. This doesn't mean stutter disappears (yet), it just means it now feel like motion rather than a slideshow playback.
Example order of magnitude: 10
A common threshold is 85 Hz (for CRTs). Also known as the “flicker fusion threshold”. Variables such as duty cycle (pulse width) and whether there’s fade (e.g. phosphor fade) can shift this threshold. This also happens to be the rough threshold where stutter completely disappears on a perfect sample-and-hold display.
Example order of magnitude: 100
Flicker free displays (sample and hold) means there is always a guaranteed minimum display motion blur, even for instant 0ms GtG displays, due to eye tracking blur (animation demo). The higher the resolution and the larger FOV the display, the easier it is to see display motion blur as a difference in sharpness between static imagery and moving imagery, blurry motion despite blur free frames (e.g. rendered frames or fast-shutter frames).
Example order of magnitude: 1000
Where mouse pointer becomes a continuous motion instead of gapped. This is where higher display Hz helps (reduce distance between gaps) and higher mouse Hz (reduce variance in the gaps). Mouse Hz needs to be massively oversample the display Hz to avoid mouse jitter (aliasing effects). If you move a mouse pointer 4000 pixels per second, you need 4000Hz to turn the mouse pointer into a smooth blur (without adding unwanted GPU blur effect).
Example order of magnitude 10,000
An example test of stroboscopic lights:

(From lighting industry paper but has also been shown to be true for stroboscopics on large displays, including VR displays intended to mimic the real world)
More information can be found in Research Section of Blur Busters.
Please vet me, rebut me, and peer review me -- I am fully prepared with science, with Ph.D researchers working with me too. The integrity of the refresh rate race depends on people trying to find flaws in scientific theory, so that research can be further improved, to find out more genuine real use cases for Hz.
Also, as a reminder, remember LCD-vs-LCD (120Hz vs 240Hz) is muddied by LCD GtG. 120Hz-vs-240Hz is much more visible on OLEDs I've seen on the exhibit floor at DisplayWeek. Parrotting self-experience on LCDs belittles the fact that I've seen thousands of laboratory, prototype, and unreleased displays in various places.
Remember -- due to limitations of old-codebase flick-shooter CS:GO that is otherwise the go-to benchmark for Hz -- the refresh rate race becomes more human-visible in stutterless, jitterless & crosshairsless apps and games. That means software that force you to eye-track ultrasmooth motion that is framerate=Hz. That means many use cases other than CS:GO, such as virtual reality (where framerate=Hz is an absolute eye-health headache-free necessity), map panning, and other ultrasmooth-pursuit motion use cases that amplify Hz visibility. There are great examples of both game and non-game use cases.
New public papers will be coming (by the mid 2020s) especially as ultra Hz near-0ms-GtG displays commercialize (witness the newly announced 240Hz OLEDs) and start hitting researchers worldwide for tests that incrementally moves ever closer and closer to retina refresh rate testing.
The goal by Blur Busters is
The goal is "highest Hz of human-visible benefit for the most extreme use cases" -- like a Star Trek Holodeck (VR headsets) or other use cases that amplify non-strobed sample-and-hold Hz differences much more massively. Perfectly matching real life requires ultrafast GtG (0ms) combined with analog-like motion simulated by retina refresh rates. Then once we finally have 0ms GtG, then in addition, very far up the diminishing returns curve, one needs to compare 4x-8x Hz differences, before finally discovering the vanishing point of the diminishing curve of returns. That is the proper intent of Correct Proper Design of a Refresh Rate Blind Test For Such Use Cases.
Senior Member
Posts: 1992
Joined: 2013-06-04
That's gonna add 10-15C to the GPU and CPU, which will ramp up fans, no thank you

I want quiet/silent opperation, and i don't have issues with my eyes at 60hz.
Having a good CPU cooler and a good case with airflow will reduce that effect.
Also, I always use a voltage offset on my CPU -0.1V undervolt and my GPU is undervolted also.
If your PC is noisy, maybe the fans you use are poop or you have it badly configured.
Also, don't sit next to the case, put it on the floor?
PS: if you're more focused on the noise your PC makes than playing/enjoying content, what the hell are you playing?
PS2: you don't have issues, yet.
Senior Member
Posts: 1784
Joined: 2012-10-07
I agree. Adding more framerate is sometimes dumb, but it's the only way to keep reducing motion blur without strobing.
In the next ~10 years, NVIDIA is working on future versions of DLSS that will amplify frame rate by 4x-10x by the 2030s, enabling 100fps UE5 to be converted to 1000fps UE5 in a visually flawless and lagless manner (hint: It's not classic interpolation).
A finite framerate is an artificial humankind invention to imperfectly simulate analog real life motion.
It generates problems such as motion blur as well as stroboscopic effects, as seen in The Stroboscopic Effect of Finite Frame Rates as well as other articles in the Blur Busters Research Portal. I'm now cited in more than 25 peer-reviewed research papers at places like Google Scholar, ResearchGate, Academia, etc -- including very prestigious papers.
The bottom line is that more than 90% of population sees major motion blur differences between a 1/120sec camera shutter photograph and a 1/1000sec shutter camera photograph. Thus, most humans (even grandma) can tell the difference between 120Hz and 1000Hz at framerate=Hz, for things like browser scrolling or panning maps. That's why retina refresh rate blind tests in research, needs to compare dramatic differences in Hz. Instead of testing humans for 240Hz-vs-360Hz (which is only a 1.5x motion blur difference diminished to 1.1x motion blur difference due to slow GtG and jitter effects), researchers test 120Hz-vs-500Hz, or 240Hz-vs-1000Hz for the 4x blur difference -- as seen in
Right now, this is very important for future VR. Many (even ~5-10% of population is a lot) current people can't use VR because they get flicker eyestrain, because they use flicker to eliminate motion blur headaches. But there are people who people get flicker headaches. Strobing is a humankind band-aid that will never get five-sigma ergonomics for entire population.
To get five-sigma of population to be comfortable with a holdeck/VR (no flicker headaches, no blur headaches), the only way to fix that is infinite frame rate at infinite Hz -- or at least numbers above human detection thresholds. Aka, ultra high framerates at ultra high Hz.
The retina refresh rate for a 16K 180-degree VR headset is well over 10,000 Hz. This is because even 8000 pixels/sec motion (in a sample-and-hold 8K screen) is a slow one-screenwidth per second motion yet generates 8 pixels of persistence-based motion blur at 1000fps 1000Hz. So, you need to go more than an order of magnitude above that, if you do a 16K VR headset (since 8K isn't retina anymore when stretched to a 180-degree FOV).
The bottom line is that refresh rate incrementalism (and throttled by GtG) is junk. That's why many in media say Hz is worthless, while the smart media has finally recognized the benefits of Hz to non-gaming humankind too.
Fortunately, 1000Hz GtG=0 (e.g. 1080p 24" OLED) will get us reasonably close to retina refresh rate for desktop monitors, although for ultrafast motion speeds it isn't retina (e.g. TestUFO Panning Map at 3000 pixels/sec will still have 3 pixels of motion blur).
It's a function of angular resolving resolution, the difference in sharpness between stationary image vs moving image (of the same image), and the human's fastest eye tracking speed for the available FOV of eye tracking. That's why higher resolutions and wider FOV amplify refresh rate limitations more than smaller-FOV lower-resolution displays.
The new best-practice for non-esports upgraders is to upgrade by between 2x-4x refresh rate, e.g. upgrade a 60Hz monitor directly to 240Hz, to see browser smooth-scrolling benefits in non-game use cases. Average Joe Users need much bigger big Hz jumps to see useful animation-fluidity differences in non-game use cases.
For esports players who see smaller differences in Hz, incrementalism is still the way to go because they need the tiniest edge to stay ahead of the other person -- metaphorically like being a millisecond ahead in a 100-meter Olympics sprint -- The Amazing Human Visible Feats Of The Millisecond (even if you don't feel the millisecond).
But... for average users:
In other words, if you are not in esports, don't upgrade your refresh rate more slowly than 60 -> 120 -> 240 -> 480 -> 1000 -> ... but preferably 2.5x to 4x steps such as 60 -> 240 -> 1000 or even 60 -> 144 -> 360 -> 1000. For non-gaming use cases like scrolling/panning/etc -- punching the diminishing curve of returns requires more dramatic jumps, e.g. 720p->4K.
Just like it's harder to see difference 1/120sec vs 1/240sec SLR camera shutter motion blur for sports stuff, it's vastly easier to see a difference in two photographs of 1/120sec shutter versus 1/1000sec shutter. There's actually an exact motion blur equivalence (when it comes to continuous-motion framerate=Hz material)
The main two things seen at high Hz sample and hold is persistence-blur and stroboscopic-effect:

With small Hz differences, it's harder to see. But with big Hz differences, it's much easier to see -- even in non-gaming cases (e.g. smooth scrolling / smooth panning)
Many grandmas couldn't tell apart 720p-vs-1080p or VHS-vs-DVD, but they could tell 720p-vs-4K or more extreme, VHS-vs-4K much more easily. This is true with refresh rates above flicker fusion threshold (>70Hz) where you need more dramatic jumps for different artifacts that finite frame rates produce.
Even beyond 2x in this diminishing curve on a sample-and-hold display, some humans need a 4x-8x jump (e.g. 120Hz->1000Hz) for far beyond 90-99% of non-gaming humans to notice during continuous motion material (that doesn't have camera shutter blur) -- i.e. scrolling/panning/etc that becomes CRT motion clarity without the need for strobing.
Correct. Most non-gamers don't care -- they aren't bothered by scrolling motion blur.
However, this is still relevant science/research, and there are visible effects outside of games. In decades years from now, 1000Hz may be a freebie inclusion with no image degradation, much like 4K doesn't cost much more than 720p -- and can be optionally used. 120Hz is already slowly being commoditized (phones, tablets, consoles, etc) and once that's complete, the next dominoe (240Hz) falls, and so on. You get the picture of what will happen over the course of this century...
Besides, 1000Hz conveniently behaves as per-pixel VRR (you can play 24p, 25p, 48p, 59.94p simultaneously in 4 video windows with zero pulldown judder), so 1000Hz+ has other benefits for low frame rates too, making VRR obsolete and strobing obsolete, and making "VSYNC ON" have almost identical lag to "VSYNC OFF", no matter what framerate you do. So there's a ton of non-ultra-high-framerate benefit too. Manufacturers have now retina'd-out spatially, so it's time to retina-out temporally over the long term.
Mark Rejhon
Founder, Blur Busters / Inventor of TestUFO
Peer-Reviewed Display Researcher Cited in Over 25 Papers
Research Portal: www.blurbusters.com/area51
Hi, could I get some monitor purchase advice off you, and also congratulations on your website & work you do, I've been using your discoveries in the "G-sync 101" articles since about 2015 (capping framerate below max refresh rate). Mostly I play fps multiplayer games and I've found upgrading from 75Hz to 144Hz an amazing difference (back in 2015), and I noticed an improvement overclocking to 180Hz, albeit that was subtle. I've got this old G-sync monitor (G2460PG https://pcmonitors.info/reviews/aoc-g2460pg/ ). If I was considering a monitor upgrade in a quest for less motion blur, based on what you've been saying then it seems important for me to focus on how quick the GTG transition is of the monitor as well as the Hz......and if I've understood the gist of your post then I should probably not settle for anything less than 360Hz to notice a proper difference. Companies don't really accurately say what their GTG is for their monitors do they? What kind of GTG transition time would I need to look for on a 360Hz monitor to notice a proper improvement vs my current monitor when it comes to blur reduction, and how much blur reduction could I expect from that upgrade? OLED looks like it would be the best of all worlds, but that's gonna be super expensive even if does exist at 360Hz.....what advice would you have for me re other technologies like TN vs IPS vs VA.....traditionally my understanding is that TN screens offer the best blur free experience of those 3 technologies, (albeit with other weaknesses such as viewing angle / colour reproduction)? Perhaps my questions might help other gamers on here upgrade sensibly too.
EDIT: it also seems that a 4000Hz mouse would reduce jitter from your table you showed when paired with a 360Hz monitor. Is that an applicable factor in fps style games or were you referring to that more in other usage scenarios?
EDIT #2: is there a formula a person can use to combine refresh rate with GTG time when comparing two monitors to work out the motion blur reduction?
Senior Member
Posts: 1784
Joined: 2012-10-07
There probably will be a few people that get tricked into buying this, but most of the world is going to slowly move towards OLED.
Once you go past 165Hz its more up to the quality of the panel rather than the refresh rate (remember that you actually have to have the panel be fast enough to take advantage of the 500Hz).
I am afraid you got it wrong. BlurBusters claim there is a law of persistence that is potentially applicable to up to 1000FPS, they did not prove there is one.
What is the difference?
That they assume "everything else being perfect". That is NOT how monitors work now, its not how monitors worked in the past and it is unlikely they will work that way in the future.
The claim is only valid in a vacuum, meaning "everything else being equal and scaling linearly" higher frame rate is better to reduce motion blur. It is that "scaling linearly" part that is important.
You have to consider undershoot and overshoot (the higher the refresh rate, the harder it becomes to keep these in check), that will introduce blur, the whole thing that ultra fast refresh rate monitors are trying to solve.
(I didn't really get it wrong)