A 500Hz refresh rate NVIDIA G-Sync compatible gaming LCD is in the works

Published by

Click here to post a comment for A 500Hz refresh rate NVIDIA G-Sync compatible gaming LCD is in the works on our message forum
https://forums.guru3d.com/data/avatars/m/260/260828.jpg
We are getting really close to 600Hz, that is the golden number for many, as you can easily divide it by 120, 60, 50, 40, 30, 24, 25
https://forums.guru3d.com/data/avatars/m/251/251189.jpg
120Hz would be sufficient with bright OLED and VRR & strobing properly combined, such a dumb development...
data/avatar/default/avatar13.webp
Sylencer:

In one or two games that might be the case. But can you achieve that kinda frame rates on other competitive games? Most of them aren't properly optimized but still not even close to get 200 fps.
thats true but that is only type of games i play. Believe me there is no way i would otherwize stick to a 24/25 inch monitor in 2022 😀:D:D
data/avatar/default/avatar16.webp
Robbo9999:

These high refresh rate screens do matter and make a difference, Blur Busters have proven that there are benefits all the way up to 1000Hz in terms of motion clarity: https://blurbusters.com/blur-busters-law-amazing-journey-to-future-1000hz-displays-with-blurfree-sample-and-hold/ Me, personally, I've found a massive increase in enjoyment & ability by going from 75Hz to 144Hz, and then further benefits to overclocking to 180Hz. The higher the refresh rate the easier I can track targets in fps multiplayer games whilst in close quarter combat, quick mouse flicks onto enemy players when both players are close together & moving unpredictably & fast. You can just see the screen with greater clarity during these fast movements whilst also of course each frame is being updated within a shorter time span so you get more "snapshots" to guage what fast moving close targets are "doing" so you can guage their direction/intention/etc. I found you can play in totally different ways and styles when on a 144Hz+ monitor vs say 60Hz. It is difficult though to imagine GPU's & CPUs reaching 1000Hz in games, and I'd think 360Hz is probably the most sensible limit in today's landscape of technology & games.
True again. Also where i saw a difference in shooter games after upgrade from 240 to 360Hz when an anpther player is coming from the side. such details i did not have so clear on 240hz. also there was a season in fortnite where you could dive under sand… with 240Hz i still had the short lags when moving under the sand. with 360hz this movement was soooo smooth and not laggy at all. but i have to admin as a network engineer i‘m very very sensitive to latency. It doesnt matter if we speak about routing latency or reaction time… but i still cannot imagine difference from 360 to 500… i cant wait for some detailed hands-on reviews. Guru3D can you get one from Asus to do a Review? 🙂) This community has lot of potential buyers! 😎:p
data/avatar/default/avatar31.webp
I'm using 360hz while playing Quake. For me there is a big difference between 144hz and 360hz. Many people here are like my wife: Don't see difference between 60hz, 144hz and 360hz in games. She couldn't care less 😀 For her it's just another monitor taking up space 😛
data/avatar/default/avatar26.webp
Venix:

yeah the theory is that but the frame ms difference from 240 hz to 500hz ... is 4 ms to 2ms ... on top of the input delay ..... now 2thusands of a second .... i can not seem em even being conceivable to anyone even mlg 1337 players ... that said to the ones that compete for big fat prizes ... sure why not ! Non mlg players should not really care or worry about it
believe me… 2ms makes a difference…
https://forums.guru3d.com/data/avatars/m/251/251189.jpg
nizzen:

I'm using 360hz while playing Quake. For me there is a big difference between 144hz and 360hz.
Of course there is, there is less than half of the LCD hold-type motion blur. But there would be ~0 (so even less) total motion blur with strobed OLED also at 120Hz and you would only need to achieve a third of the fps (which should be much more stable). With proper in-game limiter/Reflex, also lag would only be slightly higher. Killing issues with more fps is just dumb (but yes, sometimes the only way in practice).
data/avatar/default/avatar13.webp
:D
nizzen:

I'm using 360hz while playing Quake. For me there is a big difference between 144hz and 360hz. Many people here are like my wife: Don't see difference between 60hz, 144hz and 360hz in games. She couldn't care less 😀 For her it's just another monitor taking up space 😛
hahhaha same here! my wife told me once she would be surprised if she would get from me a bunch of colored rj45 cables instead of flowers 😀 but of course it depends what you are playing of course. there is sure no point to play with unlimited frames hiting 700fps mark with 240 or 360 or 144hz monitor… that why i always cry when i see a question under a desktop pc „how much fps can i get in apex…“ And there are also type of people buying apple pro notebooks to read social medias. I learn to have understanding for needs of other ppl 😀
https://forums.guru3d.com/data/avatars/m/179/179850.jpg
aufkrawall2:

Of course there is, there is less than half of the LCD hold-type motion blur. But there would be ~0 (so even less) total motion blur with strobed OLED also at 120Hz and you would only need to achieve a third of the fps (which should be much more stable). With proper in-game limiter/Reflex, also lag would only be slightly higher. Killing issues with more fps is just dumb (but yes, sometimes the only way in practice).
I agree. Adding more framerate is sometimes dumb, but it's the only way to keep reducing motion blur without strobing. In the next ~10 years, NVIDIA is working on future versions of DLSS that will amplify frame rate by 4x-10x by the 2030s, enabling 100fps UE5 to be converted to 1000fps UE5 in a visually flawless and lagless manner (hint: It's not classic interpolation). A finite framerate is an artificial humankind invention to imperfectly simulate analog real life motion. It generates problems such as motion blur as well as stroboscopic effects, as seen in The Stroboscopic Effect of Finite Frame Rates as well as other articles in the Blur Busters Research Portal. I'm now cited in more than 25 peer-reviewed research papers at places like Google Scholar, ResearchGate, Academia, etc -- including very prestigious papers. The bottom line is that more than 90% of population sees major motion blur differences between a 1/120sec camera shutter photograph and a 1/1000sec shutter camera photograph. Thus, most humans (even grandma) can tell the difference between 120Hz and 1000Hz at framerate=Hz, for things like browser scrolling or panning maps. That's why retina refresh rate blind tests in research, needs to compare dramatic differences in Hz. Instead of testing humans for 240Hz-vs-360Hz (which is only a 1.5x motion blur difference diminished to 1.1x motion blur difference due to slow GtG and jitter effects), researchers test 120Hz-vs-500Hz, or 240Hz-vs-1000Hz for the 4x blur difference -- as seen in Properly Designing A Blind Test That >90% Of Humans Can See 240Hz-vs-1000Hz (non-Game Use Cases Too!) Right now, this is very important for future VR. Many (even ~5-10% of population is a lot) current people can't use VR because they get flicker eyestrain, because they use flicker to eliminate motion blur headaches. But there are people who people get flicker headaches. Strobing is a humankind band-aid that will never get five-sigma ergonomics for entire population. To get five-sigma of population to be comfortable with a holdeck/VR (no flicker headaches, no blur headaches), the only way to fix that is infinite frame rate at infinite Hz -- or at least numbers above human detection thresholds. Aka, ultra high framerates at ultra high Hz. The retina refresh rate for a 16K 180-degree VR headset is well over 10,000 Hz. This is because even 8000 pixels/sec motion (in a sample-and-hold 8K screen) is a slow one-screenwidth per second motion yet generates 8 pixels of persistence-based motion blur at 1000fps 1000Hz. So, you need to go more than an order of magnitude above that, if you do a 16K VR headset (since 8K isn't retina anymore when stretched to a 180-degree FOV). The bottom line is that refresh rate incrementalism (and throttled by GtG) is junk. That's why many in media say Hz is worthless, while the smart media has finally recognized the benefits of Hz to non-gaming humankind too. Fortunately, 1000Hz GtG=0 (e.g. 1080p 24" OLED) will get us reasonably close to retina refresh rate for desktop monitors, although for ultrafast motion speeds it isn't retina (e.g. TestUFO Panning Map at 3000 pixels/sec will still have 3 pixels of motion blur). It's a function of angular resolving resolution, the difference in sharpness between stationary image vs moving image (of the same image), and the human's fastest eye tracking speed for the available FOV of eye tracking. That's why higher resolutions and wider FOV amplify refresh rate limitations more than smaller-FOV lower-resolution displays. The new best-practice for non-esports upgraders is to upgrade by between 2x-4x refresh rate, e.g. upgrade a 60Hz monitor directly to 240Hz, to see browser smooth-scrolling benefits in non-game use cases. Average Joe Users need much bigger big Hz jumps to see useful animation-fluidity differences in non-game use cases. For esports players who see smaller differences in Hz, incrementalism is still the way to go because they need the tiniest edge to stay ahead of the other person -- metaphorically like being a millisecond ahead in a 100-meter Olympics sprint -- The Amazing Human Visible Feats Of The Millisecond (even if you don't feel the millisecond). But... for average users: In other words, if you are not in esports, don't upgrade your refresh rate more slowly than 60 -> 120 -> 240 -> 480 -> 1000 -> ... but preferably 2.5x to 4x steps such as 60 -> 240 -> 1000 or even 60 -> 144 -> 360 -> 1000. For non-gaming use cases like scrolling/panning/etc -- punching the diminishing curve of returns requires more dramatic jumps, e.g. 720p->4K. Just like it's harder to see difference 1/120sec vs 1/240sec SLR camera shutter motion blur for sports stuff, it's vastly easier to see a difference in two photographs of 1/120sec shutter versus 1/1000sec shutter. There's actually an exact motion blur equivalence (when it comes to continuous-motion framerate=Hz material) The main two things seen at high Hz sample and hold is persistence-blur and stroboscopic-effect: https://blurbusters.com/wp-content/uploads/2019/05/display-persistence-blur-equivalence-to-camera-shutter.png https://blurbusters.com/wp-content/uploads/2019/09/crosshairs-stroboscopic-arrowed-example-animate.apng With small Hz differences, it's harder to see. But with big Hz differences, it's much easier to see -- even in non-gaming cases (e.g. smooth scrolling / smooth panning) Many grandmas couldn't tell apart 720p-vs-1080p or VHS-vs-DVD, but they could tell 720p-vs-4K or more extreme, VHS-vs-4K much more easily. This is true with refresh rates above flicker fusion threshold (>70Hz) where you need more dramatic jumps for different artifacts that finite frame rates produce. Even beyond 2x in this diminishing curve on a sample-and-hold display, some humans need a 4x-8x jump (e.g. 120Hz->1000Hz) for far beyond 90-99% of non-gaming humans to notice during continuous motion material (that doesn't have camera shutter blur) -- i.e. scrolling/panning/etc that becomes CRT motion clarity without the need for strobing. Correct. Most non-gamers don't care -- they aren't bothered by scrolling motion blur. However, this is still relevant science/research, and there are visible effects outside of games. In decades years from now, 1000Hz may be a freebie inclusion with no image degradation, much like 4K doesn't cost much more than 720p -- and can be optionally used. 120Hz is already slowly being commoditized (phones, tablets, consoles, etc) and once that's complete, the next dominoe (240Hz) falls, and so on. You get the picture of what will happen over the course of this century... Besides, 1000Hz conveniently behaves as per-pixel VRR (you can play 24p, 25p, 48p, 59.94p simultaneously in 4 video windows with zero pulldown judder), so 1000Hz+ has other benefits for low frame rates too, making VRR obsolete and strobing obsolete, and making "VSYNC ON" have almost identical lag to "VSYNC OFF", no matter what framerate you do. So there's a ton of non-ultra-high-framerate benefit too. Manufacturers have now retina'd-out spatially, so it's time to retina-out temporally over the long term. Mark Rejhon Founder, Blur Busters / Inventor of TestUFO Peer-Reviewed Display Researcher Cited in Over 25 Papers Research Portal: www.blurbusters.com/area51
https://forums.guru3d.com/data/avatars/m/258/258664.jpg
Venix:

yeah the theory is that but the frame ms difference from 240 hz to 500hz ... is 4 ms to 2ms ... on top of the input delay ..... now 2thusands of a second .... i can not seem em even being conceivable to anyone even mlg 1337 players ... that said to the ones that compete for big fat prizes ... sure why not ! Non mlg players should not really care or worry about it
Yes that's true. Then again, I'm not a pro player either. There I could imagine that the difference between 60Hz and 200Hz or above making a difference, but reaction time alone is maybe 1/10th of a second faster, which beats any high fps advantages anyway.
Sylencer:

Well, I rock the lowest possible/competitive (graphic) settings (capping frames at 144 due to my monitor) on all multiplayer games I play, and maxed graphics capped at 60 fps for all single-player/non-competitive coop games. Works just fine for me. 144hz @ 1440p or even 1080p is plenty nowadays for gaming. If you want eye-candy and don't play competitive games you can go higher in resolution but refresh is just not worth it.
That's exactly how I'm doing handling things.
mdrejhon:

I agree. Adding more framerate is sometimes dumb, but it's the only way to keep reducing motion blur without strobing. In the next ~10 years, NVIDIA is working on future versions of DLSS that will amplify frame rate by 4x-10x by the 2030s, enabling 100fps UE5 to be converted to 1000fps UE5 in a visually flawless and lagless manner (hint: It's not classic interpolation). A finite framerate is an artificial humankind invention to imperfectly simulate analog real life motion. It generates problems such as motion blur as well as stroboscopic effects, as seen in The Stroboscopic Effect of Finite Frame Rates as well as other articles in the Blur Busters Research Portal. I'm now cited in more than 25 peer-reviewed research papers at places like Google Scholar, ResearchGate, Academia, etc -- including very prestigious papers. The bottom line is that more than 90% of population sees major motion blur differences between a 1/120sec camera shutter photograph and a 1/1000sec shutter camera photograph. Thus, most humans (even grandma) can tell the difference between 120Hz and 1000Hz at framerate=Hz, for things like browser scrolling or panning maps. That's why retina refresh rate blind tests in research, needs to compare dramatic differences in Hz. Instead of testing humans for 240Hz-vs-360Hz (which is only a 1.5x motion blur difference diminished to 1.1x motion blur difference due to slow GtG and jitter effects), researchers test 120Hz-vs-500Hz, or 240Hz-vs-1000Hz for the 4x blur difference -- as seen in Properly Designing A Blind Test That >90% Of Humans Can See 240Hz-vs-1000Hz (non-Game Use Cases Too!) Right now, this is very important for future VR. Many (even ~5-10% of population is a lot) current people can't use VR because they get flicker eyestrain, because they use flicker to eliminate motion blur headaches. But there are people who people get flicker headaches. Strobing is a humankind band-aid that will never get five-sigma ergonomics for entire population. To get five-sigma of population to be comfortable with a holdeck/VR (no flicker headaches, no blur headaches), the only way to fix that is infinite frame rate at infinite Hz -- or at least numbers above human detection thresholds. Aka, ultra high framerates at ultra high Hz. The retina refresh rate for a 16K 180-degree VR headset is well over 10,000 Hz. This is because even 8000 pixels/sec motion (in a sample-and-hold 8K screen) is a slow one-screenwidth per second motion yet generates 8 pixels of persistence-based motion blur at 1000fps 1000Hz. So, you need to go more than an order of magnitude above that, if you do a 16K VR headset (since 8K isn't retina anymore when stretched to a 180-degree FOV). The bottom line is that refresh rate incrementalism (and throttled by GtG) is junk. That's why many in media say Hz is worthless, while the smart media has finally recognized the benefits of Hz to non-gaming humankind too. Fortunately, 1000Hz GtG=0 (e.g. 1080p 24" OLED) will get us reasonably close to retina refresh rate for desktop monitors, although for ultrafast motion speeds it isn't retina (e.g. TestUFO Panning Map at 3000 pixels/sec will still have 3 pixels of motion blur). It's a function of angular resolving resolution, the difference in sharpness between stationary image vs moving image (of the same image), and the human's fastest eye tracking speed for the available FOV of eye tracking. That's why higher resolutions and wider FOV amplify refresh rate limitations more than smaller-FOV lower-resolution displays. The new best-practice for non-esports upgraders is to upgrade by between 2x-4x refresh rate, e.g. upgrade a 60Hz monitor directly to 240Hz, to see browser smooth-scrolling benefits in non-game use cases. Average Joe Users need much bigger big Hz jumps to see useful animation-fluidity differences in non-game use cases. For esports players who see smaller differences in Hz, incrementalism is still the way to go because they need the tiniest edge to stay ahead of the other person -- metaphorically like being a millisecond ahead in a 100-meter Olympics sprint -- The Amazing Human Visible Feats Of The Millisecond (even if you don't feel the millisecond). But... for average users: In other words, if you are not in esports, don't upgrade your refresh rate more slowly than 60 -> 120 -> 240 -> 480 -> 1000 -> ... but preferably 2.5x to 4x steps such as 60 -> 240 -> 1000 or even 60 -> 144 -> 360 -> 1000. For non-gaming use cases like scrolling/panning/etc -- punching the diminishing curve of returns requires more dramatic jumps, e.g. 720p->4K. Just like it's harder to see difference 1/120sec vs 1/240sec SLR camera shutter motion blur for sports stuff, it's vastly easier to see a difference in two photographs of 1/120sec shutter versus 1/1000sec shutter. There's actually an exact motion blur equivalence (when it comes to continuous-motion framerate=Hz material) The main two things seen at high Hz sample and hold is persistence-blur and stroboscopic-effect: https://blurbusters.com/wp-content/uploads/2019/05/display-persistence-blur-equivalence-to-camera-shutter.png https://blurbusters.com/wp-content/uploads/2019/09/crosshairs-stroboscopic-arrowed-example-animate.apng With small Hz differences, it's harder to see. But with big Hz differences, it's much easier to see -- even in non-gaming cases (e.g. smooth scrolling / smooth panning) Many grandmas couldn't tell apart 720p-vs-1080p or VHS-vs-DVD, but they could tell 720p-vs-4K or more extreme, VHS-vs-4K much more easily. This is true with refresh rates above flicker fusion threshold (>70Hz) where you need more dramatic jumps for different artifacts that finite frame rates produce. Even beyond 2x in this diminishing curve on a sample-and-hold display, some humans need a 4x-8x jump (e.g. 120Hz->1000Hz) for far beyond 90-99% of non-gaming humans to notice during continuous motion material (that doesn't have camera shutter blur) -- i.e. scrolling/panning/etc that becomes CRT motion clarity without the need for strobing. Correct. Most non-gamers don't care -- they aren't bothered by scrolling motion blur. However, this is still relevant science/research, and there are visible effects outside of games. In decades years from now, 1000Hz may be a freebie inclusion with no image degradation, much like 4K doesn't cost much more than 720p -- and can be optionally used. 120Hz is already slowly being commoditized (phones, tablets, consoles, etc) and once that's complete, the next dominoe (240Hz) falls, and so on. You get the picture of what will happen over the course of this century... Besides, 1000Hz conveniently behaves as per-pixel VRR (you can play 24p, 25p, 48p, 59.94p simultaneously in 4 video windows with zero pulldown judder), so 1000Hz+ has other benefits for low frame rates too, making VRR obsolete and strobing obsolete, and making "VSYNC ON" have almost identical lag to "VSYNC OFF", no matter what framerate you do. So there's a ton of non-ultra-high-framerate benefit too. Manufacturers have now retina'd-out spatially, so it's time to retina-out temporally over the long term. Mark Rejhon Founder, Blur Busters / Inventor of TestUFO Peer-Reviewed Display Researcher Cited in Over 25 Papers Research Portal: www.blurbusters.com/area51
Good to see you guys are still out and about doing your busting thing
https://forums.guru3d.com/data/avatars/m/235/235352.jpg
@mdrejhon, Amen brother. Been following your work for years, basically since first 120Hz displays came out. Never stop doing what you do, it's appreciated very much.
data/avatar/default/avatar35.webp
Venix:

yeah the theory is that but the frame ms difference from 240 hz to 500hz ... is 4 ms to 2ms ... on top of the input delay ..... now 2thusands of a second .... i can not seem em even being conceivable to anyone even mlg 1337 players ... that said to the ones that compete for big fat prizes ... sure why not ! Non mlg players should not really care or worry about it
That's not how it works. 2ms and 4ms can seem insignificant to you, but you're not comparing input latency here or similar. You're comparing how long each frame stays on your display. You're seeing hundreds of frames every second, so it's completely irrelevant whether 2ms and 4ms don't seem different to you. Why not extend that logic to 16.67ms (60FPS) and 8.33ms (120FPS)? They're both not something we can measure ourselves, yet the difference in motion smoothness, blur, and responsiveness is drastic. For what it's worth: that's 4ms of motion blur being cut down to 2ms - half the eye-tracking motion blur. ULMB does 1-2ms strobe length, so that's the equivalent motion blur of 500-1000Hz. This monitor has the same motion blur as using ULMB at 2ms strobe length - which is a massive achievement - especially given that strobing without side effects (crosstalk) and with a variable refresh rate (G-Sync / FreeSync) is very, very hard.
https://forums.guru3d.com/data/avatars/m/132/132389.jpg
aufkrawall2:

120Hz would be sufficient with bright OLED and VRR & strobing properly combined, such a dumb development...
While normally I'm all in favour of pushing for higher refresh rates, I have to agree that after 120Hz other aspects become far more important. True blacks/real HDR, VRR + strobing working properly at the same time, along with accuracy, and the lowest possible overshoot are FAR more important than higher refresh rate once you're already at 120. I bought a 240Hz monitor not too long ago, and while yes it's noticeably smoother than 120Hz, it's such severely diminishing returns that I'd definitely favour other improvements over just raw refresh rate. Honestly, to me the point of "good enough" is roughly 110-120 fps, beyond that it's "that's nice, but whatever".
https://forums.guru3d.com/data/avatars/m/292/292247.jpg
I will stick to 240Hz unless there is a 1000Hz display.
https://forums.guru3d.com/data/avatars/m/103/103841.jpg
The refresh rate on my 2013 plasms TV is 600hz. It's a shame that tech abandoned given its unparalleled color reproduction and black levels.
https://forums.guru3d.com/data/avatars/m/268/268248.jpg
@yasamoka I was talking about the contribution to input latency.. So yeah I focused there . Yes more Hz means increased smoothness but the higher you go the more you reach to point of deminishing returns . If you get 500 hz to get rid of the blurriness ...do not bother other that I find extremely unlikely for a panel to deliver REAL 2ms or lower to actually do 500hz for real(I hope I will be proven wrong on that one.) , OLED eliminates completely the blurriness while providing better colors and black levers and ultra smooth.
https://forums.guru3d.com/data/avatars/m/238/238382.jpg
What would be the purpose of such a high refresh rate?
data/avatar/default/avatar17.webp
There probably will be a few people that get tricked into buying this, but most of the world is going to slowly move towards OLED. Once you go past 165Hz its more up to the quality of the panel rather than the refresh rate (remember that you actually have to have the panel be fast enough to take advantage of the 500Hz).
Robbo9999:

These high refresh rate screens do matter and make a difference, Blur Busters have proven that there are benefits all the way up to 1000Hz in terms of motion clarity: https://blurbusters.com/blur-busters-law-amazing-journey-to-future-1000hz-displays-with-blurfree-sample-and-hold/ Me, personally, I've found a massive increase in enjoyment & ability by going from 75Hz to 144Hz, and then further benefits to overclocking to 180Hz. The higher the refresh rate the easier I can track targets in fps multiplayer games whilst in close quarter combat, quick mouse flicks onto enemy players when both players are close together & moving unpredictably & fast. You can just see the screen with greater clarity during these fast movements whilst also of course each frame is being updated within a shorter time span so you get more "snapshots" to guage what fast moving close targets are "doing" so you can guage their direction/intention/etc. I found you can play in totally different ways and styles when on a 144Hz+ monitor vs say 60Hz. It is difficult though to imagine GPU's & CPUs reaching 1000Hz in games, and I'd think 360Hz is probably the most sensible limit in today's landscape of technology & games.
I am afraid you got it wrong. BlurBusters claim there is a law of persistence that is potentially applicable to up to 1000FPS, they did not prove there is one. What is the difference? That they assume "everything else being perfect". That is NOT how monitors work now, its not how monitors worked in the past and it is unlikely they will work that way in the future. The claim is only valid in a vacuum, meaning "everything else being equal and scaling linearly" higher frame rate is better to reduce motion blur. It is that "scaling linearly" part that is important. You have to consider undershoot and overshoot (the higher the refresh rate, the harder it becomes to keep these in check), that will introduce blur, the whole thing that ultra fast refresh rate monitors are trying to solve.
data/avatar/default/avatar04.webp
iNerd:

I can play fortnite stable between 340 and 360fps without big fps drops with 5900x / 3080Ti with not all low settings. Runs fluently. But no RTX on and all that. But it works even with Low settings like low shadows on DX11 not only on perfromance mode. 360hz PG259QN is already overkill. I dont understand what would i need 500hz for. Really not. But i can see difference between 240 and 360 🙂
In a blind ABX test? Because Ive never seen anyone pass a blind test after 144Hz, but I have seen a lot of people fail it.
https://forums.guru3d.com/data/avatars/m/156/156348.jpg
Venix:

i am on 75 and playing most things on 75 fps ...honestly i can not tell it apart from 60 hz ... that said i might be able too if i just to 120 or 144 ... 60 to 75 hz i guess is not much of a difference !
Personally i can definitely see the difference between 100fps and 60 fps. When my games drop in the 60 i can tell without checking at the fps. I can't tell the difference between anything over 80ish fps. If i got 120fps and i drop to 85fps i wont notice anyway not enough for me to care. But if i drop from 100 to around 60 i'll definitely notice it in a bad way like it doesn't feel as smooth.