ASUS ROG Strix XG248Q Adaptive Sync 240 Hz Monitor

Published by

Click here to post a comment for ASUS ROG Strix XG248Q Adaptive Sync 240 Hz Monitor on our message forum
https://forums.guru3d.com/data/avatars/m/270/270288.jpg
Pointless unless you have eyes like a fighter pilot
data/avatar/default/avatar05.webp
Sixtyfps:

Pointless unless you have eyes like a fighter pilot
That's true, however what makes it really pointless is that AOC and Dell already offer the exact same thing, however I am sure once Asus release it, it will be more expensive than the aoc and dell we've already had for the past half a year.
https://forums.guru3d.com/data/avatars/m/243/243702.jpg
Sixtyfps:

Pointless unless you have eyes like a fighter pilot
You see difference between 144 and 240Hz even on desktop. It is same as 60~90Hz difference. Many people claim that there is no difference. That's till they use higher refresh for some time and then go accidentally back to lower refresh rate settings.
data/avatar/default/avatar19.webp
Fox2232:

You see difference between 144 and 240Hz even on desktop. It is same as 60~90Hz difference. Many people claim that there is no difference. That's till they use higher refresh for some time and then go accidentally back to lower refresh rate settings.
ow i'm sure you can see/notice it but paying for the hardware to run games on 144/240 with high/ultra settings is the reason i'm still rocking a old 60HZ monitor. pretty much a competative monitor, if your not making your money through it you better be very fanatic about css/CS:go cause thatss about the only thing that'll comfortably crunch out 240fps wile not looking like dirt.
https://forums.guru3d.com/data/avatars/m/243/243702.jpg
anub1s18:

ow i'm sure you can see/notice it but paying for the hardware to run games on 144/240 with high/ultra settings is the reason i'm still rocking a old 60HZ monitor. pretty much a competative monitor, if your not making your money through it you better be very fanatic about css/CS:go cause thatss about the only thing that'll comfortably crunch out 240fps wile not looking like dirt.
Ultra settings are mostly for fools today. Often you have hard time finding difference of each settings moved from High to Ultra even on screenshot. But drop in fps is noticeable. In those heavier games, I go setting by setting and note impact on fps of each step. Then I set step down all big offenders which make almost no visual difference and test one by one again to see if some of options did not become easier on GPU as result of change of other option.
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
Fox2232:

Ultra settings are mostly for fools today. Often you have hard time finding difference of each settings moved from High to Ultra even on screenshot. But drop in fps is noticeable. In those heavier games, I go setting by setting and note impact on fps of each step. Then I set step down all big offenders which make almost no visual difference and test one by one again to see if some of options did not become easier on GPU as result of change of other option.
Honestly there have been games where I feel like the Ultra settings are almost a graphical downgrade compared to high - too much chromatic aberration, or blurring.. usually the post process effects when overdone make the game look worse to me. So it's like a double bonus to turn them off.
https://forums.guru3d.com/data/avatars/m/243/243702.jpg
Denial:

Honestly there have been games where I feel like the Ultra settings are almost a graphical downgrade compared to high - too much chromatic aberration, or blurring.. usually the post process effects when overdone make the game look worse to me. So it's like a double bonus to turn them off.
Motion Blur, Lens Flares, Chromatic Aberration, Depth of Field, Bloom. Those are ones I almost always turn completely OFF just to have clearer image. Because, what is screen with fast gtg for if I just blur everything? Those blurry effects are good for people who have low fps or would notice stutter without them.
https://forums.guru3d.com/data/avatars/m/179/179850.jpg
Sixtyfps:

Pointless unless you have eyes like a fighter pilot
FrostNixon:

That's true, however what makes it really pointless is that AOC and Dell already offer the exact same thing, however I am sure once Asus release it, it will be more expensive than the aoc and dell we've already had for the past half a year.
False. Apples versus oranges. What a fighter pilot sees has nothing to do with refresh rate, because it's a glimpse test of a single flash that's not brightness-compensated, and is for object identification. It's tantamount to flashing a single frame for a single Hertz, and asking a user to identify the object in it. I am an inventor of TestUFO, and the pursuit camera invention used by many reviewers -- that I have peer reviewed conference paper coauthored with NIST.gov, NOKIA and Keltek. I was the world's first person to measure the input lag of GSYNC (in late 2013) and I was the world's first person to test 480Hz. So as an authority in this topic matter: There are OTHER mainstream benefits of high Hz completely unrelated to fighter pilots. 1. Higher Hz means less stroboscopic effects. https://www.blurbusters.com/wp-content/uploads/2017/08/project480-mousearrow-690x518.jpg 2. Higher Hz means less motion blur (without needing flicker strobe backlight like ULMB) https://www.blurbusters.com/wp-content/uploads/2014/03/motion_blur_from_persistence.png 3. Display motion blur goes down with higher Hz when using flickerfree mode. See animations www.testufo.com/eyetracking and www.testufo.com/persistence as an example. 4. Input lag benefits. Higher Hz has less scanout-related latency. We already easily tell apart 125Hz mice and 1000Hz mice. The difference between 120Hz display and a 1000Hz display is confirmed to be a similar latency improvement in scientific experiments of laboratory displays. Also, at 1000Hz, even the input lag of perfect VSYNC ON becomes very small (3 frames of latency is only 3ms). For more reading, please educate yourself with mathematics & science: Blur Busters Law: The Amazing Journey To Future 1000Hz+ Displays For blurless sample-and-hold (strobless ULMB, flickerless ULMB) is only possible via ultra-high Hz. Virtual reality scientist agree. Many! As one of the many, this is an NVIDIA SCIENTIST TWEET: https://www.blurbusters.com/wp-content/uploads/2018/06/img_5b116df080ebb.png (& many others 1000Hz confirmations by many scientists) Many websites need to stop writing "X Hz is worthless" articles non-science misinformation. Maybe it's important to you, and doesn't show visual benefits due to display limitations (fake Hz or slow response) but until you've actually seen good, proper, scientific high-Hz displays -- do not claim as such. It is TOTALLY WRONG and does a DISSERVICE to industry to spread this wrong information. I help display manufacturers engineer their computer monitors. Future GPUs with "frame rate amplification technology" will overcome the GPU side problem over the coming decade, too. And panel manufacturers have often delayed Hz progress because they doubted -- until they realized they were wrong. Eventually, Blur Busters plans to begin calling out all non-scientific media websites (possibly with paid advertisement to shame these websites if they post future articles about the worthlessness of "X Hz".) that still perpetuate these falsehoods. Strobeless ULMB/LightBoost can only be accomplished via 500Hz-1000Hz to achieve zero-flicker blur reduction (blurless sample-and-hold requires ultra-short refresh cycles. 1ms persistence requires 1ms refresh cycles for flickerfree). New tests by scientists have already confirmed that the vanishing point of the diminishing curve doesn't disappear yet far beyond 1000Hz. And higher Hz displays makes lower Hz cheaper!!!! Do you really want to do a tantamount equivalent of telling Intel and AMD not to manufacture faster and cheaper CPUs.... Thank you for reading this important public service post. Much appreciated. "X Hz is useless to everyone" needs to die like the "humans can't tell 30fps vs 60fps" crap. ๐Ÿ˜‰ Cheers, Chief Blur Buster
data/avatar/default/avatar01.webp
@mdrejhon I've been an GURU3D and Blurbuster user for many years but I just registered to reply this post. Just to say thank you for all your work (and team) It has been so useful on all my displays and to be able to squeeze the last drop of hz on them. And it is so clear to me the importance of hz and how promissing is the future with lower persistance.
https://forums.guru3d.com/data/avatars/m/179/179850.jpg
You are welcome!
https://forums.guru3d.com/data/avatars/m/258/258664.jpg
A screen for CS:GO players. Can't imagine to have many people feed a 1080p display with 240fps besides professional gamers. Strictly hardware wise that is, not that it wouldn't make sense.
https://forums.guru3d.com/data/avatars/m/243/243702.jpg
fantaskarsef:

A screen for CS:GO players. Can't imagine to have many people feed a 1080p display with 240fps besides professional gamers. Strictly hardware wise that is, not that it wouldn't make sense.
There are many games, where 240+ or at least 200+ fps is possible.With your card, it should be achievable in so many... Last time we had this type of discussion around, I took Overwatch and put it on almost lowest details. I was still GPU bottlenecked, but fps was around 250 in intense fights. And I was encoding that gameplay just in case. But I did not feel like posting it just for sake of triggering someone fanatical. Payday 2 does 120+ fps on very high details, with your GPU, you must do much more. Killing Floor (2). Older games like Dead Island hits the cap too. @mdrejhon : Thanks for jumping in, but do not get carried away over few blind people. Or those who talk about something they did not experience.
https://forums.guru3d.com/data/avatars/m/258/258664.jpg
Fox2232:

There are many games, where 240+ or at least 200+ fps is possible.With your card, it should be achievable in so many... Last time we had this type of discussion around, I took Overwatch and put it on almost lowest details. I was still GPU bottlenecked, but fps was around 250 in intense fights. And I was encoding that gameplay just in case. But I did not feel like posting it just for sake of triggering someone fanatical. Payday 2 does 120+ fps on very high details, with your GPU, you must do much more. Killing Floor (2). Older games like Dead Island hits the cap too. @mdrejhon : Thanks for jumping in, but do not get carried away over few blind people. Or those who talk about something they did not experience.
Oh yes I understand, it's just not everybody's running a 1080Ti or something along those lines ๐Ÿ˜‰ Like I meant, it's not easy to keep constant 240Hz / fps. I still see the least of people configuring or aiming to do this, strictly by usage, not possibility. Not everybody's as stupid as me to flash out 800โ‚ฌ for a GPU, which I fully understand. I personally bought it for eye candy... still can play Overwatch with fixed 142fps maxed (besides internal rendering scale), because I'm not that competitive. But I already noticed how fixing the fps greatly felt smoother in terms of gameplay (input lag which blur busters greatly helped me understand!)
https://forums.guru3d.com/data/avatars/m/179/179850.jpg
fantaskarsef:

Oh yes I understand, it's just not everybody's running a 1080Ti or something along those lines
There is work being done on "frame rate amplification technologies" that will raise framerate fairly cheaply (more framerate per dollar) without adding input lag. Oculus' spacewarp tech is one of these (45fps->90fps) -- it's very rough and wright-brothers at the moment compared to what will come -- tomorrow's frame rate amplification tech will be built more directly into silicon -- and artifactlessly and laglessly achieve 100fps->1000fps frame rate amplification. Imagine, midrange GPUs doing 1000fps cheaply. Within our lifetimes. This is in overdrive at the moment because of virtual reality, and the benefits will eventually filter down over the coming decade to midrange cards for desktop gaming monitors. Consoles are finally jumping on HFR (120Hz XBox), and LG demonstrated HFR streaming (120fps movies) at CES 2018. It takes times for those things to become much more widespread. Thanks to VR, there is incredible lab stuff but it's going to take years, since some of the focuses are on increasing pixel count. Higher framerates at 4K and 8K. Also, it's a vicious circle -- higher resolution amplify motion clarity limitations of Hz massively. 4K 120Hz LCD degrades motion clarity on a relative-percentage more than 1024x768 60Hz LCD. For one-screen-width-per-second horizontal panning motion, measured in the length of TestUFO blur trailing size behind moving UFO objects: --> 1024x768 60fps motion -- motionblurs 1024/60ths screenwidth (motion blur trail length of 17 pixels -- roughly 17x blurrier than stationary graphics), 17:1 degradation in image sharpness between motion-vs-stationary --> 3840x2160 120fps motion -- motionblurs 3840/120ths screenwidth (motion blur trail length of 32 pixels -- roughly 32x blurrier than stationary graphics), 32:1 degradation in image sharpness between motion-vs-stationary. So you see, higher resolution amplifies sample-and-hold motion blur visibility. Naturally de-blurring (stroblessly, since real life doesn't strobe/flicker) retina graphics in fast motion will require extremely high frame rates at refresh rates. The more Retina a display becomes, the lower the persistence you need to completely eliminate display motion blur. And the only way to do strobeless low persistence is ultra-high-fps at ultra-high-Hz. Certainly, you need bigger jumps ups to see human benefits. e.g. 60Hz -> 120Hz -> 240Hz -> 480Hz -> 960Hz Milking the diminishing return curves requires progressively bigger jumps upwards to see any benefits. Obviously, the first time a high Hz is achieved, it's often not fully efficient (e.g. response time limitations) but perfect effiency means a perfect halving of motion blur for a Hz doubling (for comfortably flickerfree sample-and-hold). So saying "240Hz is garbage because I can't tell apart 120Hz vs 144Hz" does not acknowledge how the curve behaves. The next jump upwards can get increasingly difficult, without losing the ability to have retina resolutions. Strobe-based blur reduction (ULMB/LightBoost) is wonderful, but strobing ain't Holodeck Star Trek final frontier stuff -- blurless+flickerless+strobeless (analog refreshrateless display) is more human-eye natural, but we can't go analog, so we have to go ultra-high-Hz to simulate analog naturalness. Eventually the resolution-race becomes over when everything is "retina", and some of the last remaining races becomes temporal resolution (which requires raising Hertz for people who don't like input lag or strobing). We're worsening our motion blur degradation deltas (clarity of stationary versus motion) by going to higher resolutions, and will eventually trigger more pressure to go beyond 60Hz in the longer-term humankind. Yesterday, plasma cost an arm and a leg. Today, 4K TVs are almost the same price as 1080p HDTVs (they have almost stopped selling 1080p TVs in Best Buy now). Same thing may happen to 1000Hz when it costs only a few dollars more than 120Hz in, let's imagine, 50 years from now. Who knows? When consoles play at 120-240Hz and competitive are already in the 1000Hz leagues. The progress is much slower than SSDs and CPUs, but there's already a Hertz Moore's Law starting up already where Hz doubles every approximately 5-10 years, which is currently observed. Sure, 1000Hz this is "long term" and "in a decade or few" stuff obviously. The first 1000Hz indie displays will arrive in the early 2020s. Indie came out with 240Hz in year 2013 - several years before manufacturers did. And it has happened again with 480Hz where Zisworks (whom I helped with strobe backlight stuff) has launched 480Hz well before mainstream manufacturers did. I've got contacts with indies that is experimenting in 1000Hz stuff already -- expect this to happen by the early half of 2020s. For example, it costs only $200 of FPGA modifications to a $700 DLP projector to make it output motion-flawless 1000Hz, though there is a severe loss of color depth, and of course, having FPGA skillz. But there are CHEAP indie/maker/hacker ways to achieve 1000Hz experiments and they will bear fruit by the early 2020s. It may take till 2030s before cheap 1000Hz occurs, but we do try to move the needle before the mainstream does ๐Ÿ˜€ Even if it's not important to you -- we don't need those "30fps vs 60fps" luddite stuff (like "X Hz is worthless") to misinform people in the interim. That stuff belong in the garbage bin -- any of that will be unceremoniously shot down by Blur Busters. ๐Ÿ˜‰ Cheers, Chief Blur Buster
https://forums.guru3d.com/data/avatars/m/243/243702.jpg
fantaskarsef:

Oh yes I understand, it's just not everybody's running a 1080Ti or something along those lines ๐Ÿ˜‰ Like I meant, it's not easy to keep constant 240Hz / fps. I still see the least of people configuring or aiming to do this, strictly by usage, not possibility. Not everybody's as stupid as me to flash out 800โ‚ฌ for a GPU, which I fully understand. I personally bought it for eye candy... still can play Overwatch with fixed 142fps maxed (besides internal rendering scale), because I'm not that competitive. But I already noticed how fixing the fps greatly felt smoother in terms of gameplay (input lag which blur busters greatly helped me understand!)
Since you are playing Overwatch at higher fps, little advice. Unlock game's internal fps to 400 via config. Use RTSS or Driver to limit fps to that you like. Overwatch has horrid internal limiter, you can see it at works in main menu where it is used to produce 60fps. Just start RTSS and its frametime graph. I never saw such bad behavior, so I started thread about it in Benchmark section. And I found out that even when running settings allowing me to pull over 340fps, moment fps goes above ~220-230, frametimes become progressively unstable as internal game's timers are bad. So, I think that I would not play Overwatch at more than 220fps as I do not fancy tooth saw.
https://forums.guru3d.com/data/avatars/m/258/258664.jpg
mdrejhon:

There is work being done on frame rate amplification technologies that will raise framerate fairly cheaply (more framerate per dollar). This is in overdrive at the moment because of virtual reality, and the benefits will eventually filter down over the coming decade to midrange cards for desktop gaming monitors. Consoles are finally jumping on HFR (120Hz XBox), and LG demonstrated HFR streaming (120fps movies) at CES 2018. It takes times for those things to become much more widespread. Thanks to VR, there is incredible lab stuff but it's going to take years, since some of the focuses are on increasing pixel count. Higher framerates at 4K and 8K. Also, it's a vicious circle -- higher resolution amplify motion clarity limitations of Hz massively. 4K 120Hz LCD degrades motion clarity on a relative-percentage more than 1024x768 60Hz LCD. For one-screen-width-per-second horizontal panning motion, measured in the length of TestUFO blur trailing size behind moving UFO objects: --> 1024x768 60fps motion -- motionblurs 1024/60ths screenwidth (motion blur trail length of 17 pixels -- roughly 17x blurrier than stationary graphics), 17:1 degradation in image sharpness between motion-vs-stationary --> 3840x2160 120fps motion -- motionblurs 3840/120ths screenwidth (motion blur trail length of 32 pixels -- roughly 32x blurrier than stationary graphics), 32:1 degradation in image sharpness between motion-vs-stationary. So you see, higher resolution amplifies sample-and-hold motion blur visibility. Naturally de-blurring (stroblessly, since real life doesn't strobe/flicker) retina graphics in fast motion will require extremely high frame rates at refresh rates. The more Retina a display becomes, the lower the persistence you need to completely eliminate display motion blur. And the only way to do strobeless low persistence is ultra-high-fps at ultra-high-Hz. Certainly, you need bigger jumps ups to see human benefits. e.g. 60Hz -> 120Hz -> 240Hz -> 480Hz -> 960Hz Milking the diminishing return curves requires progressively bigger jumps upwards to see any benefits. Obviously, the first time a high Hz is achieved, it's often not fully efficient (e.g. response time limitations) but perfect effiency means a perfect halving of motion blur for a Hz doubling (for comfortably flickerfree sample-and-hold). So saying "240Hz is garbage because I can't tell apart 120Hz vs 144Hz" does not acknowledge how the curve behaves. The next jump upwards can get increasingly difficult, without losing the ability to have retina resolutions. Strobe-based blur reduction (ULMB/LightBoost) is wonderful, but strobing ain't Holodeck Star Trek final frontier stuff -- blurless+flickerless+strobeless (analog refreshrateless display) is more human-eye natural, but we can't go analog, so we have to go ultra-high-Hz to simulate analog naturalness. Eventually the resolution-race becomes over when everything is "retina", and some of the last remaining races becomes temporal resolution (which requires raising Hertz for people who don't like input lag or strobing). We're worsening our motion blur degradation deltas (clarity of stationary versus motion) by going to higher resolutions, and will eventually trigger more pressure to go beyond 60Hz in the longer-term humankind. Yesterday, plasma cost an arm and a leg. Today, 4K TVs are almost the same price as 1080p HDTVs (they have almost stopped selling 1080p TVs in Best Buy now). Same thing may happen to 1000Hz when it costs only a few dollars more than 120Hz in, let's imagine, 50 years from now. Who knows? When consoles play at 120-240Hz and competitive are already in the 1000Hz leagues. The progress is much slower than SSDs and CPUs, but there's already a Hertz Moore's Law starting up already where Hz doubles every approximately 5-10 years, which is currently observed. Sure, 1000Hz this is "long term" and "in a decade or few" stuff obviously. The first 1000Hz indie displays will arrive in the early 2020s. Indie came out with 240Hz in year 2013 - several years before manufacturers did. And it has happened again with 480Hz where Zisworks (whom I helped with strobe backlight stuff) has launched 480Hz well before mainstream manufacturers did. I've got contacts with indies that is experimenting in 1000Hz stuff already -- expect this to happen by the early half of 2020s. For example, it costs only $200 of FPGA modifications to a $700 DLP projector to make it output motion-flawless 1000Hz, though there is a severe loss of color depth, and of course, having FPGA skillz. But there are CHEAP indie/maker/hacker ways to achieve 1000Hz experiments and they will bear fruit by the early 2020s. It may take till 2030s before cheap 1000Hz occurs, but we do try to move the needle before the mainstream does ๐Ÿ˜€ Even if it's not important to you -- we don't need those "30fps vs 60fps" luddite stuff (like "X Hz is worthless") to misinform people in the interim. That stuff belong in the garbage bin -- any of that will be unceremoniously shot down by Blur Busters. ๐Ÿ˜‰ Cheers, Chief Blur Buster
Thanks for the lengthy post and explaination. It's just that I doubt the average joe PC user tunes his games so they run at 240fps, that's what I meant. Again, I didn't say it wouldn't make sense, just that the real world scenarios, like you said yourself, take time to be common ground, right now people are indeed aiming to keep 60fps mostly, or at least that's my impression. The only group that actively aims for anything in the range of 200+fps seems to be gamers that are playing shooter games... I know that this is not a technical issue, but a usage issue by people. Maybe you understood me wrong in my post before.
Fox2232:

Since you are playing Overwatch at higher fps, little advice. Unlock game's internal fps to 400 via config. Use RTSS or Driver to limit fps to that you like. Overwatch has horrid internal limiter, you can see it at works in main menu where it is used to produce 60fps. Just start RTSS and its frametime graph. I never saw such bad behavior, so I started thread about it in Benchmark section. And I found out that even when running settings allowing me to pull over 340fps, moment fps goes above ~220-230, frametimes become progressively unstable as internal game's timers are bad. So, I think that I would not play Overwatch at more than 220fps as I do not fancy tooth saw.
Thanks, I'll give that a try.
https://forums.guru3d.com/data/avatars/m/179/179850.jpg
fantaskarsef:

Again, I didn't say it wouldn't make sense, just that the real world scenarios, like you said yourself, take time to be common ground, right now people are indeed aiming to keep 60fps mostly, or at least that's my impression. The only group that actively aims for anything in the range of 200+fps seems to be gamers that are playing shooter games...
Indeed. I predict: Yesterday (1998): Mainstream clamoring for 30fps and high end clamoring for 60fps (3Dfx SLI). Today (2018): Mainstream clamoring for 60fps and high end clamoring for 240fps (GTX TITAN). Tomorrow (2038): Mainstream clamoring for 120fps and high end clamoring for 1000fps
https://forums.guru3d.com/data/avatars/m/258/258664.jpg
mdrejhon:

Indeed. Yesterday, it was mainstream clamoring for playable framerates (20fps-30fps) and high end wanting 60fps. Today it's mainstream clamoring for playable framerates (60fps) and high end wanting 240fps. Tomorrow, it's mainstream clamoring for playable framerates (120fps) and high end wanting 1000fps.
Yes, that's how I see it too. Again, I was in no way saying it's stupid, I just see the average joe not making use of such a monitor much. I learned lots from blur busters and that's the reason I even got a monitor that's above 60Hz in the first place, so I couldn't agree with you guys more.
https://forums.guru3d.com/data/avatars/m/243/243702.jpg
mdrejhon:

Indeed. I predict: Yesterday (1998): Mainstream clamoring for 30fps and high end clamoring for 60fps (3Dfx SLI). Today (2018): Mainstream clamoring for 60fps and high end clamoring for 240fps (GTX TITAN). Tomorrow (2038): Mainstream clamoring for 120fps and high end clamoring for 1000fps
If we are here in 20 years without 3rd WW, 120Hz will be entry level refresh rate. Because screen controller capable to do that will be penny cheap and and low persistence will come from screen technology itself (xOLED microLED, ... ). You will have screens everywhere instead of billboards, and they will want to put you that "invisible" frame into brain. Tech moves fast. 6 years ago I paid more for 120Hz BenQ screen with awful firmware than I paid this year for 240Hz with FW allowing great color calibration, dynamic profile switching, higher contrast and maximum brightness, faster response and lower lag, and on top of it all Freesync. My next upgrade will be 240Hz screen. And I know they will come. Because HDR is just little something, that enables them to keep price up and differentiate. But once that is standard, they'll have to offer more again. Resolutions are already around, 3D screens came and left in its own way. (But I did love passive 3D and if there was reasonably priced 4k passive screen with at least 120Hz... That's not gonna happen as at time that will come, VR will deliver same thing with better immersion.) So, how are they gonna compete? 1080p 480Hz, that's what DP 1.4 can do.
https://forums.guru3d.com/data/avatars/m/123/123760.jpg
There's very few engines where even a TITAN could probably hold 240 constant. Only the very efficient Doom 2016 engine comes to mind, that was putting out massive framerates even on my 1080 GTX. Anyways I've grown more "casual" but I'm still very nitpicky about latency, so I've settled somewhat around 120 fps & 120 Hz... until 120 fps becomes mainstream and 240 becomes way more affordable. Personally I wish NVIDIA/ATI also came with some scaling mechanism so that 720p looks crisp on a 1440p display (as 1 pixel of 720p would fit exactly within 4 squared pixels of a 1440p resolution). That way I could play my casual games at 1440p and play my more serious stuff at 720p.