Frametime and latency performance
The charts below will show you graphics anomalies like stutters and glitches in a plotted chart. Frame time and pacing measurements.
- FPS mostly measures performance, the number of frames rendered per passing second.
- Frametime AKA Frame Experience recordings mostly measures and expose anomalies - here we look at how long it takes to render one frame. Measure that chronologically and you can see anomalies like peaks and dips in a plotted chart, indicating something could be off.
We have a detailed article (read here) on the methodology behind it all. Basically the time it takes to render one frame can be monitored and tagged with a number, this is latency. One frame can take say 17 ms. Higher latency can indicate a slow framerate, and weird latency spikes indicate a stutter, jitter, twitches; basically, anomalies that are visible on your monitor. What these measurements show are anomalies like small glitches and stutters that you can sometimes (and please do read that well, sometimes) see on screen. Below I'd like to run through a couple of titles with you. Bear in mind that Average FPS often matters more than frametime measurements.
Please understand that a lower frame time is a higher FPS, so for these charts, lower = better. Huge spikes would be stutters, thick lines would be bad frame pacing, and the graduate streamlining is framerate variation.
- Formula 1 2020 Codemasters
- Watch Dogs: Legion
- Shadow of the Tomb Raider (SOTTR)
- Far Cry: New Dawn
As you might have observed, we're experimenting a bit with our charts and methodology. Below the games at 2560x1440 (WQHD), with image quality settings as used throughout this review. These result sets are based on the reference review and serve as an indication of the performance of cards in the same class.
Above: Watch Dogs: legion
Above: Hitman 3
Above: Shadow of the Tomb raider
Above: Assassins Creed Valhalla
Above: Formula 1 2020 Codemasters
Above: Far Cry New Dawn