G-SYNC 101: G-SYNC vs. V-SYNC OFF w/FPS Limit


At the Mercy of the Scanout

Now that the FPS limit required for G-SYNC to avoid V-SYNC-level input lag has been established, how does G-SYNC + V-SYNC and G-SYNC + V-SYNC “Off” compare to V-SYNC OFF at the same framerate?

Blur Buster's G-SYNC 101: Input Latency & Optimal Settings
Blur Buster's G-SYNC 101: Input Latency & Optimal Settings
Blur Buster's G-SYNC 101: Input Latency & Optimal Settings
Blur Buster's G-SYNC 101: Input Latency & Optimal Settings
Blur Buster's G-SYNC 101: Input Latency & Optimal Settings
Blur Buster's G-SYNC 101: Input Latency & Optimal Settings

The results show a consistent difference between the three methods across most refresh rates (240Hz is nearly equalized in any scenario), with V-SYNC OFF (G-SYNC + V-SYNC “Off,” to a lesser degree) appearing to have a slight edge over G-SYNC + V-SYNC. Why? The answer is tearing…

With any vertical synchronization method, the delivery speed of a single, tear-free frame (barring unrelated frame delay caused by many other factors) is ultimately limited by the scanout. As mentioned in G-SYNC 101: Range, The “scanout” is the total time it takes a single frame to be physically drawn, pixel by pixel, left to right, top to bottom on-screen.

With a fixed refresh rate display, both the refresh rate and scanout remain fixed at their maximum, regardless of framerate. With G-SYNC, the refresh rate is matched to the framerate, and while the scanout speed remains fixed, the refresh rate controls how many times the scanout is repeated per second (60 times at 60 FPS/60Hz, 45 times at 45 fps/45Hz, etc), along with the duration of the vertical blanking interval (the span between the previous and next frame scan), where G-SYNC calculates and performs all overdrive and synchronization adjustments from frame to frame.

The scanout speed itself, both on a fixed refresh rate and variable refresh rate display, is dictated by the current maximum refresh rate of the display:

Blur Buster's G-SYNC 101: Scanout Speed DiagramAs the diagram shows, the higher the refresh rate of the display, the faster the scanout speed becomes. This also explains why V-SYNC OFF’s input lag advantage, especially at the same framerate as G-SYNC, is reduced as the refresh rate increases; single frame delivery becomes faster, and V-SYNC OFF has less of an opportunity to defeat the scanout.

V-SYNC OFF can defeat the scanout by starting the scan of the next frame(s) within the previous frame’s scanout anywhere on screen, and at any given time:

Blur Buster's G-SYNC 101: Input Lag & Optimal Settings

This results in simultaneous delivery of more than one frame scan in a single scanout (tearing), but also a reduction in input lag; the amount of which is dictated by the positioning and number of tearline(s), which is further dictated by the refresh rate/sustained framerate ratio (more on this later).

As noted in G-SYNC 101: Range, G-SYNC + VSYNC “Off” (a.k.a. Adaptive G-SYNC) can have a slight input lag reduction over G-SYNC + V-SYNC as well, since it will opt for tearing instead of aligning the next frame scan to the next scanout when sudden frametime variances occur.

To eliminate tearing, G-SYNC + VSYNC is limited to completing a single frame scan per scanout, and it must follow the scanout from top to bottom, without exception. On paper, this can give the impression that G-SYNC + V-SYNC has an increase in latency over the other two methods. However, the delivery of a single, complete frame with G-SYNC + V-SYNC is actually the lowest possible, or neutral speed, and the advantage seen with V-SYNC OFF is the negative reduction in delivery speed, due to its ability to defeat the scanout.

Bottom-line, within its range, G-SYNC + V-SYNC delivers single, tear-free frames to the display the fastest the scanout allows; any faster, and tearing would be introduced.



3696 Comments For “G-SYNC 101”

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Sort by:   newest | oldest | most liked
tearxinnuan
Member
tearxinnuan

Thank you very much for your article and tutorial! I’ve set up the appropriate settings according to your article, but I still have some questions I’d like to ask!

First, my current settings are:
NVCP: G-SYNC + V-SYNC on, LLM off,
In Game: Reflex on + boost, V-SYNC off

I believe this setup is optimal for GSYNC usage. I don’t limit my frame rate using any external software or NVCP. When I enable Reflex in-game, it automatically caps my frame rate at 260 FPS (my monitor is 280Hz). I think relying solely on Reflex to limit my frame rate would be more straightforward than setting it separately, and perhaps also avoid conflicts and instability caused by multiple frame limits. Secondly, I’ve personally tested the games I play, and Reflex takes precedence over both the in-game and NVCP frame limits. That is, no matter how much I limit my frame rate, once Reflex is enabled, it caps it at 260 FPS.

I primarily play competitive games like Valve, APEX, and Overwatch, but I also occasionally play other single-player games. Then, the competitive games I play all have Reflex, so can I completely abandon all external frame limiting methods and rely solely on Reflex?

Also, regarding LLM in NVCP, should I set it on or off, or even set it to Ultra? I’m not sure if there are any advantages or disadvantages to turning LLM on, even though Reflex takes over a lot of the processing. There’s a lot of controversy online about LLM, and even NVIDA officials claim that setting LLM to Ultra will minimize V-SYCN latency.

Looking forward to your answers!

dimacbka
Member
dimacbka

Hi. I really liked this article. But I have a couple of questions. I have a new PC that gives 800 fps in cs2. How do I set up this gsync+vsync+reflex bundle correctly? My monitor is 280Hz. I’m confused, do I need to limit frames via the nvidia panel? Yesterday I turned on “delay” on Ultra and reflex+boost. In the game, the frames were around 260. With the fps_max parameter 0

mike-lesnik
Member

Hello, jorimt! My question is more about input delay than G-sync, but I decided to ask it here because I really like your style of response — simple and clear.
I don’t quite understand what role frametime plays in input delay? It is often written that frametime is the time needed to create a frame, but 60 frames of 16.6 ms each can be created by either an underloaded or overloaded GPU. On the screen, we see the same framerate and frametime in both cases, but the resulting input delay will be different…
That is, the frametime is not “the time it took the system (CPU-OS-Engine-GPU) to create the frame”, but “the time allotted for displaying the frame by the display before the next one appears”?

dpawelcz
Member
dpawelcz

I’m having an awful time trying to get Street Fighter 6 feeling good on my Zephyrus G14 gaming laptop. It has a 120hz OLED screen. I swear in game it doesn’t feel like its getting 120hz, and feels input laggy.
The game is locked at 60fps, and it feels as if its running at 60hz. Outside the game i’ve confirmed im running at 120hz on the display. I have gsync ON and vsync ON in the nvidia control panel. I’ve also noticed that no matter what, sf6 starts with vsync on in the settings and i have to turn it off every time manually. I suspect that might be the issue.
Any tips would be greatly appreciated

anthony3192
Member
anthony3192

When I activate vsync from the nvidia app per profile, therefore from game to game (as you advised me) some games like the witcher, wukong, fortnite are limited to 225fps (I have a 240hz monitor) while other titles like Valorant and Cod have an unlocked frame rate, so they are not limited. How come?

wpDiscuz