There is no denying that we are in a Golden Age of technology. Personal computers and consoles continue to become more powerful with each successive generation, resulting in gamers being able to experience content at eye-watering resolutions on displays with triple-digit refresh rates. Most modern gamers grasp the concept that more fps (frames per second) usually results in a better gaming experience, which is one of the major focal points for the typical PC vs Console argument that occurs at least 500 times per day.
Modern gamers are also learning that monitors with higher refresh rates are conducive to a better gaming experience, which is by no means a new development. But the plethora of terms that consumers are bombarded with nowadays can cause significant confusion: V-Sync, FreeSync™, G-SYNC®, LFC, VRR, etc. The aim of this article is to explore these technologies and hopefully clear up some of the confusion, but we need to discuss the underlying concepts first.
The relationship between frames per second and refresh rate has always been important to gaming, but these values were born long before the advent of video games. Let us begin with a trip down memory lane.

Pictures in Motion
You may have heard the old urban-myth that the human eye cannot see above 24/30/60 fps (the value varies depending on the storyteller); now of course the myth does not mean we cannot physically see things at high frame rates, but instead that we cannot discern the difference when media is displayed at these higher values. There have even been some explanations which suggest that the recovery time of the rods and cones in our retina is the limiting factor, whereas in reality our eyes and brain are capable of processing beyond 1000 fps.
But it is the lower end of the spectrum that is more interesting, the human eye can register 10-12 frames per second as individual static images; but beyond that rate (13 fps and higher) our brains interpret what we are seeing as motion. This optical illusion is often referred to as the persistence of vision. Early silent films (1891-1926) had frame rates ranging between 14 and 26 frames per second (the cameras were hand-cranked back then, hence the range); enough to provide a sense of motion, albeit somewhat jerky. It was not until the late 1920s that the technology existed to synchronise sound with a motion picture, with the first ‘talkie’ being The Jazz Singer which was set at 24 frames per second. This was the slowest frame rate possible for producing intelligible sound, as sound was synced by actually printing an optical track alongside the image on the physical filmstrip itself.
Film was an expensive medium back then, so creating a movie with video and audio above 24 fps would be a costly endeavour as it would require more film; hence why it became the industry standard for movies. It has only been in recent years that some films have attempted higher frame rates, mostly with a mixed reception.

The 1950s Changed Everything
This particular decade was of course when the television gained popularity. Anyone over a certain age will no doubt remember the large and bulky CRT (Cathode Ray Tube) displays that used to sit in our homes, whether as a television or as a computer monitor. This is also when refresh rates as we know them became a factor. Early CRT displays utilised vacuum tubes at the time, a small bulb-like device that has since been replaced by modern-day transistors; although the audiophiles out there will most likely be familiar with modern amplifiers that use vacuum tubes (the sound is smoother, warmer, and cleaner).
The limits of vacuum tube technology at the time resulted in CRT displays refreshing their image at AC line frequency, which was 60 Hertz in the U.S. and 50 Hertz in Europe. Neither of these values lined up well with the 24 fps of film, so something different had to be implemented. The U.S. utilised the NTSC format (30 fps interlaced, aka 60i) whilst Europe utilised PAL (25 fps interlaced, aka 50i). Interlacing doubles the perceived frame rate by flashing each frame twice.

Interlacing was a clever way to improve the perceived motion of video and reduce flicker, without needing to increase bandwidth (which was considerably limited at the time). It was nothing new however, as film projectors utilised the same technique. By using a two-bladed shutter, a 24 fps film was actually projected at 48 Hz: each frame would flash twice before advancing the film.
In our homes however, some interesting technical magic had to occur in order for anything shot with film to successfully display on our televisions. If you lived in a country that used PAL, the film was accelerated by 4%. This resulted in the picture being silky smooth, but also caused a small shift in the pitch of the audio. For those in NTSC regions, the film was converted using a technique called 3:2 pull down; simply put, a third ‘half frame’ was added to every second video frame. This meant the audio was unchanged, but it resulted in slight judder and unsmooth playback.
It is also worth noting that the two formats also had different forms of colour correction (manual versus automated) as well as slightly different resolutions: NTSC – 720×480, PAL – 720×576; which are known as 480i and 576i respectively.

Not Enough Hertz Actually Hurts
Well not anymore, but in the days of gaming on a CRT it most certainly did and was actually directly linked to the size of the display itself. CRT monitors larger than 17 inches required a refresh rate of at least 72Hz, otherwise the user would experience mild discomfort such as eye-strain and headaches. This is why VESA (Video Electronics Standards Association) decreed that a monitor required a refresh rate of 75Hz or above in order to be classified as ‘flicker-free’. Whilst the average PC user tended to have a 15 inch or even smaller monitor, many ‘hardcore’ gamers utilised 19 or even 21 inch CRT displays.
As the sizes of monitors increased, so did their resolutions. Keep in mind that the initial VGA standard only offered a pitiful resolution of 640×480, but over time things improved. SVGA upped the resolution to 800×600, which was the most popular resolution all the way up to the year 2000 (56% market share). But eventually XGA (1024×768) became the gaming standard. If we draw a parallel to modern gaming, XGA can be thought of as the ‘1080p’ of the 00s; this is the resolution that most PC gamers used. However, much like the higher resolutions that are available today (1440p and 4K), there were also two higher resolutions that those with beefier hardware commonly utilised back then: 1280×1024 (SXGA) and 1600×1200 (UXGA). I remember being blown away playing games at UXGA on my 21″ CRT many years ago.
Sure there were other factors such as colour depth back then, but resolution is a vital component of calculating refresh rate; hence the digression.
Due to the way CRT monitors worked, setting the resolution to a value lower than the display’s native resolution did not result in image degradation like it does on modern flat-panels. Lowering the resolution did however increase the refresh rate of the monitor (this does not occur on modern flat panels), we will leave out the mathematics but here is an example of how the maximum refresh rate increased on a Sony CPD-E430 as the resolution was lowered:
- 1600 x 1200 @ 78Hz
- 1280 x 1024 @ 91Hz
- 1024 x 768 @ 121Hz
- 800 x 600 @ 155Hz
The limiting factor back then was of course that not all graphics cards would necessarily support those high refresh rates; but if you were fortunate enough to have one that did, then you were in for a treat. For reference, I used to play RPGs at 1600×1200 for the eye-candy but played games like Tribes 2 and Unreal Tournament at 1024×768 for the improved response time and triple-digit frame rates.
Gaming at high refresh rates is nothing new, but due to the shift from CRT to LCD/LED technologies, we have effectively started from scratch; with modern high-refresh rate displays only appearing in the last few years.

High Refresh Rates Are Not New
The trip down memory lane certainly became a much longer journey than intended, but it is important to understand that there has been a regression in regards to refresh rates with the advent of newer technologies. Our high-end gaming PCs of 15-20 years ago were able to push well into triple-digit fps values, whilst CRT monitors could also provide high refresh rates to match; albeit at lower resolutions. But when the shift to flat panels occurred, we took a significant step backwards in terms of refresh rates; with the new standard becoming 60Hz. But since our average frames per second had also dipped as well due to the demanding nature of modern AAA games (the new goal was to achieve 60fps), it was not so much of an issue.
Moving on, now would be a good time to explore V-Sync (vertical synchronisation).
We have covered the basics: frames per second is the amount of images your computer can generate, whilst refresh rate is how many frames your monitor can refresh per second. In a perfect world your fps will equal the refresh rate of your monitor, as anything higher will cause tearing whilst anything lower will cause visible stuttering. Simply put: stuttering is when your graphics card is not up to the task, whilst tearing is when the monitor is not. In order to understand tearing and stuttering, we need to introduce what is referred to as the frame buffer.
The frame buffer is where images are temporarily stored in the video memory of a graphics card before being sent to the monitor. There are usually two buffered images being held by your graphics card at a time, the first one is the image actually being displayed on your monitor (primary buffer) whilst the second image is the one that will be displayed next (secondary buffer). When it is time for the next frame to be displayed, the secondary buffer becomes the primary and a new image is loaded into the secondary buffer. Your graphics card is constantly doing this as fast as it can in order to provide you with a solid gaming experience; the better the video card, then the faster it can process each image. Somewhat similar to the old animated flipbooks; the faster you move your thumb, the faster the perceived motion is.

When your graphics card generates more frames per second than your monitor is capable of displaying, several images have been generated before any of them have been sent to the monitor. When the image is eventually sent to the monitor it ends up displaying several overlapped frames; resulting in a tear. By enabling V-Sync, the graphics card is instructed to wait until the monitor is ready before generating and sending an image; thereby limiting the fps to the refresh rate of the monitor. As long as your hardware is capable of frames per second values which are consistently exceeding the refresh rate of your monitor, then V-Sync is excellent; your hardware will consume less power, run cooler, and your game-play will be silky smooth (stutter and tear free). It is worth noting however that many gamers experience input-lag when V-Sync is enabled.
But if your graphics card cannot maintain an fps higher than the refresh rate, then it can potentially reduce your frame rate to 75% or 50% of the refresh rate if V-Sync is enabled. This is because the primary buffer has sent the image, the monitor asks for the next image but the secondary buffer has not finished rendering the next completed frame; therefore the refresh cycle restarts and the monitor continues displaying the current image. Now even so the next image may have been ready a nanosecond later, the graphics card cannot send this next image until the monitor has completed this next refresh cycle and ‘asked’ for the next frame.
The result is a horrible stutter.
So that seems fairly straightforward enough; enable V-Sync for games where the fps never drops below the monitor’s refresh rate, turn it off for games that dip under. You would be forgiven for thinking that, but with V-Sync turned off it means that the graphics card will send frames to the monitor whenever it can; regardless of where in it’s refresh cycle the monitor currently is. This means that with V-Sync off, you will experience tearing; but at least there is no V-Sync induced input-lag.
So now you might be thinking that we basically have to choose between stuttering or tearing right?
Not anymore.

Variable Refresh Rate: Adaptive Sync Technologies
It was important to establish the basics before moving on to the modern technologies that you more than likely came here to read about in the first place. We now know that a graphics card pushes images to a monitor as fast as it can, but monitors refresh their image at a set rate: a 60Hz monitor for example refreshes every 1/60th of a second. When your graphics card delivers frames faster or slower than that, you experience screen tearing. V-Sync can help eliminate that, but can introduce stuttering and input lag; as your graphics card is ‘waiting’ to deliver a new frame until the monitor is ready for it.
Adaptive sync technology eradicates those issues by synchronising the refresh rate of your monitor with the frames per second of the graphics card, up to the maximum refresh rate of the monitor of course. Remember that whole ordeal with the overlapping of frame buffers? Not an issue. When the video card generates a new frame, an adaptive sync monitor immediately displays it. The result, even at sub 60fps (depending on the particular technology, see below) is a silky-smooth gaming experience. So purchasing a monitor and graphics card with adaptive sync should be a fairly simple decision then, as all gamers should want this type of technology.
The problem is that there are two main types of adaptive sync:
G-SYNC® and FreeSync™. Whilst they aim to achieve the same goal, their implementation is considerably different.

In the Green Corner
G-SYNC® is a proprietary technology from Nvidia which has been around since the Kepler architecture of their graphic cards (GeForce GTX 600 series). The science behind the technology is fascinating, Nvidia effectively built a special collision avoidance feature into their graphic cards in order to prevent lag or stutter. Collision avoidance is a concept that is utilised heavily in computer networking, ensuring that data packets do not get lost and data integrity is maintained (CSMA/CD); so the application of this technique within the context of gaming, particularly the graphical aspect, is fairly amazing (for a computer geek at least).
G-SYNC® is a stunning VRR (Variable Refresh Rate) technology, with the newest implementation being known as G-SYNC® HDR. The downside is that it requires a G-SYNC® compatible monitor, which requires a special hardware module within the actual monitor itself. This unfortunately results in G-SYNC® monitors starting at considerably higher prices than those using the VESA Adaptive-Sync standard which was introduced with DisplayPort 1.2a. The benefit however is a VRR technology that operates all the way down to 0Hz, provides minimal input-lag, and operates just as well in windowed mode as well as in full screen. Another benefit is that there are very few ‘budget’ G-SYNC® monitors, so the level of quality is usually guaranteed to be high if you go the G-SYNC® route.
You are of course limited to only being able to use a graphics card made by Nvidia, which is by no means a bad thing.

In the Red Corner
For FreeSync™ however, AMD opted to ‘piggyback’ onto the VESA Adaptive-Sync standard. Whilst the monitor maker still needs to include a display scaler that is compatible, it is far less expensive than the special module that G-SYNC® requires; plus there are no royalties or licensing costs that a monitor maker would need to pay AMD in order to produce a monitor that is FreeSync™ compatible. Whilst
FreeSync™ monitors could only be used with AMD Radeon graphic cards in the past, Nvidia just recently ‘flipped the switch’ enabling their graphics cards (10 series and newer) to utilise FreeSync™ displays. Although they are only guaranteed to work with those that are certified as G-SYNC® compatible, it appears that most decent FreeSync™ monitors appear to be working just fine with Nvidia graphic cards.
So regardless of whether you support #AMD or #Nvidia you should just save some money and buy a FreeSync™ display right? Well no, not exactly.
Whilst FreeSync™ monitors are significantly more affordable (depending on brand and features of course) than their G-SYNC® counterparts, they are also much more of a mixed bag. FreeSync™ monitors only support adaptive sync within a specified frame rate range, my own monitor for example has a range of 48Hz to 100Hz. Depending on the price of the monitor, FreeSync™ will only work within that range; drop below it and you are back to stuttering gameplay. Fortunately there is a technology called Low Framerate Compensation (LFC) which duplicates frames when the refresh rate drops below the range of the monitor; so if the fps dips to 30, LFC duplicates frames and the display will run at 60Hz thereby keeping things smooth.
But not all FreeSync™ displays come with LFC; especially not those on the lower end of the price spectrum, so keep an eye out when comparing different brands. Without LFC, the transition between frame rates within the FreeSync™ range and those below it are considerably noticeable regardless of whether you are using an AMD or an Nvidia graphics card. This is why the monitor you purchase should be heavily influenced by the graphics card that you have.
If you purchase a beautiful 21:9 ultrawide with a 3440×1440 resolution and a FreeSync™ range of 55-144Hz, then you will need to ensure you have a high-end graphics card to match; otherwise you may be less than impressed with the results.

So What About V-Sync?
As refresh rates of modern monitors increased, V-Sync became the ‘red-headed step-child’ of PC Gaming; especially when graphics cards could not keep up with these higher refresh rates. V-Sync just caused too much stuttering, so it became more important to have a display with a variable refresh rate; V-Sync be damned. But nowadays the best hardware (provided you can afford it) is certainly capable of exceeding the maximum refresh rate of a typical gaming monitor, depending on the game and resolution of course; thereby operating outwith the range of adaptive sync technologies and reintroducing us to our old friend: screen tearing.
So why is V-Sync still avoided by so many? Part of the problem is input lag, the other part is that old habits die hard.
Higher frame rates do indeed reduce input lag, even if the refresh rate of the monitor is limited to only 60Hz. The margins however are usually minuscule, mere milliseconds in difference that would only affect the most competitive of gamers (or more likely eSports pros). For example, in a game that is capable of over 2000 fps, let us say that the input latency averages 20ms with V-sync off and the fps uncapped. If we cap the fps to 1000, there is no noticeable difference. If we cap the fps to 180 on a 60Hz monitor (so three times the refresh rate), the input lag only rises around 2ms. Capped at 120 fps, another 2ms; and if we cap it to 60 fps, our average input lag will sit around 31ms. Turn on V-Sync and just that alone adds another 8ms of input lag, for a total of a 39ms delay.

So yes, if you happen to play at a level of say the teams within the Rainbow Six: Siege pro league, then you will more than likely want to play with V-Sync off; as you more than likely care more about the prize money and your MMR than screen tearing. But for normal gamers it is not going to make much of a difference and the benefits only become noticeable if you are able to push the frame rates significantly higher (2x, 3x, etc.) than your monitor’s refresh rate. Many turn their nose up at V-Sync due to the reputation it had for increased input lag; but nowadays V-Sync has improved a great deal and you should certainly give it a try.
Just as AMD and Nvidia have two different VRR technologies, they also have new and improved versions of vertical synchronisation which are unique to each brand (even though they technically do the same thing). AMD offers Enhanced Sync, whilst Nvidia offers Adaptive VSync. Both technologies basically turn on their ‘new and improved’ V-Sync when fps is higher than the monitor’s refresh rate, thereby providing a tear-free experience; but do so with considerably less input lag than traditional V-Sync (approximately 34% lower on average). When the fps is within the adaptive sync range of the monitor, this improved V-Sync is disabled as it is not needed.
It is a common misconception that if you are using
FreeSync™ or G-SYNC® then you should disable V-Sync. Technically that is correct, as you ideally do not want to use traditional V-Sync but instead enable the manufacturer’s enhanced version; as they are specifically designed to complement their respective adaptive sync technologies. The combination of the two provides low latency, no stuttering, and minimal tearing.
Unfortunately, merely reading about VRR technologies does not really do them justice; you have to experience them first-hand to truly grasp how much of a game-changer they are.
But once you do, you will never be able to go back; trust me.