Learn the Basics of Nvidia G-Sync and AMD FreeSync Monitor Technologies

Nvidia, the dominant supplier of graphics cards for PCs and everything to do with making games look good, recently announced plans to open its proprietary ” G-Sync ” display rendering technology for multiple monitors that support a different variable refresh rate technology. , AMD FreeSync.

It’s okay if it doesn’t make sense. Most gamers and computer geeks learn the nuances of display technology to impress friends and family. However, if you want to take your PC gaming to the next level, it would be unwise to completely ignore G-Sync and FreeSync. The benefits – if you get them – can really enrich your gaming experience by making your gameplay smoother and lowering input lag.

If you’re still scratching the back of your head, don’t worry. To keep you out of the rabbit hole of research, here’s a quick tour of the basics of adaptive display technology so you can better understand G-Sync and FreeSync – and why you might need them when shopping for your next large gaming monitor.

What can G-Sync and FreeSync do for my games?

G-Sync and FreeSync are common names, but the technology behind the Nvidia and AMD implementations is commonly known as “adaptive sync” or “adaptive sync” for short. When enabled, G-Sync and FreeSync match the output of your graphics card with the output of your display to ensure that every single frame generated first is displayed on your monitor – nothing more, nothing less. This minimizes lag and prevents annoying display glitches that can occur when your graphics card sends more (or less) frames than your monitor’s native refresh rate.

Let’s unpack this a little more. When you launch a video game, your computer calculates and redraws everything you need to know – the status of every potential moving part in the game, including the player, enemies, and the environment, for example – dozens of times per second. In the same way that cartoons are made up of many similar drawings with minor modifications, your computer sends each of these “frames” to your monitor, which creates the movement and animation that you perceive. The number of times a game sends this information is called its “frame rate”.

On the other hand, each monitor has a limit on the number of animation frames it can display – its refresh rate, or the number of times it can refresh an image every second. When your computer sends more frames of animation than your monitor can handle, it causes screen tearing with up to two frames of animation displayed at the same time. Conversely, a PC that does not render enough frames of animation creates a delay between input and display on the screen – an annoying delay.

This video shows you screen tearing and how FreeSync helps prevent it.

Most games combat this problem by offering a feature known as vertical sync, or “V-Sync,” which is a software solution that prevents your PC from sending more frames than your monitor can handle. In some cases this fixes the screen tearing issue, but it is not ideal.

Games generally don’t run at a constant frame rate: depending on the game and the power of your computer, gameplay can jump tens of frames per second at any time. If you’re having trouble outputting a frame rate that matches or exceeds your monitor’s refresh rate, for lack of a better way to articulate it , you might be stuck at an even lower frame rate . It’s just technology.

Adaptive sync prevents problems in all but the most extreme cases, giving you more flexibility: locking the frame rate so that it doesn’t exceed your monitor’s capabilities, but allowing you to run at the highest possible frame rate when your frames per second can’t reach your refresh rate your monitor.

What’s the difference between G-Sync and FreeSync?

The simplest answer is that G-Sync is a proprietary dynamic scaler exclusive to Nvidia and its monitor partners. FreeSync is an open source standard created by AMD that any monitor manufacturer can support.

However, this is not all. Before CES 2019, G-Sync was technically the standard of hardware: monitors that support G-Sync have a processor chip in them that communicates directly with an Nvidia graphics card to adjust the frame rate. If you don’t have an Nvidia graphics card, G-Sync won’t work for you.

FreeSync monitors, which do not require dedicated monitor scaling, should theoretically work with any graphics card, but until now you could only take advantage of FreeSync with a compatible AMD graphics card.

As we mentioned earlier, Nvidia announced this week that the software version of G-Sync, due out on January 15, will enable adaptive sync technology for a small number of pre-approved G-Sync-compatible monitors that have not yet been built. -in Nvidia scalers. Those with other FreeSync monitors (and Nvidia graphics cards) should eventually be able to enable adaptive sync, but the details are still unclear. Also, at the moment we cannot say how the new software G-Sync will pair with FreeSync or the original chip-based version.

How do I know if my monitor supports G-Sync or FreeSync?

If you buy a new monitor, most companies are pretty clear that the display supports G-Sync, FreeSync, or (now) both. This will be a marker in the product description of any online store you shop from, and there should be a logo on the box. (Most of the time it’s a damn name .)

Whether you want to test your monitor you already own, or just want to be completely sure, here are the links to each G-Sync monitor and FreeSync monitor . Nvidia has also prepared a little cheat sheet for monitors that will become G-Sync compatible next week.

How do I use them?

G-Sync and FreeSync are enabled by default if you are using a compatible GPU and monitor. You can check if your tool is enabled by going to the Nvidia Control Panel app for G-Sync or the AMD Catalyst Control Panel app for FreeSync.

While most people can leave G-Sync or FreeSync on and forget about it, some gamers may find that some games perform better without G-Sync or FreeSync – those looking for the lowest possible input lag for competitive first-person shooters. … , For instance. For a more detailed and detailed breakdown of how to optimize G-Sync, including how to turn it off for individual games, check out this guide . AMD has a similar guide on how to do this on a FreeSync monitor .

Which adaptive sync technology is best?

Unless you’re buying a new graphics card and monitor at the same time, choosing between G-Sync and FreeSync probably comes down to choosing the best hardware to match what you already have. There is no point in buying a G-Sync monitor, for example, if you have an AMD graphics card and I would not recommend that you go out and buy a FreeSync monitor to pair with an Nvidia GPU – at least until we see how well FreeSync displays are approved Nvidia work with G-Sync.

There are many other considerations you will want to think about when buying a new monitor: panel type (TN? IPS?); its maximum refresh rate, as well as the refresh rate at which G-Sync and FreeSync are supported; its resolution, and whether your video card can output high quality games in any form; and how adjustable the display is, just to name a few.

The simple fact that a display supports G-Sync or FreeSync is the very first question you should ask yourself when buying a new gaming monitor. And even then, you can put off buying something new for a while. New displays « FreeSync 2 » and « the G-Sync Ultimate ” are still in their infancy, but they can look even better than the best modern monitors – Now for a hefty price tag, but I hope that after a while it cools.

More…

Leave a Reply