The best graphics cards these days have much more to offer than just pure performance. Features like ray tracing and DLSS have taken the PC gaming scene by storm, but it's the simple stuff like adaptive sync that's more important. Screen tearing is a major issue when your monitor's refresh rate and in-game frame rate are not synchronized. Fortunately, many of the best gaming monitors use AMD's FreeSync or Nvidia's G-sync to combat this problem. While FreeSync and G-sync will
The best graphics cards these days have much more to offer than just pure performance. Features like ray tracing and DLSS have taken the PC gaming scene by storm, but it's the simple stuff like adaptive sync that's more important. Screen tearing is a major issue when your monitor's refresh rate and in-game frame rate are not synchronized. Fortunately, many of the best gaming monitors use AMD's FreeSync or Nvidia's G-sync to combat this problem. While FreeSync and G-sync will work with both AMD and Nvidia GPUs, there are a couple of caveats.
Enlarge (credit: Nvidia)
Back in 2013, Nvidia introduced a new technology called G-Sync to eliminate screen tearing and stuttering effects and reduce input lag when playing PC games. The company accomplished this by tying your display's refresh rate to the actual frame rate of the game you were playing, and similar variable refresh-rate (VRR) technology has become a mainstay even in budget monitors and TVs today.
The issue for Nvidia is that G-Sync isn't what has been drivin
Back in 2013, Nvidia introduced a new technology called G-Sync to eliminate screen tearing and stuttering effects and reduce input lag when playing PC games. The company accomplished this by tying your display's refresh rate to the actual frame rate of the game you were playing, and similar variable refresh-rate (VRR) technology has become a mainstay even in budget monitors and TVs today.
The issue for Nvidia is that G-Sync isn't what has been driving most of that adoption. G-Sync has always required extra dedicated hardware inside of displays, increasing the costs for both users and monitor manufacturers. The VRR technology in most low-end to mid-range screens these days is usually some version of the royalty-free AMD FreeSync or the similar VESA Adaptive-Sync standard, both of which provide G-Sync's most important features without requiring extra hardware. Nvidia more or less acknowledged that the free-to-use, cheap-to-implement VRR technologies had won in 2019 when it announced its "G-Sync Compatible" certification tier for FreeSync monitors. The list of G-Sync Compatible screens now vastly outnumbers the list of G-Sync and G-Sync Ultimate screens.
Today, Nvidia is announcing a change that's meant to keep G-Sync alive as its own separate technology while eliminating the requirement for expensive additional hardware. Nvidia says it's partnering with chipmaker MediaTek to build G-Sync capabilities directly into scaler chips that MediaTek is creating for upcoming monitors. G-Sync modules ordinarily replace these scaler chips, but they're entirely separate boards with expensive FPGA chips and dedicated RAM.