Over the decades, developers of the computer hardware and software have been trying to improve and polish the technologies of gaming graphics in order to improve user’s image perceiving experience in games by increasing the smoothness of rendering. Quite recently, it seemed that with the appearance of such technologies as Full HD or 4K, increasing refresh frequencies of some models of computer displays and TV sets up to 120-144Hz, as well as a permanent increase of power of GPUs for faster processing of such resolutions and rendering of complex 3D gaming scenes, we have reached the peak of perfection – both for games and for watching movies, but … unfortunately it is not so bloomy as it turned out…
Existing 3D and video rendering issues in games
Majority of the modern monitors have a refresh rate of 60Hz, whereas the GPU renders images with frame rates that vary quite dramatically due to constantly changing complexity of the graphical scene, as well as GPU’s load. In other words, there is no synchronization between the frame rate of scene rendered by GPU and the frame rate of the display due to hardware limitations. This causes the artefact known as “screen tearing”, a very obvious “tear line” on the screen, which leads to quite unpleasant experience, especially during dynamic action games.
At the time, the nVidia company has developed and implemented a software technology called V-Sync, which was designed to eliminate screen tearing by forcing the GPU to delay the output of the already rendered scene to the screen until display starts a new refresh cycle. Although, V-sync helped to resolve the issue, other problems came into the scene – stutter and input lag. The stutter caused by the refresh rate of GPU whenever it comes lower than the display refresh rate. The input lag caused by the fact that GPU has to delay before output the scene, which works out as a noticeable delay between button being pressed and corresponded changes on the screen. Not to mention that players can suffer from eyestrains and headaches caused by persistent V-sync stuttering.
The next evolution step of the V-Sync technology was an Adaptive V-Sync, which resolved the problem with stutter by enabling vertical synchronization of GPU with display when refresh rate of GPU is higher than display’s rate, end disabling it when it is lower. However, input lag issue kept unresolved, which was quite troublesome for most of the gamers and absolutely was not acceptable for cyber-sportsmen. And even appearance of the displays with refresh rates of 120 and 144Hz didn’t help to eliminate all the issues, although minimized them quite substantially.
nVidia G-Sync comes into play
In 2013, nVidia proudly announced the completion of the development and actual implementation into their products of a new technology called G-Sync. The core principle of G-Sync is to synchronize display refresh rate with the refresh rate of the GPU. And this time it was not only a software improvement, but also a hardware implementation – nVidia has developed a chip to be built into the monitor that receives commands from the GPU in order to ensure a proper synchronization. As a result, the technology was not just revolutionary – it literally blew up the game development industry and the gaming community. No one before experienced in games such smoothness, responsiveness and clarity. Screen tearing, stutter, input lags – all gone. Moreover, the general user perception of the game become much better – graphical scenes appears on the screen instantaneously, image is more crisp and bright. As a feedback from the professional players – the gameplay acquired unmatched dynamics and responsiveness, which allowed them to ensure a rapid response and to have significant competitive advantages.
Nevertheless, there are some limitations – G-Sync doesn’t work with the less than 30fps and the refresh rate selected by GPU for the display will be limited to its physical frequency. For 60Hz displays, it will be ranged from 30fps to 60fps and for 120-144Hz displays within 30-144fps respectively. In principle, this limitation is fairly relative drawback, considering that a computational power of modern GPUs is so great, so the situation when the game is displayed on screen with a frequency lower than 30 fps would be quite rare.
The most substantial downside of this technology might be considered a need to buy a new monitor with built-in G-sync chip. Though, the development of the chips that can be built into some models of the displays of pre G-Sync era is already began.
G-Sync in laptops?
Using G-sync in laptops is a bit more challenging. If we talk about laptops developed before the G-Sync, today it is not possible, due to requirements of special hardware upgrade, which is impossible due to very limited technical possibilities of the laptop upgradability. Although, nVidia already announced that they are working on the adaptation of G-sync technology for laptops without its hardware support – by developing a G-Sync Mobility, which would be based on the software (drivers) implementation only and will utilize the embedded DisplayPort (eDP) technical capabilities when plugging-in an external monitor.
As for the new laptops, the situation is much better – quite a few companies are already manufacturing new lines of high-end gaming laptops with built-in G-Sync support. For example, MSI GT72G, Clevo 4K, Asus G751 or Aorus X5. All of them have the latest G-Sync supporting nVidia GTX 960M, 965M, 970M or 980М GPUs on board. Clevo’s display frequency was even increased up to 75Hz for enabling unrivaled gameplay quality, on the background with a higher FPS rate.
And while certainly, the cost of these laptops is quite high as compared to the conventional gaming machines, for real players and enthusiasts of the gaming world, that wouldn’t be a real obstacle, would it?