When comparing the performance of two custom gaming PCs, we often look at the frame rates each computer is capable of producing in a certain game at the same resolution and graphics quality. Frame rates are measured in FPS or Frames per Second. Most people know that higher FPS is better, but let’s clear up some common misconceptions with FPS and refresh rates.
First, what is a frame and what determines the frame rate? A frame is a single still image, which is then combined in a rapid slideshow with other still images, each one slightly different, to achieve the illusion of natural motion. The frame rate is how many of these images are displayed in one second. To produce, or render, a new frame your CPU and GPU work together to determine the actions of the AI, the physics, positions, and textures of the objects in the scene to produce an image. Then your GPU cuts this image into pixels at the resolution you set and sends this information to the display. The more powerful your CPU and GPU, the more frames they are able to generate per second.
Your monitor or display is where refresh rates come in. Refresh rate is measured in frequency (Hz) which is the number of times per second your monitor can redraw the screen. A refresh rate of 85Hz means that your display can redraw the entire screen 85 times in one second.
Does that mean that your frame rate is limited by your screen’s refresh rate? No; they are two separate things. Remember that FPS is how many frames your custom gaming PCs is producing or drawing, while the refresh rate is how many times the monitor is refreshing the image on the screen. The refresh rate (Hz) of your monitor does not affect the frame rate (FPS) your GPU will be outputting. However, if your FPS is higher than your refresh rate, your display will not be able to display all of the frames your computer is producing, so although the refresh rate doesn’t technically limit the frame rate, it does effectively set a cap.
It’s also important to remember that even if your gaming PC is capable of generating 90 FPS in your favorite game at your preferred settings, and even if your monitor supports 90Hz, 120Hz or more, you could still be capped by the lower refresh rate capabilities of the ports on your graphics card and display. Read our blog post on DisplayPort vs HDMI vs DVI vs VGA to learn about the pros, cons and limitations of the different types of connections. For example, some gaming monitors feature 120Hz refresh rates, but have HDMI 1.4 and DisplayPort 1.4. This means that you can only take advantage of the 120Hz refresh rate if your use DisplayPort; you’ll be stuck at 60Hz if you use HDMI.
Frame rate is typically used as a gaming benchmark for measuring the performance of hardware, drivers, games and APIs like Vulkan and DirectX. In this case the monitor’s refresh rate doesn’t matter because the frame rate is just used as a number to measure the gaming performance. A higher frame rate is better. However, when you’re actually playing a game, the display’s refresh rate does effectively limit the frame rate – if you have an 80hz display and your computer is capable of outputting 120 FPS, your screen will still only be able to show 80 different images per second.
If the frame rate your computer is producing is different (either higher or lower) that the refresh rate of your monitor, you may experience a glitch known as screen tearing, where information from two or more frames is shown in a single screen draw. It is important to note that screen tearing does no damage to a display or graphics card.
To prevent screen tearing, you can enable a feature called Vertical Synchronization, or VSync. This tells your GPU to synchronize its actions with the display, which forces it to render and send a new frame when the monitor is ready to redraw the screen. This does limit your framerate exactly to the refresh rate. For example, if your refresh rate is 60Hz, VSync will cap your framerate to 60 FPS. If your GPU is capable of producing higher frame rates than the VSync cap you can take advantage of its leftover capacity to increase the resolution, draw distance, or other graphics quality settings. If your graphics card can’t outpace the refresh rate of your display then enabling VSync won’t help much. You may be able to lock your GPU into a lower frame rate, like 30 FPS, that would match up with your monitor. Common display refresh rates include 120hz, 60hz and 30hz, and there are all divisible by 30 so you won’t get screen tearing, but you may get stutter, as each frame will be on the screen for a couple of cycles.
If your monitor and graphics card both in your customer computer support NVIDIA G-SYNC, you’re in luck. With this technology, a special chip in the display communicates with the graphics card. This lets the monitor vary the refresh rate to match the frame rate of the NVIDIA GTX graphics card, up to the maximum refresh rate of the display. This means that the frames are displayed as soon as they are rendered by the GPU, eliminating screen tearing and reducing stutter for when the frame rate is both higher and lower than the refresh rate of the display. This makes it perfect for situations where the frame rate varies, which happens a lot when gaming. Today, you can even find G-SYNC technology in gaming laptops!
AMD has a similar solution called FreeSync. However, this doesn’t require a proprietary chip in the monitor. Instead, FreeSync relies on the DisplayPort’s Adaptive-Sync specification, which is a royalty-free industry standard. The difference between them is that in G-SYNC, the proprietary module in the monitor handles the work of communication between the devices. In FreeSync, the AMD Radeon driver, and the display firmware handle the communication. AMD has demonstrated that FreeSync can work over HDMI, but it requires custom drivers from AMD and the monitor’s manufacturer. Currently G-SYNC only works with DisplayPort, but that may change. Generally, FreeSync monitors are less expensive than their G-SYNC counterparts, but gamers generally prefer G-SYNC over FreeSync as the latter may cause ghosting, where old images leave behind artifacts. However, this may change as both technologies are relatively new.
Well, thanks to this article I have a better understanding of what I will do. I have been reading articles and watching tech videos and was on the verge of getting cross-eyed. Seriously. Hopefully I understood this and will not be making a mistake with my impending monitor purchase. To summarize per some benchmarks I’ve read and this article, a monitor will only put out its maximum resolution. I believe you used the term “cap”. So, yeah, what’s the point of a game running at 90 fps if my monitor can’t go past 60hz or even worse it is less than 60hz (what a nightmare).
Lastly, I also noticed that the higher the resolution the less fps. Hoping to find a sweet spot monitor that will provide the best of both worlds but that is arguably impossible to find just yet. Thank you so much for this article. I feel a bit more confident moving forward.
If I got it wrong please let me know as I am always open to learning. :-)
Sounds like you’ve got a good understanding! There is definitely a trade-off between high resolution and high frame rates. Not only in monitors, but also in what your hardware can do. There’s only so much graphics power available, and you have to choose between higher frame rates and low resolutions, high resolutions and lower frame rates, or a sweet spot.
Is all this true with 4k displays? I need to know because I’m looking into getting a 4k TV that I also intend to use as a computer monitor and I’d like to know if it makes any real difference between a TV with a 120hz refresh rate and one with 240hz to play video/computer games, seeing as how the former is cheaper and has more varied brand options.
It’s true for 4K, too. One important note is that DisplayPort 1.4 (the latest version) only supports 4K at 120Hz. HDMI 2.1 also supports 4K at 120Hz bit it was just announced this month and hasn’t been released yet.
Anyway, it depends on your gaming PC. You only have so much graphics processing power. You know the saying, you can only pick two: good, cheap or fast? It’s the same thing here – you can prioritize on resolution (1080p, 1440p, 4K), frame rate/refresh rate (60, 120, etc.) or graphics quality (low, medium, high, ultra). The extent to which you have to compromise between the three depends on the hardware in the system.
What graphic cards do you have? What resolution and refresh rate do you currently play at?
Go with the 120Hz TV. Because of the current limitations in HDMI and DisplayPort, I’m not too sure how you’ll be getting 240Hz content to the TV…
Thanks. I’ve got a Nvidia GTX 970 as my graphics card. My monitor’s resolution is 1080p and I’m not sure what it’s refresh rate is. It’s a ViewSonic I got a few years ago with the rest of my PC.
EDIT: Just looked it up. It’s a 24 inch monitor with a horizontal refresh rate of 83 khz and a vertical refresh rate of 73hz. I’ve never had any issues playing games with that. The 4K TVs I’m looking at have a screen size in the 40 inch range if it makes a difference.
running on a 4k tv without being capped by the refresh rate is great, but I’m wondering if you’re lagging? Or if you’ve even able to get an FPS past 60?
Lmk, i’m curious cause I’m considering buying something similar
So far I’ve had no problem with using my 50 inch 4k TV as a computer monitor.
There is an advantage to tearing over v-sync with framerates higher than your monitor’s refresh rate, and that is that you have bits of newer frames so you see some things a bit earlier than with v-sync that uses older frames. That’s why pros always had v-sync off back in the days that 60hz LCD’s were standard issue for pros.
Having higher refresh rate monitors also has a few advantages for low-end systems like smoother scrolling in normal programs, clearer cursor movements, etc. High refresh rates aren’t just good for gaming, that’s why the new ipad pros have 120hz screens rather than 60hz ones.
Note to self always use Vsync. There’s no point in having FPS run the CPU and GPU through the roof when there’s no human interface benefits in that.
actually, Almost. What is better is v-sync “off” (it adds input delay, or mouse/controller ‘lag’) or “Fast” (on Nvidia 9 or 10 series) with a frame rate limiter “on”, and actually set for 1 lower than your Hz rating to further reduce input lag. Ex: 60 Hz monitor, set it to 59. 144 Hz monitor, set it to 142. 120 Hz monitor set it to 119. To get a frame rate limiter use MSI Afterburner utility to over clock your gpu. Whether or not you overclock or not, the utility has RivaTuner which allows you to set up On Screen stats as well as a frame limiter. All of this took awhile to learn but I promise you it is the superior method.
Thanks for that. I couldn’t figure out why my 60 Hz monitor kept showing up as 59 Hz with my new 1080Ti card.
You misunderstood. He’s saying to let your frame rate=monitor refresh rate – 1
ex. for a 60Hz monitor cap your framerate to 59
your welcome. It’s a great trick, huh? Works so smoothly.
Gonna try that in Overwatch. You can set custom FPS limit. 59 it is for me.
The human brain and eyes can detect flicker up to 500HZ. But updating moving objects is somewhere between 35-50Hz depending on your genes and age. However your brain is affected and will get tired more quickly the lower the monitor sync is. So the screen should have as high sync as possible but the FPS is more or less pointless above 80 HZ. It might seem that you don’t need FPS above 50 FPS but the thing is that the frames that are sent are not sent in a steady pattern. Sometimes one frame is delayed and that is compensated by faster posting of frames right after. When a game or benchmark tool display a FPS-value it is actually the mean FPS, not the lowest or highest value. 80 FPS mean normally fluctuates between 50 and 100 FPS. So to always stay above your eyes limitation to detect movement, 80 FPS should do it.
Above 60hz tearing isn’t so bad, below 60hz it emphasizes your desktop’s limitations quite heavily, although it never actually makes me want to upgrade any sooner than with a freesync screen since I only ever upgrade when input lag becomes too high when using decent settings. Tearing’s annoying, input lag can make or break your gaming experience. Usually you don’t want dips below 30fps for most games and racing/fps games shouldn’t dip below 40-45fps.
Since my post I purchased a gaming monitor w/144hz refresh rate. I keep it at 120hz. It works fine. However, I recently experienced tearing for the first time. While it only happened during a specific point in the game and not the entire time it was very distracting. It has happened only once and with only one game. V-sync was turned off because the monitor is a G-Sync monitor. Honestly, I am clueless why this would happen.
I have 144hz iiyama monitor Gsync supported paired with xbox360, is it ok or bottle neck?
Looks great yo. The resolution matters tho, the higher the more your graphics card has to render, thus lower FPS.
what I don’t get is why I can’t watch 60fps videos or play games in 60fps on my 60hz monitor. i feel like i’m living in the stone age u.u
my screen is 60hz and i play in 60 fps. In-case of FPS Drop(Ex: drop to 30 fps). Won’t be better to set the fps to 90 so it can drop to 60 fps ?
Wow, awesome info ! Thanks