As usual, the video version can be found at the bottom of the post. Or on my YouTube Channel directly.
The refresh rate is simply the number of times a display refreshes the image it shows per second. You can compare it with the frames from the film industry. If a film is shot at 24 frames per second, the source content only shows 24 different images per second. Similarly, a display with 60Hz shows 60 frames per second. But they are not really frames since the display will refresh 60 times each second even if not a single pixel changes.
A problem that can occur sometimes in games is when the frames are being rendered faster than the rate at which display refreshes. For example, you have a 60Hz display but your game runs at 90 FPS. In scenarios like this, you may experience something called screen tearing – basically, the refresh rate and the frame rate mismatch and you see the errors. When this happens you can limit the frame rate or use a setting called Vsync.
A newer technology, the adaptive refresh rate is becoming more popular nowadays. Nvidia calls this G-SYNC and AMD calls it FreeSync. The basic concept is the same: a display with this capability will ask the GPU how quickly it delivers the frames, and adjust its refresh rate accordingly to eliminate screen tearing.
Do keep in mind that if your hardware cannot go over 65 FPS and you buy a monitor capable of 120Hz, you will not see the full benefits of that display. In situations like these, I’d rather stick with the 60Hz display and once I have the hardware I’d switch over to a better monitor.
What about newer TV’s with 400Hz refresh rate or similar? That’s not the real refresh rate. They use post-processing technologies similar to interpolation that is used to make things seem smoother. This can work well for videos but not in gaming. Especially since these post-processing tasks add input lag. And gamers will try to avoid input lag.