In the world of gaming, two features that affect the smoothness of a game's visuals are FPS and refresh rate, and even though these two features are distinct most people seem not to fully comprehend the difference between both. Well, in this post, we will provide an in-depth yet concise difference between refresh rate vs frame rate.
Frame rate is an attribute of the graphics card and the CPU. This means your frame rate depends on the power of your GPU and CPU and not necessarily your monitor. To better understand what frame rate means, we will like to highlight that a video is a collection of still pictures (frames) in motion. Therefore, the frame rate of your GPU tells you the number of frames the GPU can output to the monitor. This means, the higher the frame rate of your GPU, the more image details your monitor can potentially display. We will explain why we used "potentially" when we get to the definition of the refresh rate.
FPS, on the other hand, is the unit of measurement of the frame rate of a GPU, and it tells you how many frames per second the GPU is capable of processing and pushing out. So, basically, the higher the FPS of your graphical processor, the more detailed your videos will appear.
So, simply put, the frame rate shows how many images the GPU can capture or process in one second. Therefore, if your GPU can capture/process more images in one second, then it will produce a more detailed video or picture than one that captures less.
Refresh rate is a feature associated with your monitor. Once the GPU processes the videos, it sends them straight to the monitor, and once the video or image gets to the monitor, the monitor's refresh rate will determine the intervals at which the monitor displays information. Or basically, the refresh rate determines the speed at which the monitor displays each frame coming from the CPU or GPU. Or, how quickly the monitor can pick up and display each frame.
This means, the higher the refresh rate of your monitor, the smoother and less laggy the videos and images displayed will appear.
To make this explanation more elaborate, let's take a look at an example;
So, assuming you have a 144 FPS GPU, and then you connect it to two monitors, say one has a refresh rate of 144Hz or higher and the second one has a refresh rate of 70 HZ. The monitor with the higher refresh rate (144 Hz or more) is going to pick up all the 144 frames produced by the GPU and then display them quicker. The monitor with the 70Hz refresh rate will only pick up 70 frames and also, it will display these frames more slowly, causing a lag.
Similarly, assuming you have a GPU with a frame rate of say 48 fps, and then you connect it to two monitors; one with a refresh rate of say 144 Hz and the other with a refresh rate of say 70 Hz. Well, even though both monitors will pick up all the frames produced by the GPU (they will have the same image detail), the monitor with a higher refresh rate will display a smoother image or video than the one with a lower refresh rate.
Therefore, regardless of the frames produced by the GPU, the refresh rate of your monitor will be the determiner of how many of those frames are displayed. This is why it is recommended that you purchase a monitor in which the refresh rate is the same--if not more than--the FPS of your GPU. So, if your GPU has a frame rate of say 60FPS, you want the refresh rate of your monitor to be 60 Hz or more.
FPS is the unit of measurement of the frame rate of your GPU, while Hertz is the unit of measure of the refresh rate of your monitor.
Frame rate determines how detailed an image might appear to one's eye, while refresh rate determines how smooth (the input lag) the images appear to the eye.
To upgrade the FPS you will have to change your GPU to a higher spec option, while to upgrade the refresh rate, you will have to change your monitor.
The bottom line is that either doesn't matter, as stated earlier, just ensure that the two match. This is because if you have a monitor with a high refresh rate and a GPU with a lesser FPS, then the monitor is only going to display the available frames, and the GPU will only produce the frames it is capable of. Similarly, if you have a high FPS and a monitor with a lower refresh rate, the monitor will only display the frames it is capable of picking up, and discard what it cannot pick up.
So, if one is higher or lesser than the other, the only thing that will happen is that the lesser one will put a cap on the higher one; therefore, you should match the two features.
That being said, take note of the following: