FPS and refresh rate. It is so pervasive in the gaming community. However, many people don't even know what it is - they just know that the higher the better.
FPS FPS (frames per second) is how many images the GPU sends to the monitor per second. Each “frame” is an image, and the more the better. Why is that so? Well, they more FPS, the faster the keys register into the game. However, this doesn't not always mean that you are seeing, say, 100 frames per second.
Hz Hertz (refresh rate) is how fast the display is refreshing per second. If the display is 60Hz (it is generally the standard for most monitors), it means it will show a new image every 1/60 of a second. If the display is at, say 300Hz, it will show a new image every 1/300 of a second. Now, obviously, the more images you see, they better you will game. It means that you will see things in the game faster than others might, as in a multiplayer game. When FPS is less than the refresh rate, the frame will not refresh, or just stay the same. However, when the FPS and the refresh rate are mostly the same, you will get, say 240 frames per second, and you will SEE it. If you buy a 2080 Ti and game on your 10 year old monitor, chances are that you will only see some of the frames the GPU spits out. However, if you pair it up with a 240Hz monitor, you will probably see all the frames. Simply, FPS measures how well the GPU performs and refresh rate measures how well the monitor is.
The common refresh rates for monitors are: 40Hz (Rare, not recommended) 60Hz (Most non-gaming monitors) 75Hz (Overclocked 60Hz) 120Hz (quite common in gaming) 144Hz (quite common in gaming) 165Hz (Rare, usually 144Hz or 240Hz) 240Hz (Common with serious gamers)300Hz (Expensive, but available) 360Hz (Expensive, but available)
FPS is different in every game and with every GPU. Obviously, the more powerful the GPU is, the more FPS. However, the game, (i.e. how detailed it is, how many things it needs to process) will also affect FPS.