There is some misinformation out there about frame rate affecting sharpness of moving things. To some extent, it may be true, if you let the camera handle exposure times fully automatically. But any camera worth having will let you limit the exposure time (shutter speed) independently of the frame rate. So when it really matters, you set the exposure limit and don't let it run fully auto. So for example you could force a camera to never expose longer than 1/120th of a second, and then run it at whatever frame rate you want without it affecting motion blur.
The main reason for a high frame rate like 60 FPS is to catch very quick movements. If you were monitoring poker tables and needed the camera to be able to prove someone is cheating, it could make all the difference. But for general home or business video monitoring, there is very little practical value.
Here is a video that compares common security camera frame rates. I'm pretty sure they cheated and just captured at 30 FPS and dropped frames in post-processing to make lower FPS streams, but that is beside the point.
You'll notice that beyond about 7 FPS you don't really gain much value for video surveillance, and 15 FPS is generally considered a good compromise between smoothness and practicality. 30+ FPS may be nice to look at, but it should not be a priority for video surveillance.
You may notice some blurred frames in the youtube video above, but they are not caused by the frame rate. They are caused by the exposure time, and you only notice them more in the low FPS streams because each frame is shown for a longer time.