Yeah I have to agree. I saw a few specific objects I had a perception of slight blur more so on the 30FPS
Exactly what I have been saying all along LOL
Now of course lots of other variables do come into play - field of view and what it is trying to display can impact that, along with the capabilities of the cameras - some don't allow higher bitrates so that should be taken into consideration when determining the FPS to run.
Some cameras and a field of view may not be as obvious as others. Certainly ones eyes and monitor come into play as well.
For example, a 4K budget cam shoved on a 1/2.8" sensor where the camera is limited to 8,192 bitrate will perform better at a lower FPS because of this.
Conversely a 4MP camera on the 1/1.8" sensor that can run say 8,192 bitrate for 15FPS and 16,384 bitrate for 30FPS may not be as obvious, provided the camera isn't being maxed out on other specs like too many AI rules or ROI or some other CPU intensive feature.
But I think the takeaway is similar to what we say on determining what bitrate to use - start at the benchmarks we mention and then go up and down until you find that sweet spot for your field of view. If your eyes and monitor cannot distinguish between 8,192 bitrate and 16,384 bitrate then no reason to use the higher bitrate.
So the same with FPS - try different rates and then do a freeze frame and see if you see a difference.
And also try digital zoom - you may get by with more digital zoom at lower FPS.
As I mentioned, on an older 2MP camera model that I have and my neighbor has, we are both seeing essentially the same field of view and have the camera set up identical, except he uses 30FPS and I use 12FPS and his freeze frames are blurrier than mine. I think for this very reason.