I commonly reduce the frame rate of recordings in an attempt to save a bit of storage and/or bandwidth, particularly when the camera is just looking for humans walking at a moderate distance from the camera. By contrast, when I want to capture fast moving vehicles (LPR in particular) or wildlife running/flying by, I keep the frame rate as high as possible.
For the times when I do reduce the frame rate, on the main stream and almost always on the substream, I have always been reducing it BOTH on the camera AND in BlueIris. My theory was that reducing it on the camera would lessen the necessary bandwidth to stream to BlueIris. At this point, I probably didn't need to reduce the BI max fps recording value either, but for some reason I got into the habit of doing that also. I think it just gave me confidence that if either one setting was accidentally reset, I'd still get a similar end result.
For purpose of discussion, I generally set the camera's i-frame interval to the same value as my frames per second, to get a key frame once per second. I commonly use 5 fps with the same i-frame interval of 5.
This has and continues to work fine for me, but I can't help but wonder if by reducing the frame rate in both BlueIris and the camera, might BlueIris be missing i-frames periodically due to some synchronization issue? Would there be any video quality improvement if I use a higher frame rate on the camera and then let BlueIris reduce the frame rate of the recording? I assume BI adopts the same raw i-frame interval when it stores directly to disk without translation, but is there any other nuance to the i frame interval setting if BI is also limiting the frame rate of the recording?
I realize some of this discussion might be just theoretical or academic, but I'd rather be doing what "might be best" rather than something that "likely is not the best." Does anyone have an opinion that they can back with either theory or some practical result that you've seen?
For the times when I do reduce the frame rate, on the main stream and almost always on the substream, I have always been reducing it BOTH on the camera AND in BlueIris. My theory was that reducing it on the camera would lessen the necessary bandwidth to stream to BlueIris. At this point, I probably didn't need to reduce the BI max fps recording value either, but for some reason I got into the habit of doing that also. I think it just gave me confidence that if either one setting was accidentally reset, I'd still get a similar end result.
For purpose of discussion, I generally set the camera's i-frame interval to the same value as my frames per second, to get a key frame once per second. I commonly use 5 fps with the same i-frame interval of 5.
This has and continues to work fine for me, but I can't help but wonder if by reducing the frame rate in both BlueIris and the camera, might BlueIris be missing i-frames periodically due to some synchronization issue? Would there be any video quality improvement if I use a higher frame rate on the camera and then let BlueIris reduce the frame rate of the recording? I assume BI adopts the same raw i-frame interval when it stores directly to disk without translation, but is there any other nuance to the i frame interval setting if BI is also limiting the frame rate of the recording?
I realize some of this discussion might be just theoretical or academic, but I'd rather be doing what "might be best" rather than something that "likely is not the best." Does anyone have an opinion that they can back with either theory or some practical result that you've seen?