It seems that whatever I set the "max bitrate" to, the camera will use every bit of that whether I have the framerate set to 1fps or 20fps. Logic would dictate that 1/20th the framerate would use 1/20th the bitrate, but that is not the case. I realize why but the fact that they do not tell you what the optimal bitrate is for any given FPS is pretty stupid, especially considering that the default bitrate selected when you initialize the camera it apparently already constrained, resulting in a situation where you can dramatically lower the framerate but still not get any more storage space out of the SD card.
So what is the actual bitrate being used by a 2688x1520 video feed?
How can I calculate this based on the number of pixels, the codec, the quality setting, and the fps?
It would be nice if Hikvision would just give you this info instead of making you work backwards through blueiris stats page
Camera is Hikvision 2CD2442FWD-IW btw
Thanks very much!
So what is the actual bitrate being used by a 2688x1520 video feed?
How can I calculate this based on the number of pixels, the codec, the quality setting, and the fps?
It would be nice if Hikvision would just give you this info instead of making you work backwards through blueiris stats page

Camera is Hikvision 2CD2442FWD-IW btw
Thanks very much!