BI changes Frame Rate settings on its own

nejakejnick

Getting the hang of it
Aug 30, 2015
139
24
Several times now I had to check every camera and set proper Frame Rate in BI, because it was changed on some cameras, e.g from 15 to 17,5, and from 6 to 7,5.... improper frame rate results in motion artifacts in recordings afaik.
Does it happen to anyone else?

DS-2CD2032F-I
 
Several times now I had to check every camera and set proper Frame Rate in BI, because it was changed on some cameras, e.g from 15 to 17,5, and from 6 to 7,5.... improper frame rate results in motion artifacts in recordings afaik.
Does it happen to anyone else?

DS-2CD2032F-I
what version of BI are you running. Blue iris auto adjusts the frame rates, setting it in blue iris does nothing. Make sure your cameras iframe interval matches your fps.
 
Today I noticed it on 4.2.8. The frame rate matters, not sure when, perhaps only when using variable bitrate...
 
I emailed support about this when I noticed it. We're talking about the framerate in the Video tab, right? From what I gather with their responses, this is a cosmetic issue. It seems to reset to the highest FPS value it receives from the camera (how is my camera sending more than what I set it to? I don't know). And even when the FPS goes back to normal, that FPS in the video tab seems to show the high water mark for FPS it receives from the camera.

If you set it to a lower value (10), you should see it move up to 15 (if that's what your camera is sending). If you set it to 30, it will stay at 30 and not reset lower.
 
I am pretty sure that when I had some weird framerate (yes, in the Video tab), my recordings were messed up, not always, probably only with some bad combinations of camera fps and BI Frame Rate, I will try to reproduce it and email to support...
 
I had distorted recordings as well. I assumed it was the higher FPS I saw in the Video tab, but it doesn't seem to be related directly. Sometimes, when it's higher, recordings are fine. When it gets weird (kind of like artifacting/ghosting), it seems to "fix" itself when I drop that FPS to what it should be.

I had done a test recently when it did that and left the FPS alone (17 when it should be 15) and just made a setting change somewhere else and saved it, it then does the bars for the camera before showing the feed again. Recordings were good again, and the FPS setting still showed that high water mark of 17. I'm doing a test now where I'm not using direct to disc to see if it still happens.
 
I had distorted recordings as well. I assumed it was the higher FPS I saw in the Video tab, but it doesn't seem to be related directly. Sometimes, when it's higher, recordings are fine. When it gets weird (kind of like artifacting/ghosting), it seems to "fix" itself when I drop that FPS to what it should be.

I had done a test recently when it did that and left the FPS alone (17 when it should be 15) and just made a setting change somewhere else and saved it, it then does the bars for the camera before showing the feed again. Recordings were good again, and the FPS setting still showed that high water mark of 17. I'm doing a test now where I'm not using direct to disc to see if it still happens.
Two things to check
1) match the iframe interval to the fps
2) increase the receive buffer in blue iris. camera properties>video>configure...set it to 20mb
 
2) increase the receive buffer in blue iris. camera properties>video>configure...set it to 20mb

Hey Fenderman, thanks for all the useful help on this site. Question for you on your quote...is there any documentation on this? I can't find any other than in "What's New" from 2013:


  • The network IP camera receive buffer size setting is now used for both Windows receive buffer as well as the software's own frame buffer. If you have a camera that produces very large key frames (beyond 768k) you should set this value to be twice the maximum key frame size that is anticipated. This applies to multi-megapixel cameras set for extremely high quality.

. What are the recommended settings for the receive buffer with regard to camera resolution, bitrate, etc.? Does setting the buffer higher use significantly more resources? I can of course experiment, but if you have any insight, it'd be greatly appreciated. BTW, all my cameras are set to use the max of 20MB.
 
I was unable to reproduce my problem, my iframe interval matches fps, and now I also increased the buffer. I will just wait for it occur again...
 
Hey Fenderman, thanks for all the useful help on this site. Question for you on your quote...is there any documentation on this? I can't find any other than in "What's New" from 2013:


  • The network IP camera receive buffer size setting is now used for both Windows receive buffer as well as the software's own frame buffer. If you have a camera that produces very large key frames (beyond 768k) you should set this value to be twice the maximum key frame size that is anticipated. This applies to multi-megapixel cameras set for extremely high quality.

. What are the recommended settings for the receive buffer with regard to camera resolution, bitrate, etc.? Does setting the buffer higher use significantly more resources? I can of course experiment, but if you have any insight, it'd be greatly appreciated. BTW, all my cameras are set to use the max of 20MB.
It's based on posts I have seen over the years, I don't recall the origin. I have not noticed any issues when setting it at the max.
 
It's based on posts I have seen over the years, I don't recall the origin. I have not noticed any issues when setting it at the max.

I tried reducing the buffer size and saw no change in CPU usage...I imagine that it's during recording of motion-intense scenes that the buffer becomes a factor. Clip write errors, ghosting and artifacts would be symptoms of a buffer shortage. Thanks.
 
I've seen the same thing happen with recently releases, iframe and interval are matched to the fps, buffer is 20mb etc. Set to 15 fps etc and i get 17.50 in BI. just left it though.