BI New camera limit

davisbacon

n3wb
May 6, 2018
8
2
Hello, I'm trying to add a ninth camera and find that I have to disable one of my other cameras to view the new one it in the viewing panel.
Is this a known limit of the camera viewer or a hardware resource limitation?

Thank you
 
Hello, I'm trying to add a ninth camera and find that I have to disable one of my other cameras to view the new one it in the viewing panel.
Is this a known limit of the camera viewer or a hardware resource limitation?

Thank you
There is no such limitation. What happens when you try to add it?
 
Am using the Nvidia NVDEC

BI5 broke Nvidia hardware acceleration. I've seen the same behavior on multiple systems and NVidia GPUs. Systems that could do a couple dozen cameras on v4 would behave like you're describing on v5 with a half dozen.

Ken (the Blue Iris developer) said that he switched to newer codecs and it was up to the codec developers to resolve the problem.

Until they do, NVidia hardware acceleration is pretty much useless. Switch to Intel or use Blue Iris v4.
 
BI5 broke Nvidia hardware acceleration. I've seen the same behavior on multiple systems and NVidia GPUs. Systems that could do a couple dozen cameras on v4 would behave like you're describing on v5 with a half dozen.

Ken (the Blue Iris developer) said that he switched to newer codecs and it was up to the codec developers to resolve the problem.

Until they do, NVidia hardware acceleration is pretty much useless. Switch to Intel or use Blue Iris v4.
Why would you need to use bi4? Just use 5.2.1.4
 
BI5 did not break nvidia hardware acceleration. This symptom (additional camera failing to change out of "no signal") is exactly what happens when you exceed the nvidia decoder's capabilities since day one.
 
BI5 did not break nvidia hardware acceleration. This symptom (additional camera failing to change out of "no signal") is exactly what happens when you exceed the nvidia decoder's capabilities since day one.

I never had the problem in BI4. After upgrading to BI5, systems that had a few dozen cameras on Nvidia couldn't handle more than 7 or 12 or whatever depending on the card. I even tried upgrading to newer/higher RAM cards. Ken confirmed he changed the decoder between v4 and v5 and said the lower usable number of cameras was out of his hands.

As I don't have easy physical access to the larger systems or hours and hours and hours of time to waste periodically installing video cards that used to work fine, I've just given up on NVidia and either dialed back camera loads or thrown better Intel hardware at it. Which is a shame, because I had fewer problems with NVidia than Intel.

So, no, not "since day one". Or at least I never hit a limit on v4 but crashed into one with v5. Upgrading was such a cluster fuck that I've still got around a dozen systems with paid v5 licenses still on v4 because I just haven't wanted to deal with it.

I'm going to define "broken" as "Worked with 50 cameras but now works with 25 or 7 or whatever the number was". Do you disagree that such a drastic drop in cameras is "broken"? Do you consider that "working fine"? Since the limit changed "since day one"?

I really appreciate all you've done with UI3, so, if you'd like to communicate directly, I'd be happy to address the questions you seem to have regarding my credibility. Because either I'm making things up, or there's a problem you aren't aware of. Feel free to send me a PM and I'll reply back with my contact info.
 
Last edited:
Why would you need to use bi4? Just use 5.2.1.4

For whatever reason, the NVidia decoder in v4 can handle a lot more cameras than the decoder in v5.

Trying out a few different video cards, I couldn't find a pattern to it. Didn't seem related to memory or GPU. I eventually gave up and threw bigger computers at it and used the overloaded ones elsewhere.
 
@Zanthexter, I will admit that I did not do extensive testing before disagreeing that "BI5 broke Nvidia hardware acceleration." I had only messed around with one camera a few days earlier for the purpose of collecting data on which methods of hardware acceleration worked and which did not. During that test, I saw GPU utilization at expected levels.

So just now I did more testing to see if I also have reduced Nvidia decoding capacity, and it looks like maybe I do, but only just barely.

I still run the same BI server hardware as I did during my initial tests of the Nvidia hardware acceleration feature in BI4. Back then I was able to run a particular set of 11 cameras with Nvidia decoding, and that was the limit of my GT 1030 card. So just now I tried enabling Nvidia decoding for the same 11 cameras in BI 5.2.3.2, Nvidia driver 432.00. Two of those cameras have since been upgraded from 4MP to 8MP, so for this test I substituted them for two different 4MP cams. The result is that I was able to get only 10 cameras running on Nvidia, down from 11. I can substitute the 11th cam (4MP) for a 2MP cam, in which case I can get an 11th cam going. Therefore, I am seeing possibly a small amount of reduced capacity. Or it could just be the result of changes in the camera frame rates that I have made since 2018, which my test just now did not account for. Certainly I am not seeing anything resembling a loss of 50+% of capacity as you guys have.

During these tests, I saw a change to how BI handles overloading the Nvidia decoder now. Instead of showing a "No Signal" screen for the cam, it logged "HW VA not compatible: -542398533" and automatically set HWVA to "no" for that camera. On occasion, I saw it also log "FORCE quitting pXServer thread" for the same camera. Whenever it did that, the camera would get stuck in the colored bars state (loading) until I restarted the Blue Iris service.