BI5: Nvidia Decode Limited to 4 Cameras

eroji

Young grasshopper
Joined
Jul 10, 2015
Messages
36
Reaction score
3
Just upgraded to BI5 and everything seems to be working with the exception of NVDEC. It seems to be limited 4 cameras now whereas before it was working for all 8 at the same time. Anyone else having this problem or know of a fix?
 

bp2008

Staff member
Joined
Mar 10, 2014
Messages
12,666
Reaction score
14,007
Location
USA
Check the utilization of the nvidia GPU in task manager. When it runs out of free processing time new cameras just kind of break.
 

eroji

Young grasshopper
Joined
Jul 10, 2015
Messages
36
Reaction score
3
the GPU utilization is definitely not maxed out. However, it seems to reserve 20% or so graphics memory on each camera with NVDEC enabled, which then fails to allocate on the 5th camera. I am not sure how it behaved previously but it did work with all 8. Moreover, I noticed that playback doesn't work on Android app if the camera is set to use Nvidia GPU.
 
Last edited:

bp2008

Staff member
Joined
Mar 10, 2014
Messages
12,666
Reaction score
14,007
Location
USA
I wonder if you used to be running 32 bit BI and now run 64 bit? Just trying to figure out why the video memory consumption would change like that. I have no clue.
 

Zanthexter

Getting the hang of it
Joined
Aug 5, 2016
Messages
96
Reaction score
39
Ran into the same problem, but with a larger camera allowance probably because it was a higher end card.

Instead of Nvidia decoding just getting slower, it just "stops". I had to switch a bunch of cameras to CPU only (since Intel is such a nightmare) to free up not just enough headroom for the remaining cameras, but also for other decoding tasks.
 

fenderman

Staff member
Joined
Mar 9, 2014
Messages
36,897
Reaction score
21,250
Ran into the same problem, but with a larger camera allowance probably because it was a higher end card.

Instead of Nvidia decoding just getting slower, it just "stops". I had to switch a bunch of cameras to CPU only (since Intel is such a nightmare) to free up not just enough headroom for the remaining cameras, but also for other decoding tasks.
Intel works great! You just have the wrong driver.
 

Zanthexter

Getting the hang of it
Joined
Aug 5, 2016
Messages
96
Reaction score
39
Intel works great! You just have the wrong driver.
You're correct. I also took steps to keep supposedly memory leak free drivers protected from updates, etc. I still sometimes had issues, mostly on larger setups.

For me, stability matters more than trimming someone else's power bill and I got tired of dealing with Intel issues. Now that Nvidia is having issues, I may revisit that.
 

fenderman

Staff member
Joined
Mar 9, 2014
Messages
36,897
Reaction score
21,250
You're correct. I also took steps to keep supposedly memory leak free drivers protected from updates, etc. I still sometimes had issues, mostly on larger setups.

For me, stability matters more than trimming someone else's power bill and I got tired of dealing with Intel issues. Now that Nvidia is having issues, I may revisit that.
This usually is a result of improperly using Intel plus VPP or pushing the Intel hardware acceleration past its limits, this is why blue Iris allows to manage hardware acceleration on a per camera basis. you don't have to deal with anything I have over 20 machines running Intel hardware acceleration with zero issues. You just need to read the directions in the wiki.
 
Top