BI5: Nvidia Decode Limited to 4 Cameras

Discussion in 'Troubleshooting' started by eroji, Jun 3, 2019.

Share This Page

  1. eroji

    eroji n3wb

    Joined:
    Jul 10, 2015
    Messages:
    19
    Likes Received:
    2
    Just upgraded to BI5 and everything seems to be working with the exception of NVDEC. It seems to be limited 4 cameras now whereas before it was working for all 8 at the same time. Anyone else having this problem or know of a fix?
     
  2. looney2ns

    looney2ns IPCT Contributor

    Joined:
    Sep 25, 2016
    Messages:
    7,502
    Likes Received:
    5,895
    Location:
    Evansville, Indiana
  3. eroji

    eroji n3wb

    Joined:
    Jul 10, 2015
    Messages:
    19
    Likes Received:
    2
    Nvidia hardware decoder.
     
  4. bp2008

    bp2008 Staff Member

    Joined:
    Mar 10, 2014
    Messages:
    8,868
    Likes Received:
    5,973
    Check the utilization of the nvidia GPU in task manager. When it runs out of free processing time new cameras just kind of break.
     
  5. eroji

    eroji n3wb

    Joined:
    Jul 10, 2015
    Messages:
    19
    Likes Received:
    2
    the GPU utilization is definitely not maxed out. However, it seems to reserve 20% or so graphics memory on each camera with NVDEC enabled, which then fails to allocate on the 5th camera. I am not sure how it behaved previously but it did work with all 8. Moreover, I noticed that playback doesn't work on Android app if the camera is set to use Nvidia GPU.
     
    Last edited: Jun 5, 2019
  6. bp2008

    bp2008 Staff Member

    Joined:
    Mar 10, 2014
    Messages:
    8,868
    Likes Received:
    5,973
    I wonder if you used to be running 32 bit BI and now run 64 bit? Just trying to figure out why the video memory consumption would change like that. I have no clue.
     
  7. Zanthexter

    Zanthexter Getting the hang of it

    Joined:
    Aug 5, 2016
    Messages:
    75
    Likes Received:
    28
    Ran into the same problem, but with a larger camera allowance probably because it was a higher end card.

    Instead of Nvidia decoding just getting slower, it just "stops". I had to switch a bunch of cameras to CPU only (since Intel is such a nightmare) to free up not just enough headroom for the remaining cameras, but also for other decoding tasks.
     
  8. fenderman

    fenderman Staff Member

    Joined:
    Mar 9, 2014
    Messages:
    30,983
    Likes Received:
    9,927
    Intel works great! You just have the wrong driver.
     
  9. Zanthexter

    Zanthexter Getting the hang of it

    Joined:
    Aug 5, 2016
    Messages:
    75
    Likes Received:
    28
    You're correct. I also took steps to keep supposedly memory leak free drivers protected from updates, etc. I still sometimes had issues, mostly on larger setups.

    For me, stability matters more than trimming someone else's power bill and I got tired of dealing with Intel issues. Now that Nvidia is having issues, I may revisit that.
     
  10. fenderman

    fenderman Staff Member

    Joined:
    Mar 9, 2014
    Messages:
    30,983
    Likes Received:
    9,927
    This usually is a result of improperly using Intel plus VPP or pushing the Intel hardware acceleration past its limits, this is why blue Iris allows to manage hardware acceleration on a per camera basis. you don't have to deal with anything I have over 20 machines running Intel hardware acceleration with zero issues. You just need to read the directions in the wiki.