Blue iris does not seem to be using GPU

kolt_

Getting the hang of it
Joined
Jul 2, 2021
Messages
37
Reaction score
58
Location
United States
I currently have Blue Iris installed on a Windows 10 VM on Proxmox with a Tesla P4 passed through. I set hardware accelerated decode to Nvida NVDEC and even set Hardware decode to the same thing for each camera, however, my CPU usage still is around 60-70% with lots of frame loss. Is this entire implementation possible? Or maybe I am missing something? I also installed the latest drivers for the P4 and rebooted. Just want everything decode on the GPU for this work and seems like it's not working.

Anyone had similar issues?
 

wittaj

IPCT Contributor
Joined
Apr 28, 2019
Messages
25,189
Reaction score
49,083
Location
USA
Do not use the graphics card for hardware acceleration as that has been problematic with newer versions of BI for many people.

Around the time AI was introduced in BI, many here had their system become unstable with hardware acceleration on (even if not using DeepStack or CodeProject). Some have also been fine. I started to see that error when I was using hardware acceleration.

This hits everyone at a different point. Some had their system go wonky immediately, some it was after a specific update, and some still don't have a problem, yet the trend is showing running hardware acceleration will result in a problem at some point.

However, with substreams being introduced, the CPU% needed to offload video to a GPU is more than the CPU% savings seen by offloading to a GPU. Especially after about 12 cameras, the CPU goes up by using a GPU and hardware acceleration.

My CPU % went down by not offloading to a GPU.

It is best to just use the GPU now for AI and use substreams for BI.
 
Top