Hardware decoding just increases GPU usage?

TomBowyer

n3wb
Nov 30, 2016
3
6
Hi,
Can anyone help explain what appears to me to be very strange CPU/GPU usage behaviour when turning on hardware decode to Intel.

With the exact same camera/settings/system, turning hardware decode to Intel increases the CPU usage by ~10% and the GPU usage goes from 0 to ~70%. There is no noticeable difference in camera FPS, Quality or anything... I feel like I'm either being incredibly stupid and missing something, or my system has gone haywire.

CPU/GPU usage with Hardware Decode set to "no":
1653902937915.png

1653902996616.png


CPU/GPU usage with Hardware Decode set to "Intel":

1653903138637.png


1653903077783.png


Specs / Settings:
i7-9700 @ 3.00Ghz, 8 cores
32 GB Ram
Samsung SSD for the OS and 2xWD Purple drives for the cameras.

1653904333429.png

Thanks!
Tom.
 
  • Like
Reactions: Flintstone61
It's likely not related but you have smart codec set to on you need to turn it off because it is not compatible with blue iris you are generating very low key frames.
 
  • Like
Reactions: Flintstone61
There has been a trend away from Hardware acceleration "ON" with newer versions of BI, since about Ver.5.3 or 5.4 or thereabouts.
from what I've been told anyway.
I turned it off globally in the "Cameras" tab on my 18 cam system after an upgrade to 5.4.7.11.
That's also when I began using substreams on 16 of 18 camera's.
it's an i5-8500 and it's usually showing CPU usage at 12-21% based on motion activity.
FPS is about 8-12 across the cameras. even my LPR.
 
The home system, is streaming 9 cameras to BI, with HA off, and the cam's are at 8 FPS, except for the Amcrest ad110 doorbell. I don't recall if you can limit the FPS on that thing.
Screenshot 2022-05-30 091116.jpgScreenshot 2022-05-30 091315.jpg
 
Yep, Smart Codec is a no-no with BI.

As @Flintstone61 points out, around the time DeepStack was introduced, many here had their system become unstable with hardware acceleration on (even if not using DeepStack). Some have also been fine.

However, with substreams being introduced, the CPU% needed to offload video to a GPU is more than the CPU% savings seen by offloading to a GPU. In fact, hardware acceleration isn't even shown in the optimization wiki anymore.

As mentioned above, also make sure you are doing every optimization in the wiki. You are not using substreams and it is a game changer. I do not know how many cameras you have, but if that is all you have, something is seriously wrong with your system. Members here run 50 cameras on a 4th gen CPU at 30%.

Do not be concerned about substream quality (that seems to be a big stumbling block for people) - if it resulted in less performance, nobody here would be using them.
 
Hey - just posting to say thanks for the replies and suggestions - I thought I had gone through all the settings and new my stuff, but clearly I haven't and don't - e.g. I turned off substreams as I thought that was just extra load (I only want the main stream so why would I need a substream) - but now I see it helps with motion detection etc.

I will go through the suggestions and post back the results. Thanks!
 
  • Like
Reactions: sebastiantombs