image corruption

PeteJ

Getting the hang of it
Jan 14, 2025
173
85
California
This is from a Uniview camera on H.264 10fps, 1/5 key frame. I don't see this when looking at the web interface on the camera, but when I look at BI, this is pretty common at night, and happens from time to time during the day. Any idea what might be causing this? I am using decode acceleration on BI (Intel + VPP), not sure if that might be the reason?

Screenshot from 2025-04-06 11-40-26.png
 
Disabling hardware accel worked. Doesn't seem to make the CPU work any harder either. The quality does seem poorer tho, i'm going to flip it back on and see if the quality improves.
 
Quick sync/hardware acceleration isn't really needed anymore and it can be an energy hog.

Around the time AI was introduced in BI, many here had their system become unstable with hardware acceleration (hardware decode) (Quick Sync) on (even if not using DeepStack or CodeProject). Some have also been fine. I started to see errors when I was using hardware acceleration several updates into when AI was added.

This hits everyone at a different point. Some had their system go wonky immediately, some it was after a specific update, and some still don't have a problem, but the trend is showing running hardware acceleration will result in a problem at some point.

However, with substreams being introduced, the CPU% needed to offload video to a GPU (internal or external) is more than the CPU% savings seen by offloading to a GPU. Especially after about 12 cameras, the CPU goes up by using hardware acceleration. The wiki points this out as well.

Plus substreams opens up the possibility for older machines to be just fine, along with non-intel computers.

My CPU % went down by not using hardware acceleration.

Here is a recent thread where someone turned off hardware acceleration based on my post and their CPU dropped 10-15% and BI became stable.

But if you use HA, use plain intel and not the variants.

Some still don't have a problem, but eventually it may result in a problem.

Here is a sampling of recent threads that turning off HA fixed the issues they were having....

No hardware acceleration with subs?


Hardware decoding just increases GPU usage?


Can't enable HA on one camera + high Bitrate


And as always, YMMV.
 
  • Like
Reactions: bp2008 and PeteJ
Thanks @wittaj, I always learn a lot from your details replies. :)

I just tried the Intel hw decode (was using intel +vpp) and I was still getting the corruption, so I switched to Intel Beta just to see what happens.

Yeah, I was having some BI issues, and one of the first things support tells you to do is disable hw decode...
 
  • Like
Reactions: bp2008