Lowering BI webcast JPEG quality increases CPU usage?

djangel

Pulling my weight
Joined
Aug 30, 2014
Messages
336
Reaction score
149
I was playing with BI settings trying to reduce some of the CPU usage for my computer and changed the JPEG quality to 50%, instead of the default 85% for all my cams and my CPU usage actually increased. Is that normal? Once I restored the settings, CPU went back down to original readings. Anyone else with similar issue? Wondering if this is to be expected, or could it be a software bug, or software/encoding issue with my computer. Any input will be greatly appreciated.
 

bp2008

Staff member
Joined
Mar 10, 2014
Messages
12,681
Reaction score
14,043
Location
USA
That is to be expected. JPEG streaming works by sending one image at a time. When one image is fully transferred, the next image transfer begins. Lower image quality means lower sizes and faster transfers. With faster transfers, the browser is able to request images more often and that means Blue Iris spends more time encoding JPEG images!

There is not usually a significant difference in CPU usage for encoding different image qualities. The CPU usage difference is a result of the browser being able to request images more often.
 
Top