Update: Problem was solved until I upgraded to 5.3.7.13 and 5.3.8.3. After those upgrades my CPU spiked to 100% without any other changes. The spiking seemed to occur when connecting using a web client. When I didn't use the web client, the CPU came down to around 80%. No matter what changes I made to the web server, the camera settings, the Blue Iris settings, using all the guidance on reducing CPU utilization, nothing was effective reducing CPU useage at 100%. when using a web client, and when not it would remain around 80%.
This never happened prior to upgrading to versions 5.3.7.13 and 5.3.8.3. Downgrading back to 5.3.6.7 didn't change this behavior, which was odd (I assume the changes made in upgrades beyond 5.3.7.12 changed some bits permanently that weren't reversed to a previous state prior to the upgrade to those versions). This is so frustrating and causes my love/hate relationship with Blue Iris!
So I after hours and hours of checking and changing settings to try to get the CPU back to the 67% range when using a web client, I gave up because nothing worked. With no other options left, I decided to follow sebastiantombs suggestion of configuring and using sub-streams in Blue Iris. After doing so with my same Blue Iris configuration (see the first post in this stream that provides the details of my configuration), with version 5.3.8.3, my CPU utilization is down to 16% on average even with a Remote Desktop Session using the thhick Blue Iris client within the Remote Desktop Session, let alone connecting to the Blue Iris server using several web clients and the IOS client.
My own self-assessment is that I didn't listen initially to sebastiantombs because I've become so weary of Blue Iris and making any changes to it because they always impact CPU utilization negatively or screw things up. I had gotten my Blue Iris system finally operating at 67% through updates including 5.3.7.12. So I was hesitant to make the changes that sebastiantombs was suggesting in fear of screwing everything up even worse than 100% CPU utilization. Plus I'm not sure I realized how sub-streams would work. After doing my homework on sub-streams, because I was forced to consider it after everything else failed to get the CPU utilization back to the nominal range after upgrades to v 5.3.7.13 and above, I implemented sub-streams in Blue Iris and immediately my CPU utilization dropped from 100% to now between 16% to 20%, with no other changes.
I love the developers of Blue Iris because Blue Iris is truly great and very enabling software, with lots of functionality and is sold at a very reasonable price. I would gladly pay $100 more a year if the SDLC process had better quality control to produce more stable released to production software, to reduce the requirement for daily updates at the x.x.x.x level to solve bugs introduced in each successive version update that is being implemented to solve a previous bug that was introduced in the pervious version. Not saying that bugs can be totally eliminated given the infinite and exponential variables across OS updates, camera firmware, processors, GPU, etc. or that bugs shouldn't be addressed and resolved timely. Rather I'm suggesting that that better quality control (read testing) would reduce the amount of bugs introduced in each successive version. If I had to include the cost of my time dealing with Blue Iris issues including lots of web searches, reading and configuration changes across almost 30 cameras to solve issues, the actual cost of Blue Iris is excessive. Developers of Blue Iris - please take this as constructive criticism from a user who wants Blue Iris to be the best IP Camera monitoring software. Why not include an assessment utility that would assess a Blue Iris configuration and provide the configuration changes to solve the issue.
I hope this post will helps others who are suffering with CPU spiking after upgrades to versions 5.3.7.13 and above.
This never happened prior to upgrading to versions 5.3.7.13 and 5.3.8.3. Downgrading back to 5.3.6.7 didn't change this behavior, which was odd (I assume the changes made in upgrades beyond 5.3.7.12 changed some bits permanently that weren't reversed to a previous state prior to the upgrade to those versions). This is so frustrating and causes my love/hate relationship with Blue Iris!
So I after hours and hours of checking and changing settings to try to get the CPU back to the 67% range when using a web client, I gave up because nothing worked. With no other options left, I decided to follow sebastiantombs suggestion of configuring and using sub-streams in Blue Iris. After doing so with my same Blue Iris configuration (see the first post in this stream that provides the details of my configuration), with version 5.3.8.3, my CPU utilization is down to 16% on average even with a Remote Desktop Session using the thhick Blue Iris client within the Remote Desktop Session, let alone connecting to the Blue Iris server using several web clients and the IOS client.
My own self-assessment is that I didn't listen initially to sebastiantombs because I've become so weary of Blue Iris and making any changes to it because they always impact CPU utilization negatively or screw things up. I had gotten my Blue Iris system finally operating at 67% through updates including 5.3.7.12. So I was hesitant to make the changes that sebastiantombs was suggesting in fear of screwing everything up even worse than 100% CPU utilization. Plus I'm not sure I realized how sub-streams would work. After doing my homework on sub-streams, because I was forced to consider it after everything else failed to get the CPU utilization back to the nominal range after upgrades to v 5.3.7.13 and above, I implemented sub-streams in Blue Iris and immediately my CPU utilization dropped from 100% to now between 16% to 20%, with no other changes.
I love the developers of Blue Iris because Blue Iris is truly great and very enabling software, with lots of functionality and is sold at a very reasonable price. I would gladly pay $100 more a year if the SDLC process had better quality control to produce more stable released to production software, to reduce the requirement for daily updates at the x.x.x.x level to solve bugs introduced in each successive version update that is being implemented to solve a previous bug that was introduced in the pervious version. Not saying that bugs can be totally eliminated given the infinite and exponential variables across OS updates, camera firmware, processors, GPU, etc. or that bugs shouldn't be addressed and resolved timely. Rather I'm suggesting that that better quality control (read testing) would reduce the amount of bugs introduced in each successive version. If I had to include the cost of my time dealing with Blue Iris issues including lots of web searches, reading and configuration changes across almost 30 cameras to solve issues, the actual cost of Blue Iris is excessive. Developers of Blue Iris - please take this as constructive criticism from a user who wants Blue Iris to be the best IP Camera monitoring software. Why not include an assessment utility that would assess a Blue Iris configuration and provide the configuration changes to solve the issue.
I hope this post will helps others who are suffering with CPU spiking after upgrades to versions 5.3.7.13 and above.