BI CPU Usage tech discussion

Joined
Jul 28, 2018
Messages
27
Reaction score
4
Location
USA, AR
I have been reading all weekend about the proper system specs to really make BI work well. I didnt start this post this bash the developers or anything, I am genuinely curious about the technology at work.

So here are my main questions

  • If the camera is set up for internal motion detection, and is outputting a h.264/265 stream, and direct to disk is selected then what is Blue Iris actually doing (other than writing data to the disk) that requires so much processing time? I've read that people have 60-70% consumption even with the GUI turned off. it doesn't seem to me just writing data to disk would need much CPU.
  • I read that NVR's are actually substantially more power efficient due to DSP chips. This brings me back to my first question. IF the video stream is already encoded, what is the DSP chip doing that makes the NVR more efficient? IS it possible to get these chips on a PCI card or another server connected device?
  • is there anything about the data stream or GUI that could be GPU accelerated? Like transcoding if you arent using direct to disk? is there any plans to use GPU;s in the future?
  • Will the software ever be improved so that intel iGPU's are not mandatory, and competitor video cards will work?
  • is it possible for the cameras to be in a low frame rate, low bitrate state, and then maximize the performance when motion is detected?

and one question not CPU related, but I don't want to make another thread. I know its possible to use an HDMI to Ethernet converter to send video signal long distance, but does anyone know if its possible to use some sort of a networking switch to duplicate that to 4 or 5 channels and send it to 4 or 5 different TVs? Or is it possible to see/interact with the gui over ethernet? (directly, without remote desktop)


I know a few of these have been answered before, but I've not seen really detailed answers about the underlying mechanism of the software, so I'm just curious.
 

bp2008

Staff member
Joined
Mar 10, 2014
Messages
12,666
Reaction score
14,006
Location
USA
Hi again.

If the camera is set up for internal motion detection, and is outputting a h.264/265 stream, and direct to disk is selected then what is Blue Iris actually doing (other than writing data to the disk) that requires so much processing time? I've read that people have 60-70% consumption even with the GUI turned off. it doesn't seem to me just writing data to disk would need much CPU.
Blue Iris always decodes all incoming video streams, even if there's nothing actually needing to consume the decoded video. This is the primary CPU consumer for most well-tuned BI installations. Realizing this, Blue Iris fairly recently added a "Limit decode" feature that can be turned on for individual streams which will limit Blue Iris to decoding only the keyframes. Since keyframes typically come every 1-4 seconds (user-configurable in the camera's web interface), this reduces CPU usage drastically. This is a complex feature, though, and users who enable it without understanding it run into trouble eventually. That is discussed on this page: Optimizing Blue Iris's CPU Usage | IP Cam Talk

I read that NVR's are actually substantially more power efficient due to DSP chips. This brings me back to my first question. IF the video stream is already encoded, what is the DSP chip doing that makes the NVR more efficient? IS it possible to get these chips on a PCI card or another server connected device?
NVRs have dedicated hardware for working with H.264/H.265 video that requires minimal CPU interaction so they can run on low power budgets. There are PC equivalents, most commonly found in graphics cards and integrated graphics on CPUs. Intel's version is called Quick Sync. It is the only type of hardware acceleration supported by Blue Iris, and at this time only H.264 decoding utilizes it.

is there anything about the data stream or GUI that could be GPU accelerated? Like transcoding if you arent using direct to disk? is there any plans to use GPU;s in the future?
Sure, most GPUs do have dedicated video decoding and encoding modules built-in, but Blue Iris currently only has the most minimal support for this (H.264 decoding via Quick Sync). H.265 decoding support (also via Quick Sync) is planned and partially implemented, just not working yet.

Will the software ever be improved so that intel iGPU's are not mandatory, and competitor video cards will work?
Eventually, probably. You can still run without hardware acceleration if you throw more raw CPU horsepower at the problem. It just won't be as efficient.

is it possible for the cameras to be in a low frame rate, low bitrate state, and then maximize the performance when motion is detected?
Not really. The closest thing to this is the "Limit decode" option previously mentioned.

and one question not CPU related, but I don't want to make another thread. I know its possible to use an HDMI to Ethernet converter to send video signal long distance, but does anyone know if its possible to use some sort of a networking switch to duplicate that to 4 or 5 channels and send it to 4 or 5 different TVs?
Some HDMI extenders use multicast packets and are compatible with multiple receivers from one sender. Here's one such example: http://a.co/7IDHbOk

This type of extender lowers video quality and increases input lag substantially and usually does not have a method for sending input back to the source.

Or is it possible to see/interact with the gui over ethernet? (directly, without remote desktop)
No. There are the Android and iOS apps and the UI3 web interface, but no way to get the full functionality of the local console besides using the actual local console on the server's desktop.
 
As an Amazon Associate IPCamTalk earns from qualifying purchases.
Top