- May 6, 2018
- 2,688
- 3,060
I have 13 cameras (12 Axis and 1 Dahua) in my business. The Axis cameras were purchased years ago before I found this site (and Andy). The one Dahua I just bought from Andy a couple weeks ago. I have been running the standard and free Axis Camera Companion software for years and just letting the cameras record to a NAS VM I setup on my virtualization server. While this works, the ACC software is pretty terrible and very limited. It takes ages to load the cameras and looking up clips, fast forwarding, and rewinding through recordings is painfully slow. It was time to find another way...
So I found this site and started doing some reading. Blue Iris seems universally recommended. I quickly spun up a test Win 10 VM, installed the demo version of BI, and attached a couple cameras to the software. I was a bit disappointed in the performance however. With two cameras attached, the VM was using about 50% of the two vCPUs I allotted to it. The virtualization host was running a very old Ivy Bridge i3 chip that has only 2C/4T available to it. I had whiteboxed together this host many many years ago. Doing some further digging, I found out the high CPU utilization was because the VM guest has no access to the CPU quicksync that Blue Iris can utilize for H.264 decoding. I knew my host CPU would not be able to handle 13 cameras being thrown at it plus the other VMs I run on the host for my other uses.
I had two options: I could either dedicate a physical machine for just BI or rebuild the virtualization host to a beefier setup and keep one host for everything. Building a dedicated machine just for camera usage was not a path I wanted to go down so I opted for the latter. The i3 host was getting a bit long in the tooth for some other services I host anyway so I thought it this was the perfect time to do a "forklift" upgrade on the whole system.
New Virtualization host:
I begun adding cameras to the demo version of BI and was extremely surprised and happy with how the new host handled the VM load. Here is a screen shot of the VM load with all 13 cameras:
Those four vCPUs hover around 30% CPU usage with all 13 cameras recording. Very impressed with this setup! For some reference:
Its amazing how snappy the web/mobile interfaces are. Cameras pull up extremely quickly and moving through clips is an absolute breeze.
At any rate, I wanted to share that, sure, its possible to run BI in a VM so long as you have enough horsepower under the hood. I'm not sure I'd recommend the setup to just anyone. Its pricey of course. I think I threw ~$1K at it and that is with re-using the large spinning disks I already had. I think if you need to host other VMs then its a no brainer to consolidate everything onto one host But if all you are doing is running some cams at home, then a physical host of BI is probably the way to go.
FYI, I had considered using the newer Ryzen 3 CPUs. The Ryzen 7 3700X for example has the same core count that the i9 9900 has for about $100 or so less. However my hypervisor (xcp-ng) is based of off CentOS. Its been my extensive experience that Intel seems to play nicer with various flavors of *nix than AMD does. So that is why I went that route.
If you have any questions please ask. I just wanted to share a success story with virtualizing BI since most stories are about failures when going this route.
So I found this site and started doing some reading. Blue Iris seems universally recommended. I quickly spun up a test Win 10 VM, installed the demo version of BI, and attached a couple cameras to the software. I was a bit disappointed in the performance however. With two cameras attached, the VM was using about 50% of the two vCPUs I allotted to it. The virtualization host was running a very old Ivy Bridge i3 chip that has only 2C/4T available to it. I had whiteboxed together this host many many years ago. Doing some further digging, I found out the high CPU utilization was because the VM guest has no access to the CPU quicksync that Blue Iris can utilize for H.264 decoding. I knew my host CPU would not be able to handle 13 cameras being thrown at it plus the other VMs I run on the host for my other uses.
I had two options: I could either dedicate a physical machine for just BI or rebuild the virtualization host to a beefier setup and keep one host for everything. Building a dedicated machine just for camera usage was not a path I wanted to go down so I opted for the latter. The i3 host was getting a bit long in the tooth for some other services I host anyway so I thought it this was the perfect time to do a "forklift" upgrade on the whole system.
New Virtualization host:
- xcp-ng (Hypervisor)
- Intel I9 9900 (8C/16T)
- 32 GB RAM
- RAID 1 500GB SSDs (They contain the VM storage repos plus the xcp-ng OS)
- 12 TB Exos X 7200 drive (Re-used from the older i3 host)
- 6 TB WD Red 5400 drive (Re-used from the older i3 host)
I begun adding cameras to the demo version of BI and was extremely surprised and happy with how the new host handled the VM load. Here is a screen shot of the VM load with all 13 cameras:
Those four vCPUs hover around 30% CPU usage with all 13 cameras recording. Very impressed with this setup! For some reference:
- Six of the cameras are setup for continuous recording @720p 25fps 24x7x365.
- The remaining cameras are setup for motion only @720p 25fps.
- I have BI set to "Direct to Disc Recording" for all cams.
- I have BI set for "No overlays" on all cams.
- I have BI set to record to the 12TB drive first (New) until full.
- I have BI set to move clips to the 6TB drive (stored) after New gets full. That gives 18TB of clip storage.
- I have BI set to run as a service
- I have BI "Limit Decoded unless required" unchecked for all cams
- The host is headless. I use RDP to configure BI software/Win 10. I view the cameras and clips only through the Web/Mobile interfaces.
Its amazing how snappy the web/mobile interfaces are. Cameras pull up extremely quickly and moving through clips is an absolute breeze.
At any rate, I wanted to share that, sure, its possible to run BI in a VM so long as you have enough horsepower under the hood. I'm not sure I'd recommend the setup to just anyone. Its pricey of course. I think I threw ~$1K at it and that is with re-using the large spinning disks I already had. I think if you need to host other VMs then its a no brainer to consolidate everything onto one host But if all you are doing is running some cams at home, then a physical host of BI is probably the way to go.
FYI, I had considered using the newer Ryzen 3 CPUs. The Ryzen 7 3700X for example has the same core count that the i9 9900 has for about $100 or so less. However my hypervisor (xcp-ng) is based of off CentOS. Its been my extensive experience that Intel seems to play nicer with various flavors of *nix than AMD does. So that is why I went that route.
If you have any questions please ask. I just wanted to share a success story with virtualizing BI since most stories are about failures when going this route.