Blue iris, why not virtualized?

bestbeast

Young grasshopper
Joined
Dec 4, 2019
Messages
85
Reaction score
8
Location
UK
Hello guys,

I have been sometime looking around the forums, posts about why virtualization is not recommended for blue iris, or any other CCTV system. But I want to ask about my point of view in this.
Up to what I can see, virtualization is just not simply recommended because of not having the ability to use hardware acceleration. But that seems not to be entirely true, I will give you 3 examples of virtualization I have been thinking about:
1. It seems on various hypervisors you can passthrough the intel igpu, to a virtual machine to use intel quick sync for the hardware acceleration. Now, this seems to result in an issue, which might limit you the ability to manage the hypervisor through VGA console access, but that seems to be solved, if you just buy another internal cheap GPU for that, which doesn't seem to be a great issue.
2. Another setup I have seen around, is buying a quadro P2000 that supports multiple transcoding streamings and pass it through again to a VM to deal with it, and leave the intel gpu to deal with the console.
3. And the last setup I have been seeing around is to unlock customer GTX GPUs with hacky drivers, to unlock the multiple streams at the same time, and pass it through to the VMs as in the before example I just mentioned.

Talking about electricity costs, first example seems to be the most cheaper one, so just wanted to ask here, why should we have a dedicated hardware which can end up in more electricity, rather than use virtualization, if we already have a dedicated device for that.

Kind regards
 

aristobrat

IPCT Contributor
Joined
Dec 5, 2016
Messages
2,982
Reaction score
3,180
There are some people recently who have reported good results with using virtualization.

IMO whether it works well or not seems to hinge around the MB/s of video their cameras are pumping into the VM, their host hardware setup (getting QuickSync to work for most doesn’t seem to be an easy task from reading many threads here IIRC), and most importantly how strong their virtualization and camera tuning skills are. Setting a camera system to 15 FPS, VBR and h.264 is way more likely to be successful than 30 FPS with a high CBR and h.265.

IMO there are more threads where people have been unsuccessful with virtualization than successful, which is why I think the general advice is that it’s not the best way to run BI *for most people*.
 

bestbeast

Young grasshopper
Joined
Dec 4, 2019
Messages
85
Reaction score
8
Location
UK
There are some people recently who have reported good results with using virtualization.

IMO whether it works well or not seems to hinge around the MB/s of video their cameras are pumping into the VM, their host hardware setup (getting QuickSync to work for most doesn’t seem to be an easy task from reading many threads here IIRC), and most importantly how strong their virtualization and camera tuning skills are. Setting a camera system to 15 FPS, VBR and h.264 is way more likely to be successful than 30 FPS with a high CBR and h.265.

IMO there are more threads where people have been unsuccessful with virtualization than successful, which is why I think the general advice is that it’s not the best way to run BI *for most people*.
Yes, at the very end this ends up on how easy is to passthrough this GPUs to the VMs, but once the passthrough is done, the performance for quick sync inside of a VM or outside should be the same. I have been checking and for example passing the gpu with proxmox might be a little bit complex, but with ESXI seems to be pretty easy. I will be checking myself next week, and lets see how it goes.
Now according to H265 and FPS, that relies more on the hardware acceleration stuff itself, don't think that has nothing to do with VMs.
 

aristobrat

IPCT Contributor
Joined
Dec 5, 2016
Messages
2,982
Reaction score
3,180
Now according to H265 and FPS, that relies more on the hardware acceleration stuff itself, don't think that has nothing to do with VMs.
IMO it has to do with hardware (either physical or virtual) in the sense that those camera settings will directly affect how much processing power BI needs to be able to run smoothly.

Please keep the thread updated with your progress next week.

Also, from with BI (on the bottom of the Status > Cameras tab) please share your Totals:. IMO that's one of the very few ways to directly compare BI workloads. Here's what my camera system generates (and BI has to process):
IMG_0659.jpg
 

bestbeast

Young grasshopper
Joined
Dec 4, 2019
Messages
85
Reaction score
8
Location
UK
IMO it has to do with hardware (either physical or virtual) in the sense that those camera settings will directly affect how much processing power BI needs to be able to run smoothly.

Please keep the thread updated with your progress next week.

Also, from with BI (on the bottom of the Status > Cameras tab) please share your Totals:. IMO that's one of the very few ways to directly compare BI workloads. Here's what my camera system generates (and BI has to process):
View attachment 51640
I was gonna try first with a different product I have to test the hardware acceleration. Because as far as I know I am not able to use hardware aceleration with BI without purchasing it, right?
 

aristobrat

IPCT Contributor
Joined
Dec 5, 2016
Messages
2,982
Reaction score
3,180
Because as far as I know I am not able to use hardware aceleration with BI without purchasing it, right?
I'm honestly not sure and I've never found a document that officially states what's disabled during the trials. And who knows if anything changed with the new BI V5 release. If you have another tool to test HA with, that's probably the best bet.
 

bestbeast

Young grasshopper
Joined
Dec 4, 2019
Messages
85
Reaction score
8
Location
UK
I'm honestly not sure and I've never found a document that officially states what's disabled during the trials. And who knows if anything changed with the new BI V5 release. If you have another tool to test HA with, that's probably the best bet.
yep, don't worry, I will be making many tests and will let you know if it works
 

rkn

Young grasshopper
Joined
May 8, 2017
Messages
41
Reaction score
9
When building a new server for BI I avoided virtualisation even though I much prefer that approach just so I could avoid pass through issues and have simple access to hardware acceleration for BI.

Sadly whenever I use QuickSync hardware acceleration I get ghosting on playback despite trying various camera settings suggested on this forum as well as a number of different Intel graphics drivers. The end result is I may as well have gone with virtualisation - its just lucky I built a fairly powerful PC so its easily able to cope with the load.

So to add to usual virtualised vs non-virtualised considerations its worth making sure that whatever HA you intend to use works reliably (and efficiently) for your cameras and CPU/GPU generally.
 

fenderman

Staff member
Joined
Mar 9, 2014
Messages
36,897
Reaction score
21,250
Sadly whenever I use QuickSync hardware acceleration I get ghosting on playback despite trying various camera settings suggested on this forum as well as a number of different Intel graphics drivers. The end result is I may as well have gone with virtualisation - its just lucky I built a fairly powerful PC so its easily able to cope with the load.
This is likely a combination of a bad driver and/or running headless. Test by connecting a monitor to the system.
 

bp2008

Staff member
Joined
Mar 10, 2014
Messages
12,666
Reaction score
14,006
Location
USA
Besides the likely chance you wouldn't be able to get Intel hardware acceleration working properly, there's nothing "wrong" with using virtualization for Blue Iris. As long as you know what you are doing and are able to allocate resources appropriately. When a spike of high utilization occurs and Blue Iris doesn't have the compute / memory / bandwidth / etc to do its job in real-time, things go wrong. If you assign only 3 out of 8 vCPUs to a Blue Iris virtual machine and you eventually run into these problems, you have nobody to blame but yourself. So in general it is safer to dedicate an entire machine to Blue Iris.

Although Nvidia pass-through is great and typically works without issue, Nvidia hardware acceleration raises power consumption significantly compared to no hardware acceleration at all. So that only makes sense if reducing CPU usage is more important than saving power. FYI, Quadro cards aren't as necessary as you might think. GeForce cards only limit the number of NVENC (encoding) sessions, which are much less important in Blue Iris. NVDEC (decoding) sessions are not artificially limited.

I was gonna try first with a different product I have to test the hardware acceleration. Because as far as I know I am not able to use hardware aceleration with BI without purchasing it, right?
You can quickly determine if hardware acceleration is working by looking in task manager > Performance tab, and look at the GPU's "Video Decode" graph.
 

bestbeast

Young grasshopper
Joined
Dec 4, 2019
Messages
85
Reaction score
8
Location
UK
Besides the likely chance you wouldn't be able to get Intel hardware acceleration working properly, there's nothing "wrong" with using virtualization for Blue Iris. As long as you know what you are doing and are able to allocate resources appropriately. When a spike of high utilization occurs and Blue Iris doesn't have the compute / memory / bandwidth / etc to do its job in real-time, things go wrong. If you assign only 3 out of 8 vCPUs to a Blue Iris virtual machine and you eventually run into these problems, you have nobody to blame but yourself. So in general it is safer to dedicate an entire machine to Blue Iris.
Well, that only depends if you limit virtual CPU to that VM, and depends on how many processing power you have and how many do you need. Just for an example, you just may have 2-4 cameras for home environment, but it is just a total waste of money maybe to buy a new entire device, to place BI in it in case you already have a powerfull server. So in case you have for example a server with a i7 8/16, you could just assign to that VM a decent amount of 8vCPUs, to cover whatever needs BI will need.


Although Nvidia pass-through is great and typically works without issue, Nvidia hardware acceleration raises power consumption significantly compared to no hardware acceleration at all. So that only makes sense if reducing CPU usage is more important than saving power. FYI, Quadro cards aren't as necessary as you might think. GeForce cards only limit the number of NVENC (encoding) sessions, which are much less important in Blue Iris. NVDEC (decoding) sessions are not artificially limited.
hmmm that seems weird, because I was using hardware aceleration with a GTX1080 on my windows pc, when I was trying blue iris with some IP cameras, and saw, that GPU usage % was not increasing while I was placing more and more cameras
 

bp2008

Staff member
Joined
Mar 10, 2014
Messages
12,666
Reaction score
14,006
Location
USA
Check the "Video Decode" graph. Older versions of Windows might not have this graph. Blue Iris also causes usage on the "3D" engine, but only when you are rendering the local console (GUI) on a display connected to that GPU.
 

bp2008

Staff member
Joined
Mar 10, 2014
Messages
12,666
Reaction score
14,006
Location
USA
Just for an example, you just may have 2-4 cameras for home environment, but it is just a total waste of money maybe to buy a new entire device, to place BI in it in case you already have a powerfull server. So in case you have for example a server with a i7 8/16, you could just assign to that VM a decent amount of 8vCPUs, to cover whatever needs BI will need.
A modern system with plenty of spare capacity is fine.

Lots of people come along with some old rack server that draws hundreds of watts at idle and sounds like a swarm of bees, and they want to run Blue Iris on it, not understanding that they could do better with a $100 USD workstation from ebay that would run silently and at under 50 watts.
 

bestbeast

Young grasshopper
Joined
Dec 4, 2019
Messages
85
Reaction score
8
Location
UK
Check the "Video Decode" graph. Older versions of Windows might not have this graph. Blue Iris also causes usage on the "3D" engine, but only when you are rendering the local console (GUI) on a display connected to that GPU.
I was currently using last windows 10 version. So what do you suggest, to minimize BI and check CPU usage?
But apart from that, what I was referring to was, that the CPU was increasing, each time I was getting more and more cameras into the system, but the GPU was kind of maintaining at a fixed %, like if it was limited somehow.

A modern system with plenty of spare capacity is fine.

Lots of people come along with some old rack server that draws hundreds of watts at idle and sounds like a swarm of bees, and they want to run Blue Iris on it, not understanding that they could do better with a $100 USD workstation from ebay that would run silently and at under 50 watts.
What type of workstations are you talking about? Also remember, that not all people have the same deals you might have in USA.
 
Joined
Apr 26, 2016
Messages
1,090
Reaction score
852
Location
Colorado
I think the general guidance away from virtualization probably has more to do with
1. some core members on here have experience with many dozens of successful deployments on standalone hardware (it's cheap, easy to setup, you can buy whatever generation/efficiency you can afford, and complexity is lower) and
2. the resulting deluge of questions (what's this error mean, why doesn't this work anymore, I can't get my docker image to run etc) and complaint posts (Blue Iris performance is garbage in VM etc) that usually occur when people go off and try to do something in a more complex way.

Your point about already having an existing powerful server, definitely could be valid, unless your existing powerful server is some generations old Xeon that already pulls gobs of power from the wall, and increasing the load will further increase that draw (so yes technically you are reusing an existing system, but you are also paying more in the long haul in electricity than just buying a smaller more energy efficient system to handle this one task).

Your point that you might be the person to figure this out, and to reinforce @bp2008 point, if you are willing to figure it out relying on your own capabilities go for it, maybe even write up how you did it to share.....but then be ready for a large number of people asking for help when they try to do the same thing without the benefit of your experience/skill with a VM.
Is your reason for wanting to do this academic? Or do you just want a simple solution that works when you press the power button and chugs away in a hardware closet somewhere and the video is there when you need it that one time a year?

Other examples:
  • VLANs can isolate cameras, so can separate NIC cards -- if you are knowledgeable on VLANs that seems like a pretty simple/obvious way to solve the internet access problem, if you aren't keep-it-simple and go with Dual-NIC setup. Neither is technically the wrong way to do it.
  • Can you run 10 camera system on a free/cheap/ebay Xeon server packing multiple CPU's and gobs of cores that are few generations old (sure), could you also buy something that handled that same load, consumed a third of the power because it was newer generation consumer grade and cost under $350 (yup).
 
Last edited:

bestbeast

Young grasshopper
Joined
Dec 4, 2019
Messages
85
Reaction score
8
Location
UK
I think the general guidance away from virtualization probably has more to do with
1. some core members on here have experience with many dozens of successful deployments on standalone hardware (it's cheap, easy to setup, you can buy whatever generation/efficiency you can afford, and complexity is lower) and
2. the resulting deluge of questions (what's this error mean, why doesn't this work anymore, I can't get my docker image to run etc) and complaint posts (Blue Iris performance is garbage in VM etc) that usually occur when people go off and try to do something in a more complex way.

Your point about already having an existing powerful server, definitely could be valid, unless your existing powerful server is some generations old Xeon that already pulls gobs of power from the wall, and increasing the load will further increase that draw (so yes technically you are reusing an existing system, but you are also paying more in the long haul in electricity than just buying a smaller more energy efficient system to handle this one task).

Your point that you might be the person to figure this out, and to reinforce @bp2008 point, if you are willing to figure it out relying on your own capabilities go for it, maybe even write up how you did it to share.....but then be ready for a large number of people asking for help when they try to do the same thing without the benefit of your experience/skill with a VM.
Is your reason for wanting to do this academic? Or do you just want a simple solution that works when you press the power button and chugs away in a hardware closet somewhere and the video is there when you need it that one time a year?

Other examples:
  • VLANs can isolate cameras, so can separate NIC cards -- if you are knowledgeable on VLANs that seems like a pretty simple/obvious way to solve the internet access problem, if you aren't keep-it-simple and goal with Dual-NIC. Neither is technically the wrong way to do it.
  • Can you run 10 camera system on a free/cheap/ebay Xeon server packing multiple CPU's and gobs of cores that are few generations old (sure), could you also buy something that handled that same load, consumed a third of the power because it was newer generation consumer grade and cost under $350 (yup).
Yes, I can understand your point, about people trying to use shitty devices to try to achieve this. Indeed my device is just an i7, but at least I want to use lets say 20-40% of its power for CCTV, and the rest for other stuff, I have. But even if I use 20-40% of its power, what I want it, is to be correctly optimized, with HA.

This is not for anything related with academic stuff, just trying to use what I have, and not expend more money on something that is not needed.

About the VLANs, yes, of course, this will run behind various vlans, and a firewall, to make it secure.
 

bp2008

Staff member
Joined
Mar 10, 2014
Messages
12,666
Reaction score
14,006
Location
USA
I was currently using last windows 10 version. So what do you suggest, to minimize BI and check CPU usage?
But apart from that, what I was referring to was, that the CPU was increasing, each time I was getting more and more cameras into the system, but the GPU was kind of maintaining at a fixed %, like if it was limited somehow.
The Video Decode graph should be above zero if hardware acceleration is working, and it should go up just like CPU usage does when you add cameras. Remember hardware acceleration can only offload some of the work from the CPU, not all of it, so both CPU and GPU Video Decode would go up with a higher load.

What type of workstations are you talking about? Also remember, that not all people have the same deals you might have in USA.
Stuff linked here: Choosing Hardware for Blue Iris

Yes, it is a lot easier to find deals on these in the USA.
 

bestbeast

Young grasshopper
Joined
Dec 4, 2019
Messages
85
Reaction score
8
Location
UK
The Video Decode graph should be above zero if hardware acceleration is working, and it should go up just like CPU usage does when you add cameras. Remember hardware acceleration can only offload some of the work from the CPU, not all of it, so both CPU and GPU Video Decode would go up with a higher load.
I will try again, and let you know my results

Stuff linked here: Choosing Hardware for Blue Iris

Yes, it is a lot easier to find deals on these in the USA.
Yep, I already read that link. But lets see first if I can figure out, how to use my actual hardware, without having to purchase another device more.
 

TL1096r

IPCT Contributor
Joined
Jan 28, 2017
Messages
1,223
Reaction score
465
Last edited:

rkn

Young grasshopper
Joined
May 8, 2017
Messages
41
Reaction score
9
This is likely a combination of a bad driver and/or running headless. Test by connecting a monitor to the system.
Thanks, but it has a monitor connected all the time.

I've tried four versions of the Intel graphics driver (from last 18 months) with no success so have parked this for now until I get some more time.
 
Top