Blue Iris in VMWare ESXi 6.7 with passthrough of Intel integrated graphics

Caveman

n3wb
Joined
Apr 7, 2016
Messages
24
Reaction score
8
Hey

Just wanted to say that this is entirely possible and works fine in a Windows 10 1809 VM.
Ive used my Intel NUC 7i5BNK which has the Core i5-7260U CPU with Intel Iris Plus 640 graphics. It currently has 8GB DDR4 memory and a 128GB Intel 600p NVMe SSD.
ESXi 6.7 Update 1 runs from a Sandisk USB stick.

The key to getting this to work is:
1. Enable Passthrough of the Intel Graphics, usually this sits on this address in ESXi: 0000:00:02.0, then reboot the ESXi host.
2. Add this GPU to the Windows 10 VM and reserve all memory dedicated to the VM.
3. Set these 2 values under VM Options - Advanced - Configuration Parameters - Edit Configuration:
pciHole.start=2048
SVGA.Present=FALSE
4. Make sure you have enabled RDP, teamviewer or similar to access the VM. After you enable GPU Passthrough you wont see the console through VMWare anymore.
5. Start the VM and install the appropriate graphics drivers from Intel.


Hope this helps :)
 

bp2008

Staff member
Joined
Mar 10, 2014
Messages
12,666
Reaction score
14,007
Location
USA
I have a few questions.

Are the two configuration parameters required? I tried this on a 4th-gen i7 box a couple years back and the Intel drivers wouldn't initialize. But I didn't change any configuration parameters like that.

When you say you won't see the console through vmware anymore, you mean the ESXi host client (web interface) can't show you the VM's console anymore? Mine remained fully functional after passing through the iGPU, however it did of course cause ESXi's own nearly worthless local console to disappear halfway through ESXi's bootup.
 

Caveman

n3wb
Joined
Apr 7, 2016
Messages
24
Reaction score
8
I have a few questions.

Are the two configuration parameters required? I tried this on a 4th-gen i7 box a couple years back and the Intel drivers wouldn't initialize. But I didn't change any configuration parameters like that.

When you say you won't see the console through vmware anymore, you mean the ESXi host client (web interface) can't show you the VM's console anymore? Mine remained fully functional after passing through the iGPU, however it did of course cause ESXi's own nearly worthless local console to disappear halfway through ESXi's bootup.
If the parameters are not present, the adapter will still appear inside the VM and you can install the driver. However, you will get the "Code 43" error in Device Manager on the GPU and it wont initialize.

You can still see the VMWare web console, but you wont be able to open the Win10 VM Remote session for this particular VM. Other VMs will work just fine, though.
 

Valiant

Pulling my weight
Joined
Oct 30, 2017
Messages
307
Reaction score
174
Location
Australia
Been a while since I used esxi 5.5. Is the latest host version still free?

Multiple VMs on a recording box may not be wise, but good effort in getting it going.
 

Caveman

n3wb
Joined
Apr 7, 2016
Messages
24
Reaction score
8
Been a while since I used esxi 5.5. Is the latest host version still free?

Multiple VMs on a recording box may not be wise, but good effort in getting it going.
Yes, ESXi 6.7 is still free with a few limitations. Being able to run some Dockers in a Linux VM on the same box is useful. Im actually running Plex on the same VM as Blue Iris and its working great so far. I have 4 Dahua Starlight turrets, but cpu is around 20% with Quick Sync in use and GUI closed down.
 
Last edited:

wpiman

Pulling my weight
Joined
Jul 16, 2018
Messages
332
Reaction score
246
Location
massachusetts
Thanks for posting this....

I have ESXi 6.7 running on a SuperMicro motherboard with the following processor...

Intel(R) Xeon(R) CPU E3-1230 v3 @ 3.30GHz (v3 is Haswell)

I have a Windows VM running BI and Homeseer. I have a Linux server running plex on the same box. All the VMs are stored on a similar custom built NAS with 18 3 TB drives in a RAID 5 format.

I have a single 2MP camera and 2 Doorbirds right now. I have 2 more 2 MP cameras that aren't installed yet. This spring I plan on installing them and adding a couple more.

I've thought about getting a Windows box to do BI; but the advantages of virtualization are too great. I have thought about getting a massive cpu with something like 20 cores and tackling the problem that way. It is costly but having to maintain another box is too.

According to this, I should be able to do H264 decoding with Haswell. I did try to do the GPU passthrough- but I am not sure I tried the other configs you listed.

Intel Quick Sync Video - Wikipedia

I am currently sipping CPU at around 14%-25% on the VM. It was higher but the last update of BI really brought it down. I will double check my settings and see if I can get this to work. Blue Iris has been set and forget for me, so I don't remember all the switches I have set.

(It there a way to tell if HW acceleration is actually working other than to see if your CPU goes down? Maybe I set it up correctly before?)

I am hoping I can add these other Camera and not have to make a new white box or do a separate Windows machine.
 

Caveman

n3wb
Joined
Apr 7, 2016
Messages
24
Reaction score
8
E3 1230 doesnt have a GPU, you need a model ending in "5". Also you need a motherboard with a C2x6 chipset, C2x2 wont work
 

wpiman

Pulling my weight
Joined
Jul 16, 2018
Messages
332
Reaction score
246
Location
massachusetts
Xeon E-2144G only has 4 cores. How about this instead? Intel Xeon E-2126G Coffee Lake 3.30 GHz LGA 1151 80W CM8068403380219 Server Processor Intel UHD Graphics P630 - OEM - Newegg.com

(there are a bunch of more expensive, but higher-clocked 6 core Xeon E-2xxxG which should be good too)
That does look indeed better. Interesting that that is listed as 6 threads? Do some of the Xeon's not support hyperthreading?

I actually think what I am going to do is to order an older V3 1245 with the embedded graphics that will be pin compatible with my motherboard. For a couple of hundred bucks I should be able to get the acceleration; test out the setup; and then if that all works I can do new build later on equipped with better information (and possibly newer tech).

I appreciate all the information you guys have shared. Big help here.
 

Caveman

n3wb
Joined
Apr 7, 2016
Messages
24
Reaction score
8
The E-2146G is by far the best choice for the cost/performance. Its 6 cores 12 threads, with the integrated GPU for 311 USD.
Ive changed my Blue Iris over to a Xeon D-1541 platform for the moment. My Intel NUC would overheat on the NVMe drive and crash unless I set the fan to unacceptable levels.
Im considering the GTX 1650 due 22nd of April which has the newest NVENC hopefully. Would be used for Blue Iris and Plex.
 

myipcam

Getting the hang of it
Joined
Dec 23, 2018
Messages
132
Reaction score
33
Location
USA
Hey

Just wanted to say that this is entirely possible and works fine in a Windows 10 1809 VM.
Ive used my Intel NUC 7i5BNK which has the Core i5-7260U CPU with Intel Iris Plus 640 graphics. It currently has 8GB DDR4 memory and a 128GB Intel 600p NVMe SSD.
ESXi 6.7 Update 1 runs from a Sandisk USB stick.

The key to getting this to work is:
1. Enable Passthrough of the Intel Graphics, usually this sits on this address in ESXi: 0000:00:02.0, then reboot the ESXi host.
2. Add this GPU to the Windows 10 VM and reserve all memory dedicated to the VM.
3. Set these 2 values under VM Options - Advanced - Configuration Parameters - Edit Configuration:
pciHole.start=2048
SVGA.Present=FALSE
4. Make sure you have enabled RDP, teamviewer or similar to access the VM. After you enable GPU Passthrough you wont see the console through VMWare anymore.
5. Start the VM and install the appropriate graphics drivers from Intel.


Hope this helps :)
Yes, this helps! Thank you.
Also, I wonder if you ever compared performance with Win10 running VMWare Workstation (with Linux / Winx guests) VS ESXi. Other than using containers, the OS layer(s) will be present with both solutions so, I am curious which runs better.
 

bp2008

Staff member
Joined
Mar 10, 2014
Messages
12,666
Reaction score
14,007
Location
USA
i7-8700K is a great CPU, but if you are considering building new then you should definitely consider i7-9700K and i9-9900K, which are each better. But the truth is you need none of these unless you are planning to run a heavy load in BI.
 

Caveman

n3wb
Joined
Apr 7, 2016
Messages
24
Reaction score
8
Dedicating 4 cores of my Xeon-D 1541 without any GPU acceleration, my CPU use is around 30% in the Windows 10 VM with 4 Dahua 5231 Starlights. 8700K would be insane overkill unless running alot of cameras. My cpu is based on Broadwell and boosts to 2.7GHz. 8700K is newer and boosts to 4.7GHz? I3 8100 is probably a better buy.
 

myipcam

Getting the hang of it
Joined
Dec 23, 2018
Messages
132
Reaction score
33
Location
USA
[QUOTE="... 8700K would be insane overkill unless running alot of cameras...[/QUOTE]

The idea of 8700K is not just for BI alone. Using Virtualization, make other services available on the same hardware. IMHO, it just doesn't make any sense to have 10 different low power units with their own HDDs, power supplies etc., providing 10 different services. Rather, get a decent machine and run everything on it. One argument then is, what if hardware fails and you don't have recordings. That would also be true if an individual BI machine goes down too. Just have separate HDDs for BI and dedicate them to its VM exclusively. Again in my opinion, there is really no benefit by running separate machines. If you really want to separate them on network, simply add a new network card. They are dead cheap anyway.
 

fenderman

Staff member
Joined
Mar 9, 2014
Messages
36,897
Reaction score
21,250
[QUOTE="... 8700K would be insane overkill unless running alot of cameras..

The idea of 8700K is not just for BI alone. Using Virtualization, make other services available on the same hardware. IMHO, it just doesn't make any sense to have 10 different low power units with their own HDDs, power supplies etc., providing 10 different services. Rather, get a decent machine and run everything on it. One argument then is, what if hardware fails and you don't have recordings. That would also be true if an individual BI machine goes down too. Just have separate HDDs for BI and dedicate them to its VM exclusively. Again in my opinion, there is really no benefit by running separate machines. If you really want to separate them on network, simply add a new network card. They are dead cheap anyway.
blue iris and any vms should be installed on bare metal. It makes lots of sense. This is the most stable. Or you can install it on a VM then scratch your head and blame BI every time you have an issue. Install all your other 9 non critical applications on a vm.
 

myipcam

Getting the hang of it
Joined
Dec 23, 2018
Messages
132
Reaction score
33
Location
USA
blue iris and any vms should be installed on bare metal. It makes lots of sense. This is the most stable. Or you can install it on a VM then scratch your head and blame BI every time you have an issue. Install all your other 9 non critical applications on a vm.
This is what I suggest to implement as well. On the other hand, for all intents and purposes, even Win10 only sees most of the hardware through its "drivers" etc., therefore running even Win10 as a VM makes sense too. The only show stopper was the Intel GPU pass through. In the past this was not possible (or may be, I didn't look properly). If this is possible now, it makes sense to at least to try and see how it goes...

In that situation, it won't make sense to blame BI for not working as it should because I know it works on bare metal fine. I would probably start to look at other issues surrounding it.
Anyway, this whole thing only makes sense if there are any significant gains related to performance & stability. If not, there is no point to undertake such migration. The only that really needs the Intel GPU is BI for H.264 acceleration and it works fine as it is. All other apps (at least in my case) don't need GPU processing so its not really a big deal.
 

wpiman

Pulling my weight
Joined
Jul 16, 2018
Messages
332
Reaction score
246
Location
massachusetts
I think the bare metal versus VmWare ESX question comes down to how many cameras you want to support. I am running a pretty light load (one two megapixels and two doorbirds). If you are running 40 2 MP cameras, then the balance tips.

I have no stability issues with Windows on ESXi. I run Homeseer on a Windows VM and that is arguable more critical than Blue Iris for our family. I run Windows 8.1. I also run Plex on the same server in another VM. The hypervisor is rock solid.

Also, server hardware tends to be more stable. My server has redundant power supplies, ECC memory and an IPMI controlled motherboard. This is on its own circuit breaker with a UPS and a 45 kilowatt backup generator. I see lots of people trying to buy some used servers on eBay which is fine for most, but that just makes me nervous.

Obviously, one could run Windows directly on server hardware; which is where I probably will end up eventually if I install enough cameras. I don't want to ever have to physically touch the machine if it can be helped. Install the OS from the console is a requirement for me.

Anyways, I installed an E3-1245 this morning and will work on the pass through. Thanks again. Good topic for discussion.
 
Top