Server/CPU recommendations for ESXi host for Blue Iris, Linux media/file/web server

itm

Getting the hang of it
May 1, 2017
197
44
Greater London, UK
I currently run an ESXi 6.7.0 host with 3 guests:

- Windows Server 2022 running Blue Iris NVR (v5)

- Lubuntu 20 for Logitech Media Server, Plex Media server, Home Assistant, VPN, Mediawiki, Web Server and file server

- Windows 10 for PVR software & general testbed environment.

My current hardware is a HP Microserver Gen 8 with Intel Xeon E3-1265L V2 2.5 GHz, 16GB RAM.

While the performance is just about OK, RAM cannot be upgraded beyond 16GB, and I'd like to at least double that. I've also found that Blue Iris can easily consume excess amounts of CPU; when I tried to run it on the Windows 10 VM it consumed 80-100% CPU the whole time, rendering the host un-usable. I don't get this problem on Windows Server 2022, which is pretty much the only reason that I still run that guest O/S.

Can anyone recommend a good upgrade which has onboard graphics to support Blue Iris, and which will support at least 32GB RAM? I'm probably looking at used hardware. Budget not fixed, but probably up to £500-1K.

I'd also really like something with lights-out remote access. My server is in the attic, and I frequently use iLo to reboot it when necessary, or simply to get access to the console.
 
Last edited:
If remote management is a sticking point, then used server hardware (Dell, HP, Supermicro) is going to be pretty much all you will be able to get.

I’ve been running Proxmox for years off just consumer gear. BI runs great in a win 10 VM with only 6GB of RAM on Proxmox.
 
  • Like
Reactions: hikky_b
PiKVM is an option for remote management of nearly any PC. That is probably your safest option. Also I wouldn't recommend keeping your eggs in the Vmware basket since... VMware vSphere ESXi free edition is dead | Hacker News I've been using Proxmox at work for several years now and it has been quite nice. Not perfect but neither was ESXi when we used that.

There do exist specialized motherboards for AMD AM5 (Ryzen 7000, 9000) with built-in remote management capabilities (IPMI) and ECC memory support, but they are a bit expensive. Gigabyte and AsrockRack both make some. Based on a little looking around this morning it looks like AsrockRack has been more popular but they appear to have a lot of stability issues lately. Gigabyte's boards don't seem to have a firmware update (yet) with Ryzen 9000 support so it would probably be wise, if you go this route, to just stick with a 7000 series CPU for now. Even the cheapest 7000 series CPU should run circles around the old Xeon E3-1265L V2.




 
More info on cheap(ish) AMD server boards:

 
Thanks for all the feedback. I've decided to go with a Ryzen 5700G, along with a BliKVM 4.0 for remote management. It looks like it will offer a significant performance boost over my current setup, while being significantly cheaper than some of the AM5 alternatives. I've also decided to migrate from ESXi to Proxmox (!!)

The 5700G does have an on-board GPU, but I suspect that it may be a life's work trying to pass it through to the Blue Iris/Windows Server VM, so I'll rely on the 5700G's core processor to do the heavy lifting.
 
PiKVM is an option for remote management of nearly any PC. That is probably your safest option. Also I wouldn't recommend keeping your eggs in the Vmware basket since... VMware vSphere ESXi free edition is dead | Hacker News I've been using Proxmox at work for several years now and it has been quite nice. Not perfect but neither was ESXi when we used that.

There do exist specialized motherboards for AMD AM5 (Ryzen 7000, 9000) with built-in remote management capabilities (IPMI) and ECC memory support, but they are a bit expensive. Gigabyte and AsrockRack both make some. Based on a little looking around this morning it looks like AsrockRack has been more popular but they appear to have a lot of stability issues lately. Gigabyte's boards don't seem to have a firmware update (yet) with Ryzen 9000 support so it would probably be wise, if you go this route, to just stick with a 7000 series CPU for now. Even the cheapest 7000 series CPU should run circles around the old Xeon E3-1265L V2.






problem of most of these workstation/cheap server boards is, they only support udimm ecc, which is not that cheap.

also most need pro or X ryzen , G versions mostly not supported or have problems. G versions dont support ECC ram
 
  • Like
Reactions: bp2008
I've also decided to migrate from ESXi to Proxmox (!!)

Last I checked, which was years ago, it is possible to migrate ESXi VMs to Proxmox but it requires a bit of command line work. My notes mention using ovftool on the ESXi machine to export disk images in ovf format, and qm importovf on the proxmox machine to import them, but the process may be improved since then. Migrate to Proxmox VE - Proxmox VE

The 5700G does have an on-board GPU, but I suspect that it may be a life's work trying to pass it through to the Blue Iris/Windows Server VM, so I'll rely on the 5700G's core processor to do the heavy lifting.

A GPU isn't needed for Blue Iris unless you want it to output to a local display or if you need the GPU for hardware acceleration (for codeproject AI or something).
 
About the gpu passtrough, going a bit off topic here but i do want a gpu for blue iris. I have a couple of hdmi extenders showing my cameras on 3 tv's in the house. Ive had trouble with videocard passthrough in the past. Everyone ive tried started glitching after a while. Two months ago i ordered a cable matters usb hdmi adapter and its working great. The easy thing is that in the proxmox webgui you can just pick the adapter. You do need to install a driver in windows, but its up and running within 5 minutes.
 
As an Amazon Associate IPCamTalk earns from qualifying purchases.
  • Like
Reactions: bp2008
About the gpu passtrough, going a bit off topic here but i do want a gpu for blue iris. I have a couple of hdmi extenders showing my cameras on 3 tv's in the house. Ive had trouble with videocard passthrough in the past. Everyone ive tried started glitching after a while. Two months ago i ordered a cable matters usb hdmi adapter and its working great. The easy thing is that in the proxmox webgui you can just pick the adapter. You do need to install a driver in windows, but its up and running within 5 minutes.
Wow, that is actually brilliant to pass through a USB display adapter to a VM. I bet it has way higher CPU overhead than a "real GPU" but overall better energy efficiency almost certainly, and with simpler drivers that would be less likely to have a problem with passthrough. Nice.
 
As an Amazon Associate IPCamTalk earns from qualifying purchases.