I’m considering selling my gaming computer and buying a Dell R720XD/SuperMicro X10 server to run my VM’s and hold 12-24 HDD’s for Plex video storage or adding something like a Dell PowerVault DAS to the HP 8300. Since I still have the gaming computer which holds 7 HDD’s, I figure I can start figuring out what kind of raid/redundancy I’d like, set it up now, and migrate it to a server once the gaming computer is sold, or just keep it until I run out of HDD room.
That said, I decided to install Proxmox on the gaming computer, restore Windows 10 VM backup from the 8300, and have it connected to my switch so it’s behind the 8300 pfSense router. This way if I decide to play with Plex, mess with Proxmox, and need to reboot, I’m not impacting anybody else in the house. I like this configuration much better and seeing that the 8300 consumes 30W, I’m probably going to find dedicated hardware for pfSense which uses 10W or less. At .13c per Watt, 24x7x365, that’s $23-$27 savings per year so a dedicated device pays for itself in 6 years and starts saving money thereafter which makes sense to me.
But what’s really interesting to me is that running my gaming computer virtualized rather than Windows 10 on bare metal is only consuming 70W with processing at 13-15%. My other tests were around 4-5pm with daylight and it’s dark now so I suspect the CPU will go up but I’m really curious to see if the power stays low since the HP 8300 was averaging 60-65W and the HP 4770 was around 90W just like this was before virtualizing. Does virtualizing lower power consumption? I’m going to check the CPU temp tomorrow and see how it compares to the HP 4770 just for fun.