I see a lot of people and businesses moving away from traditional dedicated servers and toward virtual environments controlled by a hypervisor. But I can't really understand why this is so popular.
I can think of two reasons why you might want to run applications in different, relatively isolated operating systems on one machine.
1. Security. If each application runs in its own virtual environment, then they can't affect each other very much.
2. Compatibility. Sometimes you just can't run all your software on one version of one operating system.
Is this really all there is to it? People just have a bunch of low-power applications that for whatever reason work best when isolated from the others by a layer of virtualization?
I could not effectively consolidate my servers at home if I tried. FreeNAS? Requires dedicated hardware. Blue Iris? Uses all the CPU and then some. I run all my low-demand stuff on an Intel NUC that takes up no space and costs me maybe $5 a year in electricity. Maybe I am just really far from being the target audience for this hypervisor stuff?
I can think of two reasons why you might want to run applications in different, relatively isolated operating systems on one machine.
1. Security. If each application runs in its own virtual environment, then they can't affect each other very much.
2. Compatibility. Sometimes you just can't run all your software on one version of one operating system.
Is this really all there is to it? People just have a bunch of low-power applications that for whatever reason work best when isolated from the others by a layer of virtualization?
I could not effectively consolidate my servers at home if I tried. FreeNAS? Requires dedicated hardware. Blue Iris? Uses all the CPU and then some. I run all my low-demand stuff on an Intel NUC that takes up no space and costs me maybe $5 a year in electricity. Maybe I am just really far from being the target audience for this hypervisor stuff?