I'm looking to run Blue Iris on an older server that we already have, a Dell PowerEdge R610 w/ dual socket Xeon X5650. This is a total of 12 physical cores at 2.67 Ghz. It has 32 GB of RAM. I don't know much about BI CPU usage. At home, I run a single camera on a similar system in a VM with no problems but have not tried scaling to multiple cameras.
Here at work we would like to run about 8 cameras on it, with direct to disk recording 24/7 with no motion detection. The cameras are 4K Amcrest cameras (which we don't mind running at 2K or even 1080p if necessary). I would imagine that this server would be plenty powerful if it's not doing any re-encoding? BI wouldn't be doing much more than dumping the data stream to disk, right? Should leave enough CPU for remote access/viewing, etc.
I'm aware a modern i5 or i7 would definitely be better than 8 year old Xeons, but if this will work then we don't want to buy a new system for it. We don't particularly care about power consumption, either.
I'm just trying to figure out if anybody knows of some reason that this setup would take more CPU than I think it would, given my inexperience with the software. Thanks for any input!
Here at work we would like to run about 8 cameras on it, with direct to disk recording 24/7 with no motion detection. The cameras are 4K Amcrest cameras (which we don't mind running at 2K or even 1080p if necessary). I would imagine that this server would be plenty powerful if it's not doing any re-encoding? BI wouldn't be doing much more than dumping the data stream to disk, right? Should leave enough CPU for remote access/viewing, etc.
I'm aware a modern i5 or i7 would definitely be better than 8 year old Xeons, but if this will work then we don't want to buy a new system for it. We don't particularly care about power consumption, either.
I'm just trying to figure out if anybody knows of some reason that this setup would take more CPU than I think it would, given my inexperience with the software. Thanks for any input!