Help me spend some money specing out a 48 camera Blue Iris server and storage

Shark92651

Getting the hang of it
Joined
Oct 9, 2019
Messages
81
Reaction score
78
Location
Texas
I have a spare dual Xeon Dell R320 rackserver that I foolishly thought may be adequate for a BI system with 24 cameras. After some additional planning with my co-owner, that 24 cameras could now grow to 48. After learning that I really need a Core iX based system, I am now wanting to gather information to help me put a system together, or buy one outright.

So it seems that the best type of CPU and motherboard for a Blue Iris system is a "gaming" system utilizing something like an Intel Core i7 or i9 processor. I also intend to run a couple live-view remote systems each showing a matrix of up to 16 camera feeds, one in the office and another out on the floor. For a system such as this should I go ahead and get something like a Core i9-9900K?

I ran my idea for cameras through an online NVR storage calculator and I came up with a storage requirement of 34 TB: WD Surveillance Storage Capacity Estimator

48 Cameras, H.265
30 days retention
12 hours per day
2MP resolution
High video quality
Medium scene activity
18 fps

So now I am wondering if it is a better approach to put together a single machine based on the Core CPU and a chassis that supports a large RAID with hot-swap bays, or should I keep the storage separate? I could use my R320, which has dual Xeon CPU, 128GB RAM, 8 HD bays and a hardware RAID controller as just a file share to host the camera data. The downside to this approach is that it is another Windows server that has to be maintained. Or is it better to use a separate dedicated NAS like a Synology?

Any input/opinions would be appreciated. Thanks
 
Last edited:

mech

Getting comfortable
Joined
May 18, 2019
Messages
326
Reaction score
427
Location
United States
The Core processors have the desirable Intel QuickSync decoding, but at this time, BI doesn't support H265 using QuickSync. If you want to use H265 you will be using straight-up CPU encoding, at which point the Xeons would be a contender. Insofar as that goes.

I have a Core i9 with about 2/3 of the megapixels/second workload of what you're planning. Playing back 16 cams (H264 with QuickSync decoding acceleration, on a 1900 x 1200 monitor, using the economy "Fast" scaling rather than CPU-intensive modes) will take the CPU up to 60% utilization at 1x playback speed (forward). If I leave it playing back, CPU will eventually hit 90%+ . Storage throughput from a Seagate EXOS Helium hits about 40%, Intel GPU usage goes up to 42%. Translation: short-term basic forward playback of 16 cams at 1X speed is feasible with this setup.

If I had $500 to throw at this system right now, the first thing I would do is smack a 2TB nVME SSD in there for short-term recordings, so playback of recent events doesn't have to pull from the mass-storage drive. Right now, the 10TB drive has to record AND play simultaneously when playback's required. So if you are thinking about your plan, factor in a big SSD, preferably nVME, so you're not pulling recent playback from the same drive/array that's trying to record 48 cams.
 

mech

Getting comfortable
Joined
May 18, 2019
Messages
326
Reaction score
427
Location
United States
This will probably be of interest to you too:


You can sort according to MP/sec and/or number of cameras. Keep in mind that you want considerable performance headroom to account for playback situations, not just recording.
 

Shark92651

Getting the hang of it
Joined
Oct 9, 2019
Messages
81
Reaction score
78
Location
Texas
Thanks for the replies mech. I have not yet purchased the cameras so I could go H.264 if that is better given the BI support. As far as the remote live-view system, do you know if it is possible to just pull the RTSP sub-feeds directly from the cameras therefore bypassing any additional load on the BI server itself? This is course assuming the remote system was beefy enough to handle the multiple feeds.
 

SouthernYankee

IPCT Contributor
Joined
Feb 15, 2018
Messages
5,170
Reaction score
5,320
Location
Houston Tx
I am paranoid also very conservative. For 48 cameras I would not put all my eggs in one basket, if running BI. I would use a minimum of two system. spread the cameras between the two systems.
Most current cameras support h.264 and H.265.
A 1080P cameras recording at:
fps: 15
Iframe: 15
Quality: 4
Max bit rate: 4096 Kb/Sec
bit rate type: VBR

Will use about 1.6 GB per hour of storage.

The amount of storage used depends on a lot of variable, amount of motion, amount of color changes, amount of sound, and the camera settings. So being conservative, 48 cameras at 12 hours per day, 2 GB/Hour would 96 GB/hour , 1.2 Terabytes per 12 hours, 36 TB for 31 days. This leaves a lot for free space of overages.

I would recommend WD purple disk drives. I would use 4 10 TB disk drives (about $330 each)

You can pull multiple live feeds from most good cameras, you can also setup and pull a lower resolution Substream for each camera. I do not know how to pull and display 16 cameras per screen directly. I guess you could run another copy of BI without recording and just display the console.
 
Last edited:
As an Amazon Associate IPCamTalk earns from qualifying purchases.

Shark92651

Getting the hang of it
Joined
Oct 9, 2019
Messages
81
Reaction score
78
Location
Texas
I would recommend WD purple disk drives. I would use 4 10 TB disk drives (about $330 each)
So do you recommend against using RAID for storage? Can BI be configured to write across multiple disk drives without RAID?
 

SouthernYankee

IPCT Contributor
Joined
Feb 15, 2018
Messages
5,170
Reaction score
5,320
Location
Houston Tx
Raid is a very bad idea for video storage. In really life, recovering a raid is a major PIA. I have told this story before, I was working on a real time messaging system, my boss said that if emptied his rife into the computer room. All the data must survive.

Keep it real simple. Again do not keep all eggs in one basket. I assume everything will fail. I have a 64 GB memory card in each camera. I also have a seperate NAS store the video in near real time. In blue iris you can set up a clone camera to write to a NAS or another disk drive, with no additional inbound network traffic and very little BI PC CPU utilization.
 

mech

Getting comfortable
Joined
May 18, 2019
Messages
326
Reaction score
427
Location
United States
Thanks for the replies mech. I have not yet purchased the cameras so I could go H.264 if that is better given the BI support. As far as the remote live-view system, do you know if it is possible to just pull the RTSP sub-feeds directly from the cameras therefore bypassing any additional load on the BI server itself? This is course assuming the remote system was beefy enough to handle the multiple feeds.
It's likely your cams will support both H264 and H265 at your preference, so a quick check into their specs ought to verify that. I like the idea of spreading the work across two systems. You could pick up additional BI licenses for the viewing PCs and just not configure them to record, and then you could view the cams they need. Would that work?
 

Dan V

n3wb
Joined
Oct 13, 2019
Messages
2
Reaction score
0
Location
40258
I am in basically the same boat as Shark. I need to design a 48 camera system from scratch. I was looking for 3mp to 5mp resolution. I have had pros quote me 39k to 80k for 32 cameras. The 39k being tvi based cameras. Unacceptable so I am going to do it myself. I am using the cameras in a 56000 sq ft warehouse and a 6500 sq ft two story office space with 5 outdoor cameras 1 of which will be ptz. I need help putting together this system. Should I use a server and an NAS? What about using mulitple nvr's? What cameras should I use? I have been looking at Speco, Hikvision and Dahua. I have only a basic knowledge of ip cameras so this will be a difficult build for me. I put a 10 camera system in my house a couple years back running BI and using Reolink cams, a Dell with an i7-4770 chip with a 6TB purple drive. I run them through a 24 port Netgear poe switch. That is my next question. Do I need to run switches? Obviously I do if I go NAS but what about with nvr's? I will have runs of over 300 feet so I know I will need extenders for those camera feeds. Most will be under 200-250 feet. I will also need to be able to access any camera from a phone or tablet. I know BI does this but I was never able to set it up and it actually work. I will be running cat6a cable for all cameras. My budget for this system is to stay under 30k. Please help.
 

SouthernYankee

IPCT Contributor
Joined
Feb 15, 2018
Messages
5,170
Reaction score
5,320
Location
Houston Tx
DAN V

The need for switches depends on the NVR and the distance to cameras. If you have a 48 port POE NVR then you may not need a switch. In general the max run length for a POE cable is 100 meters, so if your cameras are more then 100 meters cable distance from the NVR then you will need a switch.

If it was me I would use multiple POE switches so I did not have long cable runs. I run BlueIris, I have two POE switches in my house, on opposite sides of the house. I then have two cables from those switch run to a non POE switch which connects to my BlueIris PC.

I recommend that if you use an NVR that the NVR and cameras come from the same manufacture, If simplifies maintenance and configuration.
 

Dan V

n3wb
Joined
Oct 13, 2019
Messages
2
Reaction score
0
Location
40258
DAN V

The need for switches depends on the NVR and the distance to cameras. If you have a 48 port POE NVR then you may not need a switch. In general the max run length for a POE cable is 100 meters, so if your cameras are more then 100 meters cable distance from the NVR then you will need a switch.

If it was me I would use multiple POE switches so I did not have long cable runs. I run BlueIris, I have two POE switches in my house, on opposite sides of the house. I then have two cables from those switch run to a non POE switch which connects to my BlueIris PC.

I recommend that if you use an NVR that the NVR and cameras come from the same manufacture, If simplifies maintenance and configuration.

So would you place poe switches in the ceiling out in the warehouse and make shorter cable runs to the cameras and then run 1 wire from each poe switch back to the nvrs?
What about using a server to access the video? The computer I use at home is pretty fast and even with 10 cams using BI and optimizing it, as much as you can with BI, it stills runs a little slow. Also, should I apply static ips to each cam? BI loses the ip addresses on my cams at home sometimes when they auto-reboot and doesn't reacquire the new ones and I have to delete and re-add new ones to get them back on.
 

Shark92651

Getting the hang of it
Joined
Oct 9, 2019
Messages
81
Reaction score
78
Location
Texas
Also, should I apply static ips to each cam?
I think setting static IPs on cameras and servers is a must. I don't like to rely on DNS and I keep track of all my device names and IP addresses in a cloud-based spreadsheet.

I am also curious if there is a performance penalty from using multiple switches which justifies the cost of multiple long cable runs. As long as the distance is not too great, for example is it worth it to run 8 cables to individual cameras in an area vs a single cable to an 8-port POE switch that then has short runs to the cameras?
 

Dbirkett

n3wb
Joined
Sep 11, 2019
Messages
6
Reaction score
2
Location
Chicago
I currently run a 34-camera blueiris system on a Dell R730xd server. I chose the server to be able to support 48 cameras, I should have another 5 cameras online within the next month and I can tell you what the CPU usage is then. Currently it is between 50 and 60%, running around 2800 mp/s. Consumes around 300w. Server cost under 1200 for 2 12 core Xeon processors with hyperthreading, 128gb ram, and redundant 1100w PSUs. It gives 2 "flex-bay" 2.5" drive bays in the rear, allowing for a raid-1 SSD volume for the OS and Blueiris+db, and I currently have 6 of the front drive bays populated with 10tb drives in Raid 6, giving 40TB of storage, with redundancy for loosing 2 drives. PERC card supports adding drives and reconfiguring the array live, so adding drives in the future are not a problem, although it could take weeks or months for the raid controller to re-write the entire array to include the new disks. 10TB Seagate Exos SAS drives can be had under $300 each. Installed Windows 10 pro on the server, as a bare-metal install, no issues at all.
When choosing a server, keep in mind that Windows cannot support more than 64 CPU threads in a single processor group. A process cannot use more than 1 processor group, so having more than 2 16 core hyperthreaded processors cannot increase your performance with BlueIris.
Added a 3750 WS-C3750-48PS 48-port switch for under $50. Added a SFP transceiver for the uplink to the server, the camera ports are all 100mbit, the uplink to the server is a gigabit SFP copper transceiver.
I put it all in a cheap Amazon open-frame server rack. Total system cost for server, drives, rails, switch, was under 4k. Cameras and wiring were pre-installed
 

area651

Getting comfortable
Joined
Aug 18, 2018
Messages
471
Reaction score
411
Location
San Antonio/McAllen Texas
I am paranoid also very conservative. For 48 cameras I would not put all my eggs in one basket, if running BI. I would use a minimum of two system. spread the cameras between the two systems.
Most current cameras support h.264 and H.265.
A 1080P cameras recording at:
fps: 15
Iframe: 15
Quality: 4
Max bit rate: 4096 Kb/Sec
bit rate type: VBR

Will use about 1.6 GB per hour of storage.

The amount of storage used depends on a lot of variable, amount of motion, amount of color changes, amount of sound, and the camera settings. So being conservative, 48 cameras at 12 hours per day, 2 GB/Hour would 96 GB/hour , 1.2 Terabytes per 12 hours, 36 TB for 31 days. This leaves a lot for free space of overages.

I would recommend WD purple disk drives. I would use 4 10 TB disk drives (about $330 each)

You can pull multiple live feeds from most good cameras, you can also setup and pull a lower resolution Substream for each camera. I do not know how to pull and display 16 cameras per screen directly. I guess you could run another copy of BI without recording and just display the console.

fwiw, keep in mind that I'm an anonymous voice in cyberspace..., I agree with everything SouthernYankee says here except when he talks about not using RAID. My personal idea is that with this much redundancy and work as being put into it, I couldn't imagine NOT using a NAS for all storage needs. Then on top of that, I'd strongly consider having the NAS replicate either into the cloud or to another NAS somewhere. It just seems to me that if you just use drives and you lose even one, you're losing a lot of video.

Good luck on everyone's mega builds. These will be entertaining and put my simple 9 camera setup to shame.
 
As an Amazon Associate IPCamTalk earns from qualifying purchases.

bp2008

Staff member
Joined
Mar 10, 2014
Messages
12,676
Reaction score
14,023
Location
USA
Hello @Shark92651 and @Dan V

Blue Iris may not be the best VMS for 48 modern cameras. It simply does not scale well when handling huge amounts of video.

However it may be usable depending on your needs.

I figure there are two main challenges.

1) Handling that much video

A little quick math to calculate megapixels per second:

(1920 x 1080) pixels per frame
x 18 frames per second
x 48 cameras = 1791590400 pixels per second

~ 1792 megapixels per second (MP/s)

Blue Iris always decodes incoming video, even if you aren't viewing it or using Blue Iris's motion detection. A 1792 MP/s decoding load would bring an i9-9900K system to its knees, even using H.264 and Intel hardware acceleration. If you cut the frame rates back to 15 FPS for a load of 1493 MP/s, that would be better and should fit within the limits of the Quick Sync video decoder. Barely. The system would still be near its practical limit, but should have enough CPU left over for a few remote viewing instances, an occasional windows update, etc.

One feature you wouldn't be able to use reliably is multiple-clip synchronized playback via the timeline. Given your storage requirements, I assume you are intending to record all cameras continuously. This means that playing those all those recordings would double the MP/s load. It simply won't be possible to do multiple-clip playback except for very small groups of cameras.

So, Blue Iris has a feature called "Limit decoding unless required" which drastically reduces the CPU requirement for Blue Iris to take in video streams from cameras. What it does is make Blue Iris only decode the i-frames in the live video, while still recording all frames to disk, so your recordings remain at full frame rate. This could potentially reduce the live video processing/recording load enough to make it possible to use Blue Iris's timeline to play all clips at once. I can't really say for sure, as I don't run any BI systems at this scale.

The main downside to using the "Limit decoding" feature is the effect it has on live viewing. Instead of each camera in your camera grid showing smooth video, it will be like a slide show with one frame every 1+ seconds (depending on each camera's i-frame interval), and that video will be delayed by at least several seconds (maybe up to 20-30 seconds). You can maximize a camera and Blue Iris will temporarily decode all frames from that camera, so you can at least still view a live camera smoothly, if only one camera at a time. Clip playback will happen at full frame rate.

The other downside to "Limit decoding" is that Blue Iris's motion detector can only process frames which have been decoded, so its effectiveness will be reduced.

2) Large-scale storage

As noted, large-scale storage systems are complex.

One of the issues with running a RAID or RAID-style array of disks is what happens when a disk fails. An array in a degraded state (with a failed disk) generally runs slower, and resilvering takes a long time. Even longer if you are still writing 150+ Mbps of video to it while it tries to resilver. I couldn't tell you how long exactly, only that it would likely be measured in days.

My recommendation

Here is how I would tackle a 48x 2MP camera system for continuous recording.

I would probably build two identical boxes:

* i9-9900K CPU
* 16 GB RAM (dual channel / two sticks in appropriate slots)
* 256 GB SSD for the OS, BI, and clip database
* Two 8 or 10 TB hard drives. No RAID array, just Blue Iris configured to record 12 cameras to one drive, and 12 cameras to the other drive.

I would use "Limit decoding unless required" on (almost?) all of the cameras, and make sure to uncheck "Require/decode all camera frames when streaming" here for all camera groups:




The resulting systems should run at under 10% CPU load most of the time and therefore not consume a ridiculous amount of power, and the CPU is powerful enough you should be able to use the timeline control to review all 24 cameras at once at full frame rate when needed.

i9-9900K is actually overkill for 24x 2MP cameras at 15 FPS, however, it would leave room for future expansion, and buying a lesser CPU would not save a huge amount of money anyway. Cheap ebay systems are a bit underpowered for this load.

I am also curious if there is a performance penalty from using multiple switches which justifies the cost of multiple long cable runs. As long as the distance is not too great, for example is it worth it to run 8 cables to individual cameras in an area vs a single cable to an 8-port POE switch that then has short runs to the cameras?
No, there is not a significant performance penalty as long as you keep the total bandwidth on each network link well within the capabilities of the link. Which, if everything is gigabit, is no problem for an IP cam network of this size.
 

SouthernYankee

IPCT Contributor
Joined
Feb 15, 2018
Messages
5,170
Reaction score
5,320
Location
Houston Tx
area651

Every recover a raid disk failure? I have seen a recovery on a live system take more than a day to do a simple 10 TB drive setup in a raid 5 with a hardware controller. . Also a raid doubles the amount of write a local system. Some raids increase the amount of reads a system does. Simply use the network and store the data in three seperate places. some raids increase the ware on a drive system.

I never said use a NAS for all storage needs. I said use a NAS as the third level storage.
 

Shark92651

Getting the hang of it
Joined
Oct 9, 2019
Messages
81
Reaction score
78
Location
Texas
Hello @Shark92651 and @Dan V

Blue Iris may not be the best VMS for 48 modern cameras. It simply does not scale well when handling huge amounts of video.

However it may be usable depending on your needs.

I figure there are two main challenges.

1) Handling that much video

A little quick math to calculate megapixels per second:

(1920 x 1080) pixels per frame
x 18 frames per second
x 48 cameras = 1791590400 pixels per second

~ 1792 megapixels per second (MP/s)

Blue Iris always decodes incoming video, even if you aren't viewing it or using Blue Iris's motion detection. A 1792 MP/s decoding load would bring an i9-9900K system to its knees, even using H.264 and Intel hardware acceleration. If you cut the frame rates back to 15 FPS for a load of 1493 MP/s, that would be better and should fit within the limits of the Quick Sync video decoder. Barely. The system would still be near its practical limit, but should have enough CPU left over for a few remote viewing instances, an occasional windows update, etc.

One feature you wouldn't be able to use reliably is multiple-clip synchronized playback via the timeline. Given your storage requirements, I assume you are intending to record all cameras continuously. This means that playing those all those recordings would double the MP/s load. It simply won't be possible to do multiple-clip playback except for very small groups of cameras.

So, Blue Iris has a feature called "Limit decoding unless required" which drastically reduces the CPU requirement for Blue Iris to take in video streams from cameras. What it does is make Blue Iris only decode the i-frames in the live video, while still recording all frames to disk, so your recordings remain at full frame rate. This could potentially reduce the live video processing/recording load enough to make it possible to use Blue Iris's timeline to play all clips at once. I can't really say for sure, as I don't run any BI systems at this scale.

The main downside to using the "Limit decoding" feature is the effect it has on live viewing. Instead of each camera in your camera grid showing smooth video, it will be like a slide show with one frame every 1+ seconds (depending on each camera's i-frame interval), and that video will be delayed by at least several seconds (maybe up to 20-30 seconds). You can maximize a camera and Blue Iris will temporarily decode all frames from that camera, so you can at least still view a live camera smoothly, if only one camera at a time. Clip playback will happen at full frame rate.

The other downside to "Limit decoding" is that Blue Iris's motion detector can only process frames which have been decoded, so its effectiveness will be reduced.

2) Large-scale storage

As noted, large-scale storage systems are complex.

One of the issues with running a RAID or RAID-style array of disks is what happens when a disk fails. An array in a degraded state (with a failed disk) generally runs slower, and resilvering takes a long time. Even longer if you are still writing 150+ Mbps of video to it while it tries to resilver. I couldn't tell you how long exactly, only that it would likely be measured in days.

My recommendation

Here is how I would tackle a 48x 2MP camera system for continuous recording.

I would probably build two identical boxes:

* i9-9900K CPU
* 16 GB RAM (dual channel / two sticks in appropriate slots)
* 256 GB SSD for the OS, BI, and clip database
* Two 8 or 10 TB hard drives. No RAID array, just Blue Iris configured to record 12 cameras to one drive, and 12 cameras to the other drive.

I would use "Limit decoding unless required" on (almost?) all of the cameras, and make sure to uncheck "Require/decode all camera frames when streaming" here for all camera groups:




The resulting systems should run at under 10% CPU load most of the time and therefore not consume a ridiculous amount of power, and the CPU is powerful enough you should be able to use the timeline control to review all 24 cameras at once at full frame rate when needed.

i9-9900K is actually overkill for 24x 2MP cameras at 15 FPS, however, it would leave room for future expansion, and buying a lesser CPU would not save a huge amount of money anyway. Cheap ebay systems are a bit underpowered for this load.



No, there is not a significant performance penalty as long as you keep the total bandwidth on each network link well within the capabilities of the link. Which, if everything is gigabit, is no problem for an IP cam network of this size.
Thanks for the detailed reply, I really appreciate it and I am planning a 2 server system thanks to your input and that from SouthernYankee.

Another question, what sort of video card should I be looking at for these servers?
 
Last edited:

fenderman

Staff member
Joined
Mar 9, 2014
Messages
36,902
Reaction score
21,274
Thanks for the detailed reply, I really appreciate it and I am planning a 2 server system thanks to your input and that from SouthernYankee.

Another question, what sort of video card should I be looking at for these servers?
Just use a proper commercial vms for something like this. Dw spectum ipvms (rebranded network optix nxwitness). its about 70 bux a license with free lifetime upgrades. NX just released v14.
 

bp2008

Staff member
Joined
Mar 10, 2014
Messages
12,676
Reaction score
14,023
Location
USA
Another question, what sort of video card should I be looking at for these servers?
Onboard. Unless you find that having the local console open adds too much CPU load. Then you can put in something minimal like a GT 710 or GT 1030 to connect the monitor and that will reduce the CPU load a bit for rendering the local console.
 

Shark92651

Getting the hang of it
Joined
Oct 9, 2019
Messages
81
Reaction score
78
Location
Texas
I just put together a parts list and ordered these components on Amazon for the first server. If it all works well, I will build a 2nd system with the same components. Total is about $1800 before tax, which doesn't seem too bad to me for this system. I am putting it in a Rosewill rackmount chassis and went with a 256GB M.2 for OS and BI software. Apparently the Rosewill chassis comes with 3 crappy 80mm chasis fans with Molex connectors. I ordered some replacement Noctua 80mm fans that I can plug into the fan controller on the Asus motherboard, which supports a total of 4 fans (2 for CPU and 2 for chassis). I will report back on my progress once I get the parts in and can assemble it.

Intel Core i9-9900K
ASUS TUF Z390M-Pro Gaming LGA1151
Corsair Vengeance LPX 16GB DDR4 3000
ADATA 256GB M.2
Western Digital Purple 10TB x 2
Seasonic FOCUS Plus 550 Gold
Thermalright Le Grand Macho RT Fan
Rosewill RSV-Z2600
Noctua NF-R8 redux-1800 80mm fan x 3
 
Last edited:
Top