What high spec server would you choose for 50 cameras?

What are your specifications for a high spec 'dream' camera server to run Blue Iris 50 cameras in a business environment?

Or can anyone point me in the direction of a hardware calculator or another post that has tackled this?

There are so many options to choose: Which processor? Should they be dual? Which graphics card? How many surveillance grade HDDs and should these be in the server or should the storage be on NAS?

Some information:

Cameras: 50 planned– Connected by Ethernet cable, optimised to 15 fps and using substreams to view internal environments.

Environment: For a shop and warehouse. Lan in place with two other servers and this server would be dedicated to Blue Iris and Deepstack only. SQL database and email server on other servers.

Viewing: Mostly viewing on premises (might review remotely over the internet at some stage)

Storage requirement: 2 weeks preferred

Budget: £5000 or $6000 (just in case anyone suggested solid state storage drives!)

Preferred brand: Dell using the latest Windows OS



Other notes:

Current setup: 2 Synology NAS with ten cameras each and a mismatch of IP cameras. Mostly use for tracking down mistakes that happen at the tills.

My technical level: Beginner but I have access to IT pros!

Any suggestions or pointing in the right direction would be much appreciated!
 
Thanks for raising this- yes, that is one I was going to look at after getting the server straightned out. Rather than one huge switch- it would probably be ten switches which could power by POE the cameras (say 5 cameras on each switch).

But would need to be worked out with an ethernet backbone to get it back to the server so there may well be a central swtich that is processing a lot of (camera) traffic from the other more far flung switches and then connecting to the server.

The other traffic on the network on POS and ERP software.

Note: Premises are in historic buildings on the high street so shops are not particularly large- just a bit higgly piggldy (in the British sense of the word, not the american chain), with multiple storerooms in different rooms (which means longer cable runs) and more areas to cover with cameras.
 
With that many cameras you'd be looking at 10gb backbone.
You'll be wanting substantial disk performance.
I'd be looking at SSD as the primary recording disks and then archive off to slower disks for longer term storage.
Say nightly transfer.
 
With that many cameras you'd be looking at 10gb backbone.
You'll be wanting substantial disk performance.
I'd be looking at SSD as the primary recording disks and then archive off to slower disks for longer term storage.
Say nightly transfer.

I disagree, A gigabit network will be ok. I've setup 150 camera networks with no 10Gb switches. For high performance storage, RAID 10 can be used. To optimize the network simply have all cameras dedicated to a single NIC on the server, use a different NIC for the business network, similar to how most here set up their home systems.
 
You had 150 cameras running though a single gigabit switch?

Cameras were distributed across multiple switches but ultimately, they aggregated via fiber to a single switch and to multiple recorders. Teaming can also be set up on the server(s), I didn't need do that.
 
Thanks Valiant and Looktall for views on this.

That’s a good point about SSD as primary recording. Although I was hoping to use large surveillance grade HDD’s (eg Ironwolfs) and rely on lowest frame rate possible and latest compression technology to bring down the size. But now I am wondering if the server housing can fit all those drives! Might have to drop to ten days storage.

Thanks for mentioning the NIC (Network Interface Card)- I guess have to make sure we specify multiple NICs on the new server just to be safe. I think one NIC slot is already taken with DRAC (Dell Remote Access Card) on each server so IT can get in if absolutely needed.

I think the cameras are on a different subnet to everything else so that might be a way to group it and tell it to use one NIC.

We are doing a refurbishment of the premises so just to be sure we might have dedicated camera switches and backbone and another one for the POS/ERP data. Might even colour code the network cable so that we don’t ‘cross contaminate’ as it were.

The camera footage is really not mission critical like our SQL database – so I guess we would be considering a low performance raid (or none at all?)

I’ll have a read on teaming as I have not heard of that.

Lots of ideas, thank you!
 
A few notes:

1) I wouldn't bother with a NAS. Configure the local disks as just a bunch of disks and distribute the cameras evenly over them. The number and size will be determined by how long you want to retain the video.

2) No need for a SSD for recording video. A bunch of disks will be plenty fast enough. On the other hand, you definitely want the BI data base to be on a SSD. Before I switched my daily data base optimizations (?rebuilds?) were taking 30+ minutes.

3) You will probably want two NICs on the computer. My network usage is 250 Mbps. Three times this would be too much for a single 1Gbps NIC. Unless of course you use lower frame rates or image quality.

4) A good switch should be able to handle the load with no problem. Traffic from the cameras through a switch and on to the BI computer should not interfere with any other traffic. They are designed to allow full speed switching without slowing down. But don't let the camera traffic span any ethernet lines running other traffic. And don't try to use the router as a switch for the camera traffic . Most routers can't handle the extra load. You said you have access to IT pros. Use them.

My system has 17-18 cameras, half 4k, all running at 20Hz, and very high quality. My typical CPU load is 10-15%, spiking to 100% when AI is examining images. Most cameras are using camera triggers, not BI. I have one internal 10 TB disk and two external 14 TB eSATA disks. This provide over 400 hours. (16 days) of saved recordings. My LAN is set up as router --> 16 port switch --> everything local, including two POE switches and about 50 devices.
 
CPU performance is not as big a priority as it used to be, thanks to sub stream support. Dual processors are absolutely unnecessary. Should be more than sufficient to pick a CPU with a cpubenchmark.net score of at least 20,000, with at least 2500 on the single threaded score. Even a low tier i5 CPU from the latest Intel generation far exceeds that performance level. Just don't get screwed on CPU performance in order to have "server grade" hardware. I know the types of CPUs that dell and such like to put into entry level servers.

I'd go with 32 GB of RAM. If you run AI stuff on this server, then the AI program is likely to use more than Blue Iris if my own instance is any indication. So maybe consider 64 GB if the RAM is cheap.

No need for a 10 Gbps network. Gigabit would be fine given the bandwidth constraints of most of today's cameras. Practically any gigabit switch has fast enough switching fabric that you can saturate all the ports at the same time. But if you go that way you should consider having two NICs in the server so one can be dedicated to the camera network. You wouldn't other traffic to interfere with camera streaming, or vice versa. It is also a great deal more secure if you do not have the IP cameras on a network that has internet access.

For storage for two weeks, you'll probably be depending on Blue Iris's ability to record sub streams continuously, and add in the main stream when motion is detected. I believe that is the "Continuous + Triggered" option in this menu:

1668406379147.png

Imagine each sub stream averages 1 Mbps, which is probably reasonable, then with 50 cameras you have 50 Mbps. Multiply by 14 days and you have 7.56 terabytes. That would be your absolute minimum storage to have. You will frequently be recording main streams too though, and main streams are a lot bigger. If you aim kind of high and assume there will be 250 Mbps of data being written on average, then the storage requirement is 37.8 TB. (assuming about 10 Mbps average per main stream, that would allow for 20 main streams plus all 50 sub streams to be recording at all times). However much storage you get, make sure you get surveillance-rated drives. And if you want to do a RAID array, consider that resilvering a large RAID that is undergoing a high rate of writes may be absurdly slow and may affect recording performance while it is going on. Me personally, I would rather run a totally independent second Blue Iris server than record continuously to a RAID array. You get far better redundancy that way.

I would not use a NAS for storage.

But a high-endurance SSD for initial video storage before moving the data to mechanical HDDs is actually a really good idea especially if you're going to use the timeline a lot to review recent video. The timeline works a lot better when loading clips from SSD.

Windows Server editions are unnecessary, but should work if that is what you get. Turn off automatic updates both for Blue Iris and for Windows. Unless you don't care about unexpected outages and problems cropping up randomly.
 
Last edited:
No need for a 10 Gbps network. Gigabit would be fine given the bandwidth constraints of most of today's cameras. Practically any gigabit switch has fast enough switching fabric that you can saturate all the ports at the same time. But if you go that way you should consider having two NICs in the server so one can be dedicated to the camera network. You wouldn't other traffic to interfere with camera streaming, or vice versa. It is also a great deal more secure if you do not have the IP cameras on a network that has internet access.

You mean different subnet, different VLAN or something else?
 
You mean different subnet, different VLAN or something else?

My take is it just means keeping the traffic separated, how it's done is debatable. It can be done using Vlans on existing switches (IT department can allocate some ports across switches if they are POE capable), or an easier way is to have unmanaged switches used exclusively for POE/CCTV.

In both cases the IP addressing should be selected to ensure there is no overlap or confusion with existing network.
 
Last edited:
  • Like
Reactions: bp2008
You mean different subnet, different VLAN or something else?

I mean, 50 cameras, main and sub streams, will use a signficant amount of bandwidth on a gigabit link. So you might want to arrange things so the traffic all enters the Blue Iris server through a dedicated network interface, and that way any other traffic in and out of the system won't need to share the same link.

You could do that by installing a separate physical network dedicated to the cameras (best for cybersecurity), or VLANs, or just by assigning static IPs that are in a different subnet. In my own home I do the latter. I don't want to fuss around with running extra network cables or the complexity of VLANs. So my cameras are not truly isolated from the internet; for example if they use IPv6 at all, they could get to the internet that way because most of the cameras I own don't even let me configure IPv6 on them.
 
  • Like
Reactions: looney2ns and smole
What are your specifications for a high spec 'dream' camera server to run Blue Iris 50 cameras in a business environment?

Or can anyone point me in the direction of a hardware calculator or another post that has tackled this?

There are so many options to choose: Which processor? Should they be dual? Which graphics card? How many surveillance grade HDDs and should these be in the server or should the storage be on NAS?


Dell R350
8Cores/16Threads Xeon
32Gb Memory
2x 1Tb SSD (Raid1) for OS and very fast access to the footage of the last 24 hours
4x 4Tb 7200 RPM SATA Disks (Raid5) for footage archive (after 24h)
36 Months OnSite NBD Support
Optional: Coral TPU as AI Accelerator (req only with Proxmox or ESXi as Host OS) - no Dell Support
Optional: GPU as AI Accelerator - limited Dell Support

would be around 4500$ (via Dell Partner)



Dell R6515
16Cores/32Threads AMD Epyc
64Gb Memory
6x 2,4Tb 10k RPM SAS Disks
60 Months OnSite NBD Support
very limited GPU Support (1HE)
Optional: Coral TPU as AI Accelerator (req only with Proxmox or ESXi as Host OS) - no Dell Support

would be around 6000$ (via Dell Partner)
 
Dell R350
8Cores/16Threads Xeon
32Gb Memory
2x 1Tb SSD (Raid1) for OS and very fast access to the footage of the last 24 hours
4x 4Tb 7200 RPM SATA Disks (Raid5) for footage archive (after 24h)
36 Months OnSite NBD Support
Optional: Coral TPU as AI Accelerator (req only with Proxmox or ESXi as Host OS) - no Dell Support
Optional: GPU as AI Accelerator - limited Dell Support

would be around 4500$ (via Dell Partner)



Dell R6515
16Cores/32Threads AMD Epyc
64Gb Memory
6x 2,4Tb 10k RPM SAS Disks
60 Months OnSite NBD Support
very limited GPU Support (1HE)
Optional: Coral TPU as AI Accelerator (req only with Proxmox or ESXi as Host OS) - no Dell Support

would be around 6000$ (via Dell Partner)
Lmao. He would have to be clinically insane to spend that kind of money.
 
Like spending 12-15k USD for 50 Hikvision Cameras when you get the "same" for 2k USD from Reolink? ;)

It is a configuration within the given budget and quite reasonable for a business setup with 50+ cameras. Of course, you can save on storage and buy cheaper disk without on-site support. You might pay that back twice later when you have to deal with everything by yourself.
 
  • Like
Reactions: Daniel15
Like spending 12-15k USD for 50 Hikvision Cameras when you get the "same" for 2k USD from Reolink? ;)

It is a configuration within the given budget and quite reasonable for a business setup with 50+ cameras. Of course, you can save on storage and buy cheaper disk without on-site support. You might pay that back twice later when you have to deal with everything by yourself.
He can literally buy between 6-8 Dell business systems that would handle his load for your price quote. You didn't even include the processor model number which is the most important part of the equation.
 
  • Like
Reactions: Flintstone61
He can literally buy between 6-8 Dell business systems that would handle his load for your price quote. You didn't even include the processor model number which is the most important part of the equation.

He "literally" asked for a server and specified a budget.

The CPU is defined with the server model and it does not matter if you choose it with 2.4 or 2.8ghz.
 
He "literally" asked for a server and specified a budget.

The CPU is defined with the server model and it does not matter if you choose it with 2.4 or 2.8ghz.
You don't seem to realize that clock speed is meaningless unless we know the exact processor model..how long have you been selling for Dell? A server can be any PC, it doesn't need Dells server designation to meet the criteria. Also note that you recommend 16gb of storage for 50 cams. Wow. Fyi most dell business systems come with 3years onsite or available for cheap, not that it will ever be needed.
 
  • Like
Reactions: Flintstone61