Hi. I was wondering if someone could help me understand network bandwidth, or probably more specifically, how the cameras communicate with other devices.
Do the cameras just generally broadcast on the network and you configure a device to 'hear' them, or do you configure a device to directly connect to a camera.
Here's an example of what I'm trying to figure out:
I have 19 cameras (FI8910W) that I inherited at a new client. I grabbed an older Dell Precision workstations and Blue Iris and configured them all on it. It maxes the CPU when viewing live video, but doesn't seem to bad (50% utilization) when not viewing live video and not recording any camera stream. The owner of the company has a 'buddy' who said we should put another computer in and use that one for viewing and leave this one for just recording. When I look at Task Manager, on the current computer, the network bandwidth is at about 45 Mbps. So I don't want to put in another computer, but of course need to justify why I don't want to do that. So my question is, if I put in another computer running Blue Iris and setup all the cameras on that, will I now double the bandwidth on the total network from 45Mbps to 90Mbps, or will adding a second 'listening' computer not increase the network bandwidth?
What I want them to do is just buy a killer i7 machine and run Blue iris on that instead of this older Dell Precision 680 which is an Intel Xeon 3.2Ghaz with 4 logical processors (1 real processor).
So I think I'm basically asking if the broadcast, which means adding another listening computer won't increase bandwidth, or is there a connection made for each listening device and I will end up using twice the bandwidth and using 90Mbps on a 100Mbps network probably sin't a good option.
Thanks!
Do the cameras just generally broadcast on the network and you configure a device to 'hear' them, or do you configure a device to directly connect to a camera.
Here's an example of what I'm trying to figure out:
I have 19 cameras (FI8910W) that I inherited at a new client. I grabbed an older Dell Precision workstations and Blue Iris and configured them all on it. It maxes the CPU when viewing live video, but doesn't seem to bad (50% utilization) when not viewing live video and not recording any camera stream. The owner of the company has a 'buddy' who said we should put another computer in and use that one for viewing and leave this one for just recording. When I look at Task Manager, on the current computer, the network bandwidth is at about 45 Mbps. So I don't want to put in another computer, but of course need to justify why I don't want to do that. So my question is, if I put in another computer running Blue Iris and setup all the cameras on that, will I now double the bandwidth on the total network from 45Mbps to 90Mbps, or will adding a second 'listening' computer not increase the network bandwidth?
What I want them to do is just buy a killer i7 machine and run Blue iris on that instead of this older Dell Precision 680 which is an Intel Xeon 3.2Ghaz with 4 logical processors (1 real processor).
So I think I'm basically asking if the broadcast, which means adding another listening computer won't increase bandwidth, or is there a connection made for each listening device and I will end up using twice the bandwidth and using 90Mbps on a 100Mbps network probably sin't a good option.
Thanks!