Hello. I am relatively new to using Blue Iris and IP cameras in general. I am having a strange bandwidth issue that I am hoping somebody might be able to help me understand...
I have 13 cameras (set at 1080p, 10 frames, 1024 bit rate) connected to an unmanaged switch. The switch has 2 gigabit ports. One is connected to the modem, and one is connected to the PC that is running blue iris and storing the video files.
The internet connection is 50 Mbps down, 10 Mbps up.
As I understand it, blue iris should not be using my modem's download bandwidth for much of anything, and should only be using the upload bandwidth when remote viewing through the web server. However when I enable these cameras one by one in blue iris and do an internet speed test, I can see my download speed rapidly drop to the point that the internet becomes unresponsive to any device connected to that modem. This happens whether or not I have the web server running.
Why is blue iris seemingly choking out my modem's download speeds when (if I understand correctly) it should primarily only affect my upload speed?
I have 13 cameras (set at 1080p, 10 frames, 1024 bit rate) connected to an unmanaged switch. The switch has 2 gigabit ports. One is connected to the modem, and one is connected to the PC that is running blue iris and storing the video files.
The internet connection is 50 Mbps down, 10 Mbps up.
As I understand it, blue iris should not be using my modem's download bandwidth for much of anything, and should only be using the upload bandwidth when remote viewing through the web server. However when I enable these cameras one by one in blue iris and do an internet speed test, I can see my download speed rapidly drop to the point that the internet becomes unresponsive to any device connected to that modem. This happens whether or not I have the web server running.
Why is blue iris seemingly choking out my modem's download speeds when (if I understand correctly) it should primarily only affect my upload speed?