Attn: Networking Gurus - Help me understand this...

I would suggest an additional test to narrow the focus of the issue and the possible solution. Try controlling the PTZ cam from the BI PC, not from the phone on WiFi, and watch for disruption of the TV streaming. My guess is it will not occur.

4K can be quite demanding on a WiFi connection, the addition of the phone streaming video from the BI PC could be the straw that broke the camels back.

In a perfect world if you can hardwire the TV that would be superior in every way.

There are many different WiFi standards with many different bandwidth limits. As others have suggested replacing the ISP provided WiFi access point may well provide relief. That Modem is capable of 802.11n and 802.11ac WiFi at 2.4GHz and 5.0GHz. 802.11ac is faster than 802.11n and 5GHz is faster than 2.4GHz. The catch with frequency is 2.4GHz has better penetration of walls and such but 5.0GHz is faster if obstructions aren't an issue.

By default many WiFi access points name both the 5.0GHz and 2.4GHz networks the same name. Only by looking at the specific connection details on the TV can you determine which network you are actually using and some TVs do not show this information. I would suggest changing the SSID/WiFi network names to reflect their frequency such as TellMyWiFiLoveHer2 and TellMyWiFiLoveHer5 instead of both being called TellMyWiFiLoveHer. This way you can be sure which network you are connecting to on your TV. Once you are sure the TV is on 5.0GHz repeat the tests using the phone on WiFi see if the issue persists.

Additionally not all streaming 4K is created equal. Depending on the level of compression and codecs involved different 4K streams from different providers and even different 4K titles from the same provider can vary significantly in bandwidth usage.
 
  • Like
Reactions: bam2413
I would suggest an additional test to narrow the focus of the issue and the possible solution. Try controlling the PTZ cam from the BI PC, not from the phone on WiFi, and watch for disruption of the TV streaming. My guess is it will not occur.

4K can be quite demanding on a WiFi connection, the addition of the phone streaming video from the BI PC could be the straw that broke the camels back.

In a perfect world if you can hardwire the TV that would be superior in every way.

There are many different WiFi standards with many different bandwidth limits. As others have suggested replacing the ISP provided WiFi access point may well provide relief. That Modem is capable of 802.11n and 802.11ac WiFi at 2.4GHz and 5.0GHz. 802.11ac is faster than 802.11n and 5GHz is faster than 2.4GHz. The catch with frequency is 2.4GHz has better penetration of walls and such but 5.0GHz is faster if obstructions aren't an issue.

By default many WiFi access points name both the 5.0GHz and 2.4GHz networks the same name. Only by looking at the specific connection details on the TV can you determine which network you are actually using and some TVs do not show this information. I would suggest changing the SSID/WiFi network names to reflect their frequency such as TellMyWiFiLoveHer2 and TellMyWiFiLoveHer5 instead of both being called TellMyWiFiLoveHer. This way you can be sure which network you are connecting to on your TV. Once you are sure the TV is on 5.0GHz repeat the tests using the phone on WiFi see if the issue persists.

Additionally not all streaming 4K is created equal. Depending on the level of compression and codecs involved different 4K streams from different providers and even different 4K titles from the same provider can vary significantly in bandwidth usage.

Thanks Smoothie, your consensus as well as others was spot on. The new 4K tv streaming appeared to push things over the edge. I just cut another CAT5 and hardwired the tv and the issues seem to have gone away.

So with a gigabit network...is there really any bottleneck other than the BI Server (cpu) that can limit the number of cameras before a performance issue presents itself?
 
Glad you were able to hardwire it and that seems to have solved the issue.

I vaguely recall seeing systems mentioned in posts here with 32 or 64 cameras. I cannot recall which cameras they had and I am certain details of their config was not present but you get the general idea. You can have many cameras before really running into limits that either cost significant money to overcome or simply cannot be overcome.

The first bottleneck would be CPU and/or RAM. Processing the camera feeds is taxing and adding more cameras makes the CPU work harder and harder, the same is true to a lesser extent for RAM.

The second would probably be hard drive write speed or perhaps bus bandwidth to get the data to the hard drive to write it. Depends on bit rate, FPS and resolution as well as which out of h.264 or h.265 you are using.

Both of the above are likely to occur on all but the most powerful BI PC builds. Without knowing the exact specs of your BI PC we would just be guessing. Much the same way that asking "how fast can my car take this corner ?", the answer would vary greatly depending on what are the details of that corner and if your car was a 1984 Nissan Sentra or a 2019 Lamborghini Aventador.

The last bottleneck you would likely run into would be the network card on the BI PC. It has all the camera feeds coming in plus any displays going out such as smart phone, PC or TV. Depending on bit rate, FPS and resolution etc the cameras can consume a surprising amount of bandwidth. Plus you have all the normal network chatter that a Windows PC generates by simply existing.

With Ethernet being a CSMA/CD or Carrier Sense Multiple Access/Collision Detection technology you will likely run into utilization issues on the BI PC network card before you completely saturate the gigabit bandwidth. I cannot say how many cameras it would take to reach this bottleneck, I would expect a great many even if they were all run at max settings.

Switches are superior to hubs because of scalability, with Ethernet only 1 device can talk at a time. As the number of devices on a single network segment increases the likelihood of a collision occurring increases, hubs are essentially a single wire while switches are a single between any two ports so ports 1 and 2 can talk to each other while 3 and 4 talk to each other without one pair affecting the other. The single cable between the switch and the BI PC is a single network segment. As more and more cameras try to talk to the BI PC nearly continuously the chance of a collision goes up. Think of it like freeways/highways merging from many into 1, that traffic has to go somewhere. Copper based Ethernet can get to around 45% utilization before collisions start becoming a problem, fiber optic based Ethernet can reach almost 90% utilization before collisions start to become a problem, this is one of the main reasons fiber is considered better than copper even at the same speed and why fiber is more common on servers. Ethernet is designed to accommodate these collisions but if too many devices are trying to talk the collisions will cause retransmits which themselves could experience collisions and so on. The result I would guess be skipping frames on the BI PC feeds and if it got bad enough loss of connectivity to a camera.

CSMA/CD essentially works by the process of: device wishing to transmit listens to the wire to check if anyone is talking. If the wire is clear it transmits then listens to find out if a collision occurred. If not the process is complete. If, however, a collision occurs any device detecting the collision waits a random amount of time using the MAC address of the network card as a seed and then resends the data again. So it is largely a first come first served system with a rocks/paper/scissors tiebreaker if there is a collision. This works great until there are so many devices trying to talk that it becomes like a crowded room with just omnidirectional noise everywhere.

A common solution to this is add another network card to the BI PC on a separate switch or vLAN and put some of the cameras on one network/vLAN and some on the other. If both network cards connect to the same network segment there is no reliable way to force one card versus the other to handle the traffic from one camera versus the other.

Collisions are often a problem for servers in that many workstations may be trying to talk to the server simultaneously. This is why many higher end server have multiple network cards and support complicated setups such as NIC teaming and trunking etc.

One other last bottleneck which is very situationally specific is the PoE budget on your switch. Assuming you power your cameras using PoE and they are all connected to the same Ethernet switch which is PoE equipped eventually you will reach the PoE limit for that switch. Different manufacturers have different ways of implementing it. Most switches that you find in all but high end data centers cannot supply full PoE+ power to all 48 ports simultaneously, most commonly only a few of the total ports are PoE capable or your whole switch gets a PoE budget of 384w spread over all PoE ports etc. You would need to check the documentation for you switch to know that make and models specific implementation and limits.
 
  • Like
Reactions: bam2413