5.5.9 - June 28, 2022 - “direct to wire"

I'm seeing a bunch of lag with direct-to-wire as well. Well technically it is slowly falling behind which accumulates to a large lag.
If I solo camera view in UI3, and then in another window open the all camera index, I can see lag slowly accumulate until its multiple seconds.

In the below screenshot, I have the camera open twice (once in solo view, once in index view). You can see 20+ seconds of difference in the timestamp (generated by camera)
Did the same test. After 2 minutes, 1 second difference.
1656450439289.png
 
Oh wow, in 2 minutes mine fell behind ~30 seconds. Wonder why the huge difference.

Tried a few different OS/browsers on if they lag:
Win10 Firefox: Yes
Win10 Chrome: Yes
MacOS Firefox: Yes
MacOS Safari: Yes
 
Oh wow, in 2 minutes mine fell behind ~30 seconds. Wonder why the huge difference.

Tried a few different OS/browsers on if they lag:
Win10 Firefox: Yes
Win10 Chrome: Yes
MacOS Firefox: Yes
MacOS Safari: Yes
I'm using Windows Firefox.
 
Make sure you're on BI 5.5.9.2 as there were a few bugfixes related to direct-to-wire. If you continue to get delay, right click the video and open Stats for nerds, screenshot that.

As the developer of UI3, I am not interesting in using Direct-to-wire personally because:
  • I make sure my BI servers have plenty of CPU time to spare for web server video encoding.
  • To my eyes, the 2160p(4K) streaming profile is basically indistinguishable from the source video.
  • Direct-to-wire streams load slower and can have more delay (this delay shouldn't be a thing, but right now I'm not sure why it happens).
  • Direct-to-wire doesn't work with H.265, clips, timeline, overlays, rotation.
  • Direct-to-wire may handle the aspect ratio incorrectly when playing a sub stream, I haven't tested it very extensively.
  • I don't tune my cameras video streams for internet streaming so the bit rates on some of them are too high for that.
 
I'm not seeing any delay in video, so far any way.
 
I think direct-to-wire is a good feature if you have a use case for it, maybe streaming to another LAN device where the BI server has limited CPU/GPU encoding power but personally I won't be using the feature for some of the same reasons bp2008 listed.

More than anything I use the BI local console 99% of the time and only check the odd alert remotely on my phone when out and about so a low bit rate 720p type stream is more than enough, I’m not sure my phone would handle some of my raw camera bit rates.

  • I make sure my BI servers have plenty of CPU time to spare for web server video encoding.
  • To my eyes, the 2160p(4K) streaming profile is basically indistinguishable from the source video.
  • Direct-to-wire doesn't work with H.265, clips, timeline, overlays, rotation.
  • I don't tune my cameras video streams for internet streaming so the bit rates on some of them are too high for that.
 
I keep seeing problems with DeepStack in any version later than 5.5.8.0. For some reason the number of instances of python that are in memory increases by a factor of 12. I'm not the only one seeing this anomaly and I have contacted BI support regarding it. My system, running 5.5.7.11, keeps six python instances in memory. With any of the versions that support SenseAI it jumps to 48 instances which actually runs my system out of memory, with 32GB of RAM. Additionally, there is no local control of DeepStack in any version later than 5.5.8.0. In other words there is no start/stop of the DeepStack service and the console shows no AI is running even though it is still sending snapshots to DeepStack and analyzing them. Needless to say, I'm back on 5.5.7.11 again.

I think Ken jumped the gun on this one. SenseAI may be good once some of the bugs of integration are worked out, documentation is improved and SenseAI starts supporting offloading to a GPU.
 
SenseAI integration was way to soon/rushed. No custom models or GPU support is kinda no-no....as those are the basics of good AI systeml.


with 5.5.9.2 the delay in direct to wire seems fixed :)
 
Yes, direct to wire worked fine. The problems with DeepStack made everything else inconsequential though.
 
  • Like
Reactions: gwminor48
I keep seeing problems with DeepStack in any version later than 5.5.8.0. For some reason the number of instances of python that are in memory increases by a factor of 12. I'm not the only one seeing this anomaly and I have contacted BI support regarding it. My system, running 5.5.7.11, keeps six python instances in memory. With any of the versions that support SenseAI it jumps to 48 instances which actually runs my system out of memory, with 32GB of RAM. Additionally, there is no local control of DeepStack in any version later than 5.5.8.0. In other words there is no start/stop of the DeepStack service and the console shows no AI is running even though it is still sending snapshots to DeepStack and analyzing them. Needless to say, I'm back on 5.5.7.11 again.

I think Ken jumped the gun on this one. SenseAI may be good once some of the bugs of integration are worked out, documentation is improved and SenseAI starts supporting offloading to a GPU.
I'm holding off from migrating to SenseAI after it becomes more mature since Deepstack is working well for me. That's odd about what you're seeing with Python instances. I use DS on 14 cameras and 4 clones. I just checked and only have four instances of Python running in Task Manager.
 
I keep seeing problems with DeepStack in any version later than 5.5.8.0. For some reason the number of instances of python that are in memory increases by a factor of 12. I'm not the only one seeing this anomaly and I have contacted BI support regarding it. My system, running 5.5.7.11, keeps six python instances in memory. With any of the versions that support SenseAI it jumps to 48 instances which actually runs my system out of memory, with 32GB of RAM. Additionally, there is no local control of DeepStack in any version later than 5.5.8.0. In other words there is no start/stop of the DeepStack service and the console shows no AI is running even though it is still sending snapshots to DeepStack and analyzing them. Needless to say, I'm back on 5.5.7.11 again.

I think Ken jumped the gun on this one. SenseAI may be good once some of the bugs of integration are worked out, documentation is improved and SenseAI starts supporting offloading to a GPU.
So your happy all around with 5.5.7.11? Stability, weirdness factors, etc.
I've stuck by 5.5.6.21 because's it's been fairly good. I don't care about the AI parts.
 
Yeah, 5.5.7.11 has been pretty stable for me. Nothin odd happening with it that I can see. On the other hand from 5.5.8.0 on has been a disaster for me. I use DeepStack on my clone cameras and have 17 of them using DS. DS eliminates almost every false trigger that BI might see and adds a label for what the trigger is and that's a big plus for me. Previously to eliminate problems with shadows and so on has taken fooling with detection zones every change of season. DS eliminate all that entirely.
 
I think direct-to-wire is a good feature if you have a use case for it, maybe streaming to another LAN device where the BI server has limited CPU/GPU encoding power but personally I won't be using the feature for some of the same reasons bp2008 listed.

More than anything I use the BI local console 99% of the time and only check the odd alert remotely on my phone when out and about so a low bit rate 720p type stream is more than enough, I’m not sure my phone would handle some of my raw camera bit rates.

I think the latest update made a tweak, where you don't have to always use direct to wire (like for your phone) and can ONLY apply to UI3.
In UI3 it seems you can create a new "Streaming Profile" where you can enable direct to wire, so you don't have to check the "prefer" direct to wire all the time in the profile.
 
I think the latest update made a tweak, where you don't have to always use direct to wire (like for your phone) and can ONLY apply to UI3.
In UI3 it seems you can create a new "Streaming Profile" where you can enable direct to wire, so you don't have to check the "prefer" direct to wire all the time in the profile.

BI does have 3 streaming profiles serverside (Streaming 0, Streaming 1, Streaming 2). Streaming 0 is the default that all the apps and UI3 use as a baseline but you can choose a different one from in the apps.
 
  • Like
Reactions: eleazar and anijet
Green screen on UI3 since 5.5.9.1
I changed GPU HWA on webserver, same issue with intel, nvidia or CPU
I changed profiles on webinterface, resolution, bitrate..

Capture.JPG
 

Attachments

  • Capture.JPG
    Capture.JPG
    99.7 KB · Views: 5
H264 & H265 with HWA
I don't use direct to wire
version 5.5.9.0 was working fine.

HWA has been problematic for many people once DeepStack was added to BI. And it has been hitting everyone different on different versions. It may work for you today and the next version not so much.

With substreams it is more efficient and the CPU% needed to offload video to the GPU is more % than the % saved by offloading to the GPU so there really is no reason to use HWA anymore...
 
HWA has been problematic for many people once DeepStack was added to BI. And it has been hitting everyone different on different versions. It may work for you today and the next version not so much.

With substreams it is more efficient and the CPU% needed to offload video to the GPU is more % than the % saved by offloading to the GPU so there really is no reason to use HWA anymore...
ok, but if i've GPU, I want to use it. :)