Hell Yeah! Direct Deepstack Integration - 5.4.0 - March 31, 2021

Anyone know what "Alert Cancelled (occupied)" means? I was happy to see "Alert Cancelled (potted plant)", but there are no potted plants within sight anywhere in, near, inside or outside our property.
 
Last edited:
  • Like
Reactions: tech101
Anyone know what "Alert Cancelled (occupied)" means? I was happy to see "Alert Cancelled (potted plant)", but there are no potted plants within sight anywhere in, near, inside or outside our property.
From help file.
Occupied state
The software will also attempt to cancel alerts if detected objects are determined to be in an
occupied state. That is, they have not moved since the scene was last analyzed.
 
installed this release 5.4.0.1, I was using "vorlonCD + AItool" prior to this.

Had to reinstall cpu version of deepstack for it to work.

Some cameras worked ok at first go but some didn't .. so have delete + new camera for it. check the log to see which camera needs work.

Seems be easy enough to get it working nicely.. thanks..
I noticed the same issue - one of my cams does not process the images while the others do. At first I thought this was related to ANPR being enabled previously, so I disabled/enabled/disabled ANPR AI , but it still does not work. Did anyone find a way to avoid having to reconfigure all camera settings to get deepstack to work ?
 
@fenderman That's interesting, to, because the clip involved had a pick-up pulling a trailer with a utility tractor on it. The view was broadside and he wasn't going fast at all.

I've upped the number of "picture" scans to three on all three of the cameras watching street traffic. It has raised the "confidence" level by 50% most of the time at least during daylight.
 
  • Like
Reactions: tech101
I dont know deepstack AI that well but wondering does this deepstack AI does it have any learning which is happening behind the scene in general or no ? Like training like the more we use the better it gets kind of thing ??
 
I noticed the same issue - one of my cams does not process the images while the others do. At first I thought this was related to ANPR being enabled previously, so I disabled/enabled/disabled ANPR AI , but it still does not work. Did anyone find a way to avoid having to reconfigure all camera settings to get deepstack to work ?
It somehow started to work automatically after a scheduled PTZ zoom event was triggered on the specific CAM that did not work previously. I did not have to reconfigure anything else luckily
 
@fenderman What help file was that in? I can be pretty dense at times.
 
  • Like
Reactions: tech101
Thanks. That's funny. I thought that's what you meant but I did a search using "occupied" and it showed nothing. I'll fool around again later
 
  • Like
Reactions: tech101
It is worth pointing out in this thread the same issue that some folks will experience now that Deepstack is integrated with Blue Iris and thus people that have tried to use certain types of cameras with the other 3rd party apps are already aware of this, or will become aware of LOL:

Here is why Blue Iris and Reolinks (and many other cheapo cameras) do not work well together and are disaster for something like Deepstack.

This was a screenshot of a member here where they had set these cameras to 15FPS within the cameras (and I suspect they were missing motion that they did not know they were missing....):

1617133192782.png


Now look at they key - that is the iframes. Blue Iris works best when the FPS and the iframes match. Now this is a ratio, so it should be a 1 if it matches the FPS. The iframes not matching (that you cannot fix or change with a reolink) is why they miss motion in Blue Iris and why people have problems. This is mainly why people are having issues with these cameras and there are many threads showing the issues people have with this manufacturer and Blue Iris. It is these same games that make the camera look great as a still image or video but turn to crap once motion is introduced.

The Blue Iris developer has indicated that for best reliability, sub stream frame rate should be equal to the main stream frame rate and these cameras cannot do that and there is nothing you can do about that with these cameras... The iframe rates (something these cameras do not allow you to set) should equal the FPS, but at worse case be no more than double. This example shows the cameras going down to a keyrate of 0.25 means that the iframe rates are over 4 times the FPS and that is why motion detection is a disaster with these cameras and Blue Iris...A value of 0.5 or less is considered insufficient to trust for motion triggers reliably...try to use Deepstack Integration and it will be useless in a lot of situations...

Compounding the matter even worse...motion detection is based on the substream and look at the substream FPS - they dropped down to below 6 FPS (even though it was set to 15FPS in the camera) with an iframe/key rate of 0.25 - you will miss motion most of the time with that issue...

Blue Iris is great and works with probably more camera brands than most VMS programs, but there are brands that don't work well or not at all - Rings, Arlos, Nest, Some Zmodo cams use proprietary systems and cannot be used with Blue Iris, and for a lot of people Reolink doesn't work well either.

Now compare above to mine and cameras that follow industry standards that allow you to actually set parameters and they don't manipulate them. You will see that my FPS match what I set in the camera, and the 1.00 key means the iframe matches the FPS:

1614139197822.png


The same can be said for a variety of cameras that do not let you set certain parameters or manipulate them. If you are trying this new Deepstack Integration and AI is being missed, check the key rate in this screen and see if your camera is at 1.00 and if it isn't that is a leading indicator as to probably why the AI isn't working.
 
When I try to start deepstack from Blue Iris it just keeps saying "Could not start, check path". The path is a network share. I've tried the server-side path and the client-side path. I also tried the default path just for kicks. I'm running deepstack on linux in a docker and BI in a VM on the same machine.

Is this because I'm not running the deepstack on Windows?
 
Last edited:
  • Like
Reactions: austwhite
It is worth pointing out in this thread the same issue that some folks will experience now that Deepstack is integrated with Blue Iris and thus people that have tried to use certain types of cameras with the other 3rd party apps are already aware of this, or will become aware of LOL:

Here is why Blue Iris and Reolinks (and many other cheapo cameras) do not work well together and are disaster for something like Deepstack.

This was a screenshot of a member here where they had set these cameras to 15FPS within the cameras (and I suspect they were missing motion that they did not know they were missing....):

1617133192782.png


Now look at they key - that is the iframes. Blue Iris works best when the FPS and the iframes match. Now this is a ratio, so it should be a 1 if it matches the FPS. The iframes not matching (that you cannot fix or change with a reolink) is why they miss motion in Blue Iris and why people have problems. This is mainly why people are having issues with these cameras and there are many threads showing the issues people have with this manufacturer and Blue Iris. It is these same games that make the camera look great as a still image or video but turn to crap once motion is introduced.

The Blue Iris developer has indicated that for best reliability, sub stream frame rate should be equal to the main stream frame rate and these cameras cannot do that and there is nothing you can do about that with these cameras... The iframe rates (something these cameras do not allow you to set) should equal the FPS, but at worse case be no more than double. This example shows the cameras going down to a keyrate of 0.25 means that the iframe rates are over 4 times the FPS and that is why motion detection is a disaster with these cameras and Blue Iris...A value of 0.5 or less is considered insufficient to trust for motion triggers reliably...try to use Deepstack Integration and it will be useless in a lot of situations...

Compounding the matter even worse...motion detection is based on the substream and look at the substream FPS - they dropped down to below 6 FPS (even though it was set to 15FPS in the camera) with an iframe/key rate of 0.25 - you will miss motion most of the time with that issue...

Blue Iris is great and works with probably more camera brands than most VMS programs, but there are brands that don't work well or not at all - Rings, Arlos, Nest, Some Zmodo cams use proprietary systems and cannot be used with Blue Iris, and for a lot of people Reolink doesn't work well either.

Now compare above to mine and cameras that follow industry standards that allow you to actually set parameters and they don't manipulate them. You will see that my FPS match what I set in the camera, and the 1.00 key means the iframe matches the FPS:

1614139197822.png


The same can be said for a variety of cameras that do not let you set certain parameters or manipulate them. If you are trying this new Deepstack Integration and AI is being missed, check the key rate in this screen and see if your camera is at 1.00 and if it isn't that is a leading indicator as to probably why the AI isn't working.

I agree about the Reolinks, I have 4 RLC410W's (2 were free, long story) and I also recently got an Amcrest with adjustable keyframe intervals. The best you can do with them is NOT use substreams, because those are far worse with the .25 ratio compared to the .5 of the main stream. I compensate by having a longer Pre-trigger Video Buffer so I don't miss anything. (Like 5-6 seconds)
 
  • Like
Reactions: austwhite
I agree about the Reolinks, I have 4 RLC410W's (2 were free, long story) and I also recently got an Amcrest with adjustable keyframe intervals. The best you can do with them is NOT use substreams, because those are far worse with the .25 ratio compared to the .5 of the main stream. I compensate by having a longer Pre-trigger Video Buffer so I don't miss anything. (Like 5-6 seconds)

The problem with that though is some vantage points may completely miss the Deepstack. I have a cheapo overview camera that I tried this on and with a prebuffer of 5 seconds I captured the motion but not the AI. The 5 images taken 1 second apart missed it. Maybe I can dial in the motion setting more, but it was a quick experiment with a 0.5 key.
 
Lets not turn this into another Reolink thread. Heaps of them already around. Sorry I brought it up....
We all know Reolink are not that best cameras :).

I was just reading that Deepstack is oriented for daylight.
This is from the Deepstack FAQ.

DeepStack misses objects a lot in night/dark images?
The detection API is tailored towards detection objects in images with day light or some lighting contained. If you want to detect objects or persons in night/dark images, you can create a new DeepStack API for this by visiting the Custom Models page in this documentation.


Is anyone pretty good with API's that might be able to create a custom model that works at night?
DeepStack is pretty powerful with custom models, but I am too green to work out how to use it.
Not sure if BI can use custom models in it's current integration. Still early days :)
 
  • Like
Reactions: tech101
Lets not turn this into another Reolink thread. Heaps of them already around. Sorry I brought it up....
We all know Reolink are not that best cameras :).

I was just reading that Deepstack is oriented for daylight.
This is from the Deepstack FAQ.

DeepStack misses objects a lot in night/dark images?
The detection API is tailored towards detection objects in images with day light or some lighting contained. If you want to detect objects or persons in night/dark images, you can create a new DeepStack API for this by visiting the Custom Models page in this documentation.


Is anyone pretty good with API's that might be able to create a custom model that works at night?
DeepStack is pretty powerful with custom models, but I am too green to work out how to use it.
Not sure if BI can use custom models in it's current integration. Still early days :)
That must be a very old faq. The developer has a post here where he shows its clearly not the case.
Also note there posts where folks claim differences in accuracy between the cpu and gpu versions, where cpu is more accurate. Dont know if that has been resolved yet.
 
That must be a very old faq. The developer has a post here where he shows its clearly not the case.
Also note there posts where folks claim differences in accuracy between the cpu and gpu versions, where cpu is more accurate. Dont know if that has been resolved yet.
Doesn't surprise me. Some download pages link to an ancient windows version and some to the current one, all on the deepstack site.....
Glad to see this is not an issue now.
I don't use an NVIDIA GPU so only use the CPU version of DeepStack and I have found it is not too bad at night. The main issue I have is AITools will actually trigger people and cars when the BI integration misses them, with the exact same Deepstack settings, so I don't think it is a Deepstack issue. It doesn't use much CPU overall, so I would rather the GPU be left for BI HA and use the CPU version of Deepstack anyway :)
This is the first release of Deepstack AI in BI, so I expect it to get better as time goes on and a few updates come through :)
 
Yes, here's what Deepstack can identify.

person, bicycle, car, motorcycle, airplane,

bus, train, truck, boat, traffic light, fire hydrant, stop_sign,

parking meter, bench, bird, cat, dog, horse, sheep, cow, elephant,

bear, zebra, giraffe, backpack, umbrella, handbag, tie, suitcase,

frisbee, skis, snowboard, sports ball, kite, baseball bat, baseball glove,

skateboard, surfboard, tennis racket, bottle, wine glass, cup, fork,

knife, spoon, bowl, banana, apple, sandwich, orange, broccoli, carrot,

hot dog, pizza, donot, cake, chair, couch, potted plant, bed, dining table,

toilet, tv, laptop, mouse, remote, keyboard, cell phone, microwave,

oven, toaster, sink, refrigerator, book, clock, vase, scissors, teddy bear,

hair dryer, toothbrush

I have to wonder if maybe some of those items should be removed and replaced with more useful ones, or save the cpu cycles of looking for objects that aren't going to be there.

Unless you live in a zoo, it's highly unlikely you're going to be getting an alert about someone stealing your Zebra!

I'd also say my toothbrush and toilet were very unlikely candidates for removal and if they want to take my mice, they're welcome to them!

One other message for Ken, I'm concerned about the reports elsewhere on the forum that surfaced that BI is atcually quite unstable and prone to crashing. It would be really good to write some code to log every event verbatum and then send the code as a crash report when BI crashes so as help identify the unsatble events / areas / code. A friend of mine wrote a similar self diagnosing code for the company he works for and it was highly successful in diagnosing and removing the bugs, although they also employed a large number of community testers who also fed back manual information on the events that led to a crash. It's slighty different here in that their users would have been actively using the App when it crashed, whereas with BI, being CCTV, it's often running on it's own when that occurs making the events that led up to it not obvious.
 
  • Haha
Reactions: Sybertiger
I have to wonder if maybe some of those items should be removed and replaced with more useful ones, or save the cpu cycles of looking for objects that aren't going to be there.

Unless you live in a zoo, it's highly unlikely you're going to be getting an alert about someone stealing your Zebra!

I'd also say my toothbrush and toilet were very unlikely candidates for removal and if they want to take my mice, they're welcome to them!

One other message for Ken, I'm concerned about the reports elsewhere on the forum that surfaced that BI is atcually quite unstable and prone to crashing. It would be really good to write some code to log every event verbatum and then send the code as a crash report when BI crashes so as help identify the unsatble events / areas / code. A friend of mine wrote a similar self diagnosing code for the company he works for and it was highly successful in diagnosing and removing the bugs, although they also employed a large number of community testers who also fed back manual information on the events that led to a crash. It's slighty different here in that their users would have been actively using the App when it crashed, whereas with BI, being CCTV, it's often running on it's own when that occurs making the events that led up to it not obvious.
You can remove the items from the BI list to make restrict what you detect.
You have to remember DeepStack is not dedicated to Blue Iris. It is a now Open Source AI that is used in many applications, not just CCTV. It really doesn't use many CPU cycles doing detection. There is even a version that runs on Raspberry Pi (locked to Medium speed mode) and works as reliably as the PC based version. Also note, you can run DeepStack on a different machine (or Raspberry Pi) and enter the IP address and port to point BI to it so it doesnt affect your BI machine at all.

A note on BI stability. I am using the latest version, not even the version labelled as "stable" and I have NEVER had Blue Iris crash in over 4 years of using Bi and I always update to the absolute latest version. If BI is crashing, I would suggest check your Windows setup and not blame BI for it. There are many COMMERCIAL installs of Blue Iris running 24/7 that also never crash. BI has quite a good logging system already built in, I am pretty sure you can find the cause of virtually all crashes in the existing logging system and in the Windows Event Logs.
That said, I have never had to do that. Maybe the people with Blue Iris crashing are using the PC for more than just Blue Iris??
For something as critical as CCTV, I only use one other application on the BI PC, which is my home automation system. As both are critical services, I don't do anything else with the system.

Edit: As a note, if you find bugs in Blue Iris, Ken is VERY responsive and usually will fix them in the next version he releases. I recently had an issue with a MQTT message not sending correctly. He had it fixed within a week. He is always fixing any bugs found and releases new versions very regularly
 
Last edited:
When I try to start deepstack from Blue Iris it just keeps saying "Could not start, check path". The path is a network share. I've tried the server-side path and the client-side path. I also tried the default path just for kicks. I'm running deepstack on linux in a docker and BI in a VM on the same machine.

Is this because I'm not running the deepstack on Windows?
Dont use the Start Deepstack option in BI. This is really only designed for a local copy of DeepStack. If you have it running on a remote machine, just have docker or whatever start it on boot, and just point BI to the IP address and port. The start option is only to start the DeepStack server if it is not already running.
 
  • Like
Reactions: tech101