[tool] [tutorial] Free AI Person Detection for Blue Iris

ACSmith

n3wb
Joined
Jun 2, 2017
Messages
6
Reaction score
0
Given that this is AI, will it become more accurate over time? Also what controls the speed for triggering a camera? By the time the camera is triggered the car or person is halfway across the camera's view.
 

IAmATeaf

Known around here
Joined
Jan 13, 2019
Messages
3,306
Reaction score
3,292
Location
United Kingdom
Given that this is AI, will it become more accurate over time? Also what controls the speed for triggering a camera? By the time the camera is triggered the car or person is halfway across the camera's view.
The trigger speed really depends on the processing time it takes for AITools to get the image, pass it to DQ and then get a reply back. For me that is around 3.5 seconds so in theory there would be a minimum of 3.5seconds before the cam gets triggered, so I have my pre cam recording time in BI set to 7 seconds so the cam when triggered will also record the previous 7 seconds of video.
 

gbdesai

n3wb
Joined
May 21, 2020
Messages
19
Reaction score
3
Location
California
Thanks for building this product and tutorial. I got it all setup tonight, on between a Windows VM running BI and AITools and a Docker instance of the alternate version of DeepStack. I can reach the webserver on port 90 from the BI machine or anywhere on my network, but sending the image to the sever always results in "can't reach the server message" in the log. I made sure to turn off firewalls and ensured routing was good between the machines... Can't figure out what it could be.

Will keep trying tomorrow.

UPDATE: SUCCESS! 4AM but I got it working. It most likely that I had the environmental variable for the docker container set to VISION-SCENE=True instead of VISION-DETECTION=True

In case it helps anyone, here is my docker-compose section for deepstack:

Code:
  deepstack:
    container_name: deepstack
    image: deepquestai/deepstack:noavx
    volumes:
      - [yourlocalpathhere]:/datastore
      - /etc/localtime:/etc/localtime:ro
    environment:
      - TZ=America/Los_Angeles
      - VISION-DETECTION=True
    ports:
      - 5000:5000
    restart: "unless-stopped"
 
Last edited:

pmcross

Pulling my weight
Joined
Jan 16, 2017
Messages
371
Reaction score
185
Location
Pennsylvania
The trigger speed really depends on the processing time it takes for AITools to get the image, pass it to DQ and then get a reply back. For me that is around 3.5 seconds so in theory there would be a minimum of 3.5seconds before the cam gets triggered, so I have my pre cam recording time in BI set to 7 seconds so the cam when triggered will also record the previous 7 seconds of video.
@IAmATeaf Just wondering if you've managed to move all of your cameras to the new Docker/Deepstack instance running on Windows and how your CPU utilization has been? The reason that I ask is because I am considering running Deepstack in Docekr on my BI machine, but I am running Server 2012 R2 which isn't supported by Docker for Windows. If the CPU savings is substantial I will reload my system with Server 2016, but before doing that I wanted to follow up with your results. I have 15 cameras and when running Deepstack directly on Windows my CPU was being maxed out and alerts were being missed due to this. I know that you had success after switching from Deepstack on Windows to Deepstack on Docker on Windows. Just wondering how things are going for you and how many cameras you have and what your CPU savings is?
 

IAmATeaf

Known around here
Joined
Jan 13, 2019
Messages
3,306
Reaction score
3,292
Location
United Kingdom
@pmcross Yes I have moved/configured all of my cams to make use of AITools. Initially I did have DQ for Windows installed but that was maxing out mu CPU and pegging it at 100%

Since then I’ve moved over to Docker Desktop and have DQ running within it.

This makes the CPU less spikes but when multiple cams trigger then I can spikes of around 84% but the system will then quite quickly settle back down to around 25-30%.

When I was using substreams my system idled at around 8% but after cloning the cams I found thy at the images that BI was saving were from the sub stream and this caused DQ to miss some alerts so I had to remove substreams from the AI cloned cams which results in BI then pulling individual streams so the overall CPU usage goes up. I’m hoping that a future update will fix this issue and allow images to be saved using the main stream res but still allow motion to be detected using the substream for the AI cloned cams. If and when this does come it should hopefully help to bring the CPU usage down further. This has been reported to Ken and I urge others to also report this and ask if it could be an option to allow you to choose which stream is used for the image.

Apart from the above, I think I’ve already stated that I have had to enable motion detection on my main cams as some events were being missed so again this might be adding to the overall CPU usage?
 

pmcross

Pulling my weight
Joined
Jan 16, 2017
Messages
371
Reaction score
185
Location
Pennsylvania
@pmcross Yes I have moved/configured all of my cams to make use of AITools. Initially I did have DQ for Windows installed but that was maxing out mu CPU and pegging it at 100%

Since then I’ve moved over to Docker Desktop and have DQ running within it.

This makes the CPU less spikes but when multiple cams trigger then I can spikes of around 84% but the system will then quite quickly settle back down to around 25-30%.

When I was using substreams my system idled at around 8% but after cloning the cams I found thy at the images that BI was saving were from the sub stream and this caused DQ to miss some alerts so I had to remove substreams from the AI cloned cams which results in BI then pulling individual streams so the overall CPU usage goes up. I’m hoping that a future update will fix this issue and allow images to be saved using the main stream res but still allow motion to be detected using the substream for the AI cloned cams. If and when this does come it should hopefully help to bring the CPU usage down further. This has been reported to Ken and I urge others to also report this and ask if it could be an option to allow you to choose which stream is used for the image.

Apart from the above, I think I’ve already stated that I have had to enable motion detection on my main cams as some events were being missed so again this might be adding to the overall CPU usage?
Thanks for this. Would you mind sharing how many cameras you are running or MP/s and the make/model of your CPU? I am just trying to compare what you are running to what I am running before deciding to reload the OS on my BI machine (which will be a weekend project for sure).
 
Joined
Jun 1, 2018
Messages
2
Reaction score
0
Location
New Zealand
So started writing this new feature up yesterday using Blue Iris detection Rectangles and stuffed if I now how deepstack works as I can see a rectangle on the screen but getting the program to recognize it as a rectangle is another story, Anyway plan is to get the logic working, optimize it and produce a small side application, the work with getting it into Aitool if it resolves the issue, I don't think this will be quick as I stopped programming behind about 10 years ago :)

I'm in the same boat as IAMATeaf as in if I use a mask I wont trigger on the action I'm looking for.
 

IAmATeaf

Known around here
Joined
Jan 13, 2019
Messages
3,306
Reaction score
3,292
Location
United Kingdom
My BI computer is an i5-6500 with 12Gb of RAM. I use an SSD for the boot/OS drive.

I currently have 6 2Mp cams, all with sub streams enabled and have 3 of them cloned so that I can do targeted motion detection and also have all 6 cloned but without substream for use with AITools.
 

gbdesai

n3wb
Joined
May 21, 2020
Messages
19
Reaction score
3
Location
California
I think I’ve finally gotten AI Tool to process and trigger consistently and within a reasonable time. However I’m still having issues with MQTT.

So my MQTT is on a different server (Synology NAS) than Blue Iris and AI Tool (Windows Server). My MQTT topics are received by NodeRed and Home Assistant.

  • MQTT test button from setup works
  • AI Tool triggers the cameras to record with motion
  • MQTT test button from individual camera works

    An AI Tool trigger from motion DOES NOT trigger the MQTT payload, or push notifications for that matter
I don’t know what I’m missing or where the disconnect is coming from. Anyone have any ideas?

UPDATE: I set up an MQTT for some other cameras I have which are not running on AI Tool. The MQTT triggered as it should. So since it is not a connection between Blue Iris and MQTT it must be related to AI Tool. I am currently using AI Tool 1.64 because 1.65 was causing errors writing to history.csv.

@GentlePumpkin do you have any ideas why AI Tool would trigger a recording in BI but somehow prevent alerts using MQTT or push notificaiton?
Doh! I have been struggling with this for the past 4 hours! Wish I found I wasn't the only one with the issue sooner. I'm doing the same thing, Deepstack informs BI via URL on motion, for the camera that it informs, I setup up Alert on Trigger, but none of the conditions, motion, exter, dio, etc. fire the actions I put in on alert. Very frustrating. Seems like there is no way to trigger "On Alert" based on an external URL trigger. When I used Homeseer in the past I recall there was a way to pass things like MOTION_A, maybe I'll modify the URL in AI Tools to send whatever that URL addition was. Will report back.

UPDATE: Nevermind, that was what BI sent Homeseer, not the other way around.

UPDATE2: Nevermind, I'm an idiot, was adding the trigger clauses on the main camera not the cloned one which was getting the trigger.
 
Last edited:

pbc

Getting comfortable
Joined
Jul 11, 2014
Messages
1,024
Reaction score
156
I just thought this was funny....1) that it recognized it as a bear, and 2) the irrelevant - just a bear part.

1594209498485.png

I also started using the mask feature on my driveway since I kept getting a "person" notification for an electrical box across the road and car notifications every time something drove by. Works well so far.

Strangely I have those potted plants in the above photo masked but it still picked them up. Hmm..

1594209658821.png
 

Attachments

shannondalebreaux

Getting the hang of it
Joined
Jun 6, 2018
Messages
82
Reaction score
29
Location
louisana
I just thought this was funny....1) that it recognized it as a bear, and 2) the irrelevant - just a bear part.

View attachment 65870

I also started using the mask feature on my driveway since I kept getting a "person" notification for an electrical box across the road and car notifications every time something drove by. Works well so far.

Strangely I have those potted plants in the above photo masked but it still picked them up. Hmm..

View attachment 65872
What kind of camera your using for your drive way?
 

TechNight

n3wb
Joined
May 6, 2020
Messages
12
Reaction score
0
Location
Germany
Today I got following error until Ai Tool startup. Ai tool shows only one camera and never tried to access deepstackai.

Informationen über das Aufrufen von JIT-Debuggen
anstelle dieses Dialogfelds finden Sie am Ende dieser Meldung.

Ausnahmetext
System.InvalidOperationException: Invoke oder BeginInvoke kann für ein Steuerelement erst aufgerufen werden, wenn das Fensterhandle erstellt wurde.
bei System.Windows.Forms.Control.MarshaledInvoke(Control caller, Delegate method, Object[] args, Boolean synchronous)
bei System.Windows.Forms.Control.Invoke(Delegate method, Object[] args)
bei WindowsFormsApp2.Shell.IncrementErrorCounter()
bei WindowsFormsApp2.Shell.<Log>d__22.MoveNext()
--- Ende der Stapelüberwachung vom vorhergehenden Ort, an dem die Ausnahme ausgelöst wurde ---
bei System.Runtime.CompilerServices.AsyncMethodBuilderCore.<>c.<ThrowAsync>b__6_0(Object state)

Edit:
Found a fix. I replaced all high stat numbers in the camera configs with 0.
Example:
STATS: alerts,irrelevant alerts,false alerts: "7887, 20366, 124898" > STATS: alerts,irrelevant alerts,false alerts: "0, 0, 0"
 
Last edited:

pbc

Getting comfortable
Joined
Jul 11, 2014
Messages
1,024
Reaction score
156
How well the LPR work?
Oh, the LPR is a Dahua Z12E, works very well from ~130 feet or so which is where I have it zoomed. Have a thread on it in the LPR section (though I'm having challenges with OpenALPR right now).
 

shannondalebreaux

Getting the hang of it
Joined
Jun 6, 2018
Messages
82
Reaction score
29
Location
louisana
Oh, the LPR is a Dahua Z12E, works very well from ~130 feet or so which is where I have it zoomed. Have a thread on it in the LPR section (though I'm having challenges with OpenALPR right now).
Nice. I'm kinda starting from scratch I been testing blue iris right now on my windows surface before I spend on building my system. My dad has some Mobotix cameras very expensive cameras. Kinda not impressed with their activity sensor suppose to eliminate false alarms by 90% the sun casting shadows on the ground from the sun triggers it every time. So I'm back on the market trying to build one with blue iris and this Ai tool seems legit more cost effective. How's your false alarms with it?
 

pbc

Getting comfortable
Joined
Jul 11, 2014
Messages
1,024
Reaction score
156
Still dialing it in, but getting the false alarms down to almost nil. It's a great tool. The only thing I really liked about Sentry (paid plan that integrates with BI) is that it would mark on the timeline their "S" symbol which was really nice.

I find with this tool at times while it gets the alert right, the snapshots and/or video can be off a bit. E.g., I get notification that a trigger was flipped, but in the email to me the 3 snaps it takes are seconds too late so it doesn't show anything. But opening the 10 second video then shows me what caused it.

Easier if the snapshots would have a picture of what causes the trigger. Haven't figured out yet why that is happening, as it is only happening on my front door not my driveway even though they are setup identically.
 

pbc

Getting comfortable
Joined
Jul 11, 2014
Messages
1,024
Reaction score
156
Huge learning curve, but it's worth it and the forum members here have been great as long as you're willing to do some legwork yourself!
 
Top