[tool] [tutorial] Free AI Person Detection for Blue Iris

mayop

n3wb
Joined
Jul 20, 2020
Messages
26
Reaction score
21
Location
Canada
I'd rather not have to put the Ubiquiti cameras in stand alone mode and I've heard of others using Ubiquiti cameras in Ubiquiti Protect mode while also using BI so if anybody else has a similar configuration please let me know if you are using Ubiquiti Protect mode and your cameras are NOT in stand alone mode, are you seeing this same sort of problem I am?

Thanks in advance for any help.
The problem is Unifi Protect. It causes the keyframe rate to be too low. Ideally it should be 1.00 as shown below:



There is a post on BI of possible fixes for cameras themselves:
Now as for protect there is a variable that causes the issue but it's not something you can edit.
However one can edit it by modifying the Protect system files and changing the default from 5 to 1 which will fix the issue if you are good with editing javascript code and using SSH.

I use a cloudkey gen 2 plus with protect 1.17.0 beta 6 currently.

WARNING - THE FOLLOWING MAY CAUSE ISSUES OR BREAK YOUR UNIFI PROTECT INSTALL

Using SSH goto in "/usr/share/unifi-protect/app" (This is for the Gen2+ with 2.0 firmware, it may differ for the Dream Machine and UNVR). Look for the file called service.js. Save a copy of this file in case you break something.

Open service.js (It's minified so it will be harder to read) and search for:

JavaScript:
;a.DEFAULTS=[{idrInterval:5,minClientAdaptiveBitRate:0},{idrInterval:5,minClientAdaptiveBitRate:15e4},{idrInterval:5,minClientAdaptiveBitRate:0}]
then change the three instances of idrInterval:5 to idrInterval:1 as shown below.

JavaScript:
;a.DEFAULTS=[{idrInterval:1,minClientAdaptiveBitRate:0},{idrInterval:1,minClientAdaptiveBitRate:15e4},{idrInterval:1,minClientAdaptiveBitRate:0}]
Save the file then use SSH to restart protect:

Code:
systemctl restart unifi-protect
I only have two cameras in protect so I don't know what would happen with many cameras.

Since doing this my keyframe rate is 1.00 and BI works the same as if my G3 cameras were in standalone mode. Also you will have to edit the file everytime you update your protect install and/or controller (in my case).

I made a post about it on ui.com but no one replied.
 

JL-F1

n3wb
Joined
Jun 12, 2020
Messages
14
Reaction score
2
Location
USA
Running the beta GPU deepstack.

Works nice 90ms

But now when I look at the history in aitool, the trigger happens 3-7 secs after the snapshot photo timestamp. It used to be ~1sec everytime.

Anyone else notice this? Anything can check to decrease that time?

The snapshots go on an SSD, so the drive isn't slow.

Everything else is exactly the same, just GPU deepstack beta changed
 

AskNoOne

n3wb
Joined
Dec 20, 2020
Messages
7
Reaction score
5
Location
UK
I'm trying to get DS running on a Jetson Nano and I'm having a bit of trouble. Here's what I've done so far. I installed the latest Jetpack on an SD card and then set the nano up using headless mode with a USB cable. I assigned a static IP address and then accessed the Nano with Putty. I then ran the following commands.

sudo apt-get update
sudo apt-get upgrade

sudo docker pull deepquestai/deepstack:jetpack-2020.12

sudo docker run --runtime nvidia --restart=unless-stopped -e VISION-DETECTION=True -p 80:5000 deepquestai/deepstack:jetpack-2020.12

sudo docker volume create portainer_data

sudo docker run -d -p 8000:8000 -p 9000:9000 --name=portainer --restart=always -v /var/run/docker.sock:/var/run/docker.sock -v portainer_data:/data portainer/portainer-ce

sudo systemctl enable docker.service

sudo reboot now

After the reboot both containers started and I could access Portainer as well as the DS web interface on port 80 but AI Tool can not communicate with DS.

12/19/2020 6:35:46 PM DetectObjects Unable to connect to the remote server [WebException] Mod: <DetectObjects>d__31 Line:999:48 Error AITOOLS.EXE 10.1.31.30:5000 GarageMotion GarageMotion.20201219_183544288.jpg 60 1 9 False aitool.[2020-12-19].log

The DS URL in AI Tool is set to 10.1.31.30:5000 which is the IP address of the Nano and I believe the port number is correct based on the docker run line.

This is my first time working with a Nano and my first experience with Docker so I'm not familiar with troubleshooting techniques on these platforms. From what I see in Portainer and the output of a few Docker commands everything looks fine to me, but then I don't have an experienced eye.

Now to complicate things. I have installed Jetpack on the Nano twice. Once using the Gui and once headless. I was able to get everything working with the Gui install but I wanted to try out headless install so I wiped the card, started over, and now I can't get DS to work again.

Any help will be appreciated.
As far I can tell from your explanation you have done everything correctly with the exception of the URL in AITool. You need to talk to port 80, not 5000. 5000 is the internal docker container port which is mapped to the host port 80.

I tend to define a non-standard port rather than 80 just to avoid any potential conflicts with web servers you may have or plan to setup. (But 80 should be fine too)

Hope this solves your issue!
 

balucanb

Getting the hang of it
Joined
Sep 19, 2020
Messages
139
Reaction score
23
Location
TX
In ref to the custom models need some help can anybody help me, trying to do the following:
1 - Clone DeepStack Trainer git clone git@github.com:johnolafenwa/deepstack-trainer.git
2 - CD to the repo root cd deepstack-trainer
3 - Put your the images in a folder you want to test
4 - from the repo root run, ```python detect.py --weights "C:/path-to-your-model.pt" --source "C:/path-to-your-test-images-folder"

#1-3 No problem.
When I run the code I am getting this-

PS C:\Users\user\Documents\GitHub\deepstack-trainer> python detect.py --weights "C:\Users\user\Documents\my-models" --source " \testimages"
** On entry to DGEBAL parameter number 3 had an illegal value
** On entry to DGEHRD parameter number 2 had an illegal value
** On entry to DORGHR DORGQR parameter number 2 had an illegal value
** On entry to DHSEQR parameter number 4 had an illegal value
Traceback (most recent call last):
File "C:\Users\user\AppData\Local\Packages\PythonSoftwareFoundation.Python.3.9_qbz5n2kfra8p0\LocalCache\local-packages\Python39\site-packages\numpy\init.py", line 305, in <module>
_win_os_check()
File "C:\Users\user\AppData\Local\Packages\PythonSoftwareFoundation.Python.3.9_qbz5n2kfra8p0\LocalCache\local-packages\Python39\site-packages\numpy\init.py", line 302, in _win_os_check
raise RuntimeError(msg.format(file)) from None
RuntimeError: The current Numpy installation ('C:\\Users\\user\\AppData\\Local\\Packages\\PythonSoftwareFoundation.Python.3.9_qbz5n2kfra8p0\\LocalCache\\local-packages\\Python39\\site-packages\\numpy\\init.py') fails to pass a sanity check due to a bug in the windows runtime. See this issue for more information: Traceback (most recent call last):
File "C:\Users\user\Documents\GitHub\deepstack-trainer\detect.py", line 5, in <module>
import cv2
File "C:\Users\user\AppData\Local\Packages\PythonSoftwareFoundation.Python.3.9_qbz5n2kfra8p0\LocalCache\local-packages\Python39\site-packages\cv2\init.py", line 5, in <module>
from .cv2 import *
ImportError: numpy.core.multiarray failed to import

Any help is appreciated- Please Barney style the answer...
 

robpur

Young grasshopper
Joined
Jul 31, 2014
Messages
47
Reaction score
16
Location
Washington State
As far I can tell from your explanation you have done everything correctly with the exception of the URL in AITool. You need to talk to port 80, not 5000. 5000 is the internal docker container port which is mapped to the host port 80.
Well Duh, that was it. I've been running the native Windows version of DS with the web server running on port 80 and pic submission port on 5050. I didn't understand how the docker version works and assumed that it used two different ports like the Windows version. After setting -p 5050:5000 it works as desired.

I'm currently running MODE=High, using full size images of 1280x720, 1920x1080 and 2048x1536 and average DS time is around half a second. This time is fine for me since I'm running DS on three different machines on the network and my current camera setup can not saturate the collective DSs, but I saw a post on the DeepStack forum from sickidolderivative saying that he processes images on the Nano in around 130 ms, but he's submitting images between 300x300 and 500x500. I wonder if accuracy suffers with such small images. I prefer accuracy over speed but it's probably not necessary to submit full resolution images. I'll have to hunt for the sweet spot.

Thanks for helping me get up and running with the Nano!
 

balucanb

Getting the hang of it
Joined
Sep 19, 2020
Messages
139
Reaction score
23
Location
TX
For those that are using PTZ cameras in your set up with BI/ AI-Tool / deepstack, I am curious if it works for detection and what you have done set up wise.
 

Village Guy

Getting the hang of it
Joined
May 6, 2020
Messages
194
Reaction score
85
Location
UK
For those that are using PTZ cameras in your set up with BI/ AI-Tool / deepstack, I am curious if it works for detection and what you have done set up wise.
All my cameras are PTZ and they work just the same as non PTZ.
PTZ simply means you have more control over them one way or another.
It stands for Pan, Tilt, Zoom but it does not mean that your cameras necessarily support all functions.
 
Last edited:

balucanb

Getting the hang of it
Joined
Sep 19, 2020
Messages
139
Reaction score
23
Location
TX
All my cameras are PTZ and they work just the same as non PTZ.

PTZ simply means you have more control over them one way or another.
Yes that is correct, I should have posed my question better. Assuming you have them set up to auto scan, or maybe only during night you have them move between presets, etc. do you have any issues with detection, how do you handle missed/false detections because of the movement, etc. Also did you (or anyone) do anything different with your triggers or set in BI to adjust for issues caused because of movement vs. a static camera?
 

Ripper99

n3wb
Joined
Dec 12, 2020
Messages
10
Reaction score
2
Location
Canada
@Ripper99
Sorry, I have found your post confusing. You keep referring to the rtsp address. What is the ip address for each of your camera's? Needless to say you cannot use the same ip address for the cameras unless they are cloned.
I've shown this in my post?

Camera Office = 192.168.1.120:7447/JJnG64KrxTHEzSCP

Camera Theatre= 192.168.1.120:7447/MMtG64LryuKKzTTY

I'm very aware cameras cannot use the same address and thats why I mentioned this is a Unifi Protect system where it uses just that however the RTSP url is unique as shown in my example, also a camera may not be able to use the same IP however if you refer to the IP's I provide you can see the Ubiquiti system does just this and gives a unique URL for each camera that is connected to its CloudKey/Protect NVR
 

Village Guy

Getting the hang of it
Joined
May 6, 2020
Messages
194
Reaction score
85
Location
UK
Yes that is correct, I should have posed my question better. Assuming you have them set up to auto scan, or maybe only during night you have them move between presets, etc. do you have any issues with detection, how do you handle missed/false detections because of the movement, etc. Also did you (or anyone) do anything different with your triggers or set in BI to adjust for issues caused because of movement vs. a static camera?
I suspect that you are referring to what is sometimes called Patrol mode.
BI has no way to know when the camera will move and will trigger as the lens moves. I guess aitool will simply do it's best to capture the events you are looking for.
 

Village Guy

Getting the hang of it
Joined
May 6, 2020
Messages
194
Reaction score
85
Location
UK
I've shown this in my post?

Camera Office = 192.168.1.120:7447/JJnG64KrxTHEzSCP

Camera Theatre= 192.168.1.120:7447/MMtG64LryuKKzTTY

I'm very aware cameras cannot use the same address and thats why I mentioned this is a Unifi Protect system where it uses just that however the RTSP url is unique as shown in my example, also a camera may not be able to use the same IP however if you refer to the IP's I provide you can see the Ubiquiti system does just this and gives a unique URL for each camera that is connected to its CloudKey/Protect NVR
Your question appears to be related specifically to Ubiquiti so I clearly missundstood what you are asking and to be honest still don't understand.
 
Last edited:

Ripper99

n3wb
Joined
Dec 12, 2020
Messages
10
Reaction score
2
Location
Canada
The problem is Unifi Protect. It causes the keyframe rate to be too low. Ideally it should be 1.00 as shown below:.....
I'll check this out and have no problem editing via SSH and have the same cloudkey, thanks for the info and help! Appreciated! It sucks the file needs to be updated each time but at least I have something to start with now.
 

Ripper99

n3wb
Joined
Dec 12, 2020
Messages
10
Reaction score
2
Location
Canada
Your camera's show identical IP addresses.
Yes thats correct however as mentioned they are using unique RTSP urls and THAT is where the stream is sourced, many NVR's work the same way and I think you're missing that this actually works and the problem has to do with keyframes as another member has mentioned and nothing at all to do with the same IP on cameras as I've already shown, its the RTSP url that makes them unique and they still work in Blue Iris using this method.

If both cameras have the EXACT same IP and nothing after the port number then of course you have a clone of a camera, its using the same IP however what I am posting is actually a URL and forgive me for not adding rtsp:/ to the beginning of it but the forward slash further to the right in both the examples I show sort of make it clear this is the URL for the camera that makes it unique.

Here's further clarification if you are not sure of the difference between an IP and an rtsp URL, after port "7447" you can clearly see different characters which makes these unique rtsp urls where the stream is fed from for each camera, IP has nothing to do with this at all and you could have 50 rtsp urls all different and can still use them in BI even if the IP address and port is the same in every camera, its the characters after the forward slash that make each rtsp stream unique and NOT a clone.

Camera Office = rtsp:/192.168.1.120:7447/JJnG64KrxTHEzSCP

Camera Theatre= rtsp:/192.168.1.120:7447/MMtG64LryuKKzTTY
 

Village Guy

Getting the hang of it
Joined
May 6, 2020
Messages
194
Reaction score
85
Location
UK
Yes thats correct however as mentioned they are using unique RTSP urls and THAT is where the stream is sourced, many NVR's work the same way and I think you're missing that this actually works and the problem has to do with keyframes as another member has mentioned and nothing at all to do with the same IP on cameras as I've already shown, its the RTSP url that makes them unique and they still work in Blue Iris using this method.

If both cameras have the EXACT same IP and nothing after the port number then of course you have a clone of a camera, its using the same IP however what I am posting is actually a URL and forgive me for not adding rtsp:/ to the beginning of it but the forward slash further to the right in both the examples I show sort of make it clear this is the URL for the camera that makes it unique.

Here's further clarification if you are not sure of the difference between an IP and an rtsp URL, after port "7447" you can clearly see different characters which makes these unique rtsp urls where the stream is fed from for each camera, IP has nothing to do with this at all and you could have 50 rtsp urls all different and can still use them in BI even if the IP address and port is the same in every camera, its the characters after the forward slash that make each rtsp stream unique and NOT a clone.

Camera Office = rtsp:/192.168.1.120:7447/JJnG64KrxTHEzSCP

Camera Theatre= rtsp:/192.168.1.120:7447/MMtG64LryuKKzTTY
Now I understand :cool: :thumb:
Thanks for the education!
 

Ripper99

n3wb
Joined
Dec 12, 2020
Messages
10
Reaction score
2
Location
Canada
Now I understand :cool: :thumb:
Thanks for the education!
No problem, I know it seems confusing but basically the NVR has multiple RTSP streams and every camera when you go into options will offer you three RTSP urls to use for High,Medium,Low and then in my case I just needed the low quality streams and they can all be added in BI when you choose RTSP from the drop down.

All these streams work no problem in BI however the keyframe rate from the Ubiquiti Protect which is basically a NVR is too low and this is why I'm having the problems I mentioned in my post, I thought I was clear in mentioning they were Ubiquiti cameras and this was a specific to these cameras with Deepstack. I'll see if I can edit things via SSH as the other poster mentioned and get things working ;-)
 

Attachments

JL-F1

n3wb
Joined
Jun 12, 2020
Messages
14
Reaction score
2
Location
USA
1608571648323.png




seems like the GPU goes to sleep , every new image after no new for a few min takes much longer, ~400ms. I cannot see any low power/sleep setting for the GPU, win10?
 

Village Guy

Getting the hang of it
Joined
May 6, 2020
Messages
194
Reaction score
85
Location
UK
No problem, I know it seems confusing but basically the NVR has multiple RTSP streams and every camera when you go into options will offer you three RTSP urls to use for High,Medium,Low and then in my case I just needed the low quality streams and they can all be added in BI when you choose RTSP from the drop down.

All these streams work no problem in BI however the keyframe rate from the Ubiquiti Protect which is basically a NVR is too low and this is why I'm having the problems I mentioned in my post, I thought I was clear in mentioning they were Ubiquiti cameras and this was a specific to these cameras with Deepstack. I'll see if I can edit things via SSH as the other poster mentioned and get things working ;-)
A picture is worth a thousand words!
I will stick to using Ubiquity for my access points and Hikvision for the camera's ;)
 
Top