5.6.8 - January 8, 2023

fenderman

Staff member
Mar 9, 2014
36,892
21,407
5.6.8 - January 8, 2023
On the Alerts tab in camera settings, the option for re-triggers has been expanded to allow for
re-triggers only when a new zone or source is added to the trigger.

On AI configuration from the Trigger tab in camera settings, an option for New alert/analysis
on re-trigger after cancellation has been added. You may have been using some combination of
a large quantity of images (using significant system resources) and the “re-triggers” option
on the Alerts tab (creating many alert images even after confirmation) in an attempt to
achieve similar results, but this new option may now better address the issue of
inadvertently missing critical events.

On this same page, the option to replace Email/SMS subjects and bodies has been removed.
Instead, use the &MEMO macro in your action sets for this purpose.

You may now limit the age of clips to which users have access with a new setting on
Settings/Users. 0 may be used for unlimited access.

Clip maintenance behavior has been modified. AI confirmed alerts that belong to existent
clips are essentially treated as protected—that is, they will not be deleted or moved out of the
Alerts folder so long as their parent clip exists. Users may have felt the need to mark these
items as protected in the database (which means they are never automatically deleted),
thereby inadvertently accumulating an inordinate number of these. Cancelled and unconfirmed
(no AI involved) alerts will still be maintained on schedule.
 
I upgraded to test the new "New alert/analysis on re-trigger after cancellation" feature. Once you upgrade, it is checked as a default. For each of my cameras, I have them set to analyze 30 images at 250ms intervals. With this new setting, I'm going to try and reduce that number of images to 5 and see how detection works.
 
Can anyone add any detail on how this works? I'm not clear on when the cancellation occurs, and just what constitutes a re-trigger.

For example, I have some LPR cams set up to analyze 15 images, one each 200ms. I use a 1 second break time and a 3 second maximum trigger duration. I keep the times short in case two cars go by in quick succession. I need a new alert to send the second image to Plate Recognizer. For my situation, 3 seconds maximum alert is about right to force it to capture the second car separately after the first has passed. So these cameras will analyze the entirety of each 3-second alert unless a vehicle is detected, I believe it stops analyzing once the alert is AI confirmed, but I'm not 100% sure of that.

I don't think the new setting has any effect on the way I have these cameras set up, since if it goes 3 seconds without confirmation and motion continues past that point, I have a new alert anyway due to the max alert time being == duration of the analysis (15 x 200ms).

I have another camera not used for LPR where I want to detect people. It has a 5 second break time and a 60 second max trigger/alert duration. Currently I have it set to analyze 10 images, one every 500ms. So it will analyze 10 images over the first 5 seconds and then mark the alert as 'nothing found' if the AI doesn't detect a person. With the new setting enabled, does that mean if nothing is found in the first 5 seconds and there is still motion, that it will start a new round of analyzing 5 images? If so, that seems like it will use more resources than my current settings, not less, because it will just continue to analyze as long as there is motion.

I'm either misunderstanding how this feature works, or I'm just not imagining the use case that it is intended for.
 
Simliar to Alan_F - I'm also a little confused as to how this works. One of my cams will AI process the normal images and then 10 images @ 250ms (5 seconds) - As you can see my re-trigger is with 10seconds, will this mean if the first cycle of AI processing finds nothing and movement contiues and there is a retrigger the AI processing starts all over again? pretty much making AI processing infinate while there is motion/retriggers until either person or car is detected?


1674121489464.png
 
  • Like
Reactions: 105437 and jrbeddow
Simliar to Alan_F - I'm also a little confused as to how this works. One of my cams will AI process the normal images and then 10 images @ 250ms (5 seconds) - As you can see my re-trigger is with 10seconds, will this mean if the first cycle of AI processing finds nothing and movement contiues and there is a retrigger the AI processing starts all over again? pretty much making AI processing infinate while there is motion/retriggers until either person or car is detected?


View attachment 151383
This is a good question. I did some testing last night on a camera that looks down my driveway to the street. I used to use 30 images @ 250ms without the new feature and AI detection was very good at night.

When this new version came out, I thought it would allow me to lower the number of images to reduce overhead on my BI PC. The problem I ran into last night with the settings below, is that the camera would see motion and trigger with headlights illuminating the road several feet in front of the vehicle, so the 5 images didn't allow AI to detect a vehicle. This setting works just fine during the day. For now, I'm reverting back to 30 images and de-selecting the "New alert/analysis" box for night profiles that could alert/trigger with shining lights etc.

Screenshot 2023-01-19 at 4.19.25 PM.png
 
  • Like
Reactions: jrbeddow
If it is problematic for your field of view and this new addition isn't working, all you need to do is add an item in the TO CANCEL box that would be an item you wouldn't expect to see.

That will force it to use EVERY additional real time image. If you don't put something in the too cancel box, then as soon as AI thinks it found or didn't find something, it cancels out.

Here is an example. At night the camera AI would struggle with this tight view. It has a straight on angle of the street to get a side profile of a car and it would miss it a lot of times because the vehicle just isn't in the field of view long enough, so this is a great candidate for DS.

Now the issue I had with DS is that it would either find a car but the alert image would be the lightshine on the street or just a part of the vehicle, or it would trigger out nothing found due to headlight bounce off the street.


1640050464228.png



DS has a "to cancel" option, which means it will analyze EVERY image to determine if the item is in it. Once I added a cancel Giraffe in the field, it now will go thru all the images and select the best one, which gives me the whole vehicle in the frame and it eliminated the nothing found as well. It makes for scrubbing video much quicker as I can skip looking at video of known vehicles.



1640050718951.png
 
  • Like
Reactions: JNDATHP and 105437
If it is problematic for your field of view and this new addition isn't working, all you need to do is add an item in the TO CANCEL box that would be an item you wouldn't expect to see.

That will force it to use EVERY additional real time image. If you don't put something in the too cancel box, then as soon as AI thinks it found or didn't find something, it cancels out.

Here is an example. At night the camera AI would struggle with this tight view. It has a straight on angle of the street to get a side profile of a car and it would miss it a lot of times because the vehicle just isn't in the field of view long enough, so this is a great candidate for DS.

Now the issue I had with DS is that it would either find a car but the alert image would be the lightshine on the street or just a part of the vehicle, or it would trigger out nothing found due to headlight bounce off the street.


1640050464228.png



DS has a "to cancel" option, which means it will analyze EVERY image to determine if the item is in it. Once I added a cancel Giraffe in the field, it now will go thru all the images and select the best one, which gives me the whole vehicle in the frame and it eliminated the nothing found as well. It makes for scrubbing video much quicker as I can skip looking at video of known vehicles.



1640050718951.png
Exactly my situation. I might try putting a cancel object in that it can't find and see if it works. What do you think is the use case for the "New alert/analysis" box?
 
I noticed something weird that I've never seen before. I'm not sure if it is related to the "new alert/analysis" option or not. For my LPR cameras, at night I use AI to tag the images (vehicle, dayplate, nightplate) but I send all motion alerts to Plate Recognizer even when nothing is found. This morning I had two alerts where the tag was displayed on the clip list and was embedded in the Exif data of the image, but the alert image (of a school bus) on the clip list did not show the tag, it showed the side of the bus. I've never noticed this before, although I may just be paying more attention to the alert images recently.* Checking the Plate Recognizer site, the image that was sent to them (obviously) was not the alert image in the clip list. It was an earlier image that showed the tag.

My BI updated to 5.6.8.4 yesterday. I had unchecked the 'new alert/analysis' button on my day profile, but I forgot to uncheck it on my night profile. I've now unchecked it on both, so I'll watch and see if I still get alert images in the clip list that are different from the alert image sent to Plate Recognizer.

* I'm paying attention to the alert images because I just set up a process in Node Red to watch the alert folders and check each file for a tag in the Exif data. If there is a tag, it writes the tag and filename to a database and copies the alert image to a folder where I can retain it after BI deletes the alert. I set up a UI page in Node Red dashboard where I can search the tags and view the images. I could share the process if anyone is interested in doing the same type of thing.
 
@Vettester

I've attached the JSON for the the Node Red flows. One is to process the alert images and the other is for the UI/dashboard view. That second part is still a work in progress and could be made much better. These flows are from two different Node Red instances, but I don't think that will cause any issues. My UI dashboard runs on my Raspberry Pi, but I'm processing the files on my BI Windows PC.

Feel free to PM me with any questions. I think I covered all the steps below, but depending on how familiar you are with the various components, I might not have explained everything clearly.

Just to explain why I did this: I was using the MQTT alert function in Blue Iris to send the tags via Node Red to a database, but considering that many tags are partially misread, without the alert images the tag data is much less useful. I was reserving a LOT of space to store alert images to try to hold onto them as long as possible. That also meant Blue Iris's database was tracking a lot of alert jpegs. By doing this process, I can retain only the alert images from my LPR cameras that have a detected tag, which means I need to store only a fraction of the images. As a result I can keep them a lot longer. I can now set the Blue Iris alerts folder to delete after a reasonable time.

Components needed: Blue Iris set up to recognize tags, Node Red, a MySQL database, Internet Information Services or other web server to serve images.
  • Configure Blue Iris to write alert images to a folder
  • Create a separate folder to store the images where ALPR detected a tag (outside Blue Iris)
  • Set up Windows Internet Information Services (IIS) or another web server to serve the files in that folder on an available port (I'm using 8093 in the example)
  • Set up a MySQL database to store records (anywhere this computer can connect to on the network). I'm running MariaDB in Docker on my Raspberry Pi.
  • Use the attached flow in Node Red to check each alert image for tag info, store the data in database, and copy the image file to your storage folder
    • Configure the nodes as needed: folder paths, mysql server config, etc.
  • Use the attached flow to create a UI dashboard to view tags and images
    • Configure the mysql node. Update line 13 of the 'LPR SQL' function node to point to your web server IP and port for the images. Configure the UI nodes to refer to valid dashboard UI groups.

My database setup:
Database name: LPR
Table name: lpr_images
Table fields:
Code:
+-------------------+--------------+------+-----+---------------------+----------------+
| Field             | Type         | Null | Key | Default             | Extra          |
+-------------------+--------------+------+-----+---------------------+----------------+
| idlpr_images      | int(11)      | NO   | PRI | NULL                | auto_increment |
| tag               | varchar(45)  | YES  |     | NULL                |                |
| file              | varchar(255) | YES  | UNI | NULL                |                |
| datetime          | datetime     | YES  |     | current_timestamp() |                |
| image_description | varchar(255) | YES  |     | NULL                |                |
| filedatetime      | datetime     | YES  |     | NULL                |                |
+-------------------+--------------+------+-----+---------------------+----------------+
idlpr_images: auto-incremented primary key
tag: the tag text
file: the filename (unique)
datetime: the datetime the record was imported
image_description: should contain the detected objects from the Exif info
filedatetime: the date/time stamp extracted from the image filename

There are two triggers set up on the database. These triggers extract the date/time stamp from the filename and store it in the filedatetime column when a row is inserted or updated.
Code:
lpr_images_BEFORE_INSERT
set new.filedatetime = CONCAT(DATE_FORMAT(SUBSTRING_INDEX(SUBSTRING_INDEX(SUBSTRING_INDEX(new.file, '_',2),'_',- 1), '.',- 1),'%Y-%m-%d'),
' ', TIME_FORMAT(SUBSTRING_INDEX(SUBSTRING_INDEX(SUBSTRING_INDEX(new.file, '_', 3),'_',- 1), '.',  1),'%H:%i:%s') )  

lpr_images_BEFORE_UPDATE
set new.filedatetime = CONCAT(DATE_FORMAT(SUBSTRING_INDEX(SUBSTRING_INDEX(SUBSTRING_INDEX(new.file, '_',2),'_',- 1), '.',- 1),'%Y-%m-%d'),
' ', TIME_FORMAT(SUBSTRING_INDEX(SUBSTRING_INDEX(SUBSTRING_INDEX(new.file, '_', 3),'_',- 1), '.',  1),'%H:%i:%s')
)

Note: the flow has debug nodes all over the place to help in settings things up. These shouldn't be needed once things are running smoothly.

  • The flow starts with watch nodes, one for each folder that will have LPR files. Since I use the camera name in my folder path, I have two folders that I'm watching
  • The next node is a switch node. It checks the msg.event and passes 'update' events to output 1. Output 2 gets everything else just for debug purposes.
  • The next node is a function. It checks if the filename (1) includes the prefix for either camera AND (2) does not contain '.tmp' in the file name. Matching files are passed to output 1. Others to output 2 for debug purposes. Needs to be configured with your camera names
  • The next node is a trigger with a 5 second delay. I put this here because it seemed the same file was getting multiple updates as it was being written. This node will only send one message for each distinct msg.file every 5 seconds.
  • The next node sets the msg.topic to a SQL query to count the number of records in the database for that filename.
  • The next node is a mysql node that performs that query
  • The next node is a switch that looks at the query result. If the count == 0, that means this file is not already in the database and it passes the message to output 1. Otherwise the message is passed to output 2 for debugging.
  • The next node is a 'read file' node. It reads the file and returns it as a buffer object
  • The next node is an Exif node. It pulls the Exif data from the file and puts it in msg.exif
  • The next node is a switch node. It passes the message to output 1 if there is Exif data. Otherwise it passes the message to output 2 for debugging.
  • The next node is a change node. It sets the msg.tag to msg.exif.image.Model and msg.image_description to msg.exif.image.ImageDescription. This isn't really needed as I could just refer to the original msg parts, but it makes the later nodes easier to read
  • The next node is a switch node. It checks if there is a tag. If there is a tag it passes the message to output 1. If no tag, then output 2 for debugging.
  • Output 1 sends the message to two nodes
  • The first path goes to a SQLstringFormat node. It sets up the insert query for the database. It is set up as a "REPLACE INTO" so that duplicates will not cause an error. I used this rather than 'INSERT INTO' because I didn't originally have the count(*) query earlier in the flow.
  • The next node is a mysql node. It inserts the record into the database
  • The second path goes to a change node. It sets up the full path\filename for the copy of the image.
    • The first step needs to have the path of the folder where you want to store the images.
    • The next node writes the file to the disk.
 

Attachments

  • Like
Reactions: Vettester
Thanks for sharing this and for the detailed explanations! I’ll check it out and get back to you if I have any questions.
 
I’m using NodeRed and MariaDB to process alert images but for the purpose of advanced notifications, however I’m not watching any folders/filesystems but instead forwarding alerts from some cameras using MQTT with a base64 alert image, is there any reason you choose not to do this as the start of your NodeRed trigger/Flow?

My BI MQTT alert string - {"Cam":"&CAM","Date":"%c","AlertDB":"&ALERT_DB","AlertImgPath":"&ALERT_PATH","AlertAI":"&MEMO","ImageB64":"&ALERT_JPEG"}

The output from BI to MQTT - {"Cam":"Patio","Date":"20/01/2023 12:04:24","AlertDB":"@283149192974226","AlertImgPath":"Patio.20230120_120000.260821.3-1.jpg","AlertAI":"person:81%","ImageB64":"LONG BASE64 IMG STRING HERE"
 
I’m using NodeRed and MariaDB to process alert images but for the purpose of advanced notifications, however I’m not watching any folders/filesystems but instead forwarding alerts from some cameras using MQTT with a base64 alert image, is there any reason you choose not to do this as the start of your NodeRed trigger/Flow?

My BI MQTT alert string - {"Cam":"&CAM","Date":"%c","AlertDB":"&ALERT_DB","AlertImgPath":"&ALERT_PATH","AlertAI":"&MEMO","ImageB64":"&ALERT_JPEG"}

The output from BI to MQTT - {"Cam":"Patio","Date":"20/01/2023 12:04:24","AlertDB":"@283149192974226","AlertImgPath":"Patio.20230120_120000.260821.3-1.jpg","AlertAI":"person:81%","ImageB64":"LONG BASE64 IMG STRING HERE"

I don't know if one is better than the other. The only advantage I see to pulling from the files is that I was able to get the data for the ~14,000 alerts I already had images of.

And probably a very minor point, it means one less moving part. If MQTT is restarting or there is a network issue between Blue Iris and the broker, I assume the message from Blue Iris could be lost. Running If the broker is on the machine with BI and Node Red, that's a non-issue.

But to specifically answer your question, I didn't choose that method because it never occurred to me
 
  • Like
Reactions: CrazyAsYou
Although MQTT would be a less complex NodeRed flow I'd agree that I don't know if one is better than the other but I think you have a valid point about the broker restarting, any alerts while the broker was down (albeit brief) would mean a loss of those getting moved and added to the external DB.
 
Although MQTT would be a less complex NodeRed flow I'd agree that I don't know if one is better than the other but I think you have a valid point about the broker restarting, any alerts while the broker was down (albeit brief) would mean a loss of those getting moved and added to the external DB.

Definitely. I'll have to try to see what I can cook up with MQTT. I shouldn't even need to send the image over the network. If I can pass the plate and filename (and detected objects if possible), I could write those values to the database, wait a couple seconds, then do my file copy operation using the filename.

I'm doing a bit of tweaking of the flow I posted anyway. I just added some nodes so when the tag is null it checks if "DayPlate" or "NightPlate" was detected. If so, I make a db entry with the tag "(not read)" and copy the image. I was seeing that sometimes Plate Recognizer isn't returning anything when the tag isn't clear, so this way I'll have a copy of all the images where the AI saw a tag, even if I can't search those by tag number. Could still be useful if I need to see all tags that passed by during a given time period.

I also wired something up this morning to send images to the Code Project AI LPR module (which I finally got working).The first few tests indicate that the Plate Recognizer cloud service is more accurate than CPAI, so I'm going to have to balance the cost savings vs. accuracy when the local LPR solution gets Blue Iris support.
 
  • Like
Reactions: Vettester
Definitely. I'll have to try to see what I can cook up with MQTT. I shouldn't even need to send the image over the network. If I can pass the plate and filename (and detected objects if possible), I could write those values to the database, wait a couple seconds, then do my file copy operation using the filename.

I'm doing a bit of tweaking of the flow I posted anyway. I just added some nodes so when the tag is null it checks if "DayPlate" or "NightPlate" was detected. If so, I make a db entry with the tag "(not read)" and copy the image. I was seeing that sometimes Plate Recognizer isn't returning anything when the tag isn't clear, so this way I'll have a copy of all the images where the AI saw a tag, even if I can't search those by tag number. Could still be useful if I need to see all tags that passed by during a given time period.

I also wired something up this morning to send images to the Code Project AI LPR module (which I finally got working).The first few tests indicate that the Plate Recognizer cloud service is more accurate than CPAI, so I'm going to have to balance the cost savings vs. accuracy when the local LPR solution gets Blue Iris support.
If you could PM me some of the images you are getting poor results. I am going to work on improving the accuracy. I already have a version 2.0 that is not released yet.
 
Although MQTT would be a less complex NodeRed flow I'd agree that I don't know if one is better than the other but I think you have a valid point about the broker restarting, any alerts while the broker was down (albeit brief) would mean a loss of those getting moved and added to the external DB.

Well, we have a winner for which one is better, and a big thank you for pointing me towards this easier method.

I had noticed that there were some clips in the clip list where there was a tag displayed, but the tag was not in the Exif information of the image file. I emailed Blue Iris support about this a few days ago, but they've probably been busy on the CPAI integration and haven't gotten back to me.

However while testing using MQTT to simplify the node-red flow, I saw a file come through where the Exif information was missing, but all the information was in the MQTT message. So it appears that by using MQTT I avoid whatever issue was causing me to sometimes not be able to get the tag.

So using this Blue Iris alert: { "plate":"&PLATE", "AlertImgPath":"&ALERT_PATH", "Alert_AI":"&MEMO", "Date":"%Y-%m-%d %H:%M:%S","Camera":"&CAM" }

Which produces this: { "plate":"FH6131", "AlertImgPath":"Left_LPR.20230124_205010.0.3-1.jpg", "Alert_AI":"NightPlate:89%", "Date":"2023-01-24 20:50:10","Camera":"Left_LPR" }

And a much simpler Node-Red flow:
The flow is now:
Code:
                                                              -> SQLstring-format node to set up the query -> MySQL node to insert into database
MQTT In which feeds two paths ->
                                                              -> function node to set source folder based on camera name -> copy file node to copy the file

[{"id":"4e4757364b6290fb","type":"group","z":"9796107b389bdb9e","style":{"stroke":"#999999","stroke-opacity":"1","fill":"none","fill-opacity":"1","label":true,"label-position":"nw","color":"#a4a4a4"},"nodes":["c76f238773caf170","87d44d61fefcfd9a","902c1a6918c0db03","5a78a14f6887f171","1ca1a7c8197a712f","4b7b582067b02d1b","2edbc803fde92f3e"],"x":194,"y":1379,"w":572,"h":242},{"id":"c76f238773caf170","type":"mqtt in","z":"9796107b389bdb9e","g":"4e4757364b6290fb","name":"","topic":"BlueIris/LPRdata","qos":"2","datatype":"auto-detect","broker":"a44c57ac297015e5","nl":false,"rap":true,"rh":0,"inputs":0,"x":300,"y":1480,"wires":[["87d44d61fefcfd9a","902c1a6918c0db03","9e119b47c49a583b"]]},{"id":"87d44d61fefcfd9a","type":"debug","z":"9796107b389bdb9e","g":"4e4757364b6290fb","name":"debug 53","active":true,"tosidebar":true,"console":false,"tostatus":false,"complete":"true","targetType":"full","statusVal":"","statusType":"auto","x":320,"y":1540,"wires":[]},{"id":"902c1a6918c0db03","type":"sqlstring-format","z":"9796107b389bdb9e","g":"4e4757364b6290fb","name":"","query":"REPLACE INTO lpr_images (tag,file,image_description,filedatetime,camera) VALUES \n(?,?,?,?,?) ","vars":"payload.plate,payload.AlertImgPath,payload.Alert_AI,payload.Date,payload.Camera","outField":"topic","x":490,"y":1460,"wires":[["5a78a14f6887f171"]]},{"id":"5a78a14f6887f171","type":"mysql","z":"9796107b389bdb9e","g":"4e4757364b6290fb","mydb":"17c59737.09ace9","name":"LPR","x":630,"y":1460,"wires":[["1ca1a7c8197a712f"]]},{"id":"1ca1a7c8197a712f","type":"debug","z":"9796107b389bdb9e","g":"4e4757364b6290fb","name":"debug 54","active":true,"tosidebar":true,"console":false,"tostatus":false,"complete":"true","targetType":"full","statusVal":"","statusType":"auto","x":640,"y":1420,"wires":[]},{"id":"4b7b582067b02d1b","type":"debug","z":"9796107b389bdb9e","g":"4e4757364b6290fb","name":"debug 55","active":true,"tosidebar":true,"console":false,"tostatus":false,"complete":"true","targetType":"full","statusVal":"","statusType":"auto","x":660,"y":1580,"wires":[]},{"id":"2edbc803fde92f3e","type":"comment","z":"9796107b389bdb9e","g":"4e4757364b6290fb","name":"BlueIris/LPRdata to database","info":"","x":340,"y":1420,"wires":[]},{"id":"a44c57ac297015e5","type":"mqtt-broker","name":"Pi MQTT","broker":"192.168.0.12","port":"1883","clientid":"","autoConnect":true,"usetls":false,"protocolVersion":"5","keepalive":"60","cleansession":true,"birthTopic":"","birthQos":"0","birthPayload":"","birthMsg":{},"closeTopic":"","closeQos":"0","closePayload":"","closeMsg":{},"willTopic":"","willQos":"0","willPayload":"","willMsg":{},"userProps":"","sessionExpiry":""},{"id":"17c59737.09ace9","type":"MySQLdatabase","name":"","host":"192.168.0.12","port":"3306","db":"LPR","tz":"America/New_York","charset":"UTF8"}]

I also was able to remove the triggers from the database, since I can generate the date/time in the MQTT message instead of trying to parse it out of the file name. While I was at it, I added a column for the camera name to the database.