Blue Iris and CodeProject.AI ALPR

@brcarls You can configure BI5 to run a local program or script in "On alert..." Your script would encapsulate the if-then logic you require.

If you have Home Assistant (or similar home automation platform), BI5 can send that event with &PLATE value via HTTP or MQTT. Then in Home Assistant, you can configure an automation to do whatever you want, including controlling lights, sirens, etc...

Home Assistant automation would be more configurable since you can add additional conditions and/or other triggers like time of day, alarm system status as part of the logic for taking action.

I have something set up in Node Red to do the same thing, although I haven't had cause to watch for any tags yet. I don't use Home Assistant - I'm using Hubitat for home automation - but I think standing up Node Red is probably quicker and easier than Home Assistant, so it might be a good option unless you want Home Assistant for other things.

I send LPR alert information to MQTT so that I can log the tags in a database via Node Red. I have a separate flow in Node Red that uses a function node to check the tags. Unless you have really easy-to-read tags in your locale and a really good view of each tag, you probably want to account for partial matches as well as exact matches. If it's important enough to have real-time alerts set up, then it's probably important enough to be notified when a tag is read that is very similar to the one you are watching for.

I'll 'spoiler' the rest of the details to avoid cluttering up the thread:

My flow is fairly simple.. MQTT -> Function to compare the read to a list of tags -> Switch for whether the hit was exact or partial -> Change node to set up my notification -> Send to my notification server

1674843379365.png
My function is below. This is still a work in progress and it could be made better with a little effort.

The MQTT message is JSON format and the tag that was detected is in msg.payload.plate. On the 4th line of my function I list the tags I want to watch for, including all the partial cases I think might result from misreading. So for ABC1234 I might enter the full tag, but also BC123, C1234, ABC123, etc. The more partial tags, the more false hits I'll get, but the less change of missing a hit.
Code:
var tag = msg.payload.plate
/tags to watch for
var check_for = ['ABC1234', 'BC123', 'C1234', 'ABC123']
const partial = check_for.filter(el => tag.includes(el))
/exact match
if ( check_for.includes(tag) )
{
    msg.notify_text = 'Exact Match - ' + tag + " "
    msg.hit_type = 'exact'
    return msg;
}  
else if ( partial.length > 0  )
{
    /partial match
    msg.notify_text = "Partial matches: Tag " + tag + " matched " + partial
    msg.hit_type = 'partial'
    return msg;  
}
else
{
    /no match
    return null;
}

The result of the function is I get either a message saying there was an exact match, or a message saying what plate was read and what it matched from the list.

If I'm watching for ABC1234 and the LPR reads 8C1234 (missing the first character, mis-reading the B), it would generate a msg.notfiy_text of: "Partial matches: Tag 8C1234 matched C1234"

If I get time to try to enhance this, I think it would be useful to be able to list the full tag I'm looking for, a text description/reason, and the partial elements for just that tag, so the message could be something like:
"Tag 8C1234 is a partial match for ABC1234 - Blue Honda Accord - car break-ins"

Looks like I'm going to have to dig a bit deeper into multi-dimensional arrays.


Edit: enhanced version is down-thread - Blue Iris and CodeProject.AI ALPR
 
Last edited:
@MikeLud1 I just noticed that I receive this error in Event Viewer after installing 2.0.6. I don't get this error after installing 2.0.2. Does this offer any clues as to what is going wrong? Thanks!

Screenshot 2023-01-27 at 1.26.04 PM.png
 
@MikeLud1 I was looking at my log this evening and noticed I have "Plate:" entries from a camera other than my LPR camera.

Below is that other camera's AI tab. It should only be using ipcam-general and ipcam-animal. Any idea why it would be recognizing plates?

Clipboard02.jpg
 


The plates should be logged somewhere better than the system logs, like in a spreadsheet in a system folder, a new tab in BI, or a program like the LPR Viewer program someone built that worked pretty well with OpenAlpr. The historical, searchable, and easily accessible data is critical here. Alerts for plate detection seems like another important feature to add as this develops. Lot of potential here and thanks to everyone who has been working on it.
 
The plates should be logged somewhere better than the system logs, like in a spreadsheet in a system folder, a new tab in BI, or a program like the LPR Viewer program someone built that worked pretty well with OpenAlpr. The historical, searchable, and easily accessible data is critical here. Alerts for plate detection seems like another important feature to add as this develops. Lot of potential here and thanks to everyone who has been working on it.

I haven't tried it, but according to the docs, alerts for plate detection is already available using "myplates" on the AI setup page and then using "myplates" or "notmyplates" as required AI objects in the alert settings.
 
The plates should be logged somewhere better than the system logs, like in a spreadsheet in a system folder, a new tab in BI, or a program like the LPR Viewer program someone built that worked pretty well with OpenAlpr. The historical, searchable, and easily accessible data is critical here. Alerts for plate detection seems like another important feature to add as this develops. Lot of potential here and thanks to everyone who has been working on it.

Yes, it would be nice to have some search and filtering features that are convenient to use within BI. It would be a lot of work/resources to integrate it in a way that is useful to everyone though. Not sure what that would look like... guessing another tab in the BI app, that shows potential casing patterns? As is, just the fact that the plates are being logged with date/time is still very useful "after the fact".

For now, if just having a single easy to use file is all you are after... I already slapped together a batch script that does exactly that. Place it anywhere you want on your BI PC, run it, and it will go through all of the BI logs, read just the plate lines, and copy them to a CSV file (saved in the same directory where you run the batch). It removes dupe lines so you can run it multiple times over old and new log files without issues. It also removes uneeded columns to reduce size, and csv is very compatible with spreadsheets and other data visualization apps. It does not modify the original log files. Instead of zipping up a .bat, it's only 5-lines so just posting the source:

Code:
setlocal enabledelayedexpansion
FIND "    Plate:" C:\BlueIris\log\*.txt >> platelog.csv
sort /C /UNIQUE "platelog.csv" /O "platelog.csv"
for /F "tokens=1-8" %%i in (platelog.csv) do (
    if "0"=="%%i" (
        echo %%j,%%k,%%l,%%o,%%p >> "platelog.csv.new"
    )
)
move /y "platelog.csv.new" "platelog.csv" >nul
EOF

The CSV output rows are formatted like this:
1/14/2023,5:00:43.031,PM,w000zle,[89.5%]

It works well, but I'm sure there's more efficient ways to do this, and the percentages are still wrapped in "[]". It's the first win batch I've written in a decade, and I hate win cmd lol. Anyways, if you have a bunch of huge log files, it may take some time to finish (~5sec on a 4mb log with my i12700k cpu, to give an idea).
 
Last edited:
FWIW, it appears as is the lpr will stop processing images once it has a plate with sufficient confidence, regardless if it's a whole plate etc. Since there will be some work on this end... I was just reading up on how openalpr processes images, using state plate patterns to help filtering... it would be awesome if at some point this could be done in cpai. Of course, it could be a double edge sword with all the personalized and out of state plates my lpr sees... but probably won't be a problem as long as it doesn't adhere to patterns too much. ;)
 
Yes, it would be nice to have some search and filtering features that are convenient to use within BI. It would be a lot of work/resources to integrate it in a way that is useful to everyone though. Not sure what that would look like... guessing another tab in the BI app, that shows potential casing patterns? As is, just the fact that the plates are being logged with date/time is still very useful "after the fact".

For now, if just having a single easy to use file is all you are after... I already slapped together a batch script that does exactly that. Place it anywhere you want on your BI PC, run it, and it will go through all of the BI logs, read just the plate lines, and copy them to a CSV file (saved in the same directory where you run the batch). It removes dupe lines so you can run it multiple times over old and new log files without issues. It also removes uneeded columns to reduce size, and csv is very compatible with spreadsheets and other data visualization apps. It does not modify the original log files. Instead of zipping up a .bat, it's only 5-lines so just posting the source:

Code:
setlocal enabledelayedexpansion
FIND "    Plate:" C:\BlueIris\log\*.txt >> platelog.csv
sort /C /UNIQUE "platelog.csv" /O "platelog.csv"
for /F "tokens=1-8" %%i in (platelog.csv) do (
    if "0"=="%%i" (
        echo %%j,%%k,%%l,%%o,%%p >> "platelog.csv.new"
    )
)
move /y "platelog.csv.new" "platelog.csv" >nul
EOF

The CSV output rows are formatted like this:
1/14/2023,5:00:43.031,PM,w000zle,[89.5%]

It works well, but I'm sure there's more efficient ways to do this, and the percentages are still wrapped in "[]". It's the first win batch I've written in a decade, and I hate win cmd lol. Anyways, if you have a bunch of huge log files, it may take some time to finish (~5sec on a 4mb log with my i12700k cpu, to give an idea).
That is really nice. Thanks. Could it be modified to list the camera name? Some of us have more than one LPR cam and it would be nice to know which cam the capture came from.
 
Yes, it would be nice to have some search and filtering features that are convenient to use within BI.
I've been playing with the Node-Red/SQL stuff that @Alan_F has been working on and I think it has the potential to replace the free version of Plate Recognizer I've been using.

Thanks for the help with this Alan!

Screen Shot 2023-01-27 at 8.54.59 PM.png
 
Last edited:
he
I've been playing with the Node-Red/SQL stuff that @Alan_F has been working on and I think it has the potential to replace the free version of Plate Recognizer I've been using.

Thanks for the help with this Alan!

View attachment 152348
where can we find more info on this? I am waiting for the dust to settle before i jump into the new version and have been using plate recognizer for a while now. would love to get this set up once i do finally upgrade.
 
where can we find more info on this? I am waiting for the dust to settle before i jump into the new version and have been using plate recognizer for a while now. would love to get this set up once i do finally upgrade.

 
@woolfman72 - If you have any questions after looking through the above posts, let me know. Ignore the node-red flow in the first post. The method in the second post is much better. Also, the database was changed a bit, so here is the new database setup:

Code:
+-------------------+--------------+------+-----+---------------------+----------------+
| Field             | Type         | Null | Key | Default             | Extra          |
+-------------------+--------------+------+-----+---------------------+----------------+
| idlpr_images      | int(11)      | NO   | PRI | NULL                | auto_increment |
| tag               | varchar(45)  | YES  |     | NULL                |                |
| file              | varchar(255) | YES  | UNI | NULL                |                |
| datetime          | datetime     | YES  |     | current_timestamp() |                |
| image_description | varchar(255) | YES  |     | NULL                |                |
| filedatetime      | datetime     | YES  |     | NULL                |                |
| camera            | varchar(45)  | YES  |     | NULL                |                |
+-------------------+--------------+------+-----+---------------------+----------------+
and the create statement for that is:
Code:
CREATE TABLE `lpr_images` (
  `idlpr_images` int(11) NOT NULL AUTO_INCREMENT,
  `tag` varchar(45) DEFAULT NULL,
  `file` varchar(255) DEFAULT NULL,
  `datetime` datetime DEFAULT current_timestamp(),
  `image_description` varchar(255) DEFAULT NULL,
  `filedatetime` datetime DEFAULT NULL,
  `camera` varchar(45) DEFAULT NULL,
  PRIMARY KEY (`idlpr_images`),
  UNIQUE KEY `file_UNIQUE` (`file`)
) ENGINE=InnoDB AUTO_INCREMENT=48999 DEFAULT CHARSET=utf8mb4


On another topic: I'm using 'DayPlate' and 'NIghtPlate' as my required 'to confirm' objects. I see that BI generally takes the image that has the highest percent confidence for one of those objects and sends it off for ALPR. In other words if one image has "DayPlate:75" and one has "DayPlate:90", then the second one would be sent to Plate Recognizer for analysis.

While that seems to generally produce a good result, it doesn't always. Sometimes I have several excellent views of the plate, but the highest confidence from the AI is one where the plate was at the side of the image (and thus further away from the camera / smaller in the frame).

Would it be worthwhile to ask if Ken can modify the algorithm so that it (maybe optionally) selects the image where the confirmed object was closest to the center of the image? Since each AI return gives the X,Y of the object, that should be something he could code. I'd rather try to analyze an 85% DayPlate at the center of the image than a 90% DayPlate at the far edge. If others would find this useful I'll send it in to support to see if it is something he could do.
 
  • Like
Reactions: woolfman72
@woolfman72 - If you have any questions after looking through the above posts, let me know. Ignore the node-red flow in the first post. The method in the second post is much better. Also, the database was changed a bit, so here is the new database setup:

Code:
+-------------------+--------------+------+-----+---------------------+----------------+
| Field             | Type         | Null | Key | Default             | Extra          |
+-------------------+--------------+------+-----+---------------------+----------------+
| idlpr_images      | int(11)      | NO   | PRI | NULL                | auto_increment |
| tag               | varchar(45)  | YES  |     | NULL                |                |
| file              | varchar(255) | YES  | UNI | NULL                |                |
| datetime          | datetime     | YES  |     | current_timestamp() |                |
| image_description | varchar(255) | YES  |     | NULL                |                |
| filedatetime      | datetime     | YES  |     | NULL                |                |
| camera            | varchar(45)  | YES  |     | NULL                |                |
+-------------------+--------------+------+-----+---------------------+----------------+
and the create statement for that is:
Code:
CREATE TABLE `lpr_images` (
  `idlpr_images` int(11) NOT NULL AUTO_INCREMENT,
  `tag` varchar(45) DEFAULT NULL,
  `file` varchar(255) DEFAULT NULL,
  `datetime` datetime DEFAULT current_timestamp(),
  `image_description` varchar(255) DEFAULT NULL,
  `filedatetime` datetime DEFAULT NULL,
  `camera` varchar(45) DEFAULT NULL,
  PRIMARY KEY (`idlpr_images`),
  UNIQUE KEY `file_UNIQUE` (`file`)
) ENGINE=InnoDB AUTO_INCREMENT=48999 DEFAULT CHARSET=utf8mb4


On another topic: I'm using 'DayPlate' and 'NIghtPlate' as my required 'to confirm' objects. I see that BI generally takes the image that has the highest percent confidence for one of those objects and sends it off for ALPR. In other words if one image has "DayPlate:75" and one has "DayPlate:90", then the second one would be sent to Plate Recognizer for analysis.

While that seems to generally produce a good result, it doesn't always. Sometimes I have several excellent views of the plate, but the highest confidence from the AI is one where the plate was at the side of the image (and thus further away from the camera / smaller in the frame).

Would it be worthwhile to ask if Ken can modify the algorithm so that it (maybe optionally) selects the image where the confirmed object was closest to the center of the image? Since each AI return gives the X,Y of the object, that should be something he could code. I'd rather try to analyze an 85% DayPlate at the center of the image than a 90% DayPlate at the far edge. If others would find this useful I'll send it in to support to see if it is something he could do.

Not sure that your scenario of the plate being closest to the centre being the best works for everyone else though. For example it wouldn’t work for my current camera setup where there are multiple uses for the stream.
 
  • Like
Reactions: truglo and Alan_F
On another topic: I'm using 'DayPlate' and 'NIghtPlate' as my required 'to confirm' objects. I see that BI generally takes the image that has the highest percent confidence for one of those objects and sends it off for ALPR. In other words if one image has "DayPlate:75" and one has "DayPlate:90", then the second one would be sent to Plate Recognizer for analysis.
I asked Ken to make the below change, he has not done it yet, maybe if more users ask he will do it sooner.

Can you make one change, instead of using the license-plate highest confidence as the alert can you use Plates highest confidence as the alert.
 
  • Like
Reactions: Alan_F
I asked Ken to make the below change, he has not done it yet, maybe if more users ask he will do it sooner.

Can you make one change, instead of using the license-plate highest confidence as the alert can you use Plates highest confidence as the alert.

I assume license-plates is the AI object and Plates is the OCR/ALPR text?

That seems like a good idea if CPAI LPR is checking the text on each image. I'm using Plate Recognizer, so only a single image per alert is analyzed for plate reading, and that enhancement wouldn't be relevant in my scenario.

If CPAI gets the ability to handle the vertically stacked letters better, maybe I can drop Plate Recognizer and switch over to CPAI for the LPR analysis. Are there any more updates in the queue that might address that? Is it just a coding issue, or does it need to be trained on images? I don't have the GPU horsepower to train models, but I could supply images of tags with stacked letters to anyone who wants to use them to train the AI.
 
@Alan_F below are answers to your questions
I assume license-plates is the AI object and Plates is the OCR/ALPR text?
Correct
If CPAI gets the ability to handle the vertically stacked letters better, maybe I can drop Plate Recognizer and switch over to CPAI for the LPR analysis. Are there any more updates in the queue that might address that?
There will be more updates, one of them would be to handle the vertically stacked letters
Is it just a coding issue, or does it need to be trained on images?
Most likely just coding, when I start working on it I will find out
 
Earlier in the thread I posted a Node Red flow to check plate reads for matches and send alerts. Blue Iris and CodeProject.AI ALPR. I noted

If I get time to try to enhance this, I think it would be useful to be able to list the full tag I'm looking for, a text description/reason, and the partial elements for just that tag, so the message could be something like:
"Tag 8C1234 is a partial match for ABC1234 - Blue Honda Accord - car break-ins"

I found a little time to play with it, and the results are below. It could probably be more elegant, but it should get the job done. The plate information has to be entered into the function node is a specific format, but there are examples in the code and you can cut & paste those to add plates of interest. The node red flow json is attached.

The format is: "exact plate", "description/reason text", ["array", "of", "partial", "plates" ]
The examples below are in the function node:
Code:
alert_array.push(['ABC1234', 'description text 1', ['BC123','1234','ABC123']])
alert_array.push(['DEF1234', 'description text 2', ['DEF', '1234']])
alert_array.push(['GHI5678', 'description text 3'])
So you would replace those lines in the function with the information for the plates you want to alert on, keeping the format exactly the same. The first two examples include partial plates, the third example does not.

You can list as many plates as you want. The plate needs to be sent from Blue Iris to Node Red via MQTT in JSON format with, at a minimum { "plate":"&PLATE" } in the body of the MQTT message. In the example flow the MQTT topic is "test2" but adjust that to whatever topic you are having Blue Iris publish.

If the LPR reads "BC123" it will partial match on only the first entry. If it reads "ABC1234" it will exact match on the first entry but also partial match on the second (because 1234 was entered as a partial tag for the second entry and that matches as part of ABC1234). The third plate will only match if read exactly, because no partial plates were entered. Deciding what partial plates to enter will depend on your local plate formats and the common mistakes you see the LPR making. For example, I often see it confuse W/M/H, and J/I.

The msg.result coming out of the function node starts with the plate that was sent by MQTT and then lists the hits.
so with the above example entries in the function, if it reads "1234", the result is:
Plate 1234 | Partial match for ABC1234 - description text 1, Partial match for CDE1234 - description text 2
and if it reads "ABC1234"
Plate ABC1234 | Exact Match - description text 1, Partial match for DEF1234 - description text 2
and if it reads "GHI5678"
Plate GHI5678 | Exact Match - description text 3
and if it reads "GH15678" (there is a number 1 after the H) there will be no message sent by the function node, and there were no matches.

In my example flow, I take the output from the function, send it through a switch that checks if the result contains the word 'Exact'. I then format my notification message differently depending on that switch (with a higher priority message for exact hits) and send it to my notification server. You could take the msg.result from the function node directly to whatever notification method you are using.
 

Attachments