Full ALPR Database System for Blue Iris!

algertc

Getting the hang of it
Feb 20, 2023
34
33
California
Hello everyone

I created this project as an alternative to the super expensive options from PlateMinder and Rekor. This still depends on your own CodeProject or DeepStack AI, but offers a nice all-in-one solution to actually use and make sense of the data, which is half the point of having the AI read the plates to begin with. It has been working great for me so far, really huge upgrade, so I wanted to share it.

I know there was a NodeRed app created a while ago that had some of this functionality. I took some inspiration from that and tried to bring it to the next level.

Would love to hear if anyone tries it out.

Project link
 
Great work, I am up and running. Two issues I see the Time Distribution is off about +4 hours (the Timestamps are correct) and if there are two plates in the image it only shows one plate.

Also when building the Docker Container I get app The requested image's platform (linux/arm64/v8) does not match the detected host platform (linux/amd64/v3) and no specific platform was requested


1731768642174.png

1731769145164.png
 
  • Like
Reactions: hikky_b
looks amazing!

Attempting to get this running in a Debian container in Proxmox but get the following:

Code:
image's platform (linux/arm64/v8) does not match the detected host platform (linux/amd64/v3) and no specific platform was req

EDIT: as above!
 
I had built the image on my ARM mac originally. I just pushed a cross-platform image that will work with amd64 also.

I believe I fixed the time distribution, but I'm gonna have to think about the multiple plates in one image situation. That hardly ever happens where I am. @MikeLud1 Could you send what your memo looks like with multiple plates in it? Or, more importantly, what the plate macro is going to send. I can try to work up a solution with that.
 
I had built the image on my ARM mac originally. I just pushed a cross-platform image that will work with amd64 also.

I believe I fixed the time distribution, but I'm gonna have to think about the multiple plates in one image situation. That hardly ever happens where I am. @MikeLud1 Could you send what your memo looks like with multiple plates in it? Or, more importantly, what the plate macro is going to send. I can try to work up a solution with that.
@algertc Thanks for updating the image, the issue with amd64 is fixed. The time distribution is still not fixed, I notice the &ALERT_TIME macro is sending the current time in ISO8601 format which is +5.

The &PLATE macro only send one plate even if there is two or more plates.
{ "plate_number":"KVJ5990", "Image":"&ALERT_JPEG", "timestamp":"2024-11-17T00:20:32.654Z" }

The &MEMO macro does send all the plates detected with the confidence level.
{ "plate_number":"LAN6616:97%,KVJ5990:95%", "Image":"&ALERT_JPEG", "timestamp":"2024-11-17T00:28:43.396Z" }
 
This is going to make me install CPAI again. I’ve been cold turkey with it for years as I just didn’t find it all that reliable.
 
@algertc I fixed the time distribution by using the below macros for the timestamp but it breaks the Live ALPR Feed, the images are not being logged.

{ "plate_number":"&PLATE", "Image":"&ALERT_JPEG", "timestamp":"%Y-%m-%dT%H:%M:%S.%tZ" }
 
Is it too hard for a cavemen like me?
Yeah I could probably freeze up whole system looking at the project link.
 
  • Like
Reactions: samplenhold
This is going to make me install CPAI again. I’ve been cold turkey with it for years as I just didn’t find it all that reliable.
I just switched it to run on a separate linux machine last week and it has been 100x more reliable. New YOLO V8 has also been released - working really well.
 
  • Like
Reactions: biggen
@algertc I fixed the time distribution by using the below macros for the timestamp but it breaks the Live ALPR Feed, the images are not being logged.

{ "plate_number":"&PLATE", "Image":"&ALERT_JPEG", "timestamp":"%Y-%m-%dT%H:%M:%S.%tZ" }
Yeah, it was a timezone issue with the database. I could not get it to query and build the hour mappings with a custom timezone. Went for a workaround and am now creating the hour mapping on the server instead. It might be a tiny bit slower, but it's unlikely anyone will notice unless they have really high traffic. In that case, just make sure the docker machine has enough compute and it will be fine.

Pushed the fix. Seems to be working with my data now. You will want to stick with using the original time macro in the payload.

There's also a time histogram for individual plates if you click the plate number in the plate database page.

Switching to use the memo in order to handle multiple plates in one image is going to require some significant changes. I'll switch it over soon.
 
  • Like
Reactions: MikeLud1
Is it too hard for a cavemen like me?
Yeah I could probably freeze up whole system looking at the project link.
Not at all! Actually super easy. Just run the commands in the quick start and it will create everything for you. Only takes a few minutes - much easier than it looks :)

I don't see any way you could freeze or break your system with it either. Definitely give it a go if you have a license plate camera..
 
@algertc here are some enhancements ideas
  • From the Dashboard if you click on a plate number of reads in Top 5 Plates (24h) it opens a new Live ALPR Feed tab filtered to the plate.
    • 1731871704149.png
  • From the Dashboard if you click on a plate in Top 5 Plates (24h) it opens a new tab of the time histogram for that plate.
    • 1731871906335.png
  • Add the camera name to the payload and have the ability to filter by camera name
    • { "camera_name":"&NAME", "plate_number":"&PLATE", "Image":"&ALERT_JPEG", "timestamp":"&ALERT_TIME" }
  • From the Time Distribution if you click on a bar for the hour it displays the histogram for the days of the week at that time.
    • 1731872918667.png
 
Last edited:
Just pushed quite a few changes. I also added some extra volumes to the docker-compose file, which will keep your settings and password configuration persistent through updates. If you download or pull the latest changes in the repository, then docker-compose up -d, you should have all the new adds + keep your settings from now on.

Changes:
  • You can now optionally use memo instead of plate_number. This will allow for multiple plates in one alert and will individually add them all to the live feed and your database if applicable. The readme shows how the new payload should look if you would like to use the memo.
  • Fuzzy search. This is a huge one. The live feed tab now has a toggle for fuzzy search, which will also match to records that the OCR may have gotten slightly wrong.
  • Fixed settings and password change.
  • More configuration for push notifications + test notification button.
  • Clicking the number of reads in the "top plates" card of the dashboard will bring you to the live feed filtered for that plate. Added the same in the plate insights "recent reads" section
  • Other bug fixes.
 
Just pushed quite a few changes. I also added some extra volumes to the docker-compose file, which will keep your settings and password configuration persistent through updates. If you download or pull the latest changes in the repository, then docker-compose up -d, you should have all the new adds + keep your settings from now on.

Changes:
  • You can now optionally use memo instead of plate_number. This will allow for multiple plates in one alert and will individually add them all to the live feed and your database if applicable. The readme shows how the new payload should look if you would like to use the memo.
  • Fuzzy search. This is a huge one. The live feed tab now has a toggle for fuzzy search, which will also match to records that the OCR may have gotten slightly wrong.
  • Fixed settings and password change.
  • More configuration for push notifications + test notification button.
  • Clicking the number of reads in the "top plates" card of the dashboard will bring you to the live feed filtered for that plate. Added the same in the plate insights "recent reads" section
  • Other bug fixes.
just pulled the new update and when running docker compose I am getting the following errors:

Code:
[+] Running 2/3
 ✔ Network alprdb_default  Created                                                                                                           0.1s 
 ✔ Container alprdb-db-1   Started                                                                                                           0.7s 
 ⠹ Container alprdb-app-1  Starting                                                                                                          0.4s 
Error response from daemon: error while mounting volume '/var/lib/docker/volumes/alprdb_app-auth/_data': failed to mount local volume: mount /root/alprdb/auth:/var/lib/docker/volumes/alprdb_app-auth/_data, flags: 0x1000: no such file or directory
 
just pulled the new update and when running docker compose I am getting the following errors:

Code:
[+] Running 2/3
 ✔ Network alprdb_default  Created                                                                                                           0.1s
 ✔ Container alprdb-db-1   Started                                                                                                           0.7s
 ⠹ Container alprdb-app-1  Starting                                                                                                          0.4s
Error response from daemon: error while mounting volume '/var/lib/docker/volumes/alprdb_app-auth/_data': failed to mount local volume: mount /root/alprdb/auth:/var/lib/docker/volumes/alprdb_app-auth/_data, flags: 0x1000: no such file or directory
Did you update the docker-compose.yml and schema.sql files, there were changes made.
 
Did you update the docker-compose.yml and schema.sql files, there were changes made.
Yes I've pulled the latest schema and updated my docker-compose. Have just tried in a fresh directory to see if there was some docker caching going on but I still get the same error as above.

It does create the volumes in docker:

Code:
root@alprdb:~/alprdb2# docker volume ls
DRIVER    VOLUME NAME
local     alprdb2_app-auth
local     alprdb2_app-config
local     alprdb2_db-data
root@alprdb:~/alprdb2#
 
Yes I've pulled the latest schema and updated my docker-compose. Have just tried in a fresh directory to see if there was some docker caching going on but I still get the same error as above.
When I updated I deleted all containers, images and volumes and it worked fine
 
just pulled the new update and when running docker compose I am getting the following errors:

Code:
[+] Running 2/3
 ✔ Network alprdb_default  Created                                                                                                           0.1s
 ✔ Container alprdb-db-1   Started                                                                                                           0.7s
 ⠹ Container alprdb-app-1  Starting                                                                                                          0.4s
Error response from daemon: error while mounting volume '/var/lib/docker/volumes/alprdb_app-auth/_data': failed to mount local volume: mount /root/alprdb/auth:/var/lib/docker/volumes/alprdb_app-auth/_data, flags: 0x1000: no such file or directory
This is saying that the folders auth and config don't exist in your directory. I'm guessing Mike cloned the repo which is why they exist for him. Just create them and it should work. I'll update the readme to include that step.