OpenALPR Webhook Processor for IP Cameras

I can't find that model number on Dahua's website. If the camera was made within the past few years it will probably work though.
 
v3.7.2 is released with an update to the main plate list now using lazy loading instead of pulling all of the plates up at once increasing performance when looking at a long list. i added a small spinner to the paginator so users can tell when the plates are being loaded instead of guessing.

1623645754949.png

The settings screen works better on mobile now
1623645961417.png
 
Hey that’s great! I have noticed when loading 100 at a time there was a delay. I’ll download this ASAP! And I see possible plate matches! Very nice.

Edit: Can confirm everything words great. Still on 2.8.101 since Rekor hasn't pushed the Ubuntu update yet.
 
Last edited:
Is the service still dependent on the webhook or are you scraping/parsing the debug output from the local agent/server?
 
Last edited:
It’s good to know the debug output has the data though in case they ever shut down the webhook feature.
 
I am working on the agent scraping logic, lots of edge cases I haven't thought of. I think I have them all figured out I will give it a few more days of testing.

I am also playing around with parsing the region field out of the webhook to show on the UI, and eventually search with it
1623862615648.png
 
  • Like
Reactions: biggen and nhs128
Sweet! Will be awesome to be able to not be dependent on the webhook as it will make installs much simpler for those with limited network experience and we won't have to worry if Rekor decides they want to abandon the webhook functionality.

I haven't check the documentation on "region". Is that the state the license plate is registered in? That is a big deal if so since searching by state would be great!
 
i only have US plates to play with, but it sends country code and state code. accuracy seems to depend on how well defined the plate is.
 
  • Like
Reactions: biggen
Just noticed you added some logging information and web hook forward to the settings page.

Adding a second user doesn't seem to work. After I try to add one, I get an "OK" in a red bar at the top. But I can't login with the secondary user. I have to login with the 1st user and the user page doesn't even show the 2nd user at all. So it appears right now the service is limited to one user.
 
v3.8.0 is released.
  • fixed the issue with trying to create multiple users.
  • I have also laid the groundwork for agent scraping on the agent settings page. I highly recommend creating a copy of your processor.db before experimenting with this. The first time it is run it will start from the earliest record the agent has, working forward, checking for missing captures. if you notice irregularities let me know. The first scrape will take a while, you can watch the log page for progress. You do not need to keep your browser open while the scrape happens. This doesn't happen automatically, you need to press the scrape button manually. I'll make it a setting to schedule in the future.
    • 1624420215050.png
  • I added searching for make/model/color etc. It's a work in progress right now, you can select mazda and outback for instance, no results will return. right now the dropdowns will not show available options from past captures until the agent scrape is run for the first time.
    • 1624420917035.png
  • I added a small pulsing light near the paginator to show the connection status to the server. if it's flashing green you should be getting live updates as plates are captured.
  • I made the captured make/model only record if the agent's confidence is over 50%, this will keep incorrect vehicle types from showing up during night when there is no way it will be accurate.

I am currently working on v3.9.0 showing some metrics on the plate details page
  • 1624420721654.png
 
Last edited:
Wow! Huge update!

So once one hits start scrape and it builds, is it no longer listening for the webhook at that point any longer?
 
A couple of thoughts:
  • Scraping is currently in progress. Looking at the logs, it is scraping about 4 plates a second and I have 72k plates to scrape. So I got about 5 hours to go at this pace. If I ever have to do this in bulk again, I'll increase the core count on the VM to speed this up. Having no issue (I'm aware of) with the process and backed up the .db before performing the scrape.
  • Is the 50% confidence set too low? I'm getting lots of empty vehicle descriptions now during the day. What is the normal confidence at night that the service reports?
  • Is it my imagination or is the plate crop larger?
  • If the service still relies on the webhook, is the scraping just for advanced metrics at this point? Or is more for a proof-of-concept so that you can make the move off the webhook for future builds?
My logs look like this during the scrape:
Screenshot from 2021-06-23 07-50-23.png

I assume that means everything is running smoothly?
 
the webhook is still required for real-time updates for now, you can scrape manually by pressing that button.
scraping a second time will only scrape new plates that have happened since the last time you scraped.
 
  • Like
Reactions: biggen
Well I'm 6 hours in and its still scraping. No errors though and everything still running well.

I am missing about 25% of the vehicle descriptions now since the move to 50% confidence level. These are all daytime captures in the sun. Here is one example of a white Toyota it missed:

Screenshot from 2021-06-23 13-32-51.png

Would it be possible to let the user set that confidence level in the settings menu?

Adding another user works wonderfully now! How hard is it to hide "Settings" menu from any additional user created? That way a user couldn't mess with any of the configuration. Only the first (admin) user would have access to that area.
 
Ya I can make it configurable. Not sure why scraping is running so slow on your machine..

Need some ideas for information to show on the details:

1624476620214.png