OpenALPR Webhook Processor for IP Cameras

It's for debugging plates that fail to process, it doesn't do anything with scraping.
Alright thanks for that.

An issue I think. When you click a car picture in the webhook processor and it enlarges the car picture, if you right click and "Save Image as", it saves it with an .jfif extension now instead of a .jpg. To attach a picture in this forum you have to manually change the extension back to a .jpg as it won't accept a .jfif.
 
Last edited:
In the mean time you can always right click the image then click copy, then in the forum post post box right click and click paste
 
  • Like
Reactions: biggen
Heck I've always attached it as a file. I didn't even know you could do it your way!

Man you've been missing out!
 
  • Haha
Reactions: biggen
For the next version would it be possible to give us the option purge plate images from the local DB or have an option to either pull/not pull the images from the agent when doing a scrape? There may be instances where we don't necessarily want to keep images locally in the webhook processor or need to remove them to clear up space.
 
I was wondering if there was a way to have a seperate DB or a seperate folder to store the images?

I could mount an SMB shared into the VM I'm running Docker on, and pass it to the container. Then my NAS could store the images, but the rest of the DB could be on fast local storage. I don't mind storing 1TB of plates, but I'm not sure how much I want to take up 1TB of NVMe SSD
 
Hello I am having trouble pulling from unraid command ui


Code:
root@localhost:# /usr/local/emhttp/plugins/dynamix.docker.manager/scripts/docker run -d --name='WebhookProcessor' --net='br0' --ip='192.168.1.122' -e TZ="America/Los_Angeles" -e HOST_OS="Unraid" 'mlapaglia/openalprwebhookprocessor'

Unable to find image 'mlapaglia/openalprwebhookprocessor:latest' locally
docker: Error response from daemon: Get https://registry-1.docker.io/v2/: dial tcp: lookup registry-1.docker.io on [::1]:53: read udp [::1]:59665->[::1]:53: read: connection refused.
See 'docker run --help'.
 
Does that machine have internet access? Can you ping yahoo.com for example? Have you tried to restart/reboot the machine to see if that fixes it?
 
I am new trying Unraid, I am not clear how to add profile networks to it, it sticks to the default... Any ideas on how to run this propoperly, also having trouble registering openalpr agent....

I've been following this thread since almost the beginning of it, started working on my own platform which has local DB... And now want to integrate Webhook Processor... It's been a while

Thanks for all your hard work!
 
I have another suggestion for future ideas, I'm using the Pushover integration, and it works great. Much better than the built-in Rekor Scout alerts. But they don't say the description entered in the alert

I have 25 plates set to alert, all with various different reasons. This guy came around while I was mowing the lawn, I had no idea if it was just some guy who threw some trash, or an actual criminal

1637262082388.png
 
I have another suggestion for future ideas, I'm using the Pushover integration, and it works great. Much better than the built-in Rekor Scout alerts. But they don't say the description entered in the alert

I have 25 plates set to alert, all with various different reasons. This guy came around while I was mowing the lawn, I had no idea if it was just some guy who threw some trash, or an actual criminal

woops! use correct description by mlapaglia · Pull Request #99 · mlapaglia/OpenAlprWebhookProcessor
 
Awesome! Thanks!!!!
 
I have another question, should the notes populate on every instance of the plate?

I added a few notes for cars, like if they had been around before, but it seem to just show for that very instance of them going past
 
Alright I've finally found some free time and deployed a new VM with much more space just for this container

Is there anything special I need to do to get the images to start saving locally?
 
I've been checking the size of the processor.db and it is rising slowly, so it must be working

Is there a size limit? I think my Rekor Scout install is set to 250GB
 
There is no limit for the service. If you keep scraping, it will keep growing. It would be nice to set an upper limit or at least be able to purge images X days old.
 
Okay cool, for not that's fine I think, 2 years is around 150~GB I think, so it will be a good few years before I get annoyed at how much storage its taking

Is there any way to ramp up the speed? So far my processor.db has increased around 100MB in the last hour or so. Not sure where the limitation is
 
I'll just let its doing its thing, hopefully it gets there one day!