An issue I think. When you click a car picture in the webhook processor and it enlarges the car picture, if you right click and "Save Image as", it saves it with an .jfif extension now instead of a .jpg. To attach a picture in this forum you have to manually change the extension back to a .jpg as it won't accept a .jfif.
For the next version would it be possible to give us the option purge plate images from the local DB or have an option to either pull/not pull the images from the agent when doing a scrape? There may be instances where we don't necessarily want to keep images locally in the webhook processor or need to remove them to clear up space.
I was wondering if there was a way to have a seperate DB or a seperate folder to store the images?
I could mount an SMB shared into the VM I'm running Docker on, and pass it to the container. Then my NAS could store the images, but the rest of the DB could be on fast local storage. I don't mind storing 1TB of plates, but I'm not sure how much I want to take up 1TB of NVMe SSD
I am new trying Unraid, I am not clear how to add profile networks to it, it sticks to the default... Any ideas on how to run this propoperly, also having trouble registering openalpr agent....
I've been following this thread since almost the beginning of it, started working on my own platform which has local DB... And now want to integrate Webhook Processor... It's been a while
I have another suggestion for future ideas, I'm using the Pushover integration, and it works great. Much better than the built-in Rekor Scout alerts. But they don't say the description entered in the alert
I have 25 plates set to alert, all with various different reasons. This guy came around while I was mowing the lawn, I had no idea if it was just some guy who threw some trash, or an actual criminal
I have another suggestion for future ideas, I'm using the Pushover integration, and it works great. Much better than the built-in Rekor Scout alerts. But they don't say the description entered in the alert
I have 25 plates set to alert, all with various different reasons. This guy came around while I was mowing the lawn, I had no idea if it was just some guy who threw some trash, or an actual criminal
There is no limit for the service. If you keep scraping, it will keep growing. It would be nice to set an upper limit or at least be able to purge images X days old.
Okay cool, for not that's fine I think, 2 years is around 150~GB I think, so it will be a good few years before I get annoyed at how much storage its taking
Is there any way to ramp up the speed? So far my processor.db has increased around 100MB in the last hour or so. Not sure where the limitation is