Blue Iris and CodeProject.AI ALPR

@wpiman Yeah, with OCR an S may look like number 5 or vice versa; other equivalent patterns as well.

I do home automations using Home Assistant. When I get a plate # from BI5 via MQTT, I use regular expressions to match to several known plates.

Example code:
Code:
variables:
  vehicle: >-
    {% if  trigger.payload_json['plate'] | regex_search('(12S 765|12S 76S|125 765)', ignorecase=True) %}wpiman_car
    {% elif trigger.payload_json['plate'] | regex_search('(9ERR27)', ignorecase=True) %}sarah_car
    {% elif trigger.payload_json['plate'] | regex_search('(9HFY69)', ignorecase=True) %}justin_car
    {% else %}unknown_car{% endif %}

then I have automation trigger actions based on vehicle value like wpiman_car and not on the raw plate #s.

Actually, I just did something similar to what you did-- but I think I made it easier...

12[5S] ?76[5S]

where the pairs [5S] are things it can confuse (Ses and 5s)... and the " ?" means zero or one space... I did the same for [B8] and [71] etc....

I'll let you know how it goes...
 
  • Like
Reactions: actran
@MikeLud1 any idea why the newer versions of CPAI & the ALPR module might no longer allow LPR to use GPU? Last few versions on my machine no longer sees the GPU for LPR, which has killed the processing/alert speed. The settings gear wheel also no longer appears for this module.

Module 'License Plate Reader' 2.9.0 (ID: ALPR)
Valid: True
Module Path: <root>\modules\ALPR
AutoStart: True
Queue: alpr_queue
Runtime: python3.7
Runtime Loc: Local
FilePath: ALPR_adapter.py
Pre installed: False
Start pause: 3 sec
LogVerbosity:
Platforms: all
GPU Libraries: installed if available
GPU Enabled: enabled
Parallelism: 0
Accelerator:
Half Precis.: enable
Environment Variables
AUTO_PLATE_ROTATE = True
MIN_COMPUTE_CAPABILITY = 7
MIN_CUDNN_VERSION = 7
OCR_OPTIMAL_CHARACTER_HEIGHT = 60
OCR_OPTIMAL_CHARACTER_WIDTH = 36
OCR_OPTIMIZATION = True
PLATE_CONFIDENCE = 0.7
PLATE_RESCALE_FACTOR = 2
PLATE_ROTATE_DEG = 0
Status Data:
Started: 03 Feb 2024 8:42:05 PM Eastern Standard Time
LastSeen: 03 Feb 2024 8:45:19 PM Eastern Standard Time
Status: Started
Requests: 0 (includes status calls)
Provider:
CanUseGPU: False
HardwareType: CPU
 
@MikeLud1 any idea why the newer versions of CPAI & the ALPR module might no longer allow LPR to use GPU? Last few versions on my machine no longer sees the GPU for LPR, which has killed the processing/alert speed. The settings gear wheel also no longer appears for this module.

Module 'License Plate Reader' 2.9.0 (ID: ALPR)
Valid: True
Module Path: <root>\modules\ALPR
AutoStart: True
Queue: alpr_queue
Runtime: python3.7
Runtime Loc: Local
FilePath: ALPR_adapter.py
Pre installed: False
Start pause: 3 sec
LogVerbosity:
Platforms: all
GPU Libraries: installed if available
GPU Enabled: enabled
Parallelism: 0
Accelerator:
Half Precis.: enable
Environment Variables
AUTO_PLATE_ROTATE = True
MIN_COMPUTE_CAPABILITY = 7
MIN_CUDNN_VERSION = 7
OCR_OPTIMAL_CHARACTER_HEIGHT = 60
OCR_OPTIMAL_CHARACTER_WIDTH = 36
OCR_OPTIMIZATION = True
PLATE_CONFIDENCE = 0.7
PLATE_RESCALE_FACTOR = 2
PLATE_ROTATE_DEG = 0
Status Data:
Started: 03 Feb 2024 8:42:05 PM Eastern Standard Time
LastSeen: 03 Feb 2024 8:45:19 PM Eastern Standard Time
Status: Started
Requests: 0 (includes status calls)
Provider:
CanUseGPU: False
HardwareType: CPU
What GPU do you have? Chris for CodeProject.AI added a minimum compute capability because some of the older GPUs had issues with using CUDA so if your GPU is not in the below list that is why it is not working. Do you know what version ALPR work so I can work with Chris see if we can add support for some of the older GPU.

1707012451202.png

1707012355122.png
 

Attachments

  • 1707012451239.png
    1707012451239.png
    32.6 KB · Views: 8
  • 1707012451225.png
    1707012451225.png
    32.6 KB · Views: 8
What GPU do you have? Chris for CodeProject.AI added a minimum compute capability because some of the older GPUs had issues with using CUDA so if your GPU is not in the below list that is why it is not working. Do you know what version ALPR work so I can work with Chris see if we can add support for some of the older GPU.

View attachment 185134
Well that makes sense. I have a small, older GPU, which was all I could fit in the SFF PC - NVIDIA GeForce GT 1030. It does the job and runs sub 50ms inference times on ~10 cameras, so I can't really complain.

I'm not sure exactly which ALPR module version worked, but it's been a while. I think we were back around 1.5.
 
Well that makes sense. I have a small, older GPU, which was all I could fit in the SFF PC - NVIDIA GeForce GT 1030. It does the job and runs sub 50ms inference times on ~10 cameras, so I can't really complain.

I'm not sure exactly which ALPR module version worked, but it's been a while. I think we were back around 1.5.
What version CUDA version do you have installed? you can check by opening a command prompt and run nvcc --version

1707013693745.png
 
Send BI support an email on the issue you are have, it is most likely BI is not pulling the custom models list. Stopping and restarting CP.AI might fix it.
View attachment 152087
Still cannot get USE CUSTOM MODELS to work, It's greyed-out despite numerous reboots and trying some of the temporary setting changes I have read here. There was one comment where they stated, "run repair using the CPAI 2.0.7 installer..." among other things, but I don't understate how to get to this "repair" in order to run it. Without the CUSTOM MODELS available this leaves me stuck. I'm hoping to get AI working on my new LPR cam. Oh, and I'm on the very latest BI version as of today and newest Codeproject as well.
 
Still cannot get USE CUSTOM MODELS to work, It's greyed-out despite numerous reboots and trying some of the temporary setting changes I have read here. There was one comment where they stated, "run repair using the CPAI 2.0.7 installer..." among other things, but I don't understate how to get to this "repair" in order to run it. Without the CUSTOM MODELS available this leaves me stuck. I'm hoping to get AI working on my new LPR cam. Oh, and I'm on the very latest BI version as of today and newest Codeproject as well.

Mine is greyed out, and every screenshot @MikeLud1 posts like the first post in this thread shows it greyed out.

What is should show you though are the models that you have in the models folder. I removed all the models I don't use, so mine only shows the ipcam-general and the plate model.

1707069881889.png
 
Mine is greyed out, and every screenshot @MikeLud1 posts like the first post in this thread shows it greyed out.

What is should show you though are the models that you have in the models folder. I removed all the models I don't use, so mine only shows the ipcam-general and the plate model.

View attachment 185195
Blue Iris uses the below API to pull the custom model list
 
New Release License Plate Reader 3.0.1, Requires CodeProject.AI v2.5.2 or greater.

Changes:
  • Updated text recognition model to PP-OCRv4 (was PP-OCRv3)
  • Added AI Auto Rotation (The below image shows the plate on the right skewed. The plate on the left was rotated using AI Auto Rotation and this image was used when OCR)
    • 1708128834023.png
 

Attachments

  • 1708129298380.png
    1708129298380.png
    147.2 KB · Views: 4
Should the License Plate Reader work with non-CUDA GPU, like the YOLOV5 .NET version works with DirectML? Or would that require a separate build of the License Plate Reader in order to support DirectML?

I ask because I am able to run with the GPU using the YOLOV5 .NET model, but when I try to try to enable the GPU with the alpr module, it shuts down and then restarts with CPU. I will include the short bit of CodeProject logs from the failed GPU attempt. The log shows the lines, "** Module ALPR has shutdown" and "ALPR_adapter.py: has exited", but it doesn't give a reason or error code. At a minimum, does it seem like a reasonable request to CodeProject to include the reason in the log so we know WHY it's not working?

Note that further below in the log my MIN_COMPUTE_CAPABILITY=6 is shown, which was mentioned above as a criteria. Having said that, is there a magic number for the MIN_COMPUTE_CAPABILITY, and is that considered for all modules (e.g. YOLOv5 and ALPR) or ???

Since the bulk of my processing is going through the GPU, and I have plenty of CPU to spare, this doesn't particularly bother me, so I'm mostly just interested in having a better understanding.

codeproject-cpu-vs-gpu.jpg
Code:
2024-02-19 12:14:05: Update ALPR. Setting EnableGPU=true
2024-02-19 12:14:05: *** Restarting License Plate Reader to apply settings change
2024-02-19 12:14:05: Sending shutdown request to python/ALPR
2024-02-19 12:14:05: Client request 'Quit' in queue 'alpr_queue' (#reqid 5cc6168f-8ab5-4ed7-a403-436ccb548fc0)
2024-02-19 12:14:05: Request 'Quit' dequeued from 'alpr_queue' (#reqid 5cc6168f-8ab5-4ed7-a403-436ccb548fc0)
2024-02-19 12:14:05: License Plate Reader: Retrieved alpr_queue command 'Quit' in License Plate Reader
2024-02-19 12:14:17: ALPR_adapter.py: License Plate Reader started.
2024-02-19 12:14:18: ** Module ALPR has shutdown
2024-02-19 12:14:18: ALPR_adapter.py: has exited
2024-02-19 12:14:38: ALPR went quietly
2024-02-19 12:14:38: Running module using: C:\Program Files\CodeProject\AI\modules\ALPR\bin\windows\python39\venv\Scripts\python
2024-02-19 12:14:38:
2024-02-19 12:14:38: Attempting to start ALPR with C:\Program Files\CodeProject\AI\modules\ALPR\bin\windows\python39\venv\Scripts\python "C:\Program Files\CodeProject\AI\modules\ALPR\ALPR_adapter.py"
2024-02-19 12:14:38: Starting C:\Program Files...ws\python39\venv\Scripts\python "C:\Program Files...\modules\ALPR\ALPR_adapter.py"
2024-02-19 12:14:38:
2024-02-19 12:14:38: ** Module 'License Plate Reader' 3.0.1 (ID: ALPR)
2024-02-19 12:14:38: ** Valid:         True
2024-02-19 12:14:38: ** Module Path:   &lt;root&gt;\modules\ALPR
2024-02-19 12:14:38: ** AutoStart:     True
2024-02-19 12:14:38: ** Queue:         alpr_queue
2024-02-19 12:14:38: ** Runtime:       python3.9
2024-02-19 12:14:38: ** Runtime Loc:   Local
2024-02-19 12:14:38: ** FilePath:      ALPR_adapter.py
2024-02-19 12:14:38: ** Pre installed: False
2024-02-19 12:14:38: ** Start pause:   3 sec
2024-02-19 12:14:38: ** Parallelism:   0
2024-02-19 12:14:38: ** LogVerbosity:
2024-02-19 12:14:38: ** Platforms:     all
2024-02-19 12:14:38: ** GPU Libraries: installed if available
2024-02-19 12:14:38: ** GPU Enabled:   enabled
2024-02-19 12:14:38: ** Accelerator:
2024-02-19 12:14:38: ** Half Precis.:  enable
2024-02-19 12:14:38: ** Environment Variables
2024-02-19 12:14:38: ** AUTO_PLATE_ROTATE            = True
2024-02-19 12:14:38: ** MIN_COMPUTE_CAPABILITY       = 6
2024-02-19 12:14:38: ** MIN_CUDNN_VERSION            = 7
2024-02-19 12:14:38: ** OCR_OPTIMAL_CHARACTER_HEIGHT = 60
2024-02-19 12:14:38: ** OCR_OPTIMAL_CHARACTER_WIDTH  = 30
2024-02-19 12:14:38: ** OCR_OPTIMIZATION             = True
2024-02-19 12:14:38: ** PLATE_CONFIDENCE             = 0.7
2024-02-19 12:14:38: ** PLATE_RESCALE_FACTOR         = 2
2024-02-19 12:14:38: ** PLATE_ROTATE_DEG             = 0
2024-02-19 12:14:38:
2024-02-19 12:14:38: Started License Plate Reader module
2024-02-19 12:14:42: ALPR_adapter.py: Running init for License Plate Reader
 
Last edited:
Should the License Plate Reader work with non-CUDA GPU, like the YOLOV5 .NET version works with DirectML? Or would that require a separate build of the License Plate Reader in order to support DirectML?

I ask because I am able to run with the GPU using the YOLOV5 .NET model, but when I try to try to enable the GPU with the alpr module, it shuts down and then restarts with CPU. I will include the short bit of CodeProject logs from the failed GPU attempt. The log shows the lines, "** Module ALPR has shutdown" and "ALPR_adapter.py: has exited", but it doesn't give a reason or error code. At a minimum, does it seem like a reasonable request to CodeProject to include the reason in the log so we know WHY it's not working?

Note that further below in the log my MIN_COMPUTE_CAPABILITY=6 is shown, which was mentioned above as a criteria. Having said that, is there a magic number for the MIN_COMPUTE_CAPABILITY, and is that considered for all modules (e.g. YOLOv5 and ALPR) or ???

Since the bulk of my processing is going through the CPU, and I have plenty of CPU to spare, this doesn't particularly bother me, so I'm mostly just interested in having a better understanding.

View attachment 186963
Code:
2024-02-19 12:14:05: Update ALPR. Setting EnableGPU=true
2024-02-19 12:14:05: *** Restarting License Plate Reader to apply settings change
2024-02-19 12:14:05: Sending shutdown request to python/ALPR
2024-02-19 12:14:05: Client request 'Quit' in queue 'alpr_queue' (#reqid 5cc6168f-8ab5-4ed7-a403-436ccb548fc0)
2024-02-19 12:14:05: Request 'Quit' dequeued from 'alpr_queue' (#reqid 5cc6168f-8ab5-4ed7-a403-436ccb548fc0)
2024-02-19 12:14:05: License Plate Reader: Retrieved alpr_queue command 'Quit' in License Plate Reader
2024-02-19 12:14:17: ALPR_adapter.py: License Plate Reader started.
2024-02-19 12:14:18: ** Module ALPR has shutdown
2024-02-19 12:14:18: ALPR_adapter.py: has exited
2024-02-19 12:14:38: ALPR went quietly
2024-02-19 12:14:38: Running module using: C:\Program Files\CodeProject\AI\modules\ALPR\bin\windows\python39\venv\Scripts\python
2024-02-19 12:14:38:
2024-02-19 12:14:38: Attempting to start ALPR with C:\Program Files\CodeProject\AI\modules\ALPR\bin\windows\python39\venv\Scripts\python "C:\Program Files\CodeProject\AI\modules\ALPR\ALPR_adapter.py"
2024-02-19 12:14:38: Starting C:\Program Files...ws\python39\venv\Scripts\python "C:\Program Files...\modules\ALPR\ALPR_adapter.py"
2024-02-19 12:14:38:
2024-02-19 12:14:38: ** Module 'License Plate Reader' 3.0.1 (ID: ALPR)
2024-02-19 12:14:38: ** Valid:         True
2024-02-19 12:14:38: ** Module Path:   &lt;root&gt;\modules\ALPR
2024-02-19 12:14:38: ** AutoStart:     True
2024-02-19 12:14:38: ** Queue:         alpr_queue
2024-02-19 12:14:38: ** Runtime:       python3.9
2024-02-19 12:14:38: ** Runtime Loc:   Local
2024-02-19 12:14:38: ** FilePath:      ALPR_adapter.py
2024-02-19 12:14:38: ** Pre installed: False
2024-02-19 12:14:38: ** Start pause:   3 sec
2024-02-19 12:14:38: ** Parallelism:   0
2024-02-19 12:14:38: ** LogVerbosity:
2024-02-19 12:14:38: ** Platforms:     all
2024-02-19 12:14:38: ** GPU Libraries: installed if available
2024-02-19 12:14:38: ** GPU Enabled:   enabled
2024-02-19 12:14:38: ** Accelerator:
2024-02-19 12:14:38: ** Half Precis.:  enable
2024-02-19 12:14:38: ** Environment Variables
2024-02-19 12:14:38: ** AUTO_PLATE_ROTATE            = True
2024-02-19 12:14:38: ** MIN_COMPUTE_CAPABILITY       = 6
2024-02-19 12:14:38: ** MIN_CUDNN_VERSION            = 7
2024-02-19 12:14:38: ** OCR_OPTIMAL_CHARACTER_HEIGHT = 60
2024-02-19 12:14:38: ** OCR_OPTIMAL_CHARACTER_WIDTH  = 30
2024-02-19 12:14:38: ** OCR_OPTIMIZATION             = True
2024-02-19 12:14:38: ** PLATE_CONFIDENCE             = 0.7
2024-02-19 12:14:38: ** PLATE_RESCALE_FACTOR         = 2
2024-02-19 12:14:38: ** PLATE_ROTATE_DEG             = 0
2024-02-19 12:14:38:
2024-02-19 12:14:38: Started License Plate Reader module
2024-02-19 12:14:42: ALPR_adapter.py: Running init for License Plate Reader
The License Plate Reader only works with Nvidia GPUs or CPUs. It would need a totally new module to support other GPUs then Nvidia.
 
Is it normal that BlueIris will show the red circle with the "x", indicating a failure, for the "Plates"/ALPR detection even when it succeeds? See my screenshot below. I have blacked out the first characters of the license plate itself, but you can see that alpr "found" a result with success:true.

Perhaps BlueIris just doesn't have an icon for an identified plate (like the small orange icon for a vehicle) so it defaults to the failure icon? Or, the json response for alpr does not return a "count" attribute, so perhaps BlueIris is also looking for the count and not just the success:true ??? Is anyone seeing something other than the failure icon?

I upgraded BlueIris to 5.7.8.6 and CodeProject.AI to 2.5.4 and still see the same failure icons for "Plates".

alpr-success-but-red-x.jpg
 
Is it normal that BlueIris will show the red circle with the "x", indicating a failure, for the "Plates"/ALPR detection even when it succeeds? See my screenshot below. I have blacked out the first characters of the license plate itself, but you can see that alpr "found" a result with success:true.

Perhaps BlueIris just doesn't have an icon for an identified plate (like the small orange icon for a vehicle) so it defaults to the failure icon? Or, the json response for alpr does not return a "count" attribute, so perhaps BlueIris is also looking for the count and not just the success:true ??? Is anyone seeing something other than the failure icon?

I upgraded BlueIris to 5.7.8.6 and CodeProject.AI to 2.5.4 and still see the same failure icons for "Plates".

View attachment 187080
You should create a clone and just use the clone for ALPR and use the below AI settings. Also below is how the Plate will show

1708479639937.png

1708479727110.png
 
  • Like
Reactions: hopalong and actran