CodeProject.AI Version 2.0

Makes sense. Yes, LPR was working with my 1030 GPU on previous versions. For the moment I'm okay splitting, with Yolo using the GPU and LPR using CPU given my LPR use is only on one camera.

Separate question, what version of CUDA should I be using? I recall CPAI supporting 12.2 but LPR needing an older version.

Thanks for the help.
If you want to try something that might make your GPU work again, below are the steps.

1. unzip the attached modulesettings.zip file
2. Replace the modulesettings.json with the file you just unzipped.
3. Restart CodeProject.AI Service or just reboot the PC
1698898274043.png
 

Attachments

Last edited:
  • Like
Reactions: tedrpi and actran
Makes sense. Yes, LPR was working with my 1030 GPU on previous versions. For the moment I'm okay splitting, with Yolo using the GPU and LPR using CPU given my LPR use is only on one camera.

Separate question, what version of CUDA should I be using? I recall CPAI supporting 12.2 but LPR needing an older version.

Thanks for the help.
PaddlePaddle was updated to v2.5.2 (up from 2.5.1) and we've not been able to get GPU working on Windows in the latest. We'll keep trying. I'm sure it's a trivial change. It's just which trivial change...
 
PaddlePaddle was updated to v2.5.2 (up from 2.5.1) and we've not been able to get GPU working on Windows in the latest. We'll keep trying. I'm sure it's a trivial change. It's just which trivial change...


This looks to be the issue, I tried using the json @MikeLud1 posted with out luck, and did a fresh install on a new box with a 1660 card and LPR is still using the CPU. I'll keep an eye out for a new version of ALPR. Thanks for the updates.
 
Interesting-- I see 2.3.4 in docker hub but that didn't see to actually BE a newer version... I think that is actually an old one.


I installed 2.3.2 on an OPi5. The RKNN license plate reader isn't installed yet?

The RKNN object detection flies.....
 
Interesting-- I see 2.3.4 in docker hub but that didn't see to actually BE a newer version... I think that is actually an old one.


I installed 2.3.2 on an OPi5. The RKNN license plate reader isn't installed yet?

The RKNN object detection flies.....
With the release of CodeProject.AI v2.3.x I need to make some changes to RKNN license plate reader for it to work.
 
  • Like
Reactions: wpiman
Interesting-- I see 2.3.4 in docker hub but that didn't see to actually BE a newer version... I think that is actually an old one.


I installed 2.3.2 on an OPi5. The RKNN license plate reader isn't installed yet?

The RKNN object detection flies.....
I have the ALPR RKNN module working now with v2.3.4

1699075857074.png
1699075933961.png
1699075757921.png
 
  • Like
Reactions: weigle2
Is it possible to run instances of cpai on different servers and specify which IP blue iris uses per module?

For example face detect on my windows PC with a GPU, and everything else on my Linux box with the coral tpu
 
Is it possible to run instances of cpai on different servers and specify which IP blue iris uses per module?

For example face detect on my windows PC with a GPU, and everything else on my Linux box with the coral tpu
What you can do is clone a camera and just use this camera for face detection. Then in the clone camera's AI settings add the Override server IP and port for the CodeProject.AI server running the face module

1699110659256.png
 
Try uninstall the ALPR module then reinstall the module using Do not use download cache.

View attachment 176672
Interesting. This screenshot shows Coral version 1.62. I'm running 1.51 and CP 2.2.4
I don't really want to upgrade and break anything unless there are improvements to Coral (since that's what I'm using) but nothing is in the revision log I can find.
If I'm looking in the wrong spot, or someone just knows from upgrading, please let me/us know.
 
Interesting. This screenshot shows Coral version 1.62. I'm running 1.51 and CP 2.2.4
I don't really want to upgrade and break anything unless there are improvements to Coral (since that's what I'm using) but nothing is in the revision log I can find.
If I'm looking in the wrong spot, or someone just knows from upgrading, please let me/us know.
Below is some more details on the module version release notes

JSON:
      / Which server version is compatible with each version of this module.
      "ModuleReleases": [
        { "ModuleVersion": "1.0",   "ServerVersionRange": [ "2.1",   "2.1.12" ], "ReleaseDate": "2023-07-11" },
        { "ModuleVersion": "1.1",   "ServerVersionRange": [ "2.1",   "2.1.12" ], "ReleaseDate": "2023-07-12" },
        { "ModuleVersion": "1.2",   "ServerVersionRange": [ "2.1",   "2.1.12" ], "ReleaseDate": "2023-07-12" },
        { "ModuleVersion": "1.3",   "ServerVersionRange": [ "2.1",   "2.1.12" ], "ReleaseDate": "2023-08-11", "ReleaseNotes": "installer corrections, macOS/Ubuntu support improved" },
        { "ModuleVersion": "1.4",   "ServerVersionRange": [ "2.2",   "2.2.4"  ], "ReleaseDate": "2023-09-09", "ReleaseNotes": "Updated installer, updated TF-lite runtime" },
        { "ModuleVersion": "1.5",   "ServerVersionRange": [ "2.2",   "2.2.4"  ], "ReleaseDate": "2023-09-16", "ReleaseNotes": "Updates to help Blue Iris users" },
        { "ModuleVersion": "1.5.1", "ServerVersionRange": [ "2.2",   "2.2.4"  ], "ReleaseDate": "2023-09-17", "ReleaseNotes": "Better checks for admin rights when installing" },
        { "ModuleVersion": "1.6",   "ServerVersionRange": [ "2.3.0", "2.3.0"  ], "ReleaseDate": "2023-10-01", "ReleaseNotes": "Updated to match new installer SDK." },
        { "ModuleVersion": "1.6.1", "ServerVersionRange": [ "2.3.1", ""       ], "ReleaseDate": "2023-10-10", "ReleaseNotes": "Updated to match new installer SDK." },
        { "ModuleVersion": "1.6.2", "ServerVersionRange": [ "2.3.1", ""       ], "ReleaseDate": "2023-10-28", "ReleaseNotes": "Improvements for situations where installer does not have admin rights." },
        { "ModuleVersion": "1.6.3", "ServerVersionRange": [ "2.3.1", ""       ], "ReleaseDate": "2023-10-28", "ReleaseNotes": "Corrections to requirements.txt for Raspberry Pi / Orange Pi." }
 
  • Like
Reactions: jrbeddow
Is there anymore performance increase on Coral? ATm I'ver reverted to CPU as despite running a dual TPU, there's absolutely no real performance gain and there are issues eg. sometimes perocessing times spike and if more than 1 alert comes in, whereas with the CPU they're just queued and handled with the Coral they just they get discarded as the processor is busy.
 
Is there anymore performance increase on Coral? ATm I'ver reverted to CPU as despite running a dual TPU, there's absolutely no real performance gain and there are issues eg. sometimes perocessing times spike and if more than 1 alert comes in, whereas with the CPU they're just queued and handled with the Coral they just they get discarded as the processor is busy.
Yeah, that was my question as well. For now, I'm sticking with Coral because I have to leave this project (my parents house) and head out of here, but if there are no improvements to speak of, when I come back in Feb, I'll likely buy a GTX 1650 and test it out that way for comparison. My performance gains are there (~120 ms Medium model vs over double on CPU only) but the real issue for me was system instability with a flood of requests, causing it to nearly freeze a few times. That's solved with Coral, but I also tweaked BI Motion & IVS before it can make it to the AI stage. For an A/B comparison without the MD tweaks, Coral handled it without locking the system up, but as you said, legitimate requests got dumped while it was trying to figure out if a human was walking in a windy zone. Next go around, I want to play with this some more. One of my other factors is delayed Push notifications. Although AI might be a minor issue, the root cause is elsewhere since sending alerts right away is still slow & the system triggers it fairly quick.

I also do not have both TPU's of the Coral dual functional, since I did not have time to order the proper PCI board. That might help out, not sure.

It's been a fun month. My parents had 8 Cat5e cables to an analogue NVR system. 3 cables were bad and needed to be replaced, added 4 more, Empire all around (mostly 4MP variants), BI and AI done for now. Just re-reading and finding areas of interest on this forum where I can tweak a little more before I go.

I could not have left here this quick with a working system without the help of this forum.
I don't want to break it before I have to split...

Edit -- Should have been more specific: Although speed improvements would be more than welcome, what I am really looking for is improved accuracy, especially at night. Reasoning for wanting to play with the GPU side is custom models (ipcam-general which includes dark). I do not need the rest of the objects in Coral, just people and cars with better results at night.
 
Last edited:
  • Like
Reactions: CCTVCam
Is there anymore performance increase on Coral? ATm I'ver reverted to CPU as despite running a dual TPU, there's absolutely no real performance gain and there are issues eg. sometimes perocessing times spike and if more than 1 alert comes in, whereas with the CPU they're just queued and handled with the Coral they just they get discarded as the processor is busy.
Just found this on Codeproject boards.
Would have to deduce there is nothing in this update for Coral.cp6.png
 
Yeah maybe they need to contact Coral for support. It doesn't reflect well on Coral if the low, ms times sometimes reported for other applications around the internet, are unreproduceable and the processing times are around 140ms instead with issues with requests getting queued and discarded. I'm not sure what is going on, but it does sound very much like an issue with the coding. Atm for me, there's very little difference between processing via my 11700 CPU and a dual TPU. That shouldn't be the case. Nor can I afford to have alerts discarded as timeouts. For me, the CPU is more stable and given CPAI can be a nightmare when a build has bugs as roll backs etc often don't work and leave an entire windows re-installation necessary, it;'s easier for me to just leave the existing working build in place, at least until I can afford some system back up software that will enable me to clone the system back quickly to the way it was if something goes wrong.
 
  • Like
Reactions: jrbeddow
Success here, BI on 5.8.0.14 and CPAI on 2.3.4.0 and module Object Detection YOLOv5.6.2. Currently I have no Face Processing or license plate recognition modules installed. From a walk test things are looking good, In BI it's showing CPU 7%, GPU 8%.

About 2 days ago I installed the latest NVIDIA driver from their website for the T600 GPU here. Today's install of 2.3.4.0 CPAI wasn't without a hitch. I uninstalled ver 2.0.8.0 then used the downloaded script to install 2.3.4.0, no problems there.
Initially I was getting an error message about modules not being installed. I uninstalled Face Processing, Object Detection YOLOv5.6.2.NET and YOLOv5.NET (I'm not sure why that was there). I think on the third attempt YOLOv5.6.2 was installed without any error messages in the CPAI Dashboard's log. I did select "Do not use the download cache", I am not sure how important this was.

Many thanks Mike aka MikeLud1
 
Success here, BI on 5.8.0.14 and CPAI on 2.3.4.0 and module Object Detection YOLOv5.6.2. Currently I have no Face Processing or license plate recognition modules installed. From a walk test things are looking good, In BI it's showing CPU 7%, GPU 8%.

About 2 days ago I installed the latest NVIDIA driver from their website for the T600 GPU here. Today's install of 2.3.4.0 CPAI wasn't without a hitch. I uninstalled ver 2.0.8.0 then used the downloaded script to install 2.3.4.0, no problems there.
Initially I was getting an error message about modules not being installed. I uninstalled Face Processing, Object Detection YOLOv5.6.2.NET and YOLOv5.NET (I'm not sure why that was there). I think on the third attempt YOLOv5.6.2 was installed without any error messages in the CPAI Dashboard's log. I did select "Do not use the download cache", I am not sure how important this was.

Many thanks Mike aka MikeLud1
Thanks for this post, set mine up the same way, wasn't using 2.3.4. I was waiting for exactly this before I updated. So far it truly does seem to be working nicely.:p;)
 
PaddlePaddle was updated to v2.5.2 (up from 2.5.1) and we've not been able to get GPU working on Windows in the latest. We'll keep trying. I'm sure it's a trivial change. It's just which trivial change...

@ChrisMaunder - Any updates on when LPR will be GPU enabled again? I'm seeing this in the logs if it helps:

08:16:01:Object Detection (YOLOv5 6.2): Rec'd request for Object Detection (YOLOv5 6.2) command 'list-custom' (...7ec8f1) took 2ms
08:16:02:ALPR_adapter.py: W1110 08:16:02.295920 12708 dynamic_loader.cc:274] Note: [Recommend] copy cudnn into CUDA installation directory.
08:16:02:ALPR_adapter.py: For instance, download cudnn-10.0-windows10-x64-v7.6.5.32.zip from NVIDIA's official website,
08:16:02:ALPR_adapter.py: then, unzip it and copy it into C:\Program Files\NVIDIA GPU Computing Toolkit\CUDA\v10.0
08:16:02:ALPR_adapter.py: You should do this according to your CUDA installation directory and CUDNN version.
 
@ChrisMaunder - Any updates on when LPR will be GPU enabled again? I'm seeing this in the logs if it helps:

08:16:01:Object Detection (YOLOv5 6.2): Rec'd request for Object Detection (YOLOv5 6.2) command 'list-custom' (...7ec8f1) took 2ms
08:16:02:ALPR_adapter.py: W1110 08:16:02.295920 12708 dynamic_loader.cc:274] Note: [Recommend] copy cudnn into CUDA installation directory.
08:16:02:ALPR_adapter.py: For instance, download cudnn-10.0-windows10-x64-v7.6.5.32.zip from NVIDIA's official website,
08:16:02:ALPR_adapter.py: then, unzip it and copy it into C:\Program Files\NVIDIA GPU Computing Toolkit\CUDA\v10.0
08:16:02:ALPR_adapter.py: You should do this according to your CUDA installation directory and CUDNN version.
Can you post a screenshot of you system info. Version 2.3.4 is working fine with Nvidia GPUs

1699640336247.png
 
  • Like
Reactions: JNDATHP