CodeProject.AI Version 2.0

Today just out of curiosity I installed Docker desktop and then installed the docker CPAI version. Wow, everything just worked fine without doing much reading. In case someone wants to give it a shot here was my process.
FWIIW: I am running via Docker in a Ubuntu VM, with Nvidia GPU via PCIE passthrough. One (minor) advantage is that I don't have CPAI installed on the BI systems. I have 2 test systems, one here in SOHO and one at a client's site. Client's site has 5 (of 11) cloned cameras facing street side and is plagued with false triggers from headlights at night. CPAI doing a great job.
 
  • Like
Reactions: Tinman
Today just out of curiosity I installed Docker desktop and then installed the docker CPAI version. Wow, everything just worked fine without doing much reading. In case someone wants to give it a shot here was my process.

1. Go to: Get Docker | Docker Documentation and install Docker Desktop for Windows.
2. You will need to create a account ...no big deal.
3. Install Docker. Once it is done you will need to reboot.
4. Skip the tutorial and just open a command prompt (admin) and past this:

docker pull codeproject/ai-server (this is CPU only mode version)

you will see it start the download process and let it do its thing.

5. Start the image within docker desktop and you will see a options window pop up. All I did was put in the port 32168 and hit start. Thats it, I tested it and it works just like the windows version.

Here is some more info as well:

Running CodeProject.AI Server in Docker - CodeProject.AI Server v2.0.6.

But this page says to do more than you need. I did not mess with anything other than tell it what port to use.

FWIW after I tried and tested Docker, I uninstalled all of it. I just don't see the need for it in my case as the windows version works great for me.
Thanks for sharing, someone here also suggested docker. It would be great if we can open another topic based on docker and perhaps the experience with it. I will try this weekend. Is with docker easy to try different version of codeproject?

Sent from my SM-S906U using Tapatalk
 
Thanks for sharing, someone here also suggested docker. It would be great if we can open another topic based on docker and perhaps the experience with it. I will try this weekend. Is with docker easy to try different version of codeproject?

Sent from my SM-S906U using Tapatalk

Yes, it just creates a new container with each version. You can just delete the old container or stop it. That part is much easier than windows. Many CPAI users use docker, which is why I tried it. My main BI system runs on a I7-6700 and with JUST BI and CPAI it runs just fine in CPU mode. So far with 7 cams running AI there is no need for a GPU....yet :)
 
  • Like
Reactions: jrbeddow
Update on my crashing situation...

I kept BI at v5.6.9.7 and rolled backed to CPAI v2.0.2. I chose v2.0.2 because it was the most stable version I have tested of the v2.0.x releases. I left every BI setting exactly the same, completely uninstalled CPAI v2.0.7, along with removing the two CPAI folders in Program Files and Program Data before installing v2.0.2.

Once done, BI and CPAI have performed perfectly by successfully processing around 300 triggers since I rolled back to CPAI v2.0.2. With v2.0.7, I would've had multiple crashes during that short period of time. I'm not sure why my setup has issues with v2.0.7 and other people don't, but I plan to stick with v2.0.2 and test new releases beyond v2.0.7 as they are released.

I'll post this over at the CPAI discussions and see if the devs have any thoughts.
 
Last edited:
Update on my crashing situation...

I kept BI at v5.6.9.7 and rolled backed to CPAI v2.0.2. I chose v2.0.2 because it was the most stable version I have tested of the v2.0.x releases. I left every BI setting exactly the same, completely uninstalled CPAI v2.0.7, along with removing the two CPAI folders in Program Files and Program Data before installing v2.0.2.

Once done, BI and CPAI have performed perfectly by successfully processing around 300 triggers since I rolled back to CPAI v2.0.2. With v2.0.7, I would've had multiple crashes during that short period of time. I'm not sure why my setup has issues with v2.0.7 and other people don't, but I plan to stick with v2.0.2 and test new releases beyond v2.0.7 as they are released.

I'll post this over at the CPAI discussions and see if the devs have any thoughts.
Does 2.0.2 have the capability of reading license plates? I can't get 2.0.7 to load the LPR module. It has issues for me for some reason. I have done everything I can think of to fix it. Can you tell me where to find 2.0.2 or attach it here so I could try it?
 
Does 2.0.2 have the capability of reading license plates? I can't get 2.0.7 to load the LPR module. It has issues for me for some reason. I have done everything I can think of to fix it. Can you tell me where to find 2.0.2 or attach it here so I could try it?
I haven't tried to use LPR with v2.0.2, sorry.
 
  • Like
Reactions: rbc1225
On the same token, I was able to upgrade to 2.6 last night and working, unfortunately, my CPU is always at 100% and the GPU at 25% compared to the 1.6.x which was running smoothly. Is the CPU increase due to other modules being processed?

With your 2.2 are you also getting plates? What is your bu at?

Thanks for sharing your setup.

Sent from my SM-S906U using Tapatalk
 
Similar question, when you run Nvidia-smi, does it tell you which process is running on your GPU card? Alghouth both bu and cp are using the GPU card, I do not see it under Nvidia-smi

Sent from my SM-S906U using Tapatalk
 
Today just out of curiosity I installed Docker desktop and then installed the docker CPAI version. Wow, everything just worked fine without doing much reading. In case someone wants to give it a shot here was my process.

1. Go to: Get Docker | Docker Documentation and install Docker Desktop for Windows.
2. You will need to create a account ...no big deal.
3. Install Docker. Once it is done you will need to reboot. (also note, I used the WSL 2 method)
4. Skip the tutorial and just open a command prompt (admin) and past this:

docker pull codeproject/ai-server (this is CPU only mode version)

you will see it start the download process and let it do its thing.

5. Start the image within docker desktop and you will see a options window pop up. All I did was put in the port 32168 and hit start. Thats it, I tested it and it works just like the windows version.

Here is some more info as well:

Running CodeProject.AI Server in Docker - CodeProject.AI Server v2.0.6.

But this page says to do more than you need. I did not mess with anything other than tell it what port to use.

FWIW after I tried and tested Docker, I uninstalled all of it. I just don't see the need for it in my case as the windows version works great for me.

Indeed it is an optional thing to use containers on a WIndows computer, to run software that is able to run natively but I would say that one of the main reasons to use the container version of CPAI on a Windows machine is to avoid the potential problems that appear to be coming about when folks upgrade CPAI. CPAI has many components to it and apparently problems can come about. You can very easily test a new version or go back to an older version without any concerns using docker. All of the CPAI changes are in the container and not the host OS, so you can run it, test it, break it, and delete it with reckless abandon :). Run or download it again once more as if nothing ever happened in just minutes. To me, that is the key and why I use it. It also allows CPAI to also be run on a Linux box, which is a free OS that can be installed in minutes. That said, it is this versatility of deployment that is something the CPAI folks appear to be striving for and I'd say they have achieved it. You can even run CPAI on a rPI 4 (although it still needs Coral TPU integration to be useful).

You can run various versions by appending the command to launch the CPAI container in docker and adding the version you want to run as per these examples.

For CPU versions:
docker pull codeproject/ai-server (no version info pulls the most recent version pushed to docker hub by CPAI, also referred to as "latest")
docker pull codeproject/ai-server:cpu-2.0.7
docker pull codeproject/ai-server:cpu-2.0.6
docker pull codeproject/ai-server:cpu-1.6.8

For the gpu enabled versions:
docker pull codeproject/ai-server:gpu (no version info pulls the most recent version pushed to docker hub by CPAI)
docker pull codeproject/ai-server:gpu-2.0.7
docker pull codeproject/ai-server:gpu-2.0.6
docker pull codeproject/ai-server:gpu-1.6.8

You can find all the releases available on the CPAI docker hub "tag" page here. Change the tag at the end and you can test any version you want.
 
  • Like
Reactions: Tinman and jrbeddow
Indeed it is an optional thing to use containers on a WIndows computer, to run software that is able to run natively but I would say that one of the main reasons to use the container version of CPAI on a Windows machine is to avoid the potential problems that appear to be coming about when folks upgrade CPAI. CPAI has many components to it and apparently problems can come about. You can very easily test a new version or go back to an older version without any concerns using docker. All of the CPAI changes are in the container and not the host OS, so you can run it, test it, break it, and delete it with reckless abandon :). Run or download it again once more as if nothing ever happened in just minutes. To me, that is the key and why I use it. It also allows CPAI to also be run on a Linux box, which is a free OS that can be installed in minutes. That said, it is this versatility of deployment that is something the CPAI folks appear to be striving for and I'd say they have achieved it. You can even run CPAI on a rPI 4 (although it still needs Coral TPU integration to be useful).

You can run various versions by appending the command to launch the CPAI container in docker and adding the version you want to run as per these examples.

For CPU versions:
docker pull codeproject/ai-server (no version info pulls the most recent version pushed to docker hub by CPAI, also referred to as "latest")
docker pull codeproject/ai-server:cpu-2.0.7
docker pull codeproject/ai-server:cpu-2.0.6
docker pull codeproject/ai-server:cpu-1.6.8

For the gpu enabled versions:
docker pull codeproject/ai-server:gpu (no version info pulls the most recent version pushed to docker hub by CPAI)
docker pull codeproject/ai-server:gpu-2.0.7
docker pull codeproject/ai-server:gpu-2.0.6
docker pull codeproject/ai-server:gpu-1.6.8

You can find all the releases available on the CPAI docker hub "tag" page here. Change the tag at the end and you can test any version you want.

@gwithers What do you "pay" for using docker instead of running CPAI natively?
Detection performance? Requires more CPU and RAM?

Can I assume you still need to install Nvidia/CUDA related stuff natively even when running CPAI docker?
 
And further, is their a generation CPU that it wouldn't work well in. For example, would it bring a 4th gen to its knees?
 
@gwithers What do you "pay" for using docker instead of running CPAI natively?
Detection performance? Requires more CPU and RAM?
Nothing as far as I can tell. I even do things in the "worst" possible way :). I run both BI and two instances of CPAI in virtual machines on the same physical hardware. To make it even harder, I store my BI live recordings and high res alerts on a NAS device on the network and not even locally. This should be the worst possible thing to do for performance but it all works great. I use Proxmox as my hypervisor to run the VM's (that is free as well). BI running in a Windows VM and primarily interfacing through UI3, works exactly as it did when I ran that natively on it's own hardware and Windows. The CPAI instances I run are using two Nvidia Quadro P600 GPUs for acceleration (why I run two instances). Those process images from the large model set in ~55 ms per image. Of course, those processing times will be longer for the CPU version in the same setup. If I disable GPU acceleration on the same hardware, the processing times go up to around ~125-150 ms. As you might have guessed, I do have a modern CPU running all of this (i5-11400) but it is also running 4-5 additional VM's and the total CPU usage is only ~30%.

What one needs to install on the docker host OS, depends on what OS that is. For my Linux VMs running the CPAI containers in docker, yes I did have to install the Nvidia video driver and CUDA. There are guides out there and it isn't "hard" to do that but it will take some work, as it does on Windows.
 
Last edited:
  • Like
  • Wow
Reactions: jrbeddow and actran
And further, is their a generation CPU that it wouldn't work well in. For example, would it bring a 4th gen to its knees?
I don't know of specific cutoffs but containers are very light weight (way more so than a VM for example). I would expect the overhead of running a docker container in Windows to be negligible based on the concept of a container and how it is using the host OS for things instead of having it's own (like a VM). That said, some testing to confirm a specific setup is OK is probably needed.
 
Last edited:
  • Like
Reactions: actran and jrbeddow