Running CPAI on OrangePi/RK3588--RKNN modules are "not available"

PeteJ

Getting the hang of it
Jan 14, 2025
73
25
California
Newbie here, but I've googled for hours but have not found much success.

I currently run BI and CPAI on the same server, and it works ok. The server does not have a GPU and it takes 300-700ms for AI to run, and this is on only a few cameras. The server is a 11th gen Intel i5, 16G of ram. For comparison, I run Frigate on Linux with a Coral TPU and inference times are sub 10ms, on the same server, so I'm looking for a way to speed things up w/o a GPU.

I want to split the CPAI workload off to an OrangePI 5 Max to see if performance would improve.. but I can't figure out how to get the RKNN modules to install since they are currently listed as "Not Available" on the CPAI server dashboard.

The Orange Pi runs the official Ubuntu distribution from Orange Pi (22.04.5 LTS, 6.1.43 kernel), I am running CPAI in Docker. CPAI is 2.9.7. I am running the container in privileged mode, with /dev/bus/usb passed thru as devices. Within the container, I can see those devices, and Docker confirms the container is privileged. I am running the amr64 image.

my docker compose file:

services:
CodeProjectAI:
image: codeproject/ai-server:arm64
container_name: codeproject-ai-server-arm64
privileged: True
hostname: codeproject-ai-server
restart: unless-stopped
ports:
- "5000:32168"
environment:
- TZ=America/Los_Angeles
volumes:
- /dev/bus/usb:/dev/bus/usb
- codeproject_ai_data:/etc/codeproject/ai
- codeproject_ai_modules:/app/modules
devices:
- /dev/bus/usb:/dev/bus/usb
volumes:
codeproject_ai_data:
codeproject_ai_modules:


If you have CPAI running on RK3588, would love some advise on how to get this going.

Thanks!
 
Before you go to all that effort, unless you like to tinker, what is the benefit?

Does improving sub 1-second process times result in any additional improvements to your response to a situation? 300ms is faster than the reaction time of most people.

If it were 15,000ms, then yeah giving a perp a 15 second head start on you is bad, but sub 1 second, whether it is 300ms to 5ms, that is human reaction time differences that you might be chasing something that doesn't add any value.
 
Yes, sort of.

I do not currently have plans to act on the data, just collecting it is sufficient. I only have 1 camera doing LPR currently and it's point out to a very quiet street. I'd be surprised if I would have more than 30 reads per day. However, where I plan to install this is going to be toward a main street where I'd be in the 1000 reads per day range, in a single direction (that's just a guess, but it is a busy street).

I don't think BI and CPAI running on the same server can keep up, so that's why I am interesting in splitting it up. But, I am not really clear on how BI pipelines the LPR to CPAI--does it just queue it up in a FIFO queue and will process each one as it waits for CPAI to return the results, or does it overflow or start blocking if the requests are not returned quick enough?

From what I have seen, after about 8-10 secs, it looks like BI stops checking? The *.dat data is all grayed out.

And yes, I do like to tinker.
 
Okay, looks like the problem was the official Ubuntu 22.04.5 LTS image from Orange Pi's site. Under platform, it comes up as Linux/Arm64--which is accurate, but looks like CPAI actually looks for OrangePi as the platform:

Screenshot from 2025-01-17 23-52-39.png

I was able to get this to work by installing Joshua Reik's Ubuntu image: GitHub - Joshua-Riek/ubuntu-rockchip: Ubuntu for Rockchip RK35XX Devices

Once installed, the platform was detected properly and the option to install the RNKK modules was available.

Thanks very much to @MikeLud1 for pointing me in the right direction. Hopefully this info would be useful to someone else in the future.
 
In case anyone is considering running CPAI on the RK3588, this is what I am seeing in terms of perf on the RK3588

Screenshot from 2025-01-18 17-02-50.png

Below is on CPU.

Screenshot from 2025-01-18 17-01-15.png

CPAI avg is slightly under 40ms running average.

Screenshot from 2025-01-19 17-27-31.png

I only have a 15w power supply on it, so it's probably running at half of that right now.
 
  • Like
Reactions: MikeLud1