CodeProject.AI Version 2.0

I have Static Objects ticked and changed the analyse to 900 seconds.
What does the Auto refresh option do at the bottom of the option page?

Or is it best to untick the static object option?

1690465164091.png
 
Last edited:
  • Like
Reactions: David L
My coral usb arrived today - downloaded the latest edgetpu runtime.
Started the module on CPAI - detected and seemed to be running on INFO.
Walked in front of the cams and it just crashed CPAI! Also crashed one of my hyperv vm's (maybe just a coincidence that one).

hmm not a good start - tried that 2 times and CP kept going offline
 
My coral usb arrived today - downloaded the latest edgetpu runtime.
Started the module on CPAI - detected and seemed to be running on INFO.
Walked in front of the cams and it just crashed CPAI! Also crashed one of my hyperv vm's (maybe just a coincidence that one).

hmm not a good start - tried that 2 times and CP kept going offline
Has anyone mentioned that the Coral USB and CPAI may not "play well" with the hardware abstraction going on with running this inside a VM?
Just a possibility...
 
Has anyone mentioned that the Coral USB and CPAI may not "play well" with the hardware abstraction going on with running this inside a VM?
Just a possibility...
mine is not inside a vm - it is on a windows 10 pc.
But anyway taken it out and will try the coral on frigate first
 
  • Like
Reactions: jrbeddow
Well that is rather frustrating. All the hype about the coral and can't get it to work on either windows with CPAI or frigate on proxmox.
Can see the usb in proxmox.
Modify the lxc config to pass through and when I hit detect it just fails and restarts frigate.
Global unicorp usb type some say it doesn't have enough power.
Lenovo has 4 3.0 usb ports. Surely not all underpowered
Grhhh

On my windows I can see it under usb devices as coral.
Just crashes the codeproject when I walk in front of a camera
 
Finally got the coral working on the frigate proxmox. 8ms inference speed.
Just don't know why it doesn't work on my windows with CPAI yet but would like to get it solved
 
mine is not inside a vm - it is on a windows 10 pc.
But anyway taken it out and will try the coral on frigate first
I also received my Coral USB Accelerator about a week ago. I had it working for a little while on my Windows 10 Blue Iris machine running in bare metal, no VM. I was seeing about 150 to 180 ms responses in the CPAI dashboard.
I can't really recall what I did to get it to work.
My initial impression is that this was a waste of money. The CPU and the Intel GPU in the processor give me similar response times in the CPAI dashboard.
I did try to follow the initial setup instructions from the coral.ai website on both my Windows 10 machine, and 2 different instances of linux machnes, without any luck.
It seems the drivers and the pycoral program is only supported by earlier versions of Python.
I gave up on that in frustration after several days of trying.
Currently, I have tried to use the Coral module in the CPAI server, and I cannot get responses from CPAI into Blue Iris. Either AI:timeout or Nothing found.
After many combinations of settings changes in Blue Iris, Uninstall / reinstall the Coral module in CPAI, Stop/Start Blue Iris and CPAI server services ind rebooting the server machine, I am back to the YoloV5 .Net GPU DirectML module whick is working.
It is disconcerting that every time I make a change in either Blue Iris AI tab or CPAI, I have to pat my head, rub my tummy and hop on one foot to get the CPAI and Blue Iris to work together,
As far as trying the Coral USB Accelerator in Frigate, I have read through the installation and configuration instructions in the FrigateNVR website, and I'm sorry to say, I'm lost.
I would be happy to see a version of CPAI that does not have "beta" appended to the version number.
I did find out that if one would apply the version update64_56804 to Blue Iris, and restart the BI service, I can go back to where there is not very tight integration with CPAI, and I can point the AI to The Nvidia Jetson Nano running Deepstack.
I had been using the Jetson/Deepstack for about seven or eight months (maybe more, I don't remember) without any issues at all.
I have been told that BI dropped Deepstack because it is no longer under developement. IMHO, if a program is working, you don't really need to keep developing it, to add features etc. That's where you get in trouble.
In my career in IT of about 20 years, there were times we used software that was years old, and we didn't need to change it until we actually changed the hardware it was talking to.
Bottom line, so far, It seems the Coral USB Accelerator was a waste of money for me, until I can find a real use case for it.
End of rant. :)
 
I also received my Coral USB Accelerator about a week ago. I had it working for a little while on my Windows 10 Blue Iris machine running in bare metal, no VM. I was seeing about 150 to 180 ms responses in the CPAI dashboard.
I can't really recall what I did to get it to work.
My initial impression is that this was a waste of money. The CPU and the Intel GPU in the processor give me similar response times in the CPAI dashboard.
I did try to follow the initial setup instructions from the coral.ai website on both my Windows 10 machine, and 2 different instances of linux machnes, without any luck.
It seems the drivers and the pycoral program is only supported by earlier versions of Python.
I gave up on that in frustration after several days of trying.
Currently, I have tried to use the Coral module in the CPAI server, and I cannot get responses from CPAI into Blue Iris. Either AI:timeout or Nothing found.
After many combinations of settings changes in Blue Iris, Uninstall / reinstall the Coral module in CPAI, Stop/Start Blue Iris and CPAI server services ind rebooting the server machine, I am back to the YoloV5 .Net GPU DirectML module whick is working.
It is disconcerting that every time I make a change in either Blue Iris AI tab or CPAI, I have to pat my head, rub my tummy and hop on one foot to get the CPAI and Blue Iris to work together,
As far as trying the Coral USB Accelerator in Frigate, I have read through the installation and configuration instructions in the FrigateNVR website, and I'm sorry to say, I'm lost.
I would be happy to see a version of CPAI that does not have "beta" appended to the version number.
I did find out that if one would apply the version update64_56804 to Blue Iris, and restart the BI service, I can go back to where there is not very tight integration with CPAI, and I can point the AI to The Nvidia Jetson Nano running Deepstack.
I had been using the Jetson/Deepstack for about seven or eight months (maybe more, I don't remember) without any issues at all.
I have been told that BI dropped Deepstack because it is no longer under developement. IMHO, if a program is working, you don't really need to keep developing it, to add features etc. That's where you get in trouble.
In my career in IT of about 20 years, there were times we used software that was years old, and we didn't need to change it until we actually changed the hardware it was talking to.
Bottom line, so far, It seems the Coral USB Accelerator was a waste of money for me, until I can find a real use case for it.
End of rant. :)
If you dont want or need additional development then you could stick with the version of BI that supports deepstack. DS was nice but had issues as well for some users, there was no development to make it faster or to be able to use it with intel GPU and other accelerators. It seems like you didnt take your own advice and stick with an old version of BI and deepstack.
For what its worth I am using both DS and CPAI mixed on over 20 systems. CPAI 2.08 runs great. They will keep improving and making it better. Remember that the folks at CPAI are doing this for FREE, so everything is on their terms and their timelines.
 
  • Like
Reactions: David L
Finally got the coral working on the frigate proxmox. 8ms inference speed.
Just don't know why it doesn't work on my windows with CPAI yet but would like to get it solved
I'm curious. Did you just follow the installation and configuration instructions on the FrigateNVR website? Using Docker Compose? I'm not familiar with that.
I'm currently using Ubuntu 22.04 on an Intel i5 machine, not a vm. I have some experience in *nix, I still need to keep looking up things as I work.
I would like to try Frigate with this Coral device, but I'm afraid I got lost in the FrigateNVR documentation.
Thanks in advance.
 
If you dont want or need additional development then you could stick with the version of BI that supports deepstack. DS was nice but had issues as well for some users, there was no development to make it faster or to be able to use it with intel GPU and other accelerators. It seems like you didnt take your own advice and stick with an old version of BI and deepstack.
For what its worth I am using both DS and CPAI mixed on over 20 systems. CPAI 2.08 runs great. They will keep improving and making it better. Remember that the folks at CPAI are doing this for FREE, so everything is on their terms and their timelines.
I understand. FYI, I was just trying to "push the envelope" as it were. I'm happy now that I can go back to the old system if I have to.
I actually have a working version of AITools that I can use if need be.
 
just a bit more anecdotal evidence --

After about two weeks or so of constant Coral TPU (small) detection, I have not picked up a single cat or dog.

With the Yolo models on CUDA, i used to pick up several per day -- usually accurately.

I'm also rarely picking people up walking at night. I had more misses at night, than day with YOLO, but now with the Coral, it's like a complete dead zone for hours sometimes. Average is about 20 people walking by per hour from 1am-5am. Coral picks up like one person per hour if i'm lucky.

I've even changed the number of images submitted per trigger from 6 (YOLO) to 14 (TPU).

It's very fast for me, but a very poor performer. I might just switch back to CUDA because speed doesn't matter if it's unreliable.
 
Even though DeepStack support in Blue Iris is deprecated, it still works even in the latest version.
I haven't tried this recently.
I have been using Mike Lud's custom models for quite a while.
At one time recently, I think I remember I was able to get Blue Iris to read the model list, then stop the CodeProject AI server, and Blue Iris would still function using the custom models list that it pulled from the running Code Project AI server.
I will admit that I have at times changed more than one thing at a time, that can really confuse the issue.
Also, I seem to remember a post somewhere here, that said that you could even point Blue Iris to zero length files with the name of the custom models used by Deepstack. I've never had a reason to try that.
I have never fired up Wireshark to see what the communication between Blue Iris and the ai program looks like.
My latest issue has been trying to use the Coral module in CPAI. As I said originally, I had it working after the first try, but now no joy.
I'm thinking maybe I need to uninstall the module, delete the module folder and reinstall the module.
Also, I have had trouble getting Blue Iris to populate the custom models list/combo box. "Restart AI server to load the models" Services.msc stop and start, Blue Iris AI tab Stop / Start buttons gave me the most luck so far, and rebooting the machine at times will not load the module list.
Oh well. Just to be clear, I have looked at several different surveillance software products, Blue Iris beats them all. There is just a learning curve involved.
And I now know how to go back, to give me a functional system if I need to.
At this time, I'm just trying to get the Coral Accelerator to work. I'm just a little ocd I think.


Sent from my iPlay_50 using Tapatalk
 
After about two weeks or so of constant Coral TPU (small) detection, I have not picked up a single cat or dog.

My guess is that the Coral TPU detector is using a smaller model size.

WinServer_‎-_Remote_Desktop_29-13.32.55.png

You can't change the model size for the Coral at the moment, but my guess is that it's using either a tiny or small model. Frigate uses a tiny model and thus is much less accurate with its default config.

I have never fired up Wireshark to see what the communication between Blue Iris and the ai program looks like.

The API is documented. DeepStack and CodeProject AI mostly use the same API, which is how both work with Blue Iris.

Object detection API:
 
I also received my Coral USB Accelerator about a week ago. I had it working for a little while on my Windows 10 Blue Iris machine running in bare metal, no VM. I was seeing about 150 to 180 ms responses in the CPAI dashboard.
I can't really recall what I did to get it to work.
My initial impression is that this was a waste of money. The CPU and the Intel GPU in the processor give me similar response times in the CPAI dashboard.
I did try to follow the initial setup instructions from the coral.ai website on both my Windows 10 machine, and 2 different instances of linux machnes, without any luck.
It seems the drivers and the pycoral program is only supported by earlier versions of Python.
I gave up on that in frustration after several days of trying.
Currently, I have tried to use the Coral module in the CPAI server, and I cannot get responses from CPAI into Blue Iris. Either AI:timeout or Nothing found.
After many combinations of settings changes in Blue Iris, Uninstall / reinstall the Coral module in CPAI, Stop/Start Blue Iris and CPAI server services ind rebooting the server machine, I am back to the YoloV5 .Net GPU DirectML module whick is working.
It is disconcerting that every time I make a change in either Blue Iris AI tab or CPAI, I have to pat my head, rub my tummy and hop on one foot to get the CPAI and Blue Iris to work together,
As far as trying the Coral USB Accelerator in Frigate, I have read through the installation and configuration instructions in the FrigateNVR website, and I'm sorry to say, I'm lost.
I would be happy to see a version of CPAI that does not have "beta" appended to the version number.
I did find out that if one would apply the version update64_56804 to Blue Iris, and restart the BI service, I can go back to where there is not very tight integration with CPAI, and I can point the AI to The Nvidia Jetson Nano running Deepstack.
I had been using the Jetson/Deepstack for about seven or eight months (maybe more, I don't remember) without any issues at all.
I have been told that BI dropped Deepstack because it is no longer under developement. IMHO, if a program is working, you don't really need to keep developing it, to add features etc. That's where you get in trouble.
In my career in IT of about 20 years, there were times we used software that was years old, and we didn't need to change it until we actually changed the hardware it was talking to.
Bottom line, so far, It seems the Coral USB Accelerator was a waste of money for me, until I can find a real use case for it.
End of rant. :)

I tried frigate out and it cant match blueiris ease of use. Not sure why there isnt easy gui to add cameras. Adding everything through xml is quite annoying. Still its a new program and it works pretty well. Cant complain for free! Im sure the developer will make it more user friendly as it progresses.

I have the coral edge tpu. Im really liking for now. Still a few bugs to be worked out. It allows me to run a full blueiris system on a 10 year old i5 chip. Im only at 29% cpu useage with 7 cameras and ai going. The coral is doing all the hard work.
 
  • Like
Reactions: Pentagano
I tried frigate out and it cant match blueiris ease of use. Not sure why there isnt easy gui to add cameras. Adding everything through xml is quite annoying. Still its a new program and it works pretty well. Cant complain for free! Im sure the developer will make it more user friendly as it progresses.

I have the coral edge tpu. Im really liking for now. Still a few bugs to be worked out. It allows me to run a full blueiris system on a 10 year old i5 chip. Im only at 29% cpu useage with 7 cameras and ai going. The coral is doing all the hard work.
still use my big gpu for blue iris as I have not got the coral tpu usd to work yet. pc picks it up under usb devices but it just crashes the CP server. Maybe try TF-lite model? Will try again just out of curiosity
Anyway got the coral on my Frigate as I have this on proxmox - nice backup to BI in case that ever fails. super fast now with the coral but no comparison. BI is still awesome and I keep my maintenance plan going.
Yep I think we´ll see Frigate evolve - maybe one day there will be a gui to add cameras and alike

Just shutdown my 80/443 ports to nginx proxy as I set up Twingate today and seems to work very well and easily.
Check out the network chuck video if you want fast secure access to 5 ports on your lan.
 
still use my big gpu for blue iris as I have not got the coral tpu usd to work yet. pc picks it up under usb devices but it just crashes the CP server. Maybe try TF-lite model? Will try again just out of curiosity
Anyway got the coral on my Frigate as I have this on proxmox - nice backup to BI in case that ever fails. super fast now with the coral but no comparison. BI is still awesome and I keep my maintenance plan going.
Yep I think we´ll see Frigate evolve - maybe one day there will be a gui to add cameras and alike

Just shutdown my 80/443 ports to nginx proxy as I set up Twingate today and seems to work very well and easily.
Check out the network chuck video if you want fast secure access to 5 ports on your lan.

I couldn't get usb passthru to work on my esxi server . I bought a USB pci express card and passed that thru. That is how I got it working
 
  • Like
Reactions: Pentagano
My guess is that the Coral TPU detector is using a smaller model size.

View attachment 168835

You can't change the model size for the Coral at the moment, but my guess is that it's using either a tiny or small model. Frigate uses a tiny model and thus is much less accurate with its default config.



The API is documented. DeepStack and CodeProject AI mostly use the same API, which is how both work with Blue Iris.

Object detection API:
With the frigate I found using a higher resolution improved the accuracy greatly. The developer confirmed it would also even though frigate captures in 320x320 shots. Though if you´re config is set at 1280x720 it picked up objects so much better than the standard substream resolution