Has anyone mentioned that the Coral USB and CPAI may not "play well" with the hardware abstraction going on with running this inside a VM?My coral usb arrived today - downloaded the latest edgetpu runtime.
Started the module on CPAI - detected and seemed to be running on INFO.
Walked in front of the cams and it just crashed CPAI! Also crashed one of my hyperv vm's (maybe just a coincidence that one).
hmm not a good start - tried that 2 times and CP kept going offline
mine is not inside a vm - it is on a windows 10 pc.Has anyone mentioned that the Coral USB and CPAI may not "play well" with the hardware abstraction going on with running this inside a VM?
Just a possibility...
I also received my Coral USB Accelerator about a week ago. I had it working for a little while on my Windows 10 Blue Iris machine running in bare metal, no VM. I was seeing about 150 to 180 ms responses in the CPAI dashboard.mine is not inside a vm - it is on a windows 10 pc.
But anyway taken it out and will try the coral on frigate first
If you dont want or need additional development then you could stick with the version of BI that supports deepstack. DS was nice but had issues as well for some users, there was no development to make it faster or to be able to use it with intel GPU and other accelerators. It seems like you didnt take your own advice and stick with an old version of BI and deepstack.I also received my Coral USB Accelerator about a week ago. I had it working for a little while on my Windows 10 Blue Iris machine running in bare metal, no VM. I was seeing about 150 to 180 ms responses in the CPAI dashboard.
I can't really recall what I did to get it to work.
My initial impression is that this was a waste of money. The CPU and the Intel GPU in the processor give me similar response times in the CPAI dashboard.
I did try to follow the initial setup instructions from the coral.ai website on both my Windows 10 machine, and 2 different instances of linux machnes, without any luck.
It seems the drivers and the pycoral program is only supported by earlier versions of Python.
I gave up on that in frustration after several days of trying.
Currently, I have tried to use the Coral module in the CPAI server, and I cannot get responses from CPAI into Blue Iris. Either AI:timeout or Nothing found.
After many combinations of settings changes in Blue Iris, Uninstall / reinstall the Coral module in CPAI, Stop/Start Blue Iris and CPAI server services ind rebooting the server machine, I am back to the YoloV5 .Net GPU DirectML module whick is working.
It is disconcerting that every time I make a change in either Blue Iris AI tab or CPAI, I have to pat my head, rub my tummy and hop on one foot to get the CPAI and Blue Iris to work together,
As far as trying the Coral USB Accelerator in Frigate, I have read through the installation and configuration instructions in the FrigateNVR website, and I'm sorry to say, I'm lost.
I would be happy to see a version of CPAI that does not have "beta" appended to the version number.
I did find out that if one would apply the version update64_56804 to Blue Iris, and restart the BI service, I can go back to where there is not very tight integration with CPAI, and I can point the AI to The Nvidia Jetson Nano running Deepstack.
I had been using the Jetson/Deepstack for about seven or eight months (maybe more, I don't remember) without any issues at all.
I have been told that BI dropped Deepstack because it is no longer under developement. IMHO, if a program is working, you don't really need to keep developing it, to add features etc. That's where you get in trouble.
In my career in IT of about 20 years, there were times we used software that was years old, and we didn't need to change it until we actually changed the hardware it was talking to.
Bottom line, so far, It seems the Coral USB Accelerator was a waste of money for me, until I can find a real use case for it.
End of rant.
I'm curious. Did you just follow the installation and configuration instructions on the FrigateNVR website? Using Docker Compose? I'm not familiar with that.Finally got the coral working on the frigate proxmox. 8ms inference speed.
Just don't know why it doesn't work on my windows with CPAI yet but would like to get it solved
I understand. FYI, I was just trying to "push the envelope" as it were. I'm happy now that I can go back to the old system if I have to.If you dont want or need additional development then you could stick with the version of BI that supports deepstack. DS was nice but had issues as well for some users, there was no development to make it faster or to be able to use it with intel GPU and other accelerators. It seems like you didnt take your own advice and stick with an old version of BI and deepstack.
For what its worth I am using both DS and CPAI mixed on over 20 systems. CPAI 2.08 runs great. They will keep improving and making it better. Remember that the folks at CPAI are doing this for FREE, so everything is on their terms and their timelines.
The Coral USB should work in a VM. The M.2 Coral definitely works - I'm running Blue Iris in a Windows Server 2022 VM running in KVM via Proxmox. Works fine.Has anyone mentioned that the Coral USB and CPAI may not "play well" with the hardware abstraction going on with running this inside a VM?
Just a possibility...
I haven't tried this recently.Even though DeepStack support in Blue Iris is deprecated, it still works even in the latest version.
After about two weeks or so of constant Coral TPU (small) detection, I have not picked up a single cat or dog.
I have never fired up Wireshark to see what the communication between Blue Iris and the ai program looks like.
I also received my Coral USB Accelerator about a week ago. I had it working for a little while on my Windows 10 Blue Iris machine running in bare metal, no VM. I was seeing about 150 to 180 ms responses in the CPAI dashboard.
I can't really recall what I did to get it to work.
My initial impression is that this was a waste of money. The CPU and the Intel GPU in the processor give me similar response times in the CPAI dashboard.
I did try to follow the initial setup instructions from the coral.ai website on both my Windows 10 machine, and 2 different instances of linux machnes, without any luck.
It seems the drivers and the pycoral program is only supported by earlier versions of Python.
I gave up on that in frustration after several days of trying.
Currently, I have tried to use the Coral module in the CPAI server, and I cannot get responses from CPAI into Blue Iris. Either AI:timeout or Nothing found.
After many combinations of settings changes in Blue Iris, Uninstall / reinstall the Coral module in CPAI, Stop/Start Blue Iris and CPAI server services ind rebooting the server machine, I am back to the YoloV5 .Net GPU DirectML module whick is working.
It is disconcerting that every time I make a change in either Blue Iris AI tab or CPAI, I have to pat my head, rub my tummy and hop on one foot to get the CPAI and Blue Iris to work together,
As far as trying the Coral USB Accelerator in Frigate, I have read through the installation and configuration instructions in the FrigateNVR website, and I'm sorry to say, I'm lost.
I would be happy to see a version of CPAI that does not have "beta" appended to the version number.
I did find out that if one would apply the version update64_56804 to Blue Iris, and restart the BI service, I can go back to where there is not very tight integration with CPAI, and I can point the AI to The Nvidia Jetson Nano running Deepstack.
I had been using the Jetson/Deepstack for about seven or eight months (maybe more, I don't remember) without any issues at all.
I have been told that BI dropped Deepstack because it is no longer under developement. IMHO, if a program is working, you don't really need to keep developing it, to add features etc. That's where you get in trouble.
In my career in IT of about 20 years, there were times we used software that was years old, and we didn't need to change it until we actually changed the hardware it was talking to.
Bottom line, so far, It seems the Coral USB Accelerator was a waste of money for me, until I can find a real use case for it.
End of rant.
still use my big gpu for blue iris as I have not got the coral tpu usd to work yet. pc picks it up under usb devices but it just crashes the CP server. Maybe try TF-lite model? Will try again just out of curiosityI tried frigate out and it cant match blueiris ease of use. Not sure why there isnt easy gui to add cameras. Adding everything through xml is quite annoying. Still its a new program and it works pretty well. Cant complain for free! Im sure the developer will make it more user friendly as it progresses.
I have the coral edge tpu. Im really liking for now. Still a few bugs to be worked out. It allows me to run a full blueiris system on a 10 year old i5 chip. Im only at 29% cpu useage with 7 cameras and ai going. The coral is doing all the hard work.
still use my big gpu for blue iris as I have not got the coral tpu usd to work yet. pc picks it up under usb devices but it just crashes the CP server. Maybe try TF-lite model? Will try again just out of curiosity
Anyway got the coral on my Frigate as I have this on proxmox - nice backup to BI in case that ever fails. super fast now with the coral but no comparison. BI is still awesome and I keep my maintenance plan going.
Yep I think we´ll see Frigate evolve - maybe one day there will be a gui to add cameras and alike
Just shutdown my 80/443 ports to nginx proxy as I set up Twingate today and seems to work very well and easily.
Check out the network chuck video if you want fast secure access to 5 ports on your lan.
With the frigate I found using a higher resolution improved the accuracy greatly. The developer confirmed it would also even though frigate captures in 320x320 shots. Though if you´re config is set at 1280x720 it picked up objects so much better than the standard substream resolutionMy guess is that the Coral TPU detector is using a smaller model size.
View attachment 168835
You can't change the model size for the Coral at the moment, but my guess is that it's using either a tiny or small model. Frigate uses a tiny model and thus is much less accurate with its default config.
The API is documented. DeepStack and CodeProject AI mostly use the same API, which is how both work with Blue Iris.
Object detection API:
API Reference - CodeProject.AI Server v2.7.0
A Guide to using and developing with CodeProject.AI Serverwww.codeproject.com