well, I tried a local install of SenseAI as the docker started to frustrate me.
I thought for sure it would perplex you

well, I tried a local install of SenseAI as the docker started to frustrate me.
OK, preliminary results look promising (ie: I didn't completely break anything...yet). I have this version running on the two cams that I had DeepStack running on previously, and after some minor tweaks to adapt to the newer model names (ipcam-general, ipcam-combined in my case) and ports used (5000 still works for now, it actually had some trouble working with the 32168, but that might be solvable with a reboot).
Could anyone please remind me where to find the classifications (person, vehicle, car, animal, etc....) used when running the various custom models, I seem to have misplaced my links to those explanations.
Also, I have Default Object detection turned off in BI, so naturally the CP-AI dashboard also shows the "Object Detection (.NET)" service as "Not Enabled". I presume that turning it on within BI will start it up. Any reccomendations on when this SHOULD be used (ie: only with a high-power NVIDIA GPU card, as I run CPU only)? Also, what object names apply in that use case?
Finally, I keep seeing references to the various YOLOV5 models: where are they to be found and enabled? I don't even see references to them in the CP-AI dashboard dropdown boxes when testing and benchmarking, much less within BI...
CodeProject.AI-Custom-IPcam-ModelsOK, preliminary results look promising (ie: I didn't completely break anything...yet). I have this version running on the two cams that I had DeepStack running on previously, and after some minor tweaks to adapt to the newer model names (ipcam-general, ipcam-combined in my case) and ports used (5000 still works for now, it actually had some trouble working with the 32168, but that might be solvable with a reboot).
Could anyone please remind me where to find the classifications (person, vehicle, car, animal, etc....) used when running the various custom models, I seem to have misplaced my links to those explanations.
Also, I have Default Object detection turned off in BI, so naturally the CP-AI dashboard also shows the "Object Detection (.NET)" service as "Not Enabled". I presume that turning it on within BI will start it up. Any reccomendations on when this SHOULD be used (ie: only with a high-power NVIDIA GPU card, as I run CPU only)? Also, what object names apply in that use case?
Finally, I keep seeing references to the various YOLOV5 models: where are they to be found and enabled? I don't even see references to them in the CP-AI dashboard dropdown boxes when testing and benchmarking, much less within BI...
Thank you both, very helpful to re-see those and possibly adjust as needed. I'm sure it will help others reviewing this thread, as sometimes things like this tend to get scattered around and somewhat lost in the weeds.
Any suggestions or comments on the other points I brought up (when to use Default object detection...is it too memory/CPU intensive for a CPU only system? In my case an i5-8500). What labels apply then?
How to enable and test some of the other models (YOLOv5l, etc...)?
How much GPU memory is being used,. The actionnetv2 model needs a lot of memory, you might be running out of GPU memory. Try disabling modules you are not usingHey,
Still getting No predictions returned with 1.6 and Gtx 1660 super.
Do you have the latest CUDA 11.7.1 installed + cuDNN v8.5.0 along with the latest video card driver. The 1660 should have no issue running CodeProject.AIIt’s a 6gb card. Looks to be around 2.4gb been used by Python. With some of the first version of code ai I did see predictions Stats. Granted it never worked with blue iris. I did read a few user also having trouble getting 1660 cards to work in the thread. Not to sure if it’s tied to specific card.
Yeah .Do you have the latest CUDA 11.7.1 installed + cuDNN v8.5.0 along with the latest video card driver. The 1660 should have no issue running CodeProject.AI
Every look as it should, does it work if you use CPU instead of GPUYeah .