5.5.8 - June 13, 2022 - Code Project’s SenseAI Version 1 - See V2 here https://ipcamtalk.com/threads/codeproject-ai-version-2-0.68030/

jrbeddow

Getting comfortable
Joined
Oct 26, 2021
Messages
370
Reaction score
485
Location
USA
Yes, mine is exactly the same. Here is a sample of the return results on the log page. Not sure why I am seeing a reference to "combined" here, as I wasn't even running that model before under DeepStack. i still get a 404 error on each actual Alert status.AI-Log.jpg
 

MikeLud1

IPCT Contributor
Joined
Apr 5, 2017
Messages
2,141
Reaction score
4,118
Location
Brooklyn, NY
Yes, mine is exactly the same. Here is a sample of the return results on the log page. Not sure why I am seeing a reference to "combined" here, as I wasn't even running that model before under DeepStack. i still get a 404 error on each actual Alert status.View attachment 130636
Check your camera's AI settings, I had that happen when I first set it up
 

jrbeddow

Getting comfortable
Joined
Oct 26, 2021
Messages
370
Reaction score
485
Location
USA
1655167403306.png

Maybe I need to try one more full server reboot? Not sure if that was done before or after the latest changes I made (Port: 5000).
 

jrbeddow

Getting comfortable
Joined
Oct 26, 2021
Messages
370
Reaction score
485
Location
USA
Well yes, I do. But I assumed it wouldn't attempt to make use of those custom models, as it is now a "Greyed out" option in BI once SenseAI is installed.

I will try removing those for now, and report back.
 

jrbeddow

Getting comfortable
Joined
Oct 26, 2021
Messages
370
Reaction score
485
Location
USA
Yes, that did the trick, it is working now. So yeah, it is important to remove the custom models (or move that folder completely out of the originally defined path, as I did). Odd, as it shouldn't be pulling from there if it is essentially an unchangeable, greyed out option.

That being said, I doubt I'll be keeping this SenseAI running for now, as the detection times are noticeably longer than they were with DeepStack and your custom models. I'm seeing times around 500-950ms per image analyzed with SenseAI, and they generally ran in the 100-300ms (occasionally longer) range under DeepStack using the same - CPU only setup and Medium image size.

Wish me luck rolling back to DeepSense...fingers crossed it goes smoothly.
 

jrbeddow

Getting comfortable
Joined
Oct 26, 2021
Messages
370
Reaction score
485
Location
USA
They still need to also prioritize GPU support to speed up analysis time. Without that it's not going to be very useful, IMHO.
Maybe, maybe not: I was hoping that it could perhaps equal or even better DeepStack running with @MikeLud1 's custom models on CPU only setups such as mine. Well, for now, it just isn't quite "there" running their default model, but who knows what will happen when we see what it can do with a slimmed down/faster custom model?
 

sebastiantombs

Known around here
Joined
Dec 28, 2019
Messages
11,511
Reaction score
27,690
Location
New Jersey
If it isn't NVidia/CUDA is just isn't. I bought into NVidia way back when, in Rev 2 or 3 of BI based on Ken's recommendations. No way I'm changing now, too much invested and too expensive to replace.
 

jrbeddow

Getting comfortable
Joined
Oct 26, 2021
Messages
370
Reaction score
485
Location
USA
Well, of course anyone already invested in an add-on Nvidia card would be in favor of the added speed. That's obvious, but for those of us cheapskates who want to run on CPU only, DeepStack does now run surprisingly well with these newer slimmed down custom models.

Just for the record, it's not the cost of the Nvidia card I am opposed to, it's the added electricity cost on an ongoing basis. I'm in CA, so every added kWh that goes on my bill runs somewhere in excess of $0.35+, not the ~$0.10 or $0.12 it runs on a national average.
 

tofu

Getting the hang of it
Joined
May 3, 2019
Messages
113
Reaction score
72
Location
NYC
Well, of course anyone already invested in an add-on Nvidia card would be in favor of the added speed. That's obvious, but for those of us cheapskates who want to run on CPU only, DeepStack does now run surprisingly well with these newer slimmed down custom models.

Just for the record, it's not the cost of the Nvidia card I am opposed to, it's the added electricity cost on an ongoing basis. I'm in CA, so every added kWh that goes on my bill runs somewhere in excess of $0.35+, not the ~$0.10 or $0.12 it runs on a national average.

Deepstack without the gpu was spiking cpu usage to 90%+. I have people walking by several times a minute during certain hours (NYC). Electricity ain't cheap here either, but I felt it more or less balanced out. Instead of having a cpu pegged at 100% maybe half of the day, I now have a 40w gpu spiking to like 14% and a cpu that sits at 10-15% most of the time
 

CrazyAsYou

Getting comfortable
Joined
Mar 28, 2018
Messages
246
Reaction score
262
Location
England, Near Sheffield
Just an FYI, the installer will install the below .NET components onto your system if you don't alreay have them, below shows versions and sizes - this is on an uptodate Windows 10 system.

1655194550936.png
 

CrazyAsYou

Getting comfortable
Joined
Mar 28, 2018
Messages
246
Reaction score
262
Location
England, Near Sheffield
I'm having a little play around and it looks really good but until there is GPU support, I can even think about replacing Deepstack, the response times and volume processing are far too slow. 3.2 per second vs about 15 per second, not to mention the CPU getting pegged and the difference in power consumption
 
Top