Dedicated graphics card for Deepstack or not?

sorka

Pulling my weight
Aug 9, 2015
368
200
I have BI5 running with 14 cameras currently. At 2% cpu using substreams for motion detection. The entire i7-8700 is sitting at 6% total cpu. I'm about to install Deepstack. With so much cpu headroom left on my PC, would I still benefit from a dedicated graphics card like a 1030 or P400?
 
Many of us have found the GPU version is faster, but we have been on older machines.

I'd say give it a try without the GPU and see what the CPU does and the make time for DeepStack - if it is under 100ms and CPU doesn't spike to 100%, then the GPU would be a waste.

I would think for an 8th gen, it would be faster than the 1030 or P400 card - you would need a beefier one to see any improvement.
 
I've bought a Nvidia T600 to reduce my CPU load - The CPU had started spiking to 100% and unless I unchecked " Use main stream if available", a large number of alert triggers were being cancelled. Unfortunately, so far there's only a small improvement but as some of you will know, my IT knowledge is extremely shallow and I'm hoping for your help please:-

Good result when use main stream is unchecked:-
Screenshot 2021-10-02 150444.png

Same truck with use main stream checked (in this example the truck is still confirmed and has processing times similar to above but Deepstack setting has been ignored). All other cameras have use main stream unchecked:-
Screenshot 2021-10-02 150641.png
Camera bit rate settings etc:-
Screenshot 2021-10-02 151029.png

Deepstack settings (when unchecked not shown but no other change):-
Screenshot 2021-10-02 151238.png

I'm recording direct to disk of course on all cameras:-
Screenshot 2021-10-02 151337.png

Task manager, use main stream unchecked (I went back to using a chunk of the SSD for the new bvr files when they are being created to try and make every thing happen a bit quickerScreenshot 2021-10-02 152209.png

Task manager use main stream checked:-
Screenshot 2021-10-02 151940.png

T600 utilization ( I installed the driver from the link in the nvidia quick start guide):-
Screenshot 2021-10-02 152324.png

I'm using 720p for all the substreams to get a reasonable resolution for missed triggers, I have a large enough drive. Am I expecting too much or did I screw up? Your feedback would be much appreciated.
 
I'd check the box for "on alerts only" in DS. DS won't detect at lower than 40% so setting to 30% doesn't do anything. I don't use the main stream for DS on any of the cameras I have DS running on. I'd start with using DS on one or two cameras first, then add more once you get a feel for it and see how the increased load effects things.
 
OK sebastiantombs, thanks for the quick feedback. I confess I don’t really understand on alerts only. I only use triggers locally and don’t send alerts anywhere - does that make a difference? And I want to be able to see the ones that may have been falsely cancelled. In any case, I’ll check that box and see what happens. I think it may have been to me who Ken Pletzer first pointed out the DS minimum is 40% when I thought it was his minimum setting. 30% is a wishful thinking future change by DS.

The thing is, my CPU was very small when substreams were first introduced but the % seems to be ever increasing. But I have since added the 8MP cameras and now use top bitrates…

So, there’s not a magic special Nvidia driver for BI/DS applications then?
 
The driver is specific to the card so there is no magic bullet there. I'm using a GTX970 and have very good results with the latest driver from NVidia. YMMV.

As the bit rate goes up, so goes CPU utilization. Even if you're using sub streams each camera added increase the bit rate and the CPU utilization. No way to avoid that. The latest versions of BI also seem to be increasing CPU ever so slightly, but I can't be sure yet.
 
I imagine that the substream is perfectly fine for identifying things like cars and people but that if deepstack was going to be accurate on faces or license plates you'd need the full res stream for that.
 
Remember BI does switch to the main stream for alerts. I suspect DS is getting those snapshots anyway. Not checking the "on alerts" box means DS is constantly analyzing, target or not, which adds load to both the GPU and CPU, at least to my limited understanding of how DS is working.
 
I just purchased and installed a T600 also, thinking my Deepstack detection would be better. I don't even see a difference in detection time and things are still missing.
 
Looking at the specs for the T600 is is not very high in the cuda core count, 640 cores, department. Just a guess, based on my own experience, but over 1000 cuda cores are needed for good performance. The gtx970 I'm using has 1664 cuda cores and detection times are in the 50-250ms range.
 
Remember BI does switch to the main stream for alerts. I suspect DS is getting those snapshots anyway. Not checking the "on alerts" box means DS is constantly analyzing, target or not, which adds load to both the GPU and CPU, at least to my limited understanding of how DS is working.
Sebastiontombs, reading your message, I’m now wondering if I‘ve misunderstood how DS works in BI. I thought DS analyses all BI’s triggers to decide confirm or cancel. So I don’t understand why the CPU changes whether or not “on alert” is checked. A bit more explanation may help me please.
.
 
"On alert" to my understanding means that DS gets sent snapshots only during alerts/triggers. Leaving it unchecked, in my mind for what it's worth, means DS is constantly analyzing. i can certainly be wrong about that, but if DS analyzing all the cameras it is enabled on that will most certainly slow things down, detection times, for real events. The easy way to check is simply either check it, or uncheck it, as the case may be and watch both CPU and GPU utilization. I can tell you that in my installation the GPU sits idle until there is an alert.
 
Its also looking like, as sebastiontombs has mentioned, the T600 is not a golden bullet and it’s not as wonderful as I had imagined. So I’m now wondering if anybody’s system with lots of cameras having lots of MP and lots of bitrate works OK with “Use main stream if available”. I’m not even using custom models or dark.
 
"On alert" to my understanding means that DS gets sent snapshots only during alerts/triggers. Leaving it unchecked, in my mind for what it's worth, means DS is constantly analyzing. i can certainly be wrong about that, but if DS analyzing all the cameras it is enabled on that will most certainly slow things down, detection times, for real events. The easy way to check is simply either check it, or uncheck it, as the case may be and watch both CPU and GPU utilization. I can tell you that in my installation the GPU sits idle until there is an alert.
I’m checking on alert for the rest of my cameras to try right now. Thanks BTW.
 
  • Like
Reactions: sebastiantombs
Let us know how that does, if both CPU and GPU utilization drop.

As a side note, with the performance you're seeing I wouldn't try any custom models unless/until detection times drop significantly. DS has to scan two different model files when using custom models.
 
Now that I have a GPU, should I change the hardware accelerated decode to Nvidia (instead of intel + vpp)
 
I'd leave HA alone. Don't overload the NVidia card with video processing and DS.
 
  • Like
Reactions: kklee and bjohnrini
Plus I saw on mine that if you ever restart the computer, BI update, etc., At least in my case, I don't think the GPU was up and running and when I had default HA as the NVIDIA, all my cameras bugged out and then would go to no HA.
 
All cameras now with "on alert" checked and none with "use main stream if available":-
Screenshot 2021-10-02 223238.png

Didn't fix my CPU% unfortunately. Looks like the SSD is about to transfer its files over to the HDD???
I'll try going back to Intel HA