Minimum GPU for Deep Stack, price point vs performance

More cores equals faster processing time and more cameras being handled simultaneously. Those cores need memory which is why more memory can become critical as well. In this case "merrier" means better performance and better capabilities overall. It's governed by what's acceptable to you. I didn't hesitate when the RTX series of cards came out because it was acceptable to me back then. Now, I'd look for a 1030 or 1060 on the used market. Some of the Quadro series also do a reasonable job. As I said, budget is the biggest factor, then cores and memory. There's no set answer for every case due to the variable budgets.
 
  • Like
Reactions: mark22
Ok, I may pull the GTX970 from my main rig for BI and buy something more recent for my main rig. If I am not happy with the performance in BI I can swap them. The 970 still works fine for my gaming needs.
 
Last edited:
  • Like
Reactions: sebastiantombs
I use a 970 for DeepStack and it works well.
 
  • Like
Reactions: mark22
More Cuda cores means faster processing. More Cuda cores means more data processing power so faster results if multiple cameras trigger at the same time.

As said I use a p2000, still have to install the custom cctv model. But on high accuracy with main stream I get almost no false negatives/positives, due to combined onvif motion detection in the camera and blue iris very sensitive. Yes that's a storm of motion events for the p2000 to handle on all my cams (10+ 8mp) but results during the day get back in under 400ms all the time, some in 50ms if it's not to busy.

Also a Nvidia card from quadro series p1000 and higher can do 7nlimited stream decoding only limited by processing power.

If you look up the wiki you'll notice a system with multiple p1000 doing over 2000MP/s