Minimum GPU for Deep Stack, price point vs performance

mark22

n3wb
Joined
Mar 8, 2020
Messages
9
Reaction score
3
Location
nc
I realize this may be a subjective topic. I want to buy a GPU for my existing BI rig to use for Deep Stack. I am hoping to buy something older from Ebay at a lower cost. I've done some digging and I cant find the answers I am looking for to make a cost effective decision.
I have 7 cameras currently but will grow to ~20 this year. I see plenty of post with examples of what works but what I cant tell is if its more than you need to achieve the same results. Most folks I see are using an old card they have laying around but they are still new enough to be a considerable price point.
I also don't know if/how the GPU requirement is impacted by # of cameras. I found where someone was able to use a GT710 but I don't know how many cameras they are using or if that even matters. Is there anything to gain by going newer? Is there anything to lose by going older? How far back can I go on GPU and is there any info to help me determine the sweet spot on performance vs price.? I don't want to pay more than I need to for GPU cycles that wont gain me anything.
TIA.
 

mark22

n3wb
Joined
Mar 8, 2020
Messages
9
Reaction score
3
Location
nc
More digging and not answering my question but it looks like Deepstack currently works natively with cards having compute score 3.5 and up. Recently someone was able to manually compile to support models having a score less than 3.5. Deepstack plans to support sub 3.5 natively. (very recent post) GPU pytorch compile options for older card - i.e. GT710
I think the result I am looking for in my question is to understand what compute score should I be aiming for and then try to find a match on Ebay.
Another question I forgot to ask is does GPU ram matter? how much is enough?
GPU Compute Capability
 

sebastiantombs

Known around here
Joined
Dec 28, 2019
Messages
11,511
Reaction score
27,690
Location
New Jersey
It all comes down to your budget. The more CUDA cores the merrier and the more memory the merrier. Budget is the real problem.
 
Last edited:

iwanttosee

Pulling my weight
Joined
Dec 27, 2020
Messages
203
Reaction score
186
Location
US
I found my p620 quadro for $60 last year on Offerup, 512 cuda cores. Deals are out there, you just have to look for it and wait until it comes.
 

aadje93

Getting the hang of it
Joined
Apr 28, 2022
Messages
61
Reaction score
48
Location
Netherlands
using a p2000 Which i got a great deal on, permanently running HIGH model with main stream images (8MP) and its <300ms allmost 95+% of the time.

Do note, running it in a ubuntu vm on my truenas scale machine with passthrough. So the BI pc doesn't have to handle it, windows drivers seem to not be as stable, and don't know if i can let the IGP still work with the p2000 in it. With the ubuntu machine driver install was easy with some copy paste CLI commands, then startup docker and a startup rule makes docker start the deepstack container on every reboot of the vm.
 

Cameraguy

Known around here
Joined
Feb 15, 2017
Messages
1,485
Reaction score
1,122
Would this card work with bi - Deepstack?
Nvidia Quadro K600 1GB DDR3 PCI-E
 

iwanttosee

Pulling my weight
Joined
Dec 27, 2020
Messages
203
Reaction score
186
Location
US
Would this card work with bi - Deepstack?
Nvidia Quadro K600 1GB DDR3 PCI-E
You'd want a Nvidia card with the most cuda cores you can afford. K600 only has 192 cores. I honestly wouldn't buy a K600 when below are the cost about the same.

P400 (256 cores), P600 (384 cores), P620 (512 cores), GT1030 (384 cores)
 
Last edited:

Cameraguy

Known around here
Joined
Feb 15, 2017
Messages
1,485
Reaction score
1,122
You'd want a Nvidia card with the most cuda cores you can afford. K600 only has 192 cores. I honestly wouldn't buy a K200 when below are the cost about the same.

P400 (256 cores), P600 (384 cores), P620 (512 cores), GT1030 (384 cores)
Would the k600 work with bi deepstack tho?
 

Flintstone61

Known around here
Joined
Feb 4, 2020
Messages
6,587
Reaction score
10,894
Location
Minnesota USA
Im having trouble linking to ebay.....getting a weird error. I'm rebooting browser. Hopefully im not hacked LOL
 
As an eBay Associate IPCamTalk earns from qualifying purchases.

Cameraguy

Known around here
Joined
Feb 15, 2017
Messages
1,485
Reaction score
1,122
Im having trouble linking to ebay.....getting a weird error. I'm rebooting browser. Hopefully im not hacked LOL
Thanks but I'm looking for a budget card just enough to relieve a little cpu stress. I was reading and looks like that Nvidia k600 won't work
 
As an eBay Associate IPCamTalk earns from qualifying purchases.

Flintstone61

Known around here
Joined
Feb 4, 2020
Messages
6,587
Reaction score
10,894
Location
Minnesota USA
peeks in from Windows 11 evaluation copy, Build 22581.ni_release 221318-1623
Ebay works on this motherfucker. hmmmm
 

Cameraguy

Known around here
Joined
Feb 15, 2017
Messages
1,485
Reaction score
1,122
I read Quadro k620 is the lowest card that will work with deepstack
 

Flintstone61

Known around here
Joined
Feb 4, 2020
Messages
6,587
Reaction score
10,894
Location
Minnesota USA
As an eBay Associate IPCamTalk earns from qualifying purchases.

mark22

n3wb
Joined
Mar 8, 2020
Messages
9
Reaction score
3
Location
nc
The number of quick responses are amazing but I really was hoping for a bit of data to make a decision. What does merrier mean exactly? What does more GPU cores get me? what does more GPU ram get me?
Does either mean it simply supports more cameras?
Does either mean it returns results faster? (If so data would be nice to determine best bang for my buck)
Does either mean increased accuracy?
Is there a point where more is not "merrier"? If I can spend 10k on a GPU is it still somehow better than spending 1k? If so, in what way and by how much?

I don't want to drop the cash for the latest and greatest if for my usage it doesn't gain anything or much over something half the price.
How does cuda cores translate to performance, what is impacted by this? Does it matter if I have 1 camera or 100? Similarly where does GPU ram fit in here?

I want to offload Deepstack to GPU at the most reasonable price for performance that is acceptable to me.
Maybe this is still too new and the data I am looking for doesn't yet exist or am I being unreasonable that I feel like it should?
 
Top