It's only 5GB... might be worth a try.I would think so. I got a 1030 card that is about as low as you can go and it was a significant improvement for DS.
It's only 5GB... might be worth a try.I would think so. I got a 1030 card that is about as low as you can go and it was a significant improvement for DS.
At this time I believe only NVidia cards are compatible with DS. Look at the Tesla series of cards from NVidia.
Thanks.
I think BI might have to seriously consider this as I understand the Tesla cards are going to be discontinued as well. I believe the AX which are Ampere based are going to take the place of all existing cards. The greta news i the perofrmance hike is massive. The bad news is the starting price jumps from around £110 currently for a P400 to around £699 from what info I can find, for a bottom of the range AX2000. Probably good value for a CAD user. Not so much for CCTV use.
Or consider using the google coral like Frigate does for $70 for the USB and a 10th of the power consumption.
Thanks but that requires a 2 way connection through my firewall 24/7 and I see that as a vulnerability. In addition, the bandwidth usage is going to possibly make other activities such as gaming compromised as 24/7 BI will be sending pictures to the cloud for anaylsis and receiving the results back.
That's my point though the tools are available to integrate into the blue iris. Deep stack is using the same libraries that coral's machine learning is built on. I am actively using it to do object recognition right now on my cameras.OK my bad. However, does BI support this and is it plug and play? I'm guessing no and no in which case whilst it looks like an exellent product, it's somewhat academic.
I am seeing interference speeds around 12ms, with 4 cameras ranging from 1080p to 4k using frigate.Maybe you can provide some comparison to Quadro in performance and if better, talk to the BI devs about plug and play support for one of the modules.
As many can't code, I'm guessing unless it can be made plug and play, then it's still academic.
A coral is capable of 100fps. I am running each camera at 5fps as any more FPS does not improve the recognition. It's also built differently than how deep stack. Frigate is searching motion zones and only sends areas with the motion for processing. I will some times add a 5th camera but I haven't bothered to do get in my attic yet.How many are triggered simultaneously? How many images are you analyzing? Four cameras is not much of a test, plus if it's Google it's behind the eight ball already to me. I don't trust Google further than I could throw them.
OK, that's all well and good but in a normal surveillance situation you're looking at, typically, 10-20FPS and anything from ten cameras on up. I've currently got a dozen using DS and get excellent ID in under 100ms typically examining a total of 15 images per trigger.
Thanks, I'll stick with an integrated package since I already have enough NVidia cards to handle a few hundred cameras with no problem. Zero investment, less equipment footprint and zero Google footprint.
Just curios how does 5fps work if you set the trigger for let's say every 150ms for 5 seconds? Maybe I'm being dumb but doesn't the fps need to be higher than the interval rate? I've got my cameras set at 8-10fps and trigger 250 ms for exampleC
A coral is capable of 100fps. I am running each camera at 5fps as any more FPS does not improve the recognition. It's also built differently than how deep stack. Frigate is searching motion zones and only sends areas with the motion for processing. I will some times add a 5th camera but I haven't bothered to do get in my attic yet.