Seeing 'AI: timeout' events when 2+ cameras are triggered at almost the same time

jaydeel

BIT Beta Team
Joined
Nov 9, 2016
Messages
1,132
Reaction score
1,240
Location
SF Bay Area
As a result of investigating my current and past logfiles, I'm noticing a not insignificant number of AI 'timeout' events when my "camera pairs" are triggered at almost the same time.

Specifically, my pair of driveway cameras (DW1/DW2), and my pair of front entry and front door cameras (FE/FD).

This goes back for months; I don't know how I never noticed it until now (!?#$@).

I'm beginning to wonder if my PNY NVIDIA Quadro P400 V2 card is sometimes balking when receiving near simultaneous events.

I'm not sure yet how to test / investigate this. Any ideas?
 

jrbeddow

Getting comfortable
Joined
Oct 26, 2021
Messages
373
Reaction score
489
Location
USA
I have just recently been struggling with a similar issue on the "pair" of cameras that I run CodeProjectAI on, but I run CPU only, so slightly different situation. I get fairly high CPU spikes up to near (or at) 100% CPU use briefly when both cameras are sending images to CPAI: as a result, I have been starting to see some short lived "No signal" drops on the Status, Cameras tab (when oddly enough, I never did under DeepStack, but different settings in general). I have been testing a fix that seems to be helping: increase the buffer size on the stream from each of those two problem cameras. I was at 15MB, then 25MB, now testing at 35MB buffer today, with promising results but not yet fully conclusive until I let it run a few more days like this. The increased buffer size seems to help reduce the number of dropouts...fingers crossed it will be the final fix.
It couldn't hurt to try this for your situation, right?

Edir: What are your typical individual inference times running? I am averaging 130ms on Medium (default Medium, not "mainstream" or high resolution) on CPU only, i5-8500 system. I saw your other thread with some very lengthy times, hopefully those are not typical.
 
Last edited:

jaydeel

BIT Beta Team
Joined
Nov 9, 2016
Messages
1,132
Reaction score
1,240
Location
SF Bay Area
What are your typical individual inference times running? I am averaging 130ms on Medium (default Medium, not "mainstream" or high resolution) on CPU only, i5-8500 system. I saw your other thread with some very lengthy times, hopefully those are not typical.
For 'general' model in 'High' mode = 200-250 msec
For 'general' model in 'Medium' mode = 100-150 msec
 

jrbeddow

Getting comfortable
Joined
Oct 26, 2021
Messages
373
Reaction score
489
Location
USA
For 'general' model in 'High' mode = 200-250 msec
For 'general' model in 'Medium' mode = 100-150 msec
Ok, good to know, but I'm surprised you are not seeing faster than that. I had considered getting the same Nvidia card at one time, but I will hold out for other options if or when they become available (Coral USB possibly).
 
Top