I agree completely. You can save yourself a lot of headaches by simply upgrading your BI pc for a couple hundred bucks and forget the GPU. If you are running 10 cams or less on AI I really see no need to use GPU. If you really want to go the GPU route, then do some research and ask what card is working the best. I still run a I7-6700k in CPU mode and have had zero errors, the only time AI restarts is when I update BI (which is always the latest, sometimes not the greatest)
View attachment 158068
I agree the cpu way is alot more stable. However the gpu is a hair faster and with the amount of detections that I am getting at both locations the gpu makes more sense. Since the last update the other day I have more than 4 million analyzed images just on one of my machines. And have only popped 3 Error:500 codes. The other machine has analyzed just north of 3 million images and shows no errors using the gpu.
However, depending on the resolutions of the cameras being used it is fairly easy to tax even a higher end cpu with just the transcoding and decoding of the video streams. I use dual streams on my 9 camera system(5820K @ 4.4ghz all core, 32GB ram, RTX 2060 12GB) and it is not uncommon to see the cpu spike up the 100% usage for short durations if there is alot of motion going on. I believe that this is mainly due to the decoding of the video for analysis purposes.
The thing that remains puzzling to me is that a 1650 with 4GB of ram shows no errors what so ever. So I dont think that it is strictly a video card problem or a vram problem anyway.
It is however based on use case and what you need the machine to do as far as analysis goes. If you are trying to capture fast moving objects such as cars then you need to send more frames to be analyzed at a faster pace. I have not broken into ALPR but I am sure that requires even more images to be sent to get a correct detection. With more images being analyzed in a short duration the gpu makes more sense to me. Analysis times for CPU on the above system average 150-175ms on the GPU in the same system they are averaging 60-75ms.
In my case the gpu made sense as I was able to get them on a good deal used before the GPU prices hit the fan. For a new setup I believe that for now the cpu only approach would be cheaper.
I am going to drop the main rig back to cpu analysis for now and see how the cpu usage is compared to the gpu usage just to get away from the errors. (BI sends me a notification when I get errors)
**Edit well that did not take long Multiple Error:200 (No Error:500 though
). Detection times are running on Average 550-650ms on the same machine and CPU pings 100% usage for the duration of the analysis. Back to GPU lol.