AI code project using cpu

Breazle

n3wb
Nov 16, 2024
13
0
UK
Hi,

I've attached 3 images of my setup, I have 24 cameras, two 4k, others down to HD/5MP. My aim was to have the camera send onvif motion detection to blue iris and then an rtx 3070 to handle the ai detection. The two 4k cameras I want to do motion detection on are triggering via onvif but each time it does my cpu spikes to 100% and some cameras get bottleneck/fps errors which I guess is simply the cpu not being able to keep up. When no cameras are triggered the cpu is running around 30%. Continuous record is on for all cameras, and live view is using sub streams for most. Only reason I can see the cpu would spike is the switch to main stream on triggering, but if I run the camera permenantly on main stream the cpu only goes up 5-10% so I cant see that being the issue either.

Everything is gigabit switches mostly all down a single Cat5E cable, continuous recording is on so I assume main streams are always being recorded and its just the live view im seeing the sub stream.

Any support would be appreciated
 

Attachments

  • alerts.PNG
    alerts.PNG
    69.2 KB · Views: 0
  • code project.PNG
    code project.PNG
    94.9 KB · Views: 0
  • Capture.PNG
    Capture.PNG
    71.3 KB · Views: 0
Hi,

I've attached 3 images of my setup, I have 24 cameras, two 4k, others down to HD/5MP. My aim was to have the camera send onvif motion detection to blue iris and then an rtx 3070 to handle the ai detection. The two 4k cameras I want to do motion detection on are triggering via onvif but each time it does my cpu spikes to 100% and some cameras get bottleneck/fps errors which I guess is simply the cpu not being able to keep up. When no cameras are triggered the cpu is running around 30%. Continuous record is on for all cameras, and live view is using sub streams for most. Only reason I can see the cpu would spike is the switch to main stream on triggering, but if I run the camera permenantly on main stream the cpu only goes up 5-10% so I cant see that being the issue either.

Everything is gigabit switches mostly all down a single Cat5E cable, continuous recording is on so I assume main streams are always being recorded and its just the live view im seeing the sub stream.

Any support would be appreciated
Make sure you have all your cameras setup to use sub-steam


Use Main stream if available does not improve AI accuracy it just slows down the detection so make sure you disable it. The only time you should enable Use Main stream if available is on cameras that you are using the ALPR module

Last try using Object Detection (YOLOv5 .NET) it is faster then Object Detection (YOLOv5 6.2). I have a RTX 3060, below are my detection times so your RTX 3070 should be about the same or faster.

1739623528301.png
 
Last edited:
thanks for the prompt reply! I'll try that now. Am I right in thinking that the main stream is permenantly being recorded though?
 
made the changes, sadly the two 4k cameras I have dont have HD sub streams but 640x480 resolutions so we will see how well the AI detection works now. I remember why I switched to V5.Net not sure the difference but its using GPU(DirectML) not GPU Cuda, whats the differece there?
 
CodeProject downrezes the video to lower than that (which is why they say don't use mainstream because it will downrez it) so you should be fine unless trying to do too much with one field of view.
 
Last edited:
still getting the issue, cpu spikes to 100% and lags other cameras when one camera is triggered. I have unticked use main stream if available. I have now lowered both cameras main stream from 4k to 2560 and that seems to help but not sure why this should help or what else to try. If I was to look at buying a new cpu, what part of the cpu is being used, l2 cache, l3 cache or just lots of threads?
 
Have you performed every single tip HERE
Pay special attention to the section about excluding BlueIris from any antivirus as per the Blue Iris help file.
What exact CPU does your computer have?
Thanks for the reply.
Yes I am doing all of those steps, I'm using a I7 7700k, with an RTX 3070 for code project.

CPU runs at 30% normally, I have 24 cameras, most are either HD or 5 MP, the two that I do AI detection on are 4k, when either of those trigger the cpu spikes. I have lowered them to 2560 and it is better but still not great. But I'm not sure why the triggering is even impacting the cpu, the AI is utilizing the gpu
 
Something isn't right. People run 3rd gens with 50 cams at a lower CPU than that.

Are you sure substreams are being used and the computer isn't used for anything else?

Post a screenshot of this BI camera status and slide the slider over to show the sub FPS and key info. The Totals at the bottom are important as well:

1740345594323.png
 
Nothing explains why lowering to 2560x instead of 3840x2160 resolution makes such an impact when triggering though. I have unticked use main stream when available for ai detection, so nothing should be happening on the main stream during the triggering
 
Yeah that is nothing - this thing should be low single digits.

Post a screenshot from Task Manager sorted by CPU%
 
Last edited:
Thanks for the reply.
Yes I am doing all of those steps, I'm using a I7 7700k, with an RTX 3070 for code project.

CPU runs at 30% normally, I have 24 cameras, most are either HD or 5 MP, the two that I do AI detection on are 4k, when either of those trigger the cpu spikes. I have lowered them to 2560 and it is better but still not great. But I'm not sure why the triggering is even impacting the cpu, the AI is utilizing the gpu
I suspect that you have NOT properly excluded BlueIris from Antivirus, this includes MS Defender.
See the proper section in the Blue Iris help file for more info.

I just installed a system for my sister with 24 4mp cams, with a I7-10700 and 32 gigs of ram.
It hums along at 5-8%.
No Code project used, only using the AI in the cameras.
 
One thing I notice is only one of your cameras has a frame rate ratio of 1.0. All the rest are all 0.25, 0.12, 0.50, 0.33, etc.. which indicates your cameras aren't set with their FPS and iFrame interval the same. It's recommended they be the same to get a ratio of 1.0 meaning you store one iFrame every second. Logon to each camera's settings page and configure these so they are the same. Most people prefer 15 fps.
 
  • Like
Reactions: looney2ns
One thing I notice is only one of your cameras has a frame rate ratio of 1.0. All the rest are all 0.25, 0.12, 0.50, 0.33, etc.. which indicates your cameras aren't set with their FPS and iFrame interval the same. It's recommended they be the same to get a ratio of 1.0 meaning you store one iFrame every second. Logon to each camera's settings page and configure these so they are the same. Most people prefer 15 fps.
Thanks, I did get abit confused with these. Different cameras seem to handle Iframe intervals differently and from what I read putting them to double the value to the fps was better for cpu. Your saying its better for performance to set them the same then?

I did a full reinstall last night and am now sitting at 6% cpu, so really not sure what changed. Everything looks the same. However the main issue still persists and I'm not sure if this is normal, if a camera triggers and sends to AI the cpu spikes to now about 70% (was 100% and lagging cameras) What is causing the spike since the gpu is handling the AI?

Is there also a way to backup all blue iris settings and camera info incase I ever need to reinstall?

Thanks for all the continued support!
 
Thanks, I did get abit confused with these. Different cameras seem to handle Iframe intervals differently and from what I read putting them to double the value to the fps was better for cpu. Your saying its better for performance to set them the same then?

I did a full reinstall last night and am now sitting at 6% cpu, so really not sure what changed. Everything looks the same. However the main issue still persists and I'm not sure if this is normal, if a camera triggers and sends to AI the cpu spikes to now about 70% (was 100% and lagging cameras) What is causing the spike since the gpu is handling the AI?

Is there also a way to backup all blue iris settings and camera info incase I ever need to reinstall?

Thanks for all the continued support!

Putting iframes at double the FPS saves on the CPU resources of the camera, not the BI computer.

If you are going to use BI for motion and AI, then you have to match the FPS and iframe.

The BI developer has indicated that for best reliability, sub stream frame rate should be equal to the main stream frame rate, but at worse case be no more than double.

So your camera with a key rate of 0.1 means that if an object can be in and out of your field of view in under 10 seconds, the camera may miss it as BI motion only activates on an iframe. A 0.25 means an object in and out of the field of view in under 4 seconds could be missed. A value of 0.5 or less is considered insufficient to trust for motion triggers reliably and below 0.40 and BI will throw caution triangles (that your cameras should be showing)...try to do CodeProject AI and it will be useless...

Now it sounds like you are using the camera motion detection instead, so it shouldn't be as critical.

Yes you can export out the BI settings - on the main page is an export and then each camera has an export.

Keep in mind though that importing in many times will bring it whatever problem existed that you are trying to eliminate with a reinstall.

Best practice is to rebuild from scratch as you did.

Now why the CPU is so high on AI is interesting. My thoughts are you have too many cameras triggering at once and the GPU can't handle it all.

@MikeLud1 any ideas?
 
Thanks for all the info!
I only have two triggering, they do both trigger a lot together due to rain especially in the dark with IR Rain. Would the fact the iFrame was not correct effect the cpu usage when triggering to AI?
 
Cameras trigging all night due to ran will ramp up CPU quickly.

Sometimes settings can mitigate it like longer make time, changes to zones, etc.

Do none of your cameras have AI in them?
 
Cameras trigging all night due to ran will ramp up CPU quickly.

Sometimes settings can mitigate it like longer make time, changes to zones, etc.

Do none of your cameras have AI in them?
Yes, they don't actually trigger. I have 2 with motion sensor on, both 4k, with AI detection, AI detection has use main stream if available unticked. Apologies I suppose my terminology has been wrong, its the alert that is spiking the cpu. So motion found alert sent to AI, AI says nothing found so no trigger happens. AI is set to use GPU Cuda driver using Net 6.2 because NET 5.0 used Direct ML whatever that is. The CPU spiking is each time motion is seen and the alert is sent to AI. There is currently a spider in front of one of the two cameras and each alert spikes cpu from 7% to around 50% but again to clarify it doesnt trigger as AI finds nothing (which is what I would expect)