100 % CPU usage when viewing live video with 6 cameras at 30 FPS. I have a Dell T3600 Xeon E5-1650, 16 GB DDR3 , & (No onboard) GTX 460 video card

bp2008

Staff member
Joined
Mar 10, 2014
Messages
12,666
Reaction score
14,007
Location
USA
Hello.

1) Are you actually using Milestone software?
2) Do you really mean Xeon E5-1650?
3) How much video is being viewed when the CPU is at 100%? Multiple cameras? For each camera, calculate megapixels per second (MP/s) by multiplying its resolution in megapixels by its frame rate. Add together the MP/s values for each camera being viewed. If it is more than several hundred, that would likely be too much for your CPU.
4) Is the video being viewed on the server machine or on some other remote device?
 

besharim

n3wb
Joined
Jan 13, 2020
Messages
19
Reaction score
2
Location
Port Saint Lucie, FL
Hello.

1) Are you actually using Milestone software?
2) Do you really mean Xeon E5-1650?
3) How much video is being viewed when the CPU is at 100%? Multiple cameras? For each camera, calculate megapixels per second (MP/s) by multiplying its resolution in megapixels by its frame rate. Add together the MP/s values for each camera being viewed. If it is more than several hundred, that would likely be too much for your CPU.
4) Is the video being viewed on the server machine or on some other remote device?
I am using Milestone Xprotect Essential+. The processor is an E5-1650. The cameras are as follows: Two 2 megapixel cameras and Four 4 megapixels all at 30fps at max resolution. The video is being viewed directly on the machine with the software.
 

bp2008

Staff member
Joined
Mar 10, 2014
Messages
12,666
Reaction score
14,007
Location
USA
Ok then.
One 4 MP camera at 30 FPS is 120 MP/s
One 2 MP camera at 30 FPS is 60 MP/s

(120 * 4) + (60 * 2) = 600 MP/s

If Milestone's efficiency is anything like Blue Iris's when rendering video, then this is probably too much. Try reducing frame rates on all cameras to 15 FPS, and if that allows it to run properly below 100% CPU at all times, you can start increasing frame rates again as you see fit.
 

besharim

n3wb
Joined
Jan 13, 2020
Messages
19
Reaction score
2
Location
Port Saint Lucie, FL
Ok then.
One 4 MP camera at 30 FPS is 120 MP/s
One 2 MP camera at 30 FPS is 60 MP/s

(120 * 4) + (60 * 2) = 600 MP/s

If Milestone's efficiency is anything like Blue Iris's when rendering video, then this is probably too much. Try reducing frame rates on all cameras to 15 FPS, and if that allows it to run properly below 100% CPU at all times, you can start increasing frame rates again as you see fit.
Do you recommend using a processor that supports intel quick sync?
 

fenderman

Staff member
Joined
Mar 9, 2014
Messages
36,897
Reaction score
21,250
What is the resolution of the monitor you are using?
 

sebastiantombs

Known around here
Joined
Dec 28, 2019
Messages
11,511
Reaction score
27,690
Location
New Jersey
In terms of bang for your bucks, I'd say pick up a used PC with a Win10 license. While an Nvidia card is the quick way, they do eat a lot of power and they are limited in how many cameras they can handle from what I've seen. I say this as an Nvidia user.
 

besharim

n3wb
Joined
Jan 13, 2020
Messages
19
Reaction score
2
Location
Port Saint Lucie, FL
In terms of bang for your bucks, I'd say pick up a used PC with a Win10 license. While an Nvidia card is the quick way, they do eat a lot of power and they are limited in how many cameras they can handle from what I've seen. I say this as an Nvidia user.
How many camera limitations do you see with the Nvidia cards?
 

bp2008

Staff member
Joined
Mar 10, 2014
Messages
12,666
Reaction score
14,007
Location
USA
It depends on the cards, the video codecs, resolutions, frame rates, and maybe even bit rates. There's not a lot of real data on this. In Blue Iris my RTX 2080 Ti maxed out at about 1900 MP/s of H.264 video, however turning off hardware acceleration did not change my 1900 MP/s limit so I think I was limited by memory bandwidth. A low end card like a GT 1030 will max out much lower, perhaps in the 400-500 MP/s area. I don't know if the performance scaling favors cheaper cards or more expensive ones. Similarly I can't say how Intel's hardware acceleration scales between CPUs, but you can get to about 1500 MP/s on an i7-8700K and that is more than I'd recommend loading it with.
 

thendawg

Getting the hang of it
Joined
Dec 17, 2018
Messages
37
Reaction score
29
Location
OKC, OK
I think your issue is viewing it on the machine its running on which has a fairly old GPU likely not capable of hw decoding the stream. Using CPU to render video is very inefficient. So it's not actually the NVR, but the client rendering the streams. Iif youre running Windows SmartClient, it shouldnt have to transcode anything on the NVR. I monitor my cameras from 2 windows PC's both with 10 series GPU's and my NVR idles at a nice 5-10% CPU util - now if I fire up the mobile app, itll easily hit 100% with more than 1 stream going, as it has to transcode. If you really want to watch the feeds on the NVR, just buy a more modern GPU like at least 9 or 10 series - then you also have the advantage of using it for hw transcoding for other (mobile) streams :)

*I base all of this on my experience with x264, no idea what happens with x265
 
Top