XProtect Smart Client + GPU acceleration

adamg

Pulling my weight
Joined
Sep 19, 2017
Messages
250
Reaction score
129
outlawzz, have you considered Genetec Security Center? It supports GPU acceleration very well.
 

outlawzz

n3wb
Joined
Apr 1, 2017
Messages
10
Reaction score
0
outlawzz, have you considered Genetec Security Center? It supports GPU acceleration very well.
Hey adamg , no not really; we already made a significant investment in Milestone. I'm keeping my hopes up that they start supporting GPU acceleration real soon. What kind of performance are you seeing with Genetec?

Sent from my SM-N920V using Tapatalk
 

adamg

Pulling my weight
Joined
Sep 19, 2017
Messages
250
Reaction score
129
Unfortunately I've not had the opportunity to build up a powerful 'gaming' style computer for use in a video workstation / wall so I don't have good data for you.
 

LittleScoobyMaster

Getting the hang of it
Joined
Jul 24, 2015
Messages
229
Reaction score
24
Hey adamg , no not really; we already made a significant investment in Milestone. I'm keeping my hopes up that they start supporting GPU acceleration real soon.
Me also. I just installed 2017 R3 and it doesn't seem to support it yet either. Maybe in 2018... I'd say this is probably one of the biggest shortcomings of Xprotect at the moment.
 

adamg

Pulling my weight
Joined
Sep 19, 2017
Messages
250
Reaction score
129
Are you sure Milestone XProtect Client doesn't use GPU acceleration? I have ran a GPU monitoring program while running Client, and there is definitely load put on the GPU. And in the Client's Settings page, under Advanced, there is a setting called Hardware Acceleration that can be enabled / disabled.
 

LittleScoobyMaster

Getting the hang of it
Joined
Jul 24, 2015
Messages
229
Reaction score
24
Are you sure Milestone XProtect Client doesn't use GPU acceleration? I have ran a GPU monitoring program while running Client, and there is definitely load put on the GPU. And in the Client's Settings page, under Advanced, there is a setting called Hardware Acceleration that can be enabled / disabled.
Milestone Xprotect Smart Client does support Intel GPU acceleration. The problem is that it only supports Intel GPU's and not Nvidia's. Nvidia's GPU's are much more powerful than Intel's.

On the Advanced menu, if you select Video Diagnostics Overlay and set it to Level 2 or higher, it will show the current status of hardware acceleration but it will say 'off' if you don't have an Intel GPU on your motherboard. (no matter which Nvidia GPU card you have installed in your system).

When I monitor my Nvidia GPU, whenever I run the Xprotect Smart Client, I still see a tiny amount of GPU utilization but not much. For example, I currently show about 8% GPU utilization but 92% CPU utilization. (8 cameras, 30fps, older Intel i7-2600K CPU. Nvidia GTX 1080Ti).

The reason it uses such a small amount of my Nvidia GPU is because it is only using it for rendering, not decoding (top of page 5): https://www.milestonesys.com/files/General/MilestoneGuides/MilestoneXProtectSmartClient_HardwareAccelerationGuide_en-US.pdf

They are supposedly working on the issue and they have a demo of it working but I don't think we can download it yet:

20170403-NVIDIA

"COPENHAGEN – April 3, 2017. Milestone Systems, the global number one* open platform company in networked video management software (VMS), is collaborating with NVIDIA to provide the next level of hardware acceleration and video processing services in monitoring management. Milestone XProtect will leverage NVIDIA GPUs and the CUDA parallel computing platform and programming model to provide parallel processing capabilities of recording servers, mobile servers and other video processing services."

It doesn't specifically mention the smart client but many users have noticed that when the Xprotect servers are less burdened, the connecting clients perform better so any offloading onto the GPU should be noticeable on the client machine but it would be best if the client machine could utilize Nvidia GPU's directly I would imagine.
 
Last edited:

adamg

Pulling my weight
Joined
Sep 19, 2017
Messages
250
Reaction score
129
I was determined to go prove your wrong with screenshots, but it turns out you are correct. My XProtect Client PC with just a Intel CPU shows YES on hardware acceleration on the overlay. My XProtect Client PC with a NVIDIA K620 GPU shows NO on hardware acceleration on the overlay.
 

LittleScoobyMaster

Getting the hang of it
Joined
Jul 24, 2015
Messages
229
Reaction score
24
I was determined to go prove your wrong with screenshots, but it turns out you are correct. My XProtect Client PC with just a Intel CPU shows YES on hardware acceleration on the overlay. My XProtect Client PC with a NVIDIA K620 GPU shows NO on hardware acceleration on the overlay.
What CPU and GPU utilization do you get on the Nvidia workstation when you view all your cams?
 

LittleScoobyMaster

Getting the hang of it
Joined
Jul 24, 2015
Messages
229
Reaction score
24
Here are the performance numbers with XProtect Client displaying 6x 1080p cameras.
That is very similar to what I get for 6 cams. Once you add 2 more 1080p or higher resolution cams, you'll probably be at about 90%+ cpu utilization as well. I usually sustain around 95% or so when I view all 8 cams at the same time (currently 1 is a 4k cam, rest are 1080p). Drops down to about 10 to 15% for just 1 cam view but I rarely view only 1 cam. Are you doing 30fps on those cams?
 

adamg

Pulling my weight
Joined
Sep 19, 2017
Messages
250
Reaction score
129
There is some variation in framerate. Some of the cameras max out at 20fps, some at 60fps.
 

adamg

Pulling my weight
Joined
Sep 19, 2017
Messages
250
Reaction score
129
Here you go, NVIDIA now supported for GPU acceleration, as of 45 days from now:
Get a first peak at our first release of 2018


Milestone Xprotect Smart Client does support Intel GPU acceleration. The problem is that it only supports Intel GPU's and not Nvidia's. Nvidia's GPU's are much more powerful than Intel's.

On the Advanced menu, if you select Video Diagnostics Overlay and set it to Level 2 or higher, it will show the current status of hardware acceleration but it will say 'off' if you don't have an Intel GPU on your motherboard. (no matter which Nvidia GPU card you have installed in your system).

When I monitor my Nvidia GPU, whenever I run the Xprotect Smart Client, I still see a tiny amount of GPU utilization but not much. For example, I currently show about 8% GPU utilization but 92% CPU utilization. (8 cameras, 30fps, older Intel i7-2600K CPU. Nvidia GTX 1080Ti).

The reason it uses such a small amount of my Nvidia GPU is because it is only using it for rendering, not decoding (top of page 5): https://www.milestonesys.com/files/General/MilestoneGuides/MilestoneXProtectSmartClient_HardwareAccelerationGuide_en-US.pdf

They are supposedly working on the issue and they have a demo of it working but I don't think we can download it yet:

20170403-NVIDIA

"COPENHAGEN – April 3, 2017. Milestone Systems, the global number one* open platform company in networked video management software (VMS), is collaborating with NVIDIA to provide the next level of hardware acceleration and video processing services in monitoring management. Milestone XProtect will leverage NVIDIA GPUs and the CUDA parallel computing platform and programming model to provide parallel processing capabilities of recording servers, mobile servers and other video processing services."

It doesn't specifically mention the smart client but many users have noticed that when the Xprotect servers are less burdened, the connecting clients perform better so any offloading onto the GPU should be noticeable on the client machine but it would be best if the client machine could utilize Nvidia GPU's directly I would imagine.
 

MR2

Getting the hang of it
Joined
Jan 25, 2016
Messages
91
Reaction score
33
Miracles do happen... eventually

hopefully this means a lot of guys can remove a good portion of the decoding load off their servers
 

LittleScoobyMaster

Getting the hang of it
Joined
Jul 24, 2015
Messages
229
Reaction score
24
I was determined to go prove your wrong with screenshots, but it turns out you are correct. My XProtect Client PC with just a Intel CPU shows YES on hardware acceleration on the overlay. My XProtect Client PC with a NVIDIA K620 GPU shows NO on hardware acceleration on the overlay.
This appears to have just been resolved as of today for my machine with the new Xprotect 2018 R1 smart client. If you upgrade to 2018 R1 which just came out, you should now see 'Hardware Acceleration' listed as Nvidia instead of NO and it appears to be working so far for me.

So far, I have noticed a pretty big drop in CPU usage since installing the update. Previously I had 8 cameras taking up 90% CPU, now, after todays update, only about 30% CPU is being used. This is on an I7-2600K. Seems like the 2018 R1 update is really helping. I'm curious if other Nvidia users are noticing similar improvements as well.

Mileage will vary. Test system I am using is monitoring one 4k and seven 1080p 30fps cams.
 
Last edited:

richtj99

Getting the hang of it
Joined
May 11, 2016
Messages
163
Reaction score
17
Would a 6th gen I7, 32gigs ram, with Xprotect be able to run a 50 camera server with roughly 40* 2mp, 5*3mp & 5*4mp cameras? Or would that overwhelm a single server?
 

adamg

Pulling my weight
Joined
Sep 19, 2017
Messages
250
Reaction score
129
It will be able to handle it, as long as you keep the camera bitrates under control. If your cams are all putting out more than 5mbps you might start having performance issues. If your cams are well tuned and averaging 400kbps then you'll easily cover it.
 
Top