XProtect Smart Client + GPU acceleration

Andreas_H

n3wb
Joined
Feb 25, 2016
Messages
3
Reaction score
0
Hello,

I wonder if someone here has more details about XProtect SmartClient 2016 and GPU acceleration.

I have a new setup with 5xHiKVision DS-2CD2642fwd (4MP) and a client PC with a Core i5-6400 and 8GB RAM. I have two monitors connected to the onboard video. When trying to view all 5 cameras at once at full resolution, CPU load is maxed out and the video stutters, even though, according to the diagnostics overlay, hardware acceleration is on. I had to reduce resolution for FHD in order to get fluid frame rates at ~75% CPU.

The reseller where I purchased XProtect was not much help, he claimed a dedicated video card would help. AFAIK, XProtect does not support any other type of GPU acceleration apart from Intel's QuickSync, right? If so, what could/should I expect from my CPU? What would be necessary to increase frame rates?

Does the CPU clock make any difference, or merely GPU clock? If only GPU clock matters, a i5-6600 would be much cheaper than a i7. Would Iris Pro graphics make any difference? It is currently not available in end-user socket 1151 CPUs, so I would probably have to wait.

Thank you very much,

Andreas
 

bart

n3wb
Joined
Mar 11, 2014
Messages
11
Reaction score
9
I'm puzzled why they would limit the hardware acceleration to just Intel. I just updated my XProtect GO from 2014 to 2016. The choppy video went from terrible to horrendous. Live video is barely watchable and playback used to be smooth. Now even playback is choppy. XProtect 2016 took a huge step backwards in quality. My recording server is an Intel Core i7 3770K, 32GB of RAM and a battery-backed write cache for performance. My viewing station is an i7 Extreme 5960X, 32GB of RAM, and FirePro W8100. My resources are barely used, yet Milestone runs like complete crap. This is absolute insanity, and this is exactly why I would never pay for this software. It's junk.
 

Andreas_H

n3wb
Joined
Feb 25, 2016
Messages
3
Reaction score
0
After doing some more research on the topic, I have a little understanding why they chose to support Intel QuickSync in the first place. While generally Intel GPUs are still slower than dedicated GPUs by Nvidia or AMD, they are obviously way faster at H.264 decoding. If these figures here are correct, an Intel HD 4600 is more than twice as fast as a GTX 960 when decoding Full-HD H.264.

On the other hand, H/W acceleration using a slower card would probably be better than no H/W acceleration at all... I am not a programmer, I cannot tell how difficult it is to support different GPU architectures. But as it seems, currently there is only Intel GPU or software decoding in XProtect and there are no plans to change this. Looks like your FirePro is more or less useless in this case. :-(

Why even software decoding has gone worse, I have no idea. I had never used XProtect 2014, so I cannot tell a difference.
 

LittleScoobyMaster

Getting the hang of it
Joined
Jul 24, 2015
Messages
229
Reaction score
24
But as it seems, currently there is only Intel GPU or software decoding in XProtect and there are no plans to change this.
Holy Sh** Balls!

I never thought I'd see the day when Geovision would pull one over on Xprotect, but apparently they have. In the GV-VMS update for Feb, they added Nvidia graphics cards for GPU decoding. That means if you have GV-VMS and happen to be able to pick up a brand new Nvidia GTX1070 or 1080 this launch weekend, your camera streams will be screaming fast. At least, one would hope. It is Geovision after all though, so who really knows...

I've been wondering why Geovision has been doing the same thing as Xprotect by not supporting GPU decoding/acceleration for Nvidia graphics cards and then low and behold, Geo does it, and before Xprotect. Go figure.

Does anyone know when Xprotect will finally support GPU decoding for Nvidia cards?

It must be in the works or something by now one would hope.

I've been considering switching from Geo to Xprotect for some time now, but you guys are making me not want to switch. It sounds like Xprotect is a CPU hog even more than Geo. I haven't tried Xprotect since 2014 but was hoping to try 2016 this weekend. I may give it a go just to see if my results are equally as dismal.

I'm running an i7 6700k with 16GB of memory and Intel HD Graphics 530 and a GTX 650ti. Using this machine as a recording and live view machine. 5 Cams, 2@1920x1080, 2@1280x960, and 1 @1280x720 all 30 fps. I usually get about 35% GPU load and 40% CPU load while viewing on 1 4K TV and 1 1280x1024 monitor for comparison. Only running GV-NVR though. Haven't installed GV-VMS yet.

With this hardware I still get herky jerkies quite often, but I think it's because of bad Geo programming, but who knows, maybe I'm bottle-necked somewhere and don't know it. I've been considering upgrading the GPU but wanted to try 2016 Xprotect first for comparison.

Geovision GV-VMS Version 15.10.1.0
Released date: 02/22/2016
New Features:
Support for Windows 10
Support for GV-ASManager / GV-LPR Plugin V4.3.5.0
Support for GV-Web Report V2.2.6.0
Support for GPU decoding with external NVIDIA graphics cards
Support for dual streaming of GV-Fisheye Camera
Support for H.265 codec

Funny. I was just checking out Passmark scores and the GTX1080 just popped in there in the last half hour. Top of the chart.

gpu-specs.JPG
 
Last edited by a moderator:

fenderman

Staff member
Joined
Mar 9, 2014
Messages
36,896
Reaction score
21,247
Putting a 1080 card in the system will cause the power consumption to go through the roof. The efficiency of Intel HD is what makes it desirable.
 

LittleScoobyMaster

Getting the hang of it
Joined
Jul 24, 2015
Messages
229
Reaction score
24
Putting a 1080 card in the system will cause the power consumption to go through the roof. The efficiency of Intel HD is what makes it desirable.
Indeed, on-board video is very low for power consumption. Sometimes it just doesn't have the performance though. Always a trade-off. I don't know how many 4k displays and 4k streams you would need to bog a 1070 or 1080 down, it would be fun to test out, especially for a video wall. I know that my Intel HD 530 can't handle much though. When I monitor it with GPU-z for just 4 streams on my 4k tv it utilizes over 30% GPU load, and that's for 2 streams that are less than 1080p. I think if I added a few 4k streams and extra 4k monitors to my 530, it would possibly become a bottleneck, but I'm also using Geovision, which has some of the worst programming I've ever seen. It is supposed to support GPU decoding though so I'm at least getting that benefit. GV-NVR had a helpful setting where higher resolution streams could be set to display in lower resolutions in live-view mode, but they haven't kept up with the higher resolutions so if you have a bunch of 4k streams, it would still use the processing power needed for a 4k stream to display them in the smaller 1080p resolution.

Also, sucks how much more 4K tv's can suck power just due to the pixel density, but once you try 4k, you usually don't want to go back. For surveillance cameras you can really notice the detail improvements.

http://money.cnn.com/2015/11/17/technology/4k-tv-energy-cost/
 
Last edited by a moderator:

CamFan

Getting the hang of it
Joined
May 25, 2014
Messages
143
Reaction score
65
Location
California
I setup Milestone to record full resolution main stream and monitoring is done on the second stream at lower resolution. Allows me to view all cameras smoothly.
 

LittleScoobyMaster

Getting the hang of it
Joined
Jul 24, 2015
Messages
229
Reaction score
24
I setup Milestone to record full resolution main stream and monitoring is done on the second stream at lower resolution. Allows me to view all cameras smoothly.
Yeah, it really depends what you are trying to accomplish. I'm not sure what most people are doing but presently I'm trying to get the best possible live view video on a 4k set so I have the main streams set for live view. I'm hoping to get a 4k cam soon to have a native 4k stream and when the cam isn't full screen it would be close to quad 1080p streams.

Doing a few benchmarks. I've found that I usually can not get by with just on-board Intel quicksync video so I have to bite the bullet and pay more for electricity to power discrete graphic cards. Presently in this config I put about 30% to 35% of GPU usage with 2 1080p, 2 960p, and one 720p streams running a GTX650ti spanned on 2 screens in 1080p (30fps). In 4k mode the GPU bumps to 40-45% load (and that's without a 4k stream, once I add that, I know I will go over this cards capability). Currently I still get some chop here and there but I'm also only running at 30hz 4k and not 60hz. (one of the older 4k sets that can't do 60hz). I'm not sure if the refresh rate is coming into play or not though. I mean, 5 streams each running at 30fps, I don't know if they are all synched to run with the 30hz refresh or if it even matters. I don't care about screen tearing so much, but if I was dropping frames, I would care about that more.

It's hard for me to understand what happens when you have multiple streams that when combined are more than 30 fps, because 30hz can only see up to 30 fps. In a typical thing like a game, you would only see 30 fps on 30 hz, but with multiple streams on 30hz, I would think frames would have to get dropped unless every camera was synched to the refresh rate. Or maybe when camera 1 gets to frame 30, camera 2 is really on frame 28, etc. That's all unclear to me.
 
Last edited by a moderator:

MR2

Getting the hang of it
Joined
Jan 25, 2016
Messages
91
Reaction score
33
I'm puzzled why they would limit the hardware acceleration to just Intel. I just updated my XProtect GO from 2014 to 2016. The choppy video went from terrible to horrendous. Live video is barely watchable and playback used to be smooth. Now even playback is choppy. XProtect 2016 took a huge step backwards in quality. My recording server is an Intel Core i7 3770K, 32GB of RAM and a battery-backed write cache for performance. My viewing station is an i7 Extreme 5960X, 32GB of RAM, and FirePro W8100. My resources are barely used, yet Milestone runs like complete crap. This is absolute insanity, and this is exactly why I would never pay for this software. It's junk.
Cause intel is in just about everything, and Intel as crap as they are, are likely to be present unlike having to start support all the AMD and Nvidia cards out there. don't know what your doing with your system there, my system doesn't have half your specs and they run just fine :|
 

LittleScoobyMaster

Getting the hang of it
Joined
Jul 24, 2015
Messages
229
Reaction score
24
Cause intel is in just about everything, and Intel as crap as they are, are likely to be present unlike having to start support all the AMD and Nvidia cards out there. don't know what your doing with your system there, my system doesn't have half your specs and they run just fine :|
If you don't mind, can you list the resolutions of your cameras that you are using with Xprotect Smart Client?
 

kurdi

n3wb
Joined
May 12, 2017
Messages
1
Reaction score
1
A tip to anyone trying to build Xprotect client machines, we've managed to get great performance out of Intel's Skull Canyon NUC. With it's Iris Pro 580 on a i7-6700HQ we can run 25 Full HD H.264 streams without a hiccup.
 

LittleScoobyMaster

Getting the hang of it
Joined
Jul 24, 2015
Messages
229
Reaction score
24
A tip to anyone trying to build Xprotect client machines, we've managed to get great performance out of Intel's Skull Canyon NUC. With it's Iris Pro 580 on a i7-6700HQ we can run 25 Full HD H.264 streams without a hiccup.
How about using Nvidia GPU's?

Milestone Delivers Highest Performance in VMS Using NVIDIA GPUs - BCDVideo

https://www.milestonesys.com/newsletters/reseller_/emea/2017/June/milestone-nvidia-dell-at-ifsec-2017/

I have an older i7 2600K / GTX1080ti running Xprotect Essential smart client but I'm unable to activate hardware acceleration with the card. I thought that the 2017 r2 version was supposed to support it but so far no luck. (The 2600K has no Intel GPU onboard).

Curious if anyone has been able to activate hardware acceleration on Xprotect with an Nvidia GPU yet. Currently I am at 90%+ CPU with hardware acceleration off.
 

outlawzz

n3wb
Joined
Apr 1, 2017
Messages
10
Reaction score
0
What I understood from the guy doing that demo at the ISC show was that this feature was going to be available in the 2018 versions of the VMS, don't know if anyone else heard that. Currently the only way I see hardware acceleration is thru a quicksync enabled processor. Ii actually works pretty good. At work I need to replace a few multi monitor workstations so I'm kinda on the fence which way to go. Buy new pc's with high end video cards now and hope Milestone comes thru on direct Nvidia gpu processing? Or continue with limited video output with quicksync.

Sent from my SM-N920V using Tapatalk
 

Andreas_H

n3wb
Joined
Feb 25, 2016
Messages
3
Reaction score
0
The press release linked above is full of marketing speak and doesn't give any substantial hints to the performance of NVIDIA GPU decoding vs Intel QuickSync decoding.
Note that Video decoding is completely separate from the 3D acceleration of the GPU and isn't run on the shader units/clusters of which the high end NVIDIA cards undoubtedly have more. AFAIK, all consumer NVIDIA cards including the 1080 have a single unit for video decoding/encoding. It can decode multiple streams, as long as the resolution/bit rate allows it, but it's a single unit. That's why in all figures I've seen, Intel QuickSync is even faster at video decoding than NVIDIA or AMD.

Take a look here: Video Encode and Decode GPU Support Matrix

Look at the columns "# of chips" and "# of NVENC/Chip". Unfortunately the list doesn't include "normal" GeForce cards, but you can still see you would probably need a decent Quadro or Tesla card for multiple decoders, and I doubt Milestone will support these.

I have started to solve my problem by using cameras which offer a secondary stream with lower resolution for live viewing. After all, if you scale down six camera streams to fit on a FHD screen, each image has only 640x480, which is the resolution of the secondary stream. Now I need to find a solution for easy switching to the primary stream when "zooming in" on a single camera on a second screen.
 

LittleScoobyMaster

Getting the hang of it
Joined
Jul 24, 2015
Messages
229
Reaction score
24
I have started to solve my problem by using cameras which offer a secondary stream with lower resolution for live viewing. After all, if you scale down six camera streams to fit on a FHD screen, each image has only 640x480, which is the resolution of the secondary stream. Now I need to find a solution for easy switching to the primary stream when "zooming in" on a single camera on a second screen.
Geovision used to do this very good. I haven't had much luck doing this with the free Milestone Essentials yet however. I thought I saw one license item that mentioned this feature may be only available in the more expensive version but I don't know if that is true yet without trying to track that info down again. I should look for it though.

That would help my situation quite a bit. Geovision had it set so that if you were using a higher resolution 4k tv, the stream would know when to switch to the 4k stream and then it would switch back when you activated panels. Does Xprotect do that as well? Cause otherwise a 640x480 stream blown up to 4k would be stretched quite a bit and not look the greatest.
 

LittleScoobyMaster

Getting the hang of it
Joined
Jul 24, 2015
Messages
229
Reaction score
24
The more I use Milestone Xprotect, the more I start to hate it.

I don't understand why they can't just do Nvidia GPU decoding like other vendors do. Even Geovision:

USAVision / GeoVision Support Forum • View topic - Supported GPU for Geovision IP Cameras on Gv-VMS

I have seen on so many other vendors websites that they are implementing GPU decoding with Nvidia cards. Even Milestone says they are going to but it seems like they aren't delivering.

Many of Intels newer CPUs don't even include Quic Sync anymore.

For example, today I want to upgrade my workstation to a i9-7900x but it doesn't include Quic Sync, and because of that, it will fail for Milestone Xprotect Essential+. Sad.

I know my preferences are different than most however. I prefer amazing looking live streams. But also, that Xprotect view where it can display livestreams with the lower res, I swear that was limited to the higher priced Xprotect products. Even certain multi-monitor configs are limited to the higher license product. I mean multi monitor support should be built into every piece of software released these days but Xprotect limits that to their premium customers. (the matrixes on the other monitors, not just basic stretching to the additional monitors).

Another example is the stupid camera packs.

They give you the 8 cameras free in Essential+ but then if you need to add 1 or 2 more cameras to that, instead of just charging the regular price of $50 per cam, they charge you something like $400+ for an 8 pack camera license when you may not ever have a need for those additional 5 to 6 licenses for home use. I like a lot of Xprotect features but those items really make me angry, almost like David Banner Hulk 70's style angry.

 
Last edited:

fenderman

Staff member
Joined
Mar 9, 2014
Messages
36,896
Reaction score
21,247
The more I use Milestone Xprotect, the more I start to hate it.

I don't understand why they can't just do Nvidia GPU decoding like other vendors do. Even Geovision:

USAVision / GeoVision Support Forum • View topic - Supported GPU for Geovision IP Cameras on Gv-VMS

I have seen on so many other vendors websites that they are implementing GPU decoding with Nvidia cards. Even Milestone says they are going to but it seems like they aren't delivering.

Many of Intels newer CPUs don't even include Quic Sync anymore.

For example, today I want to upgrade my workstation to a i9-7900x but it doesn't include Quic Sync, and because of that, it will fail for Milestone Xprotect Essential+. Sad.

I know my preferences are different than most however. I prefer amazing looking live streams. But also, that Xprotect view where it can display livestreams with the lower res, I swear that was limited to the higher priced Xprotect products. Even certain multi-monitor configs are limited to the higher license product. I mean multi monitor support should be built into every piece of software released these days but Xprotect limits that to their premium customers. (the matrixes on the other monitors, not just basic stretching to the additional monitors).

Another example is the stupid camera packs.

They give you the 8 cameras free in Essential+ but then if you need to add 1 or 2 more cameras to that, instead of just charging the regular price of $50 per cam, they charge you something like $400+ for an 8 pack camera license when you may not ever have a need for those additional 5 to 6 licenses for home use. I like a lot of Xprotect features but those items really make me angry, almost like David Banner Hulk 70's style angry.

There is no need for nvidia acceleration for 99.999 percent of setups...
as far as price, would you feel better if they didnt offer a free version at all?
 

LittleScoobyMaster

Getting the hang of it
Joined
Jul 24, 2015
Messages
229
Reaction score
24
There is no need for nvidia acceleration for 99.999 percent of setups...
as far as price, would you feel better if they didnt offer a free version at all?
I don't know that I would agree with that 99.999 percent number due to the number of users I've seen post occasionally regarding sometimes sluggish response when using the Xprotect Smart client. (especially on systems without Quick Sync).

I wouldn't feel better if they didn't offer a free version at all. I would feel better if they offered 2, 4, or 6 pack camera license options but I realize that's asking too much. I'm used to being able to purchase single camera licenses when using Geovision in the past. I wish Milestone offered the same. Maybe some day. For the most part I like Xprotect, but the few issues I've had with it are a little frustrating to me. The reason it's frustrating, regarding the license issue is that here you have a product which I consider to be inferior to Xprotect in many ways (GV-VMS), but, even the product that I consider to be inferior to Xprotect at least has the convenient option of single cam licenses.
 
Last edited:

LittleScoobyMaster

Getting the hang of it
Joined
Jul 24, 2015
Messages
229
Reaction score
24
Hmm, I've got 10 camera licences on mine...if I want to add another I can. They work out the price based on how long I have left on my care plus and give me a quote to add the one camera. Not sure where you are getting your information, or if your vendor is being naughty. Similarly, if I wanted to upgrade to another version the same process would apply. I wouldn't have to forgo my current investment. They would just work out the difference in price and do it that way.
I think it might work differently when you are using Xprotect Essential+ and want to add 1 or 2 cams. Essential+ is free but I see no way online to just add the 9th or 10th cam only to Essential+ online. Guess I'll have to check with vendors again, it has been quite a while since I have checked.
 

outlawzz

n3wb
Joined
Apr 1, 2017
Messages
10
Reaction score
0
The more I use Milestone Xprotect, the more I start to hate it.

I don't understand why they can't just do Nvidia GPU decoding like other vendors do. Even Geovision:

USAVision / GeoVision Support Forum • View topic - Supported GPU for Geovision IP Cameras on Gv-VMS

I have seen on so many other vendors websites that they are implementing GPU decoding with Nvidia cards. Even Milestone says they are going to but it seems like they aren't delivering.

Many of Intels newer CPUs don't even include Quic Sync anymore.

For example, today I want to upgrade my workstation to a i9-7900x but it doesn't include Quic Sync, and because of that, it will fail for Milestone Xprotect Essential+. Sad.

I know my preferences are different than most however. I prefer amazing looking live streams. But also, that Xprotect view where it can display livestreams with the lower res, I swear that was limited to the higher priced Xprotect products. Even certain multi-monitor configs are limited to the higher license product. I mean multi monitor support should be built into every piece of software released these days but Xprotect limits that to their premium customers. (the matrixes on the other monitors, not just basic stretching to the additional monitors).

Another example is the stupid camera packs.

They give you the 8 cameras free in Essential+ but then if you need to add 1 or 2 more cameras to that, instead of just charging the regular price of $50 per cam, they charge you something like $400+ for an 8 pack camera license when you may not ever have a need for those additional 5 to 6 licenses for home use. I like a lot of Xprotect features but those items really make me angry, almost like David Banner Hulk 70's style angry.

Kinda In the same boat. Waiting for Milestone's full commitment to Nvidia or any type of true GPU acceleration. I just can't see dropping more money on workstations with Quicksync. I'm getting ready to design a videowall here for work and I want to make sure we are set for a while.

Sent from my SM-N920V using Tapatalk
 
Top