XProtect Smart Client + GPU acceleration

Discussion in 'Milestone Systems' started by Andreas_H, May 6, 2016.

Share This Page

  1. Andreas_H

    Andreas_H n3wb

    Joined:
    Feb 25, 2016
    Messages:
    3
    Likes Received:
    0
    Hello,

    I wonder if someone here has more details about XProtect SmartClient 2016 and GPU acceleration.

    I have a new setup with 5xHiKVision DS-2CD2642fwd (4MP) and a client PC with a Core i5-6400 and 8GB RAM. I have two monitors connected to the onboard video. When trying to view all 5 cameras at once at full resolution, CPU load is maxed out and the video stutters, even though, according to the diagnostics overlay, hardware acceleration is on. I had to reduce resolution for FHD in order to get fluid frame rates at ~75% CPU.

    The reseller where I purchased XProtect was not much help, he claimed a dedicated video card would help. AFAIK, XProtect does not support any other type of GPU acceleration apart from Intel's QuickSync, right? If so, what could/should I expect from my CPU? What would be necessary to increase frame rates?

    Does the CPU clock make any difference, or merely GPU clock? If only GPU clock matters, a i5-6600 would be much cheaper than a i7. Would Iris Pro graphics make any difference? It is currently not available in end-user socket 1151 CPUs, so I would probably have to wait.

    Thank you very much,

    Andreas
     
  2. bart

    bart n3wb

    Joined:
    Mar 11, 2014
    Messages:
    11
    Likes Received:
    9
    I'm puzzled why they would limit the hardware acceleration to just Intel. I just updated my XProtect GO from 2014 to 2016. The choppy video went from terrible to horrendous. Live video is barely watchable and playback used to be smooth. Now even playback is choppy. XProtect 2016 took a huge step backwards in quality. My recording server is an Intel Core i7 3770K, 32GB of RAM and a battery-backed write cache for performance. My viewing station is an i7 Extreme 5960X, 32GB of RAM, and FirePro W8100. My resources are barely used, yet Milestone runs like complete crap. This is absolute insanity, and this is exactly why I would never pay for this software. It's junk.
     
  3. Andreas_H

    Andreas_H n3wb

    Joined:
    Feb 25, 2016
    Messages:
    3
    Likes Received:
    0
    After doing some more research on the topic, I have a little understanding why they chose to support Intel QuickSync in the first place. While generally Intel GPUs are still slower than dedicated GPUs by Nvidia or AMD, they are obviously way faster at H.264 decoding. If these figures here are correct, an Intel HD 4600 is more than twice as fast as a GTX 960 when decoding Full-HD H.264.

    On the other hand, H/W acceleration using a slower card would probably be better than no H/W acceleration at all... I am not a programmer, I cannot tell how difficult it is to support different GPU architectures. But as it seems, currently there is only Intel GPU or software decoding in XProtect and there are no plans to change this. Looks like your FirePro is more or less useless in this case. :-(

    Why even software decoding has gone worse, I have no idea. I had never used XProtect 2014, so I cannot tell a difference.
     
  4. LittleScoobyMaster

    LittleScoobyMaster Getting the hang of it

    Joined:
    Jul 24, 2015
    Messages:
    140
    Likes Received:
    12
    Holy Sh** Balls!

    I never thought I'd see the day when Geovision would pull one over on Xprotect, but apparently they have. In the GV-VMS update for Feb, they added Nvidia graphics cards for GPU decoding. That means if you have GV-VMS and happen to be able to pick up a brand new Nvidia GTX1070 or 1080 this launch weekend, your camera streams will be screaming fast. At least, one would hope. It is Geovision after all though, so who really knows...

    I've been wondering why Geovision has been doing the same thing as Xprotect by not supporting GPU decoding/acceleration for Nvidia graphics cards and then low and behold, Geo does it, and before Xprotect. Go figure.

    Does anyone know when Xprotect will finally support GPU decoding for Nvidia cards?

    It must be in the works or something by now one would hope.

    I've been considering switching from Geo to Xprotect for some time now, but you guys are making me not want to switch. It sounds like Xprotect is a CPU hog even more than Geo. I haven't tried Xprotect since 2014 but was hoping to try 2016 this weekend. I may give it a go just to see if my results are equally as dismal.

    I'm running an i7 6700k with 16GB of memory and Intel HD Graphics 530 and a GTX 650ti. Using this machine as a recording and live view machine. 5 Cams, 2@1920x1080, 2@1280x960, and 1 @1280x720 all 30 fps. I usually get about 35% GPU load and 40% CPU load while viewing on 1 4K TV and 1 1280x1024 monitor for comparison. Only running GV-NVR though. Haven't installed GV-VMS yet.

    With this hardware I still get herky jerkies quite often, but I think it's because of bad Geo programming, but who knows, maybe I'm bottle-necked somewhere and don't know it. I've been considering upgrading the GPU but wanted to try 2016 Xprotect first for comparison.

    Geovision GV-VMS Version 15.10.1.0
    Released date: 02/22/2016
    New Features:
    Support for Windows 10
    Support for GV-ASManager / GV-LPR Plugin V4.3.5.0
    Support for GV-Web Report V2.2.6.0
    Support for GPU decoding with external NVIDIA graphics cards
    Support for dual streaming of GV-Fisheye Camera
    Support for H.265 codec

    Funny. I was just checking out Passmark scores and the GTX1080 just popped in there in the last half hour. Top of the chart.

    gpu-specs.JPG
     
    Last edited by a moderator: May 28, 2016
  5. fenderman

    fenderman Staff Member

    Joined:
    Mar 9, 2014
    Messages:
    20,237
    Likes Received:
    4,267
    Putting a 1080 card in the system will cause the power consumption to go through the roof. The efficiency of Intel HD is what makes it desirable.
     
  6. LittleScoobyMaster

    LittleScoobyMaster Getting the hang of it

    Joined:
    Jul 24, 2015
    Messages:
    140
    Likes Received:
    12
    Indeed, on-board video is very low for power consumption. Sometimes it just doesn't have the performance though. Always a trade-off. I don't know how many 4k displays and 4k streams you would need to bog a 1070 or 1080 down, it would be fun to test out, especially for a video wall. I know that my Intel HD 530 can't handle much though. When I monitor it with GPU-z for just 4 streams on my 4k tv it utilizes over 30% GPU load, and that's for 2 streams that are less than 1080p. I think if I added a few 4k streams and extra 4k monitors to my 530, it would possibly become a bottleneck, but I'm also using Geovision, which has some of the worst programming I've ever seen. It is supposed to support GPU decoding though so I'm at least getting that benefit. GV-NVR had a helpful setting where higher resolution streams could be set to display in lower resolutions in live-view mode, but they haven't kept up with the higher resolutions so if you have a bunch of 4k streams, it would still use the processing power needed for a 4k stream to display them in the smaller 1080p resolution.

    Also, sucks how much more 4K tv's can suck power just due to the pixel density, but once you try 4k, you usually don't want to go back. For surveillance cameras you can really notice the detail improvements.

    http://money.cnn.com/2015/11/17/technology/4k-tv-energy-cost/
     
    Last edited by a moderator: May 29, 2016
  7. CamFan

    CamFan Getting the hang of it

    Joined:
    May 25, 2014
    Messages:
    106
    Likes Received:
    34
    Location:
    California
    I setup Milestone to record full resolution main stream and monitoring is done on the second stream at lower resolution. Allows me to view all cameras smoothly.
     
  8. LittleScoobyMaster

    LittleScoobyMaster Getting the hang of it

    Joined:
    Jul 24, 2015
    Messages:
    140
    Likes Received:
    12
    Yeah, it really depends what you are trying to accomplish. I'm not sure what most people are doing but presently I'm trying to get the best possible live view video on a 4k set so I have the main streams set for live view. I'm hoping to get a 4k cam soon to have a native 4k stream and when the cam isn't full screen it would be close to quad 1080p streams.

    Doing a few benchmarks. I've found that I usually can not get by with just on-board Intel quicksync video so I have to bite the bullet and pay more for electricity to power discrete graphic cards. Presently in this config I put about 30% to 35% of GPU usage with 2 1080p, 2 960p, and one 720p streams running a GTX650ti spanned on 2 screens in 1080p (30fps). In 4k mode the GPU bumps to 40-45% load (and that's without a 4k stream, once I add that, I know I will go over this cards capability). Currently I still get some chop here and there but I'm also only running at 30hz 4k and not 60hz. (one of the older 4k sets that can't do 60hz). I'm not sure if the refresh rate is coming into play or not though. I mean, 5 streams each running at 30fps, I don't know if they are all synched to run with the 30hz refresh or if it even matters. I don't care about screen tearing so much, but if I was dropping frames, I would care about that more.

    It's hard for me to understand what happens when you have multiple streams that when combined are more than 30 fps, because 30hz can only see up to 30 fps. In a typical thing like a game, you would only see 30 fps on 30 hz, but with multiple streams on 30hz, I would think frames would have to get dropped unless every camera was synched to the refresh rate. Or maybe when camera 1 gets to frame 30, camera 2 is really on frame 28, etc. That's all unclear to me.
     
    Last edited by a moderator: Jun 7, 2016
  9. MR2

    MR2 Young grasshopper

    Joined:
    Jan 25, 2016
    Messages:
    49
    Likes Received:
    9
    Cause intel is in just about everything, and Intel as crap as they are, are likely to be present unlike having to start support all the AMD and Nvidia cards out there. don't know what your doing with your system there, my system doesn't have half your specs and they run just fine :|
     
  10. LittleScoobyMaster

    LittleScoobyMaster Getting the hang of it

    Joined:
    Jul 24, 2015
    Messages:
    140
    Likes Received:
    12
    If you don't mind, can you list the resolutions of your cameras that you are using with Xprotect Smart Client?
     
  11. kurdi

    kurdi n3wb

    Joined:
    May 12, 2017
    Messages:
    1
    Likes Received:
    1
    A tip to anyone trying to build Xprotect client machines, we've managed to get great performance out of Intel's Skull Canyon NUC. With it's Iris Pro 580 on a i7-6700HQ we can run 25 Full HD H.264 streams without a hiccup.
     
    colmcille likes this.
  12. LittleScoobyMaster

    LittleScoobyMaster Getting the hang of it

    Joined:
    Jul 24, 2015
    Messages:
    140
    Likes Received:
    12
    How about using Nvidia GPU's?

    Milestone Delivers Highest Performance in VMS Using NVIDIA GPUs - BCDVideo

    https://www.milestonesys.com/newsle...017/June/milestone-nvidia-dell-at-ifsec-2017/

    I have an older i7 2600K / GTX1080ti running Xprotect Essential smart client but I'm unable to activate hardware acceleration with the card. I thought that the 2017 r2 version was supposed to support it but so far no luck. (The 2600K has no Intel GPU onboard).

    Curious if anyone has been able to activate hardware acceleration on Xprotect with an Nvidia GPU yet. Currently I am at 90%+ CPU with hardware acceleration off.
     
  13. outlawzz

    outlawzz n3wb

    Joined:
    Apr 1, 2017
    Messages:
    10
    Likes Received:
    0
    What I understood from the guy doing that demo at the ISC show was that this feature was going to be available in the 2018 versions of the VMS, don't know if anyone else heard that. Currently the only way I see hardware acceleration is thru a quicksync enabled processor. Ii actually works pretty good. At work I need to replace a few multi monitor workstations so I'm kinda on the fence which way to go. Buy new pc's with high end video cards now and hope Milestone comes thru on direct Nvidia gpu processing? Or continue with limited video output with quicksync.

    Sent from my SM-N920V using Tapatalk
     
  14. Andreas_H

    Andreas_H n3wb

    Joined:
    Feb 25, 2016
    Messages:
    3
    Likes Received:
    0
    The press release linked above is full of marketing speak and doesn't give any substantial hints to the performance of NVIDIA GPU decoding vs Intel QuickSync decoding.
    Note that Video decoding is completely separate from the 3D acceleration of the GPU and isn't run on the shader units/clusters of which the high end NVIDIA cards undoubtedly have more. AFAIK, all consumer NVIDIA cards including the 1080 have a single unit for video decoding/encoding. It can decode multiple streams, as long as the resolution/bit rate allows it, but it's a single unit. That's why in all figures I've seen, Intel QuickSync is even faster at video decoding than NVIDIA or AMD.

    Take a look here: Video Encode and Decode GPU Support Matrix

    Look at the columns "# of chips" and "# of NVENC/Chip". Unfortunately the list doesn't include "normal" GeForce cards, but you can still see you would probably need a decent Quadro or Tesla card for multiple decoders, and I doubt Milestone will support these.

    I have started to solve my problem by using cameras which offer a secondary stream with lower resolution for live viewing. After all, if you scale down six camera streams to fit on a FHD screen, each image has only 640x480, which is the resolution of the secondary stream. Now I need to find a solution for easy switching to the primary stream when "zooming in" on a single camera on a second screen.
     
  15. LittleScoobyMaster

    LittleScoobyMaster Getting the hang of it

    Joined:
    Jul 24, 2015
    Messages:
    140
    Likes Received:
    12
    Geovision used to do this very good. I haven't had much luck doing this with the free Milestone Essentials yet however. I thought I saw one license item that mentioned this feature may be only available in the more expensive version but I don't know if that is true yet without trying to track that info down again. I should look for it though.

    That would help my situation quite a bit. Geovision had it set so that if you were using a higher resolution 4k tv, the stream would know when to switch to the 4k stream and then it would switch back when you activated panels. Does Xprotect do that as well? Cause otherwise a 640x480 stream blown up to 4k would be stretched quite a bit and not look the greatest.
     
  16. colmcille

    colmcille Getting the hang of it

    Joined:
    Nov 30, 2014
    Messages:
    246
    Likes Received:
    61
    Providing you have a camera that is recording 24/7, or you have a buffer set before/after events there is nothing stopping you going to playback and getting it to play live at higher resolution. My buffer on cameras I do this on is 3 seconds.
     
  17. LittleScoobyMaster

    LittleScoobyMaster Getting the hang of it

    Joined:
    Jul 24, 2015
    Messages:
    140
    Likes Received:
    12
    The more I use Milestone Xprotect, the more I start to hate it.

    I don't understand why they can't just do Nvidia GPU decoding like other vendors do. Even Geovision:

    USAVision / GeoVision Support Forum • View topic - Supported GPU for Geovision IP Cameras on Gv-VMS

    I have seen on so many other vendors websites that they are implementing GPU decoding with Nvidia cards. Even Milestone says they are going to but it seems like they aren't delivering.

    Many of Intels newer CPUs don't even include Quic Sync anymore.

    For example, today I want to upgrade my workstation to a i9-7900x but it doesn't include Quic Sync, and because of that, it will fail for Milestone Xprotect Essential+. Sad.

    I know my preferences are different than most however. I prefer amazing looking live streams. But also, that Xprotect view where it can display livestreams with the lower res, I swear that was limited to the higher priced Xprotect products. Even certain multi-monitor configs are limited to the higher license product. I mean multi monitor support should be built into every piece of software released these days but Xprotect limits that to their premium customers. (the matrixes on the other monitors, not just basic stretching to the additional monitors).

    Another example is the stupid camera packs.

    They give you the 8 cameras free in Essential+ but then if you need to add 1 or 2 more cameras to that, instead of just charging the regular price of $50 per cam, they charge you something like $400+ for an 8 pack camera license when you may not ever have a need for those additional 5 to 6 licenses for home use. I like a lot of Xprotect features but those items really make me angry, almost like David Banner Hulk 70's style angry.

    [​IMG]
     
    Last edited: Sep 2, 2017
  18. fenderman

    fenderman Staff Member

    Joined:
    Mar 9, 2014
    Messages:
    20,237
    Likes Received:
    4,267
    There is no need for nvidia acceleration for 99.999 percent of setups...
    as far as price, would you feel better if they didnt offer a free version at all?
     
  19. LittleScoobyMaster

    LittleScoobyMaster Getting the hang of it

    Joined:
    Jul 24, 2015
    Messages:
    140
    Likes Received:
    12
    I don't know that I would agree with that 99.999 percent number due to the number of users I've seen post occasionally regarding sometimes sluggish response when using the Xprotect Smart client. (especially on systems without Quick Sync).

    I wouldn't feel better if they didn't offer a free version at all. I would feel better if they offered 2, 4, or 6 pack camera license options but I realize that's asking too much. I'm used to being able to purchase single camera licenses when using Geovision in the past. I wish Milestone offered the same. Maybe some day. For the most part I like Xprotect, but the few issues I've had with it are a little frustrating to me. The reason it's frustrating, regarding the license issue is that here you have a product which I consider to be inferior to Xprotect in many ways (GV-VMS), but, even the product that I consider to be inferior to Xprotect at least has the convenient option of single cam licenses.
     
    Last edited: Sep 2, 2017
  20. colmcille

    colmcille Getting the hang of it

    Joined:
    Nov 30, 2014
    Messages:
    246
    Likes Received:
    61
    Hmm, I've got 10 camera licences on mine...if I want to add another I can. They work out the price based on how long I have left on my care plus and give me a quote to add the one camera. Not sure where you are getting your information, or if your vendor is being naughty. Similarly, if I wanted to upgrade to another version the same process would apply. I wouldn't have to forgo my current investment. They would just work out the difference in price and do it that way.
     
  21. LittleScoobyMaster

    LittleScoobyMaster Getting the hang of it

    Joined:
    Jul 24, 2015
    Messages:
    140
    Likes Received:
    12
    I think it might work differently when you are using Xprotect Essential+ and want to add 1 or 2 cams. Essential+ is free but I see no way online to just add the 9th or 10th cam only to Essential+ online. Guess I'll have to check with vendors again, it has been quite a while since I have checked.
     
  22. outlawzz

    outlawzz n3wb

    Joined:
    Apr 1, 2017
    Messages:
    10
    Likes Received:
    0
    Kinda In the same boat. Waiting for Milestone's full commitment to Nvidia or any type of true GPU acceleration. I just can't see dropping more money on workstations with Quicksync. I'm getting ready to design a videowall here for work and I want to make sure we are set for a while.

    Sent from my SM-N920V using Tapatalk
     
  23. adamg

    adamg n3wb

    Joined:
    Sep 19, 2017
    Messages:
    15
    Likes Received:
    1
    outlawzz, have you considered Genetec Security Center? It supports GPU acceleration very well.
     
  24. outlawzz

    outlawzz n3wb

    Joined:
    Apr 1, 2017
    Messages:
    10
    Likes Received:
    0
    Hey adamg , no not really; we already made a significant investment in Milestone. I'm keeping my hopes up that they start supporting GPU acceleration real soon. What kind of performance are you seeing with Genetec?

    Sent from my SM-N920V using Tapatalk
     
  25. adamg

    adamg n3wb

    Joined:
    Sep 19, 2017
    Messages:
    15
    Likes Received:
    1
    Unfortunately I've not had the opportunity to build up a powerful 'gaming' style computer for use in a video workstation / wall so I don't have good data for you.
     
  26. LittleScoobyMaster

    LittleScoobyMaster Getting the hang of it

    Joined:
    Jul 24, 2015
    Messages:
    140
    Likes Received:
    12
    Me also. I just installed 2017 R3 and it doesn't seem to support it yet either. Maybe in 2018... I'd say this is probably one of the biggest shortcomings of Xprotect at the moment.
     
  27. adamg

    adamg n3wb

    Joined:
    Sep 19, 2017
    Messages:
    15
    Likes Received:
    1
    Are you sure Milestone XProtect Client doesn't use GPU acceleration? I have ran a GPU monitoring program while running Client, and there is definitely load put on the GPU. And in the Client's Settings page, under Advanced, there is a setting called Hardware Acceleration that can be enabled / disabled.
     
  28. LittleScoobyMaster

    LittleScoobyMaster Getting the hang of it

    Joined:
    Jul 24, 2015
    Messages:
    140
    Likes Received:
    12
    Milestone Xprotect Smart Client does support Intel GPU acceleration. The problem is that it only supports Intel GPU's and not Nvidia's. Nvidia's GPU's are much more powerful than Intel's.

    On the Advanced menu, if you select Video Diagnostics Overlay and set it to Level 2 or higher, it will show the current status of hardware acceleration but it will say 'off' if you don't have an Intel GPU on your motherboard. (no matter which Nvidia GPU card you have installed in your system).

    When I monitor my Nvidia GPU, whenever I run the Xprotect Smart Client, I still see a tiny amount of GPU utilization but not much. For example, I currently show about 8% GPU utilization but 92% CPU utilization. (8 cameras, 30fps, older Intel i7-2600K CPU. Nvidia GTX 1080Ti).

    The reason it uses such a small amount of my Nvidia GPU is because it is only using it for rendering, not decoding (top of page 5): https://www.milestonesys.com/files/...artClient_HardwareAccelerationGuide_en-US.pdf

    They are supposedly working on the issue and they have a demo of it working but I don't think we can download it yet:

    20170403-NVIDIA

    "COPENHAGEN – April 3, 2017. Milestone Systems, the global number one* open platform company in networked video management software (VMS), is collaborating with NVIDIA to provide the next level of hardware acceleration and video processing services in monitoring management. Milestone XProtect will leverage NVIDIA GPUs and the CUDA parallel computing platform and programming model to provide parallel processing capabilities of recording servers, mobile servers and other video processing services."

    It doesn't specifically mention the smart client but many users have noticed that when the Xprotect servers are less burdened, the connecting clients perform better so any offloading onto the GPU should be noticeable on the client machine but it would be best if the client machine could utilize Nvidia GPU's directly I would imagine.
     
    Last edited: Oct 12, 2017 at 9:32 PM
  29. adamg

    adamg n3wb

    Joined:
    Sep 19, 2017
    Messages:
    15
    Likes Received:
    1
    I was determined to go prove your wrong with screenshots, but it turns out you are correct. My XProtect Client PC with just a Intel CPU shows YES on hardware acceleration on the overlay. My XProtect Client PC with a NVIDIA K620 GPU shows NO on hardware acceleration on the overlay.
     
  30. LittleScoobyMaster

    LittleScoobyMaster Getting the hang of it

    Joined:
    Jul 24, 2015
    Messages:
    140
    Likes Received:
    12
    What CPU and GPU utilization do you get on the Nvidia workstation when you view all your cams?