XProtect Smart Client + GPU acceleration

Discussion in 'Milestone Systems' started by Andreas_H, May 6, 2016.

Share This Page

  1. Andreas_H

    Andreas_H n3wb

    Joined:
    Feb 25, 2016
    Messages:
    2
    Likes Received:
    0
    Hello,

    I wonder if someone here has more details about XProtect SmartClient 2016 and GPU acceleration.

    I have a new setup with 5xHiKVision DS-2CD2642fwd (4MP) and a client PC with a Core i5-6400 and 8GB RAM. I have two monitors connected to the onboard video. When trying to view all 5 cameras at once at full resolution, CPU load is maxed out and the video stutters, even though, according to the diagnostics overlay, hardware acceleration is on. I had to reduce resolution for FHD in order to get fluid frame rates at ~75% CPU.

    The reseller where I purchased XProtect was not much help, he claimed a dedicated video card would help. AFAIK, XProtect does not support any other type of GPU acceleration apart from Intel's QuickSync, right? If so, what could/should I expect from my CPU? What would be necessary to increase frame rates?

    Does the CPU clock make any difference, or merely GPU clock? If only GPU clock matters, a i5-6600 would be much cheaper than a i7. Would Iris Pro graphics make any difference? It is currently not available in end-user socket 1151 CPUs, so I would probably have to wait.

    Thank you very much,

    Andreas
     
  2. bart

    bart n3wb

    Joined:
    Mar 11, 2014
    Messages:
    11
    Likes Received:
    9
    I'm puzzled why they would limit the hardware acceleration to just Intel. I just updated my XProtect GO from 2014 to 2016. The choppy video went from terrible to horrendous. Live video is barely watchable and playback used to be smooth. Now even playback is choppy. XProtect 2016 took a huge step backwards in quality. My recording server is an Intel Core i7 3770K, 32GB of RAM and a battery-backed write cache for performance. My viewing station is an i7 Extreme 5960X, 32GB of RAM, and FirePro W8100. My resources are barely used, yet Milestone runs like complete crap. This is absolute insanity, and this is exactly why I would never pay for this software. It's junk.
     
  3. Andreas_H

    Andreas_H n3wb

    Joined:
    Feb 25, 2016
    Messages:
    2
    Likes Received:
    0
    After doing some more research on the topic, I have a little understanding why they chose to support Intel QuickSync in the first place. While generally Intel GPUs are still slower than dedicated GPUs by Nvidia or AMD, they are obviously way faster at H.264 decoding. If these figures here are correct, an Intel HD 4600 is more than twice as fast as a GTX 960 when decoding Full-HD H.264.

    On the other hand, H/W acceleration using a slower card would probably be better than no H/W acceleration at all... I am not a programmer, I cannot tell how difficult it is to support different GPU architectures. But as it seems, currently there is only Intel GPU or software decoding in XProtect and there are no plans to change this. Looks like your FirePro is more or less useless in this case. :-(

    Why even software decoding has gone worse, I have no idea. I had never used XProtect 2014, so I cannot tell a difference.
     
  4. LittleScoobyMaster

    LittleScoobyMaster Getting the hang of it

    Joined:
    Jul 24, 2015
    Messages:
    106
    Likes Received:
    4
    Holy Sh** Balls!

    I never thought I'd see the day when Geovision would pull one over on Xprotect, but apparently they have. In the GV-VMS update for Feb, they added Nvidia graphics cards for GPU decoding. That means if you have GV-VMS and happen to be able to pick up a brand new Nvidia GTX1070 or 1080 this launch weekend, your camera streams will be screaming fast. At least, one would hope. It is Geovision after all though, so who really knows...

    I've been wondering why Geovision has been doing the same thing as Xprotect by not supporting GPU decoding/acceleration for Nvidia graphics cards and then low and behold, Geo does it, and before Xprotect. Go figure.

    Does anyone know when Xprotect will finally support GPU decoding for Nvidia cards?

    It must be in the works or something by now one would hope.

    I've been considering switching from Geo to Xprotect for some time now, but you guys are making me not want to switch. It sounds like Xprotect is a CPU hog even more than Geo. I haven't tried Xprotect since 2014 but was hoping to try 2016 this weekend. I may give it a go just to see if my results are equally as dismal.

    I'm running an i7 6700k with 16GB of memory and Intel HD Graphics 530 and a GTX 650ti. Using this machine as a recording and live view machine. 5 Cams, 2@1920x1080, 2@1280x960, and 1 @1280x720 all 30 fps. I usually get about 35% GPU load and 40% CPU load while viewing on 1 4K TV and 1 1280x1024 monitor for comparison. Only running GV-NVR though. Haven't installed GV-VMS yet.

    With this hardware I still get herky jerkies quite often, but I think it's because of bad Geo programming, but who knows, maybe I'm bottle-necked somewhere and don't know it. I've been considering upgrading the GPU but wanted to try 2016 Xprotect first for comparison.

    Geovision GV-VMS Version 15.10.1.0
    Released date: 02/22/2016
    New Features:
    Support for Windows 10
    Support for GV-ASManager / GV-LPR Plugin V4.3.5.0
    Support for GV-Web Report V2.2.6.0
    Support for GPU decoding with external NVIDIA graphics cards
    Support for dual streaming of GV-Fisheye Camera
    Support for H.265 codec

    Funny. I was just checking out Passmark scores and the GTX1080 just popped in there in the last half hour. Top of the chart.

    gpu-specs.JPG
     
    Last edited by a moderator: May 28, 2016
  5. fenderman

    fenderman Staff Member

    Joined:
    Mar 9, 2014
    Messages:
    15,362
    Likes Received:
    2,662
    Putting a 1080 card in the system will cause the power consumption to go through the roof. The efficiency of Intel HD is what makes it desirable.
     
  6. LittleScoobyMaster

    LittleScoobyMaster Getting the hang of it

    Joined:
    Jul 24, 2015
    Messages:
    106
    Likes Received:
    4
    Indeed, on-board video is very low for power consumption. Sometimes it just doesn't have the performance though. Always a trade-off. I don't know how many 4k displays and 4k streams you would need to bog a 1070 or 1080 down, it would be fun to test out, especially for a video wall. I know that my Intel HD 530 can't handle much though. When I monitor it with GPU-z for just 4 streams on my 4k tv it utilizes over 30% GPU load, and that's for 2 streams that are less than 1080p. I think if I added a few 4k streams and extra 4k monitors to my 530, it would possibly become a bottleneck, but I'm also using Geovision, which has some of the worst programming I've ever seen. It is supposed to support GPU decoding though so I'm at least getting that benefit. GV-NVR had a helpful setting where higher resolution streams could be set to display in lower resolutions in live-view mode, but they haven't kept up with the higher resolutions so if you have a bunch of 4k streams, it would still use the processing power needed for a 4k stream to display them in the smaller 1080p resolution.

    Also, sucks how much more 4K tv's can suck power just due to the pixel density, but once you try 4k, you usually don't want to go back. For surveillance cameras you can really notice the detail improvements.

    http://money.cnn.com/2015/11/17/technology/4k-tv-energy-cost/
     
    Last edited by a moderator: May 29, 2016
  7. CamFan

    CamFan Getting the hang of it

    Joined:
    May 25, 2014
    Messages:
    102
    Likes Received:
    33
    Location:
    California
    I setup Milestone to record full resolution main stream and monitoring is done on the second stream at lower resolution. Allows me to view all cameras smoothly.
     
  8. LittleScoobyMaster

    LittleScoobyMaster Getting the hang of it

    Joined:
    Jul 24, 2015
    Messages:
    106
    Likes Received:
    4
    Yeah, it really depends what you are trying to accomplish. I'm not sure what most people are doing but presently I'm trying to get the best possible live view video on a 4k set so I have the main streams set for live view. I'm hoping to get a 4k cam soon to have a native 4k stream and when the cam isn't full screen it would be close to quad 1080p streams.

    Doing a few benchmarks. I've found that I usually can not get by with just on-board Intel quicksync video so I have to bite the bullet and pay more for electricity to power discrete graphic cards. Presently in this config I put about 30% to 35% of GPU usage with 2 1080p, 2 960p, and one 720p streams running a GTX650ti spanned on 2 screens in 1080p (30fps). In 4k mode the GPU bumps to 40-45% load (and that's without a 4k stream, once I add that, I know I will go over this cards capability). Currently I still get some chop here and there but I'm also only running at 30hz 4k and not 60hz. (one of the older 4k sets that can't do 60hz). I'm not sure if the refresh rate is coming into play or not though. I mean, 5 streams each running at 30fps, I don't know if they are all synched to run with the 30hz refresh or if it even matters. I don't care about screen tearing so much, but if I was dropping frames, I would care about that more.

    It's hard for me to understand what happens when you have multiple streams that when combined are more than 30 fps, because 30hz can only see up to 30 fps. In a typical thing like a game, you would only see 30 fps on 30 hz, but with multiple streams on 30hz, I would think frames would have to get dropped unless every camera was synched to the refresh rate. Or maybe when camera 1 gets to frame 30, camera 2 is really on frame 28, etc. That's all unclear to me.
     
    Last edited by a moderator: Jun 7, 2016
  9. MR2

    MR2 Young grasshopper

    Joined:
    Jan 25, 2016
    Messages:
    44
    Likes Received:
    8
    Cause intel is in just about everything, and Intel as crap as they are, are likely to be present unlike having to start support all the AMD and Nvidia cards out there. don't know what your doing with your system there, my system doesn't have half your specs and they run just fine :|