Xprotect 2017 R1

Discussion in 'Milestone Systems' started by LittleScoobyMaster, Jan 3, 2017.

Share This Page

  1. LittleScoobyMaster

    LittleScoobyMaster Getting the hang of it

    Joined:
    Jul 24, 2015
    Messages:
    140
    Likes Received:
    12
  2. colmcille

    colmcille Getting the hang of it

    Joined:
    Nov 30, 2014
    Messages:
    246
    Likes Received:
    61
    Sounds interesting. I'll be interested to see what it actually means. At the moment I'm using my cameras to detect events, so it's not using much resources anyway.
     
  3. LittleScoobyMaster

    LittleScoobyMaster Getting the hang of it

    Joined:
    Jul 24, 2015
    Messages:
    140
    Likes Received:
    12
  4. nayr

    nayr Known around here

    Joined:
    Jul 16, 2014
    Messages:
    9,376
    Likes Received:
    5,000
    Location:
    Denver, CO
    what makes you think a VMS would ever target a gaming gfx card? Installers arent going to start slapping in $200 100W+ Discrete Gfx cards, tha'd be like taking a step back to the analogue days

    emphasis added, clearly its hardware acceleration for Intel Integrated GPU..
     
  5. LittleScoobyMaster

    LittleScoobyMaster Getting the hang of it

    Joined:
    Jul 24, 2015
    Messages:
    140
    Likes Received:
    12
    Depends on the VMS. Sort of wish Milestone would follow Geovision's lead in that area. GV started using Nvidia GPU's a while back to help with processing for their fisheye cams a couple years back. I would imagine as more people start using 4k TV's for monitoring, they might start bottlenecking the built in motherboard GPU. I know I started seeing some lag when I started adding 4k cams displayed on 4k tv's. Perhaps if your using multiple 4k monitors and high MP cams it starts to become more of an issue. Presently Milestone could benefit from offloading some of the processing they do with the CPU and move it to the GPU for remote smart client viewing. Also if you want high frame rates for high MP cams, you start to notice it there as well.

    [ VMS ] How to enable the GPU decoding?
     
    Last edited: Feb 11, 2017
  6. TazzieDevil7373

    TazzieDevil7373 n3wb

    Joined:
    Feb 12, 2017
    Messages:
    9
    Likes Received:
    3
    As Nayr clearly stated the technology is called Quicksync.
     
  7. nayr

    nayr Known around here

    Joined:
    Jul 16, 2014
    Messages:
    9,376
    Likes Received:
    5,000
    Location:
    Denver, CO
    you imagine wrong; a lil raspbery pi powered by my USB port can handle decoding a dozen 1080p h264 streams.. you dont need a big stinking gpu to decode video efficiently; you just need appropriate hardware.. GeoVision says only h264 is supported on Nvidia; thats a shame because Intel QuickSync on the latest generation of GPU's has no problem handling the h265 Codec that is driving the 4k video adoption..

    as I said; adding an expensive external GPU is counter productive.. were not rendering large VR worlds; just decompressing some highly compressed files in a reasonable time with hardware designed for the task.. Movies and security cameras are not the same thing.. Your BluRay movie will encode 1080p movies in excess of 30Mbps, your camera will do 6-7Mbps; that means if you have the hardware to decode a BD move you have the hardware to decode nearly 5-6 Security cameras no problemo.. but of course an intel gpu can handle many times more than that.. We got users doing >100Mbit video through onboard Gfx.
     
  8. LittleScoobyMaster

    LittleScoobyMaster Getting the hang of it

    Joined:
    Jul 24, 2015
    Messages:
    140
    Likes Received:
    12
    Like most things mileage may vary. I have 8 cams with a couple of them 4k and the rest are 1080p and when I view all 8 cams on a 4k set with most of the cams live viewing close to 30fps, my CPU is at 50 to 60% cpu when all cams are displayed. I can do tweaks here and there to bring the CPU down. Quicksync definitely helps as well but when I start to see the Intel Quiksync GPU spike at 100%, it makes me think it could use more power, almost like it's become a bottleneck. Can I ask, do you typically display your system on a 4k set? I notice a pretty large spike for both CPU and GPU when displaying on a 4k set compared to a 1080p set. I often switch back and forth between the two just to measure the cpu / gpu differences. Usually it's quite substantial.

    Case it helps, I'm running a Skylake 6700K with a GTX-1060, Intel HD Graphics 530.

    I'm pretty sure part of the issue is I haven't found the setting to run the cams in lower resolutions in live view yet. (i.e. if you are on a 4k set and you have just 4 1080p cams in live view, you want them all to be displayed in native 1080p, but when you switch to 8 cams, the cameras should switch from the higher resolutions to the smaller ones. Geo had a good way to do this but I haven't found this setting in Xprotect yet.). I'm sure it must exist though because it would save on CPU and GPU.

    It seems that Milestone just isn't quite ready to add external dedicated GPU decoding just yet but it seems like they are headed that way. A few other companies already have it and Milestone has just been a little slow to market. Although, looking at their recent documentation, it does say the following:

    "Hardware accelerated video decoding uses the GPU inside the
    Intel CPU and the GPU on an optionally installed dedicated
    graphic adaptor
    to render each video stream on the display."

    https://www.milestonesys.com/files/..._Accelerated_Video_Decoding_Feature_Brief.pdf

    In their example they use a GTX 970.

    I think with my particular set-up, if I can get smart live view resolution scaling to work, I should be able to drastically reduce my CPU and GPU usage. On Geovision, I had it working perfectly but haven't figured it out with Xprotect just yet. To be fair, I haven't spent much time tracking it down. On Geo it was right in the main settings.
     
    Last edited: Feb 13, 2017
  9. LevelX

    LevelX n3wb

    Joined:
    May 16, 2016
    Messages:
    8
    Likes Received:
    0
    Its for the XProtect Smart Client playback... So they will if they need it.

    One of the issues today, is decoding 8+ cameras on a client. Clients 'love' video walls of cameras. So something a dual monitor setup with all these cameras on it, doesn't equal a nice playback experience., CPU is maxed out.
     
  10. nayr

    nayr Known around here

    Joined:
    Jul 16, 2014
    Messages:
    9,376
    Likes Received:
    5,000
    Location:
    Denver, CO
    Raspberry PI, powered by USB and will decode 16 cameras no problemo.. but if you want to do it with 80W of Discrete GFX I guess its your money.
     
  11. LevelX

    LevelX n3wb

    Joined:
    May 16, 2016
    Messages:
    8
    Likes Received:
    0
    I'm not going to lie, XProtect Smart Client is a CPU hog, sitting on windows. But if you're heading down the Milestone path, money isn't really a problem! an extra GFX card in the client viewing machine will be a rounding error on the overall bill ;)

    Out of curiosity, have you tried to decode 16 x 2mp+ cameras on a Raspberry PI outputting to a monitor?
     
  12. nayr

    nayr Known around here

    Joined:
    Jul 16, 2014
    Messages:
    9,376
    Likes Received:
    5,000
    Location:
    Denver, CO
    yes I have, it does work.. however you dont have the resolution for that anyhow.. even a 4k display can only do 4x1080p streams, everything else getting scaled down.. so just pull substreams; you cant tell the difference and load is much more reasonable.
     
  13. LevelX

    LevelX n3wb

    Joined:
    May 16, 2016
    Messages:
    8
    Likes Received:
    0
    Oh very nice, but yes if you're pulling substreams, then it will do it without problems.

    Up until this version of Milestone, substreams had been a bit average. You could request a lower res version of the stream from the client, but all that did was shift the decoding load to the milestone server, as it did transcoding on the fly for the client. So not really solving the CPU hog issue, just moving it around.

    I'm yet to test 2017 R1 to see how that works out.
     
  14. nayr

    nayr Known around here

    Joined:
    Jul 16, 2014
    Messages:
    9,376
    Likes Received:
    5,000
    Location:
    Denver, CO
    most cameras provide a lower resolution stream direct, no need to re-encode.. any decent VMS should be able to utilize both.
     
  15. rnatalli

    rnatalli Getting the hang of it

    Joined:
    Aug 7, 2016
    Messages:
    133
    Likes Received:
    21
    Installed the newest Milestone and have been messing with the past few days. It's largely the same in terms of features, but I did notice some glitches were fixed. For example, in the previous version the mobile server wouldn't start automatically by default and I often had a hard time getting it to authenticate even using a basic user; may have been specific to my setup, but annoying just the same. They obviously made improvements on CPU usage too as my setup usually runs at 18-24% in BI with hardware acceleration enabled, but it runs at under 10% in Milestone even with the motion detection enabled. BI's mobile app still beats out Milestone IMO as it's easy to navigate and runs very smoothly. Milestone's app is good too, but not as fluid.
     
  16. randyth

    randyth Young grasshopper

    Joined:
    Aug 26, 2015
    Messages:
    68
    Likes Received:
    6
    Which version did you specifically install? 2017-r1?
     
  17. rnatalli

    rnatalli Getting the hang of it

    Joined:
    Aug 7, 2016
    Messages:
    133
    Likes Received:
    21
    Yes, 2017 R1.