Xprotect 2017 R1

Discussion in 'Milestone Systems' started by LittleScoobyMaster, Jan 3, 2017.

Share This Page

  1. LittleScoobyMaster

    LittleScoobyMaster Getting the hang of it

    Joined:
    Jul 24, 2015
    Messages:
    111
    Likes Received:
    4
  2. colmcille

    colmcille Getting the hang of it

    Joined:
    Nov 30, 2014
    Messages:
    203
    Likes Received:
    39
    Sounds interesting. I'll be interested to see what it actually means. At the moment I'm using my cameras to detect events, so it's not using much resources anyway.
     
  3. LittleScoobyMaster

    LittleScoobyMaster Getting the hang of it

    Joined:
    Jul 24, 2015
    Messages:
    111
    Likes Received:
    4
  4. nayr

    nayr Known around here

    Joined:
    Jul 16, 2014
    Messages:
    8,067
    Likes Received:
    3,409
    Location:
    Denver, CO
    what makes you think a VMS would ever target a gaming gfx card? Installers arent going to start slapping in $200 100W+ Discrete Gfx cards, tha'd be like taking a step back to the analogue days

    emphasis added, clearly its hardware acceleration for Intel Integrated GPU..
     
  5. LittleScoobyMaster

    LittleScoobyMaster Getting the hang of it

    Joined:
    Jul 24, 2015
    Messages:
    111
    Likes Received:
    4
    Depends on the VMS. Sort of wish Milestone would follow Geovision's lead in that area. GV started using Nvidia GPU's a while back to help with processing for their fisheye cams a couple years back. I would imagine as more people start using 4k TV's for monitoring, they might start bottlenecking the built in motherboard GPU. I know I started seeing some lag when I started adding 4k cams displayed on 4k tv's. Perhaps if your using multiple 4k monitors and high MP cams it starts to become more of an issue. Presently Milestone could benefit from offloading some of the processing they do with the CPU and move it to the GPU for remote smart client viewing. Also if you want high frame rates for high MP cams, you start to notice it there as well.

    [ VMS ] How to enable the GPU decoding?
     
    Last edited: Feb 11, 2017
  6. TazzieDevil7373

    TazzieDevil7373 n3wb

    Joined:
    Feb 12, 2017
    Messages:
    9
    Likes Received:
    2
    As Nayr clearly stated the technology is called Quicksync.
     
  7. nayr

    nayr Known around here

    Joined:
    Jul 16, 2014
    Messages:
    8,067
    Likes Received:
    3,409
    Location:
    Denver, CO
    you imagine wrong; a lil raspbery pi powered by my USB port can handle decoding a dozen 1080p h264 streams.. you dont need a big stinking gpu to decode video efficiently; you just need appropriate hardware.. GeoVision says only h264 is supported on Nvidia; thats a shame because Intel QuickSync on the latest generation of GPU's has no problem handling the h265 Codec that is driving the 4k video adoption..

    as I said; adding an expensive external GPU is counter productive.. were not rendering large VR worlds; just decompressing some highly compressed files in a reasonable time with hardware designed for the task.. Movies and security cameras are not the same thing.. Your BluRay movie will encode 1080p movies in excess of 30Mbps, your camera will do 6-7Mbps; that means if you have the hardware to decode a BD move you have the hardware to decode nearly 5-6 Security cameras no problemo.. but of course an intel gpu can handle many times more than that.. We got users doing >100Mbit video through onboard Gfx.
     
  8. LittleScoobyMaster

    LittleScoobyMaster Getting the hang of it

    Joined:
    Jul 24, 2015
    Messages:
    111
    Likes Received:
    4
    Like most things mileage may vary. I have 8 cams with a couple of them 4k and the rest are 1080p and when I view all 8 cams on a 4k set with most of the cams live viewing close to 30fps, my CPU is at 50 to 60% cpu when all cams are displayed. I can do tweaks here and there to bring the CPU down. Quicksync definitely helps as well but when I start to see the Intel Quiksync GPU spike at 100%, it makes me think it could use more power, almost like it's become a bottleneck. Can I ask, do you typically display your system on a 4k set? I notice a pretty large spike for both CPU and GPU when displaying on a 4k set compared to a 1080p set. I often switch back and forth between the two just to measure the cpu / gpu differences. Usually it's quite substantial.

    Case it helps, I'm running a Skylake 6700K with a GTX-1060, Intel HD Graphics 530.

    I'm pretty sure part of the issue is I haven't found the setting to run the cams in lower resolutions in live view yet. (i.e. if you are on a 4k set and you have just 4 1080p cams in live view, you want them all to be displayed in native 1080p, but when you switch to 8 cams, the cameras should switch from the higher resolutions to the smaller ones. Geo had a good way to do this but I haven't found this setting in Xprotect yet.). I'm sure it must exist though because it would save on CPU and GPU.

    It seems that Milestone just isn't quite ready to add external dedicated GPU decoding just yet but it seems like they are headed that way. A few other companies already have it and Milestone has just been a little slow to market. Although, looking at their recent documentation, it does say the following:

    "Hardware accelerated video decoding uses the GPU inside the
    Intel CPU and the GPU on an optionally installed dedicated
    graphic adaptor
    to render each video stream on the display."

    https://www.milestonesys.com/files/..._Accelerated_Video_Decoding_Feature_Brief.pdf

    In their example they use a GTX 970.

    I think with my particular set-up, if I can get smart live view resolution scaling to work, I should be able to drastically reduce my CPU and GPU usage. On Geovision, I had it working perfectly but haven't figured it out with Xprotect just yet. To be fair, I haven't spent much time tracking it down. On Geo it was right in the main settings.
     
    Last edited: Feb 13, 2017