Xprotect 2017 R1

LittleScoobyMaster

Getting the hang of it
Joined
Jul 24, 2015
Messages
229
Reaction score
24

nayr

IPCT Contributor
Joined
Jul 16, 2014
Messages
9,329
Reaction score
5,325
Location
Denver, CO
what makes you think a VMS would ever target a gaming gfx card? Installers arent going to start slapping in $200 100W+ Discrete Gfx cards, tha'd be like taking a step back to the analogue days

  • Improved system performance – Hardware accelerated video motion detection is an industry first for VMS software: It shifts video decoding from the CPU to the Integrated Graphics processor in Intel CPUs, giving mid-to-large scale installations improved performance at a lower hardware cost.
emphasis added, clearly its hardware acceleration for Intel Integrated GPU..
 

LittleScoobyMaster

Getting the hang of it
Joined
Jul 24, 2015
Messages
229
Reaction score
24
what makes you think a VMS would ever target a gaming gfx card? Installers arent going to start slapping in $200 100W+ Discrete Gfx cards, tha'd be like taking a step back to the analogue days



emphasis added, clearly its hardware acceleration for Intel Integrated GPU..
Depends on the VMS. Sort of wish Milestone would follow Geovision's lead in that area. GV started using Nvidia GPU's a while back to help with processing for their fisheye cams a couple years back. I would imagine as more people start using 4k TV's for monitoring, they might start bottlenecking the built in motherboard GPU. I know I started seeing some lag when I started adding 4k cams displayed on 4k tv's. Perhaps if your using multiple 4k monitors and high MP cams it starts to become more of an issue. Presently Milestone could benefit from offloading some of the processing they do with the CPU and move it to the GPU for remote smart client viewing. Also if you want high frame rates for high MP cams, you start to notice it there as well.

[ VMS ] How to enable the GPU decoding?
 
Last edited:

nayr

IPCT Contributor
Joined
Jul 16, 2014
Messages
9,329
Reaction score
5,325
Location
Denver, CO
you imagine wrong; a lil raspbery pi powered by my USB port can handle decoding a dozen 1080p h264 streams.. you dont need a big stinking gpu to decode video efficiently; you just need appropriate hardware.. GeoVision says only h264 is supported on Nvidia; thats a shame because Intel QuickSync on the latest generation of GPU's has no problem handling the h265 Codec that is driving the 4k video adoption..

as I said; adding an expensive external GPU is counter productive.. were not rendering large VR worlds; just decompressing some highly compressed files in a reasonable time with hardware designed for the task.. Movies and security cameras are not the same thing.. Your BluRay movie will encode 1080p movies in excess of 30Mbps, your camera will do 6-7Mbps; that means if you have the hardware to decode a BD move you have the hardware to decode nearly 5-6 Security cameras no problemo.. but of course an intel gpu can handle many times more than that.. We got users doing >100Mbit video through onboard Gfx.
 

LittleScoobyMaster

Getting the hang of it
Joined
Jul 24, 2015
Messages
229
Reaction score
24
you imagine wrong; a lil raspbery pi powered by my USB port can handle decoding a dozen 1080p h264 streams.. you dont need a big stinking gpu to decode video efficiently; you just need appropriate hardware.. GeoVision says only h264 is supported on Nvidia; thats a shame because Intel QuickSync on the latest generation of GPU's has no problem handling the h265 Codec that is driving the 4k video adoption..

as I said; adding an expensive external GPU is counter productive.. were not rendering large VR worlds; just decompressing some highly compressed files in a reasonable time with hardware designed for the task.. Movies and security cameras are not the same thing.. Your BluRay movie will encode 1080p movies in excess of 30Mbps, your camera will do 6-7Mbps; that means if you have the hardware to decode a BD move you have the hardware to decode nearly 5-6 Security cameras no problemo.. but of course an intel gpu can handle many times more than that.. We got users doing >100Mbit video through onboard Gfx.
Like most things mileage may vary. I have 8 cams with a couple of them 4k and the rest are 1080p and when I view all 8 cams on a 4k set with most of the cams live viewing close to 30fps, my CPU is at 50 to 60% cpu when all cams are displayed. I can do tweaks here and there to bring the CPU down. Quicksync definitely helps as well but when I start to see the Intel Quiksync GPU spike at 100%, it makes me think it could use more power, almost like it's become a bottleneck. Can I ask, do you typically display your system on a 4k set? I notice a pretty large spike for both CPU and GPU when displaying on a 4k set compared to a 1080p set. I often switch back and forth between the two just to measure the cpu / gpu differences. Usually it's quite substantial.

Case it helps, I'm running a Skylake 6700K with a GTX-1060, Intel HD Graphics 530.

I'm pretty sure part of the issue is I haven't found the setting to run the cams in lower resolutions in live view yet. (i.e. if you are on a 4k set and you have just 4 1080p cams in live view, you want them all to be displayed in native 1080p, but when you switch to 8 cams, the cameras should switch from the higher resolutions to the smaller ones. Geo had a good way to do this but I haven't found this setting in Xprotect yet.). I'm sure it must exist though because it would save on CPU and GPU.

It seems that Milestone just isn't quite ready to add external dedicated GPU decoding just yet but it seems like they are headed that way. A few other companies already have it and Milestone has just been a little slow to market. Although, looking at their recent documentation, it does say the following:

"Hardware accelerated video decoding uses the GPU inside the
Intel CPU and the GPU on an optionally installed dedicated
graphic adaptor
to render each video stream on the display."

https://www.milestonesys.com/files/XProtectOverview/Current/FeatureBriefs/Hardware_Accelerated_Video_Decoding_Feature_Brief.pdf

In their example they use a GTX 970.

I think with my particular set-up, if I can get smart live view resolution scaling to work, I should be able to drastically reduce my CPU and GPU usage. On Geovision, I had it working perfectly but haven't figured it out with Xprotect just yet. To be fair, I haven't spent much time tracking it down. On Geo it was right in the main settings.
 
Last edited:

LevelX

n3wb
Joined
May 16, 2016
Messages
10
Reaction score
0
Installers arent going to start slapping in $200 100W+ Discrete Gfx cards,
Its for the XProtect Smart Client playback... So they will if they need it.

One of the issues today, is decoding 8+ cameras on a client. Clients 'love' video walls of cameras. So something a dual monitor setup with all these cameras on it, doesn't equal a nice playback experience., CPU is maxed out.
 

nayr

IPCT Contributor
Joined
Jul 16, 2014
Messages
9,329
Reaction score
5,325
Location
Denver, CO
Raspberry PI, powered by USB and will decode 16 cameras no problemo.. but if you want to do it with 80W of Discrete GFX I guess its your money.
 

LevelX

n3wb
Joined
May 16, 2016
Messages
10
Reaction score
0
I guess its your money.
I'm not going to lie, XProtect Smart Client is a CPU hog, sitting on windows. But if you're heading down the Milestone path, money isn't really a problem! an extra GFX card in the client viewing machine will be a rounding error on the overall bill ;)

Out of curiosity, have you tried to decode 16 x 2mp+ cameras on a Raspberry PI outputting to a monitor?
 

nayr

IPCT Contributor
Joined
Jul 16, 2014
Messages
9,329
Reaction score
5,325
Location
Denver, CO
yes I have, it does work.. however you dont have the resolution for that anyhow.. even a 4k display can only do 4x1080p streams, everything else getting scaled down.. so just pull substreams; you cant tell the difference and load is much more reasonable.
 

LevelX

n3wb
Joined
May 16, 2016
Messages
10
Reaction score
0
even a 4k display can only do 4x1080p streams, everything else getting scaled down.. so just pull substreams and you cant tell the difference and load is much more reasonable.
Oh very nice, but yes if you're pulling substreams, then it will do it without problems.

Up until this version of Milestone, substreams had been a bit average. You could request a lower res version of the stream from the client, but all that did was shift the decoding load to the milestone server, as it did transcoding on the fly for the client. So not really solving the CPU hog issue, just moving it around.

I'm yet to test 2017 R1 to see how that works out.
 

nayr

IPCT Contributor
Joined
Jul 16, 2014
Messages
9,329
Reaction score
5,325
Location
Denver, CO
most cameras provide a lower resolution stream direct, no need to re-encode.. any decent VMS should be able to utilize both.
 

rnatalli

Getting the hang of it
Joined
Aug 7, 2016
Messages
140
Reaction score
31
Installed the newest Milestone and have been messing with the past few days. It's largely the same in terms of features, but I did notice some glitches were fixed. For example, in the previous version the mobile server wouldn't start automatically by default and I often had a hard time getting it to authenticate even using a basic user; may have been specific to my setup, but annoying just the same. They obviously made improvements on CPU usage too as my setup usually runs at 18-24% in BI with hardware acceleration enabled, but it runs at under 10% in Milestone even with the motion detection enabled. BI's mobile app still beats out Milestone IMO as it's easy to navigate and runs very smoothly. Milestone's app is good too, but not as fluid.
 

randyth

Young grasshopper
Joined
Aug 26, 2015
Messages
77
Reaction score
8
Installed the newest Milestone and have been messing with the past few days. It's largely the same in terms of features, but I did notice some glitches were fixed. For example, in the previous version the mobile server wouldn't start automatically by default and I often had a hard time getting it to authenticate even using a basic user; may have been specific to my setup, but annoying just the same. They obviously made improvements on CPU usage too as my setup usually runs at 18-24% in BI with hardware acceleration enabled, but it runs at under 10% in Milestone even with the motion detection enabled. BI's mobile app still beats out Milestone IMO as it's easy to navigate and runs very smoothly. Milestone's app is good too, but not as fluid.
Which version did you specifically install? 2017-r1?
 
Top