Milestone Xprotect Essential + 2018R2 hardware decoding H.265 streams

venturis

Getting the hang of it
Joined
Aug 8, 2016
Messages
157
Reaction score
98
Location
Australia
UPDATE 1: After rebooting the Milestone Server I am now seeing the Mobile Server use the GPU for transcoding the H.264 streams with about 30% GPU usage. Still need to try switching back to H.265 streams to see if GPU/CPU loadings differ.


I have recently started using Milestone's Xprotect Essentials + 2018R2 after having used Hikvision's PCNVR for the last few years. I made the jump because I wanted to use H.265 streams from my cams where possible.

Its been a steep learning curve to get the system up and running but I am now starting to get things running the way I'd like except I still can't be sure that Milestone is using the graphics card GPU hardware decoding correctly.

My system is only 4 cams at the moment running on a virtualised Windows 10 operating system. I have a GTX 1030 graphics card installed that is passed through to the virtual machine to give direct access to the hardware for decoding/encoding the video streams.

I have installed GPU-Z to monitor the graphics card. I can see the graphics card GPU loading and GPU/Memory frequency increase when I open the Xprotect client on the server. I can also see that when I call up the web interface from another machine that the mobile server status window shows that hardware decoding "Nvidia" is being used but the GPU load is very low. Usually in the range of 2%-3%.

Whilst the remote client app seems to work ok I noticed today after switching on the video overlay in the advanced settings that the two cams that were configured with H.265 streams were not using the GPU hardware decoding on my local PC however, the other two cams which are H.264 steams were using the Nvidia GPU hardware decoding.

I tried forcing hardware decoding to "OnlyNivida" in the configuration file for the client app. This had no effect.

Only after changing the two video streams from H.265 to H.264 on the server did the client app revert to GPU hardware decoding. So it seemed that GPU hardware decoding on the remote client app only works with H.264 streams. By the way the remote client PC has an Nvidia GTX 980.

The other problem is the mobile app on my Ipad and Android phone. I know that all the re-encoding is done by the server (not locally by the mobile device) as I can see the CPU and GPU loads increase when opening the mobile app however, the mobile app is prone to showing frozen images and stuttering. Usually, having to kill the app and start again. The server GPU does not seem to be transcoding fast enough.

Again, with the mobile app the GPU loading does not exceed 2%-3% when the H.265 streams were enabled but now that I've changed to H.264 streams, the server GPU hardware decoding is no longer working. It is now defaulting entirely to CPU decoding. So I have the opposite problem with the Server in that it wont use hardware decoding for the H.264 streams.

I'm confused! Does anyone have any idea what is going on here?
 
Last edited:
Top