5.2.2 - March 20, 2020 Intel hardware H.265 decodingand H.264 encoding

looking at above results;

I think I was mistaken with Intel working with H.265 on 5.2.2.4 It was dark here, though I have IR, so it didn't show in increased CPU usage.

H.265 still doesn't seem to be working with 5.2.2.5 with Intel.
 
I got almost the same results, except for the fact that H.265 HW decoding did not work for any option (so not for NVIDIA). So I reverted back now to H.264 till HW decoding is fixed for H.265. I also tried setting all cameras to Default and then setting the decoding type in the general settings to 'New Intel', but then acceleration also did not work (for H.264).
 
I've tested my Dahua 5231 against BI 5.2.2.5 just now, using all types of hardware acceleration presented in the dropdown in camera properties > Video. I tested each with both H.264 and H.265.

The good news is, whenever Blue Iris failed to decode with HWVA, it fell back automatically to software decoding and it did not revert the HW VA setting to "No". The bad news is, it only logged some of the failures, and just fell back to software decoding silently (this I determined by watching CPU/GPU usage).

My conclusion is that none of the new HW VA methods are working and we basically have the same working options as a year ago.


HW VAH.264 ResultH.265 Result
NoSoftware decodingSoftware decoding
IntelAcceleration worksSoftware decoding
Intel + VPPAcceleration works, CPU/GPU usage identical to "Intel" mode.Software decoding
NvidiaAcceleration worksAcceleration works
DXVA2Software decodingSoftware decoding
D3D11VASoftware decodingSoftware decoding
It may not be entirely as simple as that. For example when decoding H.265 with "Intel" selected, I did see a tiny amount of extra Video Decode usage reported for the Intel GPU. But CPU usage was not measurably reduced, and the GPU Video Decode usage increase (1-2%) was not enough to suggest that HW VA was actually working (6-7% when it was actually working).
Thanks for doing this! On my 5231s, I currently use H.264H. Should I be using that or H.264? Is there a difference in quality or something?
 
Thanks for doing this! On my 5231s, I currently use H.264H. Should I be using that or H.264? Is there a difference in quality or something?

In theory, H.264H might require more CPU to decode but may provide slightly higher quality. Good luck proving it though.
 
Good question bbakels...wondered the same thing. Thanks bp!
 
5.2.3.0 still can't get H.265 to work with anything other than Nvidia.

'New Intel' is no longer in the list
 
I'm not sure what changed, but now I'm getting much lower CPU usage, even with GPU decoding off? I first tried with GPU decoding and H.264, very low CPU usage, then changed all my camera's to H.265, still very low CPU usage (6 camera's, only about 6-8% CPU usage)

I still see New Intel in the list BTW
 
To be clear: I see " New Intel" the same as you: in the general settings and not in the camera settings.