This afternoon I got a fully-updated emscripten build environment up and running (that's the magic sauce which turns C++ code into JavaScript for the web). I ran some builds of the openH264 decoder and learned some things.
1) I was able to achieve **very slightly** improved file sizes and execution speed ... a 5% improvement at best ... which I attribute to the newer compiler.
2) Updating the openH264 source to the latest (2 years newer than we've been running) resulted in 7-10% performance LOST, so we'll be sticking with the old version, recompiled of course to deliver that ~5% gain.
3) I also tried compiling to "WebAssembly", but some script error I couldn't find a solution for is preventing it from loading in any browser. It probably wouldn't have been faster anyway.
4) I doubled the memory limit that prevented 4K video from being decodable, so 4K works now.
More importantly, the next beta update will be able to decode 4K H.264 video, although the only way for
Blue Iris to produce it at this time is to configure a group stream to 3840x2160 resolution. It is a tremendous CPU hog and I don't recommend it. I benchmarked with a short 4K @ 4 Mbps clip. An i7-7700K can barely hit 15 FPS decoding it, and a Samsung Galaxy S7 got 3.19 FPS. Blue Iris also uses a ridiculous amount of CPU time to encode this high of a resolution since it doesn't use hardware acceleration for encoding.
Update: Just for kicks, I ran my 4K video sample through Chrome's built-in video decoder using
a Chrome Native Client plugin. 60 FPS. Which actually means higher than 60 FPS, because this particular player can't go higher than my monitor's refresh rate. Last year I had planned to use this player for UI3, and it was going to be great. But then I learned that Google is removing Native Client plugin support from Chrome (in just a few months now), so this excellent player didn't even make it to the beta release.