looney2ns
IPCT Contributor
I'll play devils advocate, are these problems really laying on Intels doorstep, or is it possible that it's something Ken needs to change so BI plays nice?
Just curious.
Just curious.
I'll play devils advocate, are these problems really laying on Intels doorstep, or is it possible that it's something Ken needs to change so BI plays nice?
Just curious.
There's probably a few ways to look at it. Since the leak appeared after Intel updated their drivers then it's more likely their problem. Since BI hasn't been updated to "fix" the leak, and I assume it would be if it was relatively easy (and assuming Intel doesn't have a bug), then my guess is it is an Intel issue and there's no way for BI to address it other than to turn off Intel Quicksync support.I'll play devils advocate, are these problems really laying on Intels doorstep, or is it possible that it's something Ken needs to change so BI plays nice?
Just curious.
I'd love to see BI support Nvidia CUDA/HEVC encoding/decoding. I assume Intel will address the problem in their chips but if we had support for Nvidia we'd have a lot more flexibility.
For me it's a bit of a selfish reason My current server (power hog that it is) is a Xeon platform (dual E5-2650 v3) that doesn't support HD graphics. I'll likely be setting up a dedicated lower power system for NVR though so it will likely be moot at that point.I'm all in favor of more hardware acceleration choices, but I'd place more priority on fully utilizing Quick Sync's capabilities. Dedicated graphics cards can add quite a bit to power consumption, and most of the really low-power cards are several generations old.
Hiya Folks...I thought I would chime in here (again)!
After a bit of back and forth with bp2008 and some others on another thread (talking about my i9-7980XE build), I decided to take bp's advice and try my luck with an 8th gen core i7 (Core i7-8700K) and 32 gigs of ram to power my Blue Iris setup with 33 cameras. After doing a fresh install of Windows 10 (64bit), and installing the graphics drivers following all of the threads here to solve the memory leak issues, I am sad to say that this Core i7 cannot handle the load. At first i thought it was the graphics driver issue, so I meticulously reviewed all of the relevant posts on the forum, and was able to remove that as a variable. But in the end, it seems pretty clear that Quick Sync may be perfectly fine for a decent number of cameras, but once you go above a certain number it just falls short. Now....to be clear, I am forced to use the "limit decoding" feature on 20 cameras on both the i9 and the i7 setups to run all 33 cameras. But, the difference in CPU usage is DRAMATIC. On the core i9, CPU usage is 25%. On the core i7, it's around 78% (with hardware acceleration/Quick Sync).
So, my conclusion remains the same, I believe I've hit the practical limit to the number of cameras I can have (actually exceeded it...if not for limit decoding), and that the number of cores really does seem to make a difference! I'm no expert, but just going by what I see and can readily compare against between to the two systems.
Anyway, hopefully this is helpful. What I WILL do with this new i7 build is relocate it to my place in Arizona where I'm only running about a dozen cameras. I'm sure it handle that just fine!
Cheers!
Well...that's true, but I had no choice because the system became completely unusable if I didn't use limit decode.
Hey bp, so I enabled direct to disc on every camera and refreshed the performance stats online. No change in the CPU usage though.