- Mar 17, 2017
- 2
- 0
I was evaluating Blue Iris on an old PC, Core i7 920 (2.66Ghz, 4 core, 8 thread), 12GB ram, 250GB SSD, 300GB 10000RPM drive, gigabit ethernet, and it was working great.
I've got 8 IP cameras, 4 different HD cams, and 4 older non-HD cams, all wired PoE.
The app worked perfectly for me, I could record from all the cameras at full res and their respective max frame rates simultaneously (20FPS for the HD cams @ 4.1MP, 15FPS @0.3MP for the older cams), and my CPU usage was only around 70% max, and usually under 30% when only one camera was recording. My memory usage was around 2.4GB for BI.
I am now trying to get it set up on another machine, an old server. Dual Xeon E5430 (2.66Ghz, 8 cores, 8 threads), 4GB ram, 100GB SSD, 2TB 7200RPM drive, gigabit ethernet.
I would expect this machine to be roughly equivalent to the PC I tested on, but it isn't and I can't figure out why. With identical configurations, even with just a single camera hooked up and recording, the server can never hit 20FPS on the HD cams. It will occasionally hit 18, but it is generally below 10, and sometimes drops below 2FPS. The CPU usage on the server is under 30%, even with multiple cameras recording it hasn't broken 50%. It's only using around 900MB of ram for BI, which seems odd to me, there is plenty of free ram and the PC would use more for the same setup as noted. There is nothing else running on the server, only BI. The software configuration is identical to the PC, and the machine never becomes unresponsive or seems like it's struggling, but the BI frame rates remain low. Adding more cameras only slightly increases the memory and cpu usage, but the frame rates don't really change. I can run BI on the PC and the server side by side, and the PC will be pushing 20FPS constantly, while the server varies wildly, swinging between 12 and 2fps, occasionally going as high as 18 or as low as 1.4fps.
Does anyone have any idea why the performance would vary so drastically between these machines?
Thanks
I've got 8 IP cameras, 4 different HD cams, and 4 older non-HD cams, all wired PoE.
The app worked perfectly for me, I could record from all the cameras at full res and their respective max frame rates simultaneously (20FPS for the HD cams @ 4.1MP, 15FPS @0.3MP for the older cams), and my CPU usage was only around 70% max, and usually under 30% when only one camera was recording. My memory usage was around 2.4GB for BI.
I am now trying to get it set up on another machine, an old server. Dual Xeon E5430 (2.66Ghz, 8 cores, 8 threads), 4GB ram, 100GB SSD, 2TB 7200RPM drive, gigabit ethernet.
I would expect this machine to be roughly equivalent to the PC I tested on, but it isn't and I can't figure out why. With identical configurations, even with just a single camera hooked up and recording, the server can never hit 20FPS on the HD cams. It will occasionally hit 18, but it is generally below 10, and sometimes drops below 2FPS. The CPU usage on the server is under 30%, even with multiple cameras recording it hasn't broken 50%. It's only using around 900MB of ram for BI, which seems odd to me, there is plenty of free ram and the PC would use more for the same setup as noted. There is nothing else running on the server, only BI. The software configuration is identical to the PC, and the machine never becomes unresponsive or seems like it's struggling, but the BI frame rates remain low. Adding more cameras only slightly increases the memory and cpu usage, but the frame rates don't really change. I can run BI on the PC and the server side by side, and the PC will be pushing 20FPS constantly, while the server varies wildly, swinging between 12 and 2fps, occasionally going as high as 18 or as low as 1.4fps.
Does anyone have any idea why the performance would vary so drastically between these machines?
Thanks