BI Memory Leak without Intel Drivers?

runningman

n3wb
Joined
May 7, 2016
Messages
6
Reaction score
0
I am having a definite memory leak issue. Blue Iris memory usage continually climbs and maxes out anywhere between a few hours or a few days. All of my research has indicated this is due to Intel drivers and hardware acceleration. As far as I know, I do not have any Intel drivers installed. BI is running on my old gaming machine with a Nvidia GTX 470 running my display with Nvidia drivers installed. I did try switching to integrated graphics before my recent Win 10 reinstall to see if I could install Intel drivers and then use one that was linked in the memory link help thread but was met with no luck. Memory leak issue was still occurring with no Nvidia drivers or GPU connected....using integrated graphics and the appropriate 3rd gen ivy bridge drivers provided as non memory leak drivers.

Is this still the Intel based hardware acceleration/quick sync leak or am I just missing something completely?

 

SouthernYankee

IPCT Contributor
Joined
Feb 15, 2018
Messages
5,170
Reaction score
5,320
Location
Houston Tx
To see if it is a driver problem, turn off all hard acceleration on all cameras. Reboot the BI PC. If it does not leak it is a driver problem.
 

runningman

n3wb
Joined
May 7, 2016
Messages
6
Reaction score
0
To see if it is a driver problem, turn off all hard acceleration on all cameras. Reboot the BI PC. If it does not leak it is a driver problem.
Ok so I disabled Hardware Acceleration inside of the options menu of BI. I did not see an option within each camera. I will wait and see if the leak occurs.
 

bp2008

Staff member
Joined
Mar 10, 2014
Messages
12,676
Reaction score
14,023
Location
USA
The per-camera option is on the camera properties > Video tab. I don't think your Nvidia GPU is new enough to support BI's hardware acceleration, and you definitely shouldn't choose Intel acceleration if the Intel GPU isn't even activated.
 

runningman

n3wb
Joined
May 7, 2016
Messages
6
Reaction score
0
The per-camera option is on the camera properties > Video tab. I don't think your Nvidia GPU is new enough to support BI's hardware acceleration, and you definitely shouldn't choose Intel acceleration if the Intel GPU isn't even activated.
Ok. I now set each camera's Hardware accelerated decode to "No" to test. I assume it was not using any hardware acceleration any way since the setting was set to "Default" on each camera....so I assume it was using the Hardware accelerated decode setting in the options menu of "No" anyway. Regardless everything is set to No now.

I have an option to use "Nvidia CUDA." Should I be using that since I am using my GTX 440?
 

runningman

n3wb
Joined
May 7, 2016
Messages
6
Reaction score
0
So all hardware acceleration was turned off and the leak is still occurring. Anywhere else I should look? What additional information can I provide to assist?
 

bp2008

Staff member
Joined
Mar 10, 2014
Messages
12,676
Reaction score
14,023
Location
USA
If you are sure HW accel is off for all cameras, then I have no idea what else could be causing a leak. My BI install has never leaked memory noticeably except for as a result of Intel hardware acceleration.

BI shows HW accel status as a # symbol in the Status window, cameras tab, in the Pixels column. If the # symbol is there, hardware acceleration is enabled.




Another way you can verify (if your Win10 is updated enough) is to look in Task Manager > Performance. Select the Intel GPU and set one of the graphs to show "Video Decode". It should be at 0% if hardware acceleration is truly not being used.
 

pov2

Getting the hang of it
Joined
Sep 7, 2018
Messages
229
Reaction score
46
Location
Canada

fenderman

Staff member
Joined
Mar 9, 2014
Messages
36,902
Reaction score
21,274
BI broadly calls it "Nvidia hardware acceleration", not NVDEC. It might still use hardware acceleration even without NVDEC because older chips had Nvidia PureVideo: Nvidia PureVideo - Wikipedia
Some software use hardware acceleration through DirectX even if it not explicitly enabled anywhere.
Blue iris provides detailed info on what works and what doesnt.
4.7.9 - September 1, 2018

  • Support for Nvidia CUDA hardware encoding. Please see the matrix found here Video Encode and Decode GPU Support Matrix. The majority of cards support some level of encoding (NVENC) and the high-end cards support "Unrestricted" concurrent encoding sessions.

  • You may use NVENC in Blue Iris for recording, web casting, or exporting video by editing the encoding properties from the Record tab, Options/Web server, or Trim/Export windows. As the number of encoding sessions may be limited by your hardware, you may choose to target particular cameras.

  • If either NVENC or NVDEC is unavailable or cannot otherwise be initialized, Blue Iris will automatically fall back to software. The condition will be logged to status/messages.

  • A new version of BlueIrisService.exe which should now make it impossible to ever result in a never-ending "stopping" state.

  • Support for frame stepping via client apps
 

bp2008

Staff member
Joined
Mar 10, 2014
Messages
12,676
Reaction score
14,023
Location
USA
Yeah so you've still got the Intel GPU disabled entirely (in the BIOS I guess?). No idea what is wrong.
 

SouthernYankee

IPCT Contributor
Joined
Feb 15, 2018
Messages
5,170
Reaction score
5,320
Location
Houston Tx
Just an off the wall thought. could it be two subnets? I have not setup two different subnet before 192.168.1.xx and 192.168.2.xx
 

pov2

Getting the hang of it
Joined
Sep 7, 2018
Messages
229
Reaction score
46
Location
Canada
Blue Iris memory usage continually climbs and maxes out anywhere between a few hours or a few days.
I wonder what you meant by saying "maxes out". It reaches some maximum and doesn't increase further? If so, that's normal. Memory leak causes indefinite memory usage increase until a crash either of Blue Iris or the system. I noticed you have many cameras and not of them are sending the signal (some show 0 kB/s). When at some moment they all send some data Blue Iris occupies memory for that and perhaps keeps it reserved in case they send data again. That's why it starts with lower memory usage, then increases it and "maxes out".
 
Top