High CPU, no or low GPU utilization

Cue

n3wb
Joined
Jun 12, 2018
Messages
8
Reaction score
1
Location
Iceland
I have about 40 cameras, the CPU is at 100% constantly and the computer is very unstable.
Annoyingly BI does not use the dedicated Nvidia Geforce 1030 fully.
I have set it to use Nvidia NVDEC but GPU stays at 0% for the BI service
if I set it to use DirectX VA2 the GPU goes to about 20%
If I set it to use DirectX3D11 VA GPU goes to about 30%
Throughout the CPU stays at 100%

BI v5.4.4.2
 

wittaj

IPCT Contributor
Joined
Apr 28, 2019
Messages
24,983
Reaction score
48,722
Location
USA
What processor do you have? Unless it is crap, I suspect you haven't done EVERY optimization wiki.

A member here is running 50 cameras with an i7 4790 and was at 100% CPU and was looking for recommendations for a new machine. After strong persuasion by many of us, he reluctantly agreed to do every optimization and is now at 30% CPU. And that is without a graphics card...Direct-to-disc and substreams are a must....

 
Last edited:

Cue

n3wb
Joined
Jun 12, 2018
Messages
8
Reaction score
1
Location
Iceland
Thank you for the suggestion, and you are right, I have not done it, but I most certainly will try the EVERY optimization wiki.

However, my original question remains, why is the Nvidia NVDEC not working for me in BI.

The CPU is an i7-8700k
 

wittaj

IPCT Contributor
Joined
Apr 28, 2019
Messages
24,983
Reaction score
48,722
Location
USA
Yeah, you can run that many cameras on that CPU very well with the optimization and without using the GPU if you wanted.

From that same wiki:

Notes about Nvidia® NVDEC

Nvidia® NVDEC hardware acceleration reduces CPU usage similar to Intel® Quick Sync Video, and is available if you have an Nvidia GPU capable of NVDEC (see Video Encode and Decode GPU Support Matrix). Newer and faster GPUs can handle more video than older or slower GPUs.

WARNING: Nvidia hardware acceleration is actually very inefficient, resulting in much higher power consumption (increases long-term cost and heat output) and higher memory usage. Considering the relatively high purchase price and ongoing expense of using Nvidia hardware, it is a poor choice for most Blue Iris systems.
 

sebastiantombs

Known around here
Joined
Dec 28, 2019
Messages
11,511
Reaction score
27,695
Location
New Jersey
I use NVidia acceleration with no problems at all, other than the heat and power increase. I don't set it globally, though. I set it in each camera, and there is a limit to how many megapixels per second a card can handle. What's really funny with NVidia is that Ken used to recommend it for video processing bak in version 3.x.x.x of Blue Iris which is how I ended up with NVidia cards.
 

Cue

n3wb
Joined
Jun 12, 2018
Messages
8
Reaction score
1
Location
Iceland
Odd that a dedicated video card is less efficient at video encoding/decoding than a CPU. But turning over that rock is not why im here this time.

Can you imagine any reason why BI would not use Nvidia hardware acceleration?
I upgraded from BI v4>v5, perhaps there is something wrong with the update?
 

sebastiantombs

Known around here
Joined
Dec 28, 2019
Messages
11,511
Reaction score
27,695
Location
New Jersey
Did you set acceleration in each camera? What version is the NVidia driver? Post a screenshot, using snipping tool, of the "camera" page from the graph with the lightning bolt icon on the upper left of the BI console and one of the "video" tab for one of your cameras.

I was running a 1060 with no problems, but it died on me and a replacement is way too expensive right now. So I went back to an older 970 that I replaced with the 1060. BI just kept right on chugging along and used it without skipping a beat.
 

SouthernYankee

IPCT Contributor
Joined
Feb 15, 2018
Messages
5,170
Reaction score
5,320
Location
Houston Tx
screen shots.

1) windows system information, may need two shots to get all info.
2) windows task manager process tab sorted by memory (most at the top), Must contain, memory, disk, network, GPU, GPU engine columns
3) windows task manager performance, GPU (if you have multiple GPUs, then multiple screen shots)
4) Blue iris status (lighting bolt graph,upper left corner) clip storage tab
5) blue Iris status cameras tab
 

Cue

n3wb
Joined
Jun 12, 2018
Messages
8
Reaction score
1
Location
Iceland
I set acceleration in some cameras, but I mostly just change the global setting.
Nvidia driver v27.21.14.6611 date 12.04.2021
I currently have BI set to DirectX3D11 VA, if I use Nvidia the GPU has 0% utilization.
Hope these will do.
1619927720188.png
1619927734864.png
1619927743045.png
1619927756425.png
1619927766105.png
1619927788556.png
1619927799719.png
 

Flintstone61

Known around here
Joined
Feb 4, 2020
Messages
6,613
Reaction score
10,950
Location
Minnesota USA
The 1030 might not support what your trying to do. my 1060Gtx 6GB barely meets the criteria to encode decode video on some chart I read somewhere. Im running 14 cams on an i5-8500 from 9% to 18% CPU and 928MB to 1.12GB of memory with no Video card installed.
 

Flintstone61

Known around here
Joined
Feb 4, 2020
Messages
6,613
Reaction score
10,950
Location
Minnesota USA
derEisele

3 years ago

I bought a GT 1030 for the same purpose and as it turned out, the GT 1030 is only able to decode/playback videos. (Tested with ffmpeg/nvenc under Ubuntu 16.04.) So you have to buy a GTX 1050.
3
Share
ReportSave
Reddit.com post
 

Flintstone61

Known around here
Joined
Feb 4, 2020
Messages
6,613
Reaction score
10,950
Location
Minnesota USA
I bought a used Crypto card 3 years ago on Ebay (nvidia Geforce GTX 1060), for $100 bucks. it's running my home PC. not willling to pay top buck for graphics card right now, ( not a gamer really) That will encode and decode where the 1030 may not do both....alledgedly
 

bp2008

Staff member
Joined
Mar 10, 2014
Messages
12,676
Reaction score
14,024
Location
USA
I have a GT 1030 in my BI box (for the 4K 60hz display outputs) and it does work with NVDEC acceleration. Weird thing about that, on one of my cameras it never gets past the colored bars test pattern if the sub stream is set to H.265. Yet the same configuration works on a different camera. Try to explain that. Anyway on my system I've never seen the
DirectX VA2 or Direct3D11 VA options make any difference at all versus regular software decoding. Only NVDEC works on my Nvidia card, and I don't use it because I don't need it.
 

Cue

n3wb
Joined
Jun 12, 2018
Messages
8
Reaction score
1
Location
Iceland
Seems the 1030 does not have hardware encoding capability. But it does have hardware decoding ability, I guess BI does not need that feature.

So I have to get a new card, I would like not to lower the video FPS or quality.
 

Flintstone61

Known around here
Joined
Feb 4, 2020
Messages
6,613
Reaction score
10,950
Location
Minnesota USA
I think there is a graph somewhere, that shows nvidia model #'s and h264|h265 columns and encode|decode columns and the 1030 and 1060 were listed with similar yes and no's but i think the 1060 had h265 yes's.
and the 1030 may not have . im have to crash....I might try again tomorrow to google that matrix
 

wittaj

IPCT Contributor
Joined
Apr 28, 2019
Messages
24,983
Reaction score
48,722
Location
USA
Many of us run BI without a graphics card and it isn't needed.

There is something else going on with your system. Not every optimization is being used and possibly a memory leak looking at how much RAM is being used.
 
Last edited:

SouthernYankee

IPCT Contributor
Joined
Feb 15, 2018
Messages
5,170
Reaction score
5,320
Location
Houston Tx
Over allocation will cause high CPU !


1) Turn on the Intel GPU in the BIOS. The intel drive should load. The windows task manager shoud show two GPUs.
2) use the acceleration on Intel, use the Nvidia for the video.
2) set the iframe on your cameras to twice the framerate.
3) implement substreams.
4) fix the allocation problem on the AUX drive. You are over allocated.

What is the make and model numbers of some of your cameras.

=======================================
My Standard allocation post.

1) Do not use time (limit clip age)to determine when BI video files are moved or deleted, only use space. Using time wastes disk space.
2) If New and stored are on the same disk drive do not used stored, set the stored size to zero, set the new folder to delete, not move. All it does is waste CPU time and increase the number of disk writes. You can leave the stored folder on the drive just do not use it.
3) Never allocate over 90% of the total disk drive to BI.
4) if using continuous recording on the BI camera settings, record tab, set the combine and cut video to 1 hour or 3 GB. Really big files are difficult to transfer.
5) it is recommend to NOT store video on an SSD (the C: drive).
6) Do not run the disk defragmenter on the video storage disk drives.
7) Do not run virus scanners on BI folders
8) an alternate way to allocate space on multiple drives is to assign different cameras to different drives, so there is no file movement between new and stored.
9) Never use an External USB drive for the NEW folder. Never use a network drive for the NEW folder.


Advanced storage:
If you are using a complete disk for large video file storage (BVR) continuous recording, I recommend formatting the disk, with a windows cluster size of 1024K (1 Megabyte). This is a increase from the 4K default. This will reduce the physical number of disk write, decrease the disk fragmentation, speed up access.
Hint:
On the Blue iris status (lighting bolt graph) clip storage tab, if there is any red on the bars you have a allocation problem. If there is no Green, you have no free space, this is bad.
 

Cue

n3wb
Joined
Jun 12, 2018
Messages
8
Reaction score
1
Location
Iceland
These are mostly unifi cameras G3 Dome, G3 PRO, and some Mobotix °360
I will enable intel GPU when I can access the computer with a monitor. As is, the computer is headless.
I have two 6tb disks that I allocated 5000mb with a formatted capacity of 5450mb that is about 91%. I changed to 4900mb that is about <90% yesterday
I don’t have any way of reformatting the disks for a bigger cluster size (had no idea 1mbyte cluster was possible), this is an active environment. But I will do that when we add/swap out disks. Although, there is no camera on continues recording.
I don’t think there is a memory leak, except if it is in BI. However, I doubt it, the computer when I took the screenshot had only been active for less than one day since restart.
I disabled defrag on all drives, I thought W10 disabled that by default on SSDs

Is there any way to change globally and or change a selection of cameras? Testing and changing this many cameras takes a long time.
 

SouthernYankee

IPCT Contributor
Joined
Feb 15, 2018
Messages
5,170
Reaction score
5,320
Location
Houston Tx
The F drive has other stuff on it, it is over allocated, Some times this is caused by the BI DB not matching what is actually on the disk.

I do not know how to make global changes to cameras.

Implementing substreams will make a major change on your CPU load. Also putting 50% of your camera on the Intel GPU will also improve your CPU load.
 

Flintstone61

Known around here
Joined
Feb 4, 2020
Messages
6,613
Reaction score
10,950
Location
Minnesota USA
If you have a number of identical camera's you can get one configured then export the the config file as a nameityourself file. Like maybe Global_Unifi_5Mpcams.bin
then import that config into the remaining cams that are identical.
 
Top