Nvidia CUDA acceleration not working

BassTeQ

Young grasshopper
Joined
May 10, 2017
Messages
62
Reaction score
8
Hi, using BI v4.8.6.3, I'm trying to get hardware accelleration working on my system.
I have a GeForce GT 430, when I enable "Nvidia CUDA acceleration" in BI, and check the GPU usage in task manager and I don't see any "video decoding" usage.

MB : Gigabyte Z68X-UD4-B3
CPU : i7 2600
GPU : NVidia GeForce GT 430

Any idea's why it may not be working?

Also reading here (Choosing Hardware for Blue Iris | IP Cam Talk) it says my CPU i7-2600 should support quick sync, however if I enable "intel" decoding I still don't see any GPU decoding occuring.

Cheers
 
Last edited:

Mikk36

Getting the hang of it
Joined
Aug 21, 2018
Messages
105
Reaction score
42
Location
Estonia
You can check if it's hardware accelerated from the camera list view.
A hash tag (#) before the MP value indicates that the camera is currently using hardware decoding.
 

BassTeQ

Young grasshopper
Joined
May 10, 2017
Messages
62
Reaction score
8
Thanks for the reply!

Video Encode and Decode GPU Support Matrix
Check the NVDEC Support Matrix (full GeForce list)

Sandy bridge supports H.264 only in a limited fashion.
I can't see the GT 430 on that list, does that mean its not supported?
If using GPU for hardware decoding, why does it matter if the the CPU only supports it in a limited fashion?

You can check if it's hardware accelerated from the camera list view.
I do see a # next to the cameras, but I still don't see anu GPU usage, so I'm not sure.
 

BassTeQ

Young grasshopper
Joined
May 10, 2017
Messages
62
Reaction score
8
If I upgrade only my GPU to a new GTX 1050/1060/1070/1080 will hardware offload work?
 

Mikk36

Getting the hang of it
Joined
Aug 21, 2018
Messages
105
Reaction score
42
Location
Estonia
When using H.264, do you see a drop in CPU usage, when you set all cameras to use Intel acceleration?
As a comparison: I normally use H.265+ for my cameras. When I reverted all 6 1080p 15fps camera streams to H.264 and enabled acceleration, I saw a drop from 30% cpu usage to about 20%. I still run software mode though because using H.265+ I can get about 80% lower bandwidth usage (at least during daytime, ~2500 KB/s vs ~500 KB/s). On the downside, I can't fast forward fast without acceleration. An Nvidia card is on the buy list.
Yes, the 10-series support hardware acceleration in BI, including H.265.
 
Last edited:

BassTeQ

Young grasshopper
Joined
May 10, 2017
Messages
62
Reaction score
8
When using H.264, do you see a drop in CPU usage, when you set all cameras to use Intel acceleration?
My motherboard doesn't have onboard graphics, so I believe this means QuickSync won't work.

Not all of my cameras support H.265+, so I'll have to stick with h.264 for now.

Yes, the 10-series support hardware acceleration in BI, including H.265.
Great so just adding a new video card, I'll be able to offload decoding and lower CPU.

Cheers
 

Mikk36

Getting the hang of it
Joined
Aug 21, 2018
Messages
105
Reaction score
42
Location
Estonia
You don't have to limit yourself to H.264, if some of the cameras don't support H.265(+). You can have a mix of H.264, H.264+, H.265 and H.265+ and even MJPEG.
 

BassTeQ

Young grasshopper
Joined
May 10, 2017
Messages
62
Reaction score
8
I did tinker with H.265 some time ago, but from memory BI didn't fully support it, couldn't do motion detection etc...
 

Mikk36

Getting the hang of it
Joined
Aug 21, 2018
Messages
105
Reaction score
42
Location
Estonia
There's only the hardware acceleration limitation using Intel QuickSync (not supported). It needs Nvidia CUDA for that. No other limitations.
 
Top