4.7.8 - August 28, 2018 - Support for the Nvidia CUDA hardware decoding

Joined
Sep 2, 2018
Messages
6
Reaction score
0
Location
Atlanta, GA
In your opinion what is reasonable CPU Usage? Sorry for asking that may questions, but I know you know better than me :)
 

FireRock

n3wb
Joined
Sep 3, 2018
Messages
6
Reaction score
0
Location
44310
is it possible to simultaneously use Intel Quicksync to decode H.264 streams (i7-8700k) while decoding H.265 streams on an Nvidia graphics card (1070)?
 

fenderman

Staff member
Joined
Mar 9, 2014
Messages
36,897
Reaction score
21,250
is it possible to simultaneously use Intel Quicksync to decode H.264 streams (i7-8700k) while decoding H.265 streams on an Nvidia graphics card (1070)?
you should be able too..however with the cost of an Nvidia 1070 being the same as a second system, it makes little sense.
You can also try h.265 decoding with intel, it is supported by the software and your hardware but apparently is finicky about the driver.
 

peterfram

Young grasshopper
Joined
Nov 4, 2015
Messages
37
Reaction score
21
Update on my GPU offload power tests. I was just doing similar tests with different GPU HA options in K-Lite codec pack and realized in my earlier tests I had my entire media power strip measured, not my PC. (Previous results included 34" monitor, routers, hubs and stuff.

Actual power results with BI Cuda on/off
Cuda HA off: 118 watts
Cuda HA on: 148 watts.

Still measures 30 watts difference. That is a constant 30 watt GPU overhead with 8 cameras streaming. Console open or closed. With BI console open, CPU utilization up and total PC power increases, but GPU drain always constant around 30 watts.

Also, no go for me with CUDA HA on web server. Streaming fine, recordings playback in UI3 or Android app just freeze on load at what ever frame I loaded, at play start or slide forward. Didn't think it through yet, but seems to be from my using recording settings MPEG-4, Direct to Disk. No problem when I switch recording to BVR.
 
Last edited:

fenderman

Staff member
Joined
Mar 9, 2014
Messages
36,897
Reaction score
21,250
Update on my GPU offload power tests. I was just doing similar tests with different GPU HA options in K-Lite codec pack and realized in my earlier tests I had my entire media power strip measured, not my PC. (Previous results included 34" monitor, routers, hubs and stuff.

Actual power results with BI Cuda on/off
Cuda HA off: 118 watts
Cuda HA on: 148 watts.

Still measures exactly 30 watts difference. That is a constant 30 watt GPU overhead with 8 cameras streaming. Console open or closed. With BI console open, CPU utilization up and total PC power increases, but GPU drain always constant around 30 watts.
still very high numbers for your load...
 

peterfram

Young grasshopper
Joined
Nov 4, 2015
Messages
37
Reaction score
21
It's 100 watts flat if I stop BI service. 84 watts if I turn off my weather station software. It's a 4 year old PC...
 

FireRock

n3wb
Joined
Sep 3, 2018
Messages
6
Reaction score
0
Location
44310
you should be able too..however with the cost of an Nvidia 1070 being the same as a second system, it makes little sense.
You can also try h.265 decoding with intel, it is supported by the software and your hardware but apparently is finicky about the driver.
Hi @fenderman. I am putting together a fairly advanced new Blue Iris build for my business with 35 Dahua cams and I want to document the process. Where is the best place to post build pics and details as well as performance statistics on H.264 and H.265 decoding? I have read Cliff's Notes and CUDA notes as well as multiple forum threads. I'll be ready to post the build soon.
 

bp2008

Staff member
Joined
Mar 10, 2014
Messages
12,666
Reaction score
14,006
Location
USA
Hi @fenderman. I am putting together a fairly advanced new Blue Iris build for my business with 35 Dahua cams and I want to document the process. Where is the best place to post build pics and details as well as performance statistics on H.264 and H.265 decoding? I have read Cliff's Notes and CUDA notes as well as multiple forum threads. I'll be ready to post the build soon.
Perhaps NVR's, DVR's & Computers
or General BI Talk
 

DebrodeD

Young grasshopper
Joined
Feb 10, 2017
Messages
55
Reaction score
4
Location
Utah
If you are running a server with an nvidia card in it, does this update make it easier for you to select intel quicksync for blue iris and ignore the nvidia card?
 

bp2008

Staff member
Joined
Mar 10, 2014
Messages
12,666
Reaction score
14,006
Location
USA
If you are running a server with an nvidia card in it, does this update make it easier for you to select intel quicksync for blue iris and ignore the nvidia card?
No, this update would have no effect at all on how easy or difficult it is to use Quick Sync while a discrete GPU is installed.
 

Bavery

n3wb
Joined
Feb 27, 2018
Messages
10
Reaction score
8
Just for another data point, I did some testing before & after installing a gtx 1070ti card in my approx. 6 year old system, running 9 cameras. The 4770K i7 cpu supports h.264 quicksync but I've only been able to use it successfully with my 4 (newer) Dahua cams. The other 5 cameras are older low-end insteon/foscam 720p cameras that end up with horrible artifacts all over the screen if I enable the intel GPU decoding for those cameras. Power useage was tested with a "DT500 power monitor" device and should be very accurate.

9 Cameras ranging from 0.9 - 6.3 MP (17.1MP total)
Total bitrate according to BI = ~3200kB/s 295MP/s

Before (h.264 & intel GPU for Dahua cameras):
~20% CPU load / 92W avg power consumption

After (CUDA for all cameras + switching Dahua cameras to h.265):
~14% CPU & ~15% GPU / 108W avg power consumption

Reduction in CPU useage likely came from being able to enable CUDA for all cameras instead of using quicksync for 4 of the 9 cameras. I was a little concerned about the possible extra power cost after reading this thread while waiting for the new graphics card to arrive, but I'm happy with the extra 16W to be able to use the GPU decoding for all of my cameras now. Also, I noticed there's a CUDA encoding option now for the web server which is nice to see. My wife did make a comment about how the videos seemed to start playing faster now through the android app, but I'm not sure if that was due to the upgrade/CUDA changes or due to some changes in antivirus/firewall that also occured around the same time.
 

fenderman

Staff member
Joined
Mar 9, 2014
Messages
36,897
Reaction score
21,250
Just for another data point, I did some testing before & after installing a gtx 1070ti card in my approx. 6 year old system, running 9 cameras. The 4770K i7 cpu supports h.264 quicksync but I've only been able to use it successfully with my 4 (newer) Dahua cams. The other 5 cameras are older low-end insteon/foscam 720p cameras that end up with horrible artifacts all over the screen if I enable the intel GPU decoding for those cameras. Power useage was tested with a "DT500 power monitor" device and should be very accurate.

9 Cameras ranging from 0.9 - 6.3 MP (17.1MP total)
Total bitrate according to BI = ~3200kB/s 295MP/s

Before (h.264 & intel GPU for Dahua cameras):
~20% CPU load / 92W avg power consumption

After (CUDA for all cameras + switching Dahua cameras to h.265):
~14% CPU & ~15% GPU / 108W avg power consumption

Reduction in CPU useage likely came from being able to enable CUDA for all cameras instead of using quicksync for 4 of the 9 cameras. I was a little concerned about the possible extra power cost after reading this thread while waiting for the new graphics card to arrive, but I'm happy with the extra 16W to be able to use the GPU decoding for all of my cameras now. Also, I noticed there's a CUDA encoding option now for the web server which is nice to see. My wife did make a comment about how the videos seemed to start playing faster now through the android app, but I'm not sure if that was due to the upgrade/CUDA changes or due to some changes in antivirus/firewall that also occured around the same time.
Your base load of 92w is crazy high... should be closer to 30w under that load...
You then spent 200 on a video card to reduce consumption from 20% to 14% this makes no sense to me...you now are paying a perpetual tax for that power use...
A smarter solution would be to buy a 100 efficient i5 system and not use ha on the foscams.. saving you 50-100 dollars a year in power consumption costs...
 

bp2008

Staff member
Joined
Mar 10, 2014
Messages
12,666
Reaction score
14,006
Location
USA
Yeah 20% is already a low load. There's no reason to throw in CUDA decoding to reduce that. Interesting note about the encoding possibly making the web server more responsive though. I haven't been able to try that yet since none of my BI systems have an nvidia card capable of encoding.
 

Bavery

n3wb
Joined
Feb 27, 2018
Messages
10
Reaction score
8
Your base load of 92w is crazy high... should be closer to 30w under that load...
You then spent 200 on a video card to reduce consumption from 20% to 14% this makes no sense to me...you now are paying a perpetual tax for that power use...
A smarter solution would be to buy a 100 efficient i5 system and not use ha on the foscams.. saving you 50-100 dollars a year in power consumption costs...
I don't know what typical idle load is for 4770K, but you're right that this is probably a bit high. It's been overclocked since day 1 which is definitely going to increase the power cost a bit, plus I just looked up typical idle power draw for hard drives after your comment and it sounds like there's a good chance having 4-5 old drives still plugged in and idling away all the time might be a contributing factor as well.

Anyway, you're right about there being other more cost effective ways to go, and for a computer dedicated only to blue Iris, an Nvidia GPU may never make financial sense. In my case, that "tax" for the extra power use works out to something like $20/year, which isn't going to break the bank and is small enough I'm willing to give the new CUDA settings a try for a while and see how things do
 
Last edited:

fenderman

Staff member
Joined
Mar 9, 2014
Messages
36,897
Reaction score
21,250
I don't know what typical idle load is for 4770K, but you're right that this is probably a bit high. It's been overclocked since day 1 which is definitely going to increase the power cost a bit, plus I just looked up typical idle power draw for hard drives after your comment and it sounds like there's a good chance having 4-5 old drives still plugged in and idling away all the time might be a contributing factor as well.

Anyway, you're right about there being other more cost effective ways to go, and for a computer dedicated only to blue Iris, an Nvidia GPU may never make financial sense. In my case, that "tax" for the extra power use works out to something like $20/year, which isn't going to break the bank and is small enough I'm willing to give the new CUDA settings a try for a while and see how things do
actually its much more than 20 dollars, even if you only pay 10c a kwh the 70w difference between running your pc and buying a 100 dollar pc that will consume 30w, is about 60 bux a year.
 
Top