Performance benefit from hardware decoding

rkn

Young grasshopper
Joined
May 8, 2017
Messages
41
Reaction score
9
I appreciate it varies by resolution, frame rate and cpu, but wonder what type of drop in CPU use people get when enabling hardware decoding where supported?

The reason is I am planning the next generation of servers for my home infrastructure. Currently I have two PCs in a server role running VMs for core services such as file server, media streaming, version control, network controller/firewall and BI.

The use of VMs prevents the use of hardware decoding in tests I've tried to date. Hence I am wondering whether to add a third dedicated PC for BI but in practice the drop in CPU would need to be significant to justify the overhead of yet another device. I am not CPU limited with BI currently so that's not a factor either.
 

Dasstrum

IPCT Contributor
Joined
Nov 4, 2016
Messages
578
Reaction score
736
Location
Florida
On my personal setup I have a i5-3570 with 8GB ram.
I have 7 cameras pulling 250MP/sec

Before HA I was averaging around 60% cpu load - This was with Blue Iris running as a service on windows 7
After installing windows 10 and turning on HA I am averaging around 35-38% Cpu load

I don't know how much upgrading to windows 10 played a role in the reduction of load other than allowing Blue Iris to run as a service and allowing HA
 

rkn

Young grasshopper
Joined
May 8, 2017
Messages
41
Reaction score
9
Thanks for the info.

That's a reasonable drop but not sure it justifies the overhead of another PC in my case at the moment although there are other factors that make a separate PC easier for this role. I've just realised one of my old spare PCs also has an appropriate Intel CPU for HA so I'll give that a try before deciding.
 

Sev

n3wb
Joined
Jan 10, 2019
Messages
7
Reaction score
0
Location
Los Angeles
Yes it is possible to pass through an intel iGPU, after 4 days of toiling, I got it, but it doesnt seem to want to work with BI. which is annoying ,e.
 
Top