Performance benefit from hardware decoding

Discussion in 'General BI Talk' started by rkn, Feb 14, 2018 at 5:59 AM.

Share This Page

  1. rkn

    rkn n3wb

    Joined:
    May 8, 2017
    Messages:
    19
    Likes Received:
    9
    I appreciate it varies by resolution, frame rate and cpu, but wonder what type of drop in CPU use people get when enabling hardware decoding where supported?

    The reason is I am planning the next generation of servers for my home infrastructure. Currently I have two PCs in a server role running VMs for core services such as file server, media streaming, version control, network controller/firewall and BI.

    The use of VMs prevents the use of hardware decoding in tests I've tried to date. Hence I am wondering whether to add a third dedicated PC for BI but in practice the drop in CPU would need to be significant to justify the overhead of yet another device. I am not CPU limited with BI currently so that's not a factor either.
     
  2. Dasstrum

    Dasstrum Pulling my weight

    Joined:
    Nov 4, 2016
    Messages:
    237
    Likes Received:
    189
    On my personal setup I have a i5-3570 with 8GB ram.
    I have 7 cameras pulling 250MP/sec

    Before HA I was averaging around 60% cpu load - This was with Blue Iris running as a service on windows 7
    After installing windows 10 and turning on HA I am averaging around 35-38% Cpu load

    I don't know how much upgrading to windows 10 played a role in the reduction of load other than allowing Blue Iris to run as a service and allowing HA