- Nov 30, 2016
- 3
- 6
Hi,
Can anyone help explain what appears to me to be very strange CPU/GPU usage behaviour when turning on hardware decode to Intel.
With the exact same camera/settings/system, turning hardware decode to Intel increases the CPU usage by ~10% and the GPU usage goes from 0 to ~70%. There is no noticeable difference in camera FPS, Quality or anything... I feel like I'm either being incredibly stupid and missing something, or my system has gone haywire.
CPU/GPU usage with Hardware Decode set to "no":
CPU/GPU usage with Hardware Decode set to "Intel":
Specs / Settings:
i7-9700 @ 3.00Ghz, 8 cores
32 GB Ram
Samsung SSD for the OS and 2xWD Purple drives for the cameras.
Thanks!
Tom.
Can anyone help explain what appears to me to be very strange CPU/GPU usage behaviour when turning on hardware decode to Intel.
With the exact same camera/settings/system, turning hardware decode to Intel increases the CPU usage by ~10% and the GPU usage goes from 0 to ~70%. There is no noticeable difference in camera FPS, Quality or anything... I feel like I'm either being incredibly stupid and missing something, or my system has gone haywire.
CPU/GPU usage with Hardware Decode set to "no":
CPU/GPU usage with Hardware Decode set to "Intel":
Specs / Settings:
i7-9700 @ 3.00Ghz, 8 cores
32 GB Ram
Samsung SSD for the OS and 2xWD Purple drives for the cameras.
Thanks!
Tom.