Ok, thinking about power consumption, take this with a grain of salt.
I did three 2-minute tests with my PC plugged into a z-wave power outlet that reports energy use in watts, configured to report every 1 second. These are notoriously not at all scientifically accurate devices but they are generally consistent. I left all other things idle and tried to minimize background stuff to at least get a ballpark of GPU power consumption with CUDA and
Blue Iris.
Results: CUDA on, see my post above. 8 cameras, GTX 1080, BI GPU utilization around 15%
242 Watts, pretty static 2 to 5 watt fluctuations.
CUDA off, Intel HA on, BI GPU utilization = 0
212 Watts, pretty static, similar fluctuations
So, unscientifically on a GTX 1080, CUDA BI offloads might cost about 30 watts in GPU power consumption for my case on a GTX 1080, console closed no streaming. So a 14% increase in power cost to cut my BI CPU utilization in half. Or in my idle case, cut my total system CPU utilization from about 28% to 20%.
For reference, 8% of my CPU 140 watt TDP is 11.2 watts. So GPU costs about 3x, but might be worth it if you need to free up CPU.