My actual bill is even less interpretable because I have solar on the roof. PG&E has been raising rates lately to pay for the wildfire habit they have developed in the past few years.
I have a kill-a-watt already plugged into my machine and a Quadro P400 card in it. The listed P400 TDP is 30 w. I just ran a GPU benchmark and watched the power usage jump between 10 and 30 w while it was running. I'd say the average increase in usage was 15w. So that's ~50% of the TDP; with the 1650 that would be ~37 w. With new numbers of $.45 * 24 hrs * 365 days * 37 w = $146 yearly cost.
This is also making the assumption that you're running the GPU inference flat out the whole year, which you may not be. And when I've done the math in the past, the GPU is still about 20x more energy efficient than the CPU. But this is one of the reasons why I've been reworking the Coral TPU implementation for CPAI. Coral TPUs run at only 2 w each.
Your math is a little bit mixed up you missed a few decimal points it would be .0037..... The correct calculation would be .0037*
24*365.45=$14.5854 per year. However in LA county where I do own property outside of Lancaster the rate is .35 for the generation charge and added fees. That calculation would look like this .0037*24*365*.35=$11.3442 per year. Out here in Wyoming we pay .11 per kwh so that would look like this .0037*24*365*.11=$3.56532
One decimal point will make your calculation look really bad
As far as the tpu using less power that is true but But the models are lagging behind the gpu as well as the speed ( slightly). However the tpu is faster than the CPU and uses less power.
while using a gtx 1650 my remote pc averages 2-5% load when all the cameras fire off a ai call at the same time, That being said using MSI afterburner and confirmed through geforce experience the card is reporting only pulling 26 watts at this load... If you were to run the card headless it would pull less as it would not be sending power down the cable to a monitor to display an image.
No way that the discrete GPU is more efficient than a CPU.... But with the .net module you can use the Intel built-in GPU
As far as power usage the gpu is still more efficient based on speed of confirmed alerts and power used. On my secondary machine with 12 cameras recording continuous and firing off AI requests all at the same time my ryzen 5 5700x is rated at 65 watts and Windows task manager reports roughly 65% usage at that point in time. So I ran some tests with pbo off and the average power draw from the cpu reported by Ryzen Master was on the order of 42.25 watts average for the cpu usage at that time. That math from the op would equate out to $16.5564 per year to run the cpu to analyze the AI requests.
The cpu is using more power to process the events (although marginally more). However it takes the cpu 250ms on average to analyze an image equating to a slower rate of identification. IMO this gives the gpu a slight advantage in power usage and analyzing time making the GPU the more efficient of the 2 methods.
Give me some time and I will figure the math out on using the IGPU on my 5820k and report the results back. I believe that it will be more of the same due to the higher power draw of the cpu.