Recommend an Economical GPU for CPAI

mvoss

n3wb
Joined
Oct 21, 2023
Messages
20
Reaction score
21
Location
Texas
Hello,

I'm running the follow dedicated to BI machine:

System Manufacturer Dell Inc.
System Model Vostro 3671
System Type x64-based PC
Processor Intel(R) Core(TM) i5-9400 CPU @ 2.90GHz, 2904 Mhz, 6 Core(s), 6 Logical Processor(s)
Memory 24 GB
6TB WD Purple
Windows 11

I have 14 cameras each 4MP. I use substreams and everything is fine. I want to use CPAI on more cameras than I have been, total of 7.

Can anyone recommend an economical GPU to help offload some of the CPU usage, $100 - $150 range?

I don't want to use the Cameras AI as we have a lot of wildlife and I like catching that also.

Thank you for your help,

Michael
 

fenderman

Staff member
Joined
Mar 9, 2014
Messages
36,903
Reaction score
21,275
Hello,

I'm running the follow dedicated to BI machine:

System Manufacturer Dell Inc.
System Model Vostro 3671
System Type x64-based PC
Processor Intel(R) Core(TM) i5-9400 CPU @ 2.90GHz, 2904 Mhz, 6 Core(s), 6 Logical Processor(s)
Memory 24 GB
6TB WD Purple
Windows 11

I have 14 cameras each 4MP. I use substreams and everything is fine. I want to use CPAI on more cameras than I have been, total of 7.

Can anyone recommend an economical GPU to help offload some of the CPU usage, $100 - $150 range?

I don't want to use the Cameras AI as we have a lot of wildlife and I like catching that also.

Thank you for your help,

Michael
You will likely need to update your power supply - you will need to check to see if vostro supports standard power supply formats...
Have you tried using the yolov5.net model and setting it to use the integrated GPU?
 

kaltertod

Getting the hang of it
Joined
Jul 30, 2022
Messages
65
Reaction score
46
Location
BFE
GTX 1650 4gb will be plenty for that setup and it runs off the pcie bus for power so need to upgrage the power supply. They can be had from ebay gently used in the $100 range. I currently use one at a remote site for analyzing 12 5MP cameras with more to be installed shortly.
 

mvoss

n3wb
Joined
Oct 21, 2023
Messages
20
Reaction score
21
Location
Texas
Thank you both!.

Gentlemen,

Is this the one?

GIGABYTE GeForce GTX 1650 D6 OC 4G Graphics Card, 170mm Compact Size, 4GB 128-Bit GDDR6, GV-N1656OC-4GD REV2.0 Video Card

Thanks again,

Michael
 
Last edited:

mailseth

Getting the hang of it
Joined
Dec 22, 2023
Messages
126
Reaction score
87
Location
California
I recently realized that if I'm running a GPU continuously like a low-power NVIDIA 1650 (~$150), that has a TDP of 75 watts, it costs me ~$200 per per year in electricity (at $0.30 per kWh in California). Several Coral TPUs reduce that to be closer to $10-$20 per year.

(Not saying you should do different. I just picked up a Quadro P400 and wish that I had bought a 1650.)
 

jrbeddow

Getting comfortable
Joined
Oct 26, 2021
Messages
374
Reaction score
489
Location
USA
I recently realized that if I'm running a GPU continuously like a low-power NVIDIA 1650 (~$150), that has a TDP of 75 watts, it costs me ~$200 per per year in electricity (at $0.30 per kWh in California). Several Coral TPUs reduce that to be closer to $10-$20 per year.

(Not saying you should do different. I just picked up a Quadro P400 and wish that I had bought a 1650.)
You are absolutely right to be aware of the problem, but that will likely not really hit the $200/year mark that the simplified math would indicate. Keep in mind that the 75 watt TDP is the upper limit, at full time 100% utilization, something that shouldn't come close to happening in a B.I. CPAI scenario. A more useful number would be the long term average additional watts drawn by having this card installed over just the baseline computer using CPU only, which would likely be more on the order of 25-35 watts. Not insignificant, but not quite so alarmingly high.
 

mailseth

Getting the hang of it
Joined
Dec 22, 2023
Messages
126
Reaction score
87
Location
California
You’re not wrong that it’s more complicated than that, but also I low-balled the kWh rate. According to this site it averages $.45, and ranges from $.34-$.72 per kWh for PG&E.
.
 

Webfont

Pulling my weight
Joined
Sep 6, 2018
Messages
138
Reaction score
179
Location
Canada
You are absolutely right to be aware of the problem, but that will likely not really hit the $200/year mark that the simplified math would indicate. Keep in mind that the 75 watt TDP is the upper limit, at full time 100% utilization, something that shouldn't come close to happening in a B.I. CPAI scenario. A more useful number would be the long term average additional watts drawn by having this card installed over just the baseline computer using CPU only, which would likely be more on the order of 25-35 watts. Not insignificant, but not quite so alarmingly high.
Yea a kill-a-watt like device would determine the actual running cost by substracting the power usage with/without the GPU but a 1650 will not use much.
It's only 4W at idle MSI GeForce GTX 1650 Gaming X 4 GB Review, and the cuda AI isn't that much more power hungry. You'll never reach 75W usage.
 

fenderman

Staff member
Joined
Mar 9, 2014
Messages
36,903
Reaction score
21,275
You’re not wrong that it’s more complicated than that, but also I low-balled the kWh rate. According to this site it averages $.45, and ranges from $.34-$.72 per kWh for PG&E.
.
No one is paying $0.45 unless it's a commercial installation somewhere in California....
Otherwise in the Northeast closer to 23.....
Regardless with the YOLO.net module and the IP cam model OP can run everything without any additional card...
 
Last edited:

mailseth

Getting the hang of it
Joined
Dec 22, 2023
Messages
126
Reaction score
87
Location
California
My actual bill is even less interpretable because I have solar on the roof. PG&E has been raising rates lately to pay for the wildfire habit they have developed in the past few years.

I have a kill-a-watt already plugged into my machine and a Quadro P400 card in it. The listed P400 TDP is 30 w. I just ran a GPU benchmark and watched the power usage jump between 10 and 30 w while it was running. I'd say the average increase in usage was 15w. So that's ~50% of the TDP; with the 1650 that would be ~37 w. With new numbers of $.45 * 24 hrs * 365 days * 37 w = $146 yearly cost.

This is also making the assumption that you're running the GPU inference flat out the whole year, which you may not be. And when I've done the math in the past, the GPU is still about 20x more energy efficient than the CPU. But this is one of the reasons why I've been reworking the Coral TPU implementation for CPAI. Coral TPUs run at only 2 w each.
 

fenderman

Staff member
Joined
Mar 9, 2014
Messages
36,903
Reaction score
21,275
My actual bill is even less interpretable because I have solar on the roof. PG&E has been raising rates lately to pay for the wildfire habit they have developed in the past few years.

I have a kill-a-watt already plugged into my machine and a Quadro P400 card in it. The listed P400 TDP is 30 w. I just ran a GPU benchmark and watched the power usage jump between 10 and 30 w while it was running. I'd say the average increase in usage was 15w. So that's ~50% of the TDP; with the 1650 that would be ~37 w. With new numbers of $.45 * 24 hrs * 365 days * 37 w = $146 yearly cost.

This is also making the assumption that you're running the GPU inference flat out the whole year, which you may not be. And when I've done the math in the past, the GPU is still about 20x more energy efficient than the CPU. But this is one of the reasons why I've been reworking the Coral TPU implementation for CPAI. Coral TPUs run at only 2 w each.
No way that the discrete GPU is more efficient than a CPU.... But with the .net module you can use the Intel built-in GPU
 

kaltertod

Getting the hang of it
Joined
Jul 30, 2022
Messages
65
Reaction score
46
Location
BFE
My actual bill is even less interpretable because I have solar on the roof. PG&E has been raising rates lately to pay for the wildfire habit they have developed in the past few years.

I have a kill-a-watt already plugged into my machine and a Quadro P400 card in it. The listed P400 TDP is 30 w. I just ran a GPU benchmark and watched the power usage jump between 10 and 30 w while it was running. I'd say the average increase in usage was 15w. So that's ~50% of the TDP; with the 1650 that would be ~37 w. With new numbers of $.45 * 24 hrs * 365 days * 37 w = $146 yearly cost.

This is also making the assumption that you're running the GPU inference flat out the whole year, which you may not be. And when I've done the math in the past, the GPU is still about 20x more energy efficient than the CPU. But this is one of the reasons why I've been reworking the Coral TPU implementation for CPAI. Coral TPUs run at only 2 w each.
Your math is a little bit mixed up you missed a few decimal points it would be .0037..... The correct calculation would be .0037*24*365.45=$14.5854 per year. However in LA county where I do own property outside of Lancaster the rate is .35 for the generation charge and added fees. That calculation would look like this .0037*24*365*.35=$11.3442 per year. Out here in Wyoming we pay .11 per kwh so that would look like this .0037*24*365*.11=$3.56532

One decimal point will make your calculation look really bad :D

As far as the tpu using less power that is true but But the models are lagging behind the gpu as well as the speed ( slightly). However the tpu is faster than the CPU and uses less power.

while using a gtx 1650 my remote pc averages 2-5% load when all the cameras fire off a ai call at the same time, That being said using MSI afterburner and confirmed through geforce experience the card is reporting only pulling 26 watts at this load... If you were to run the card headless it would pull less as it would not be sending power down the cable to a monitor to display an image.

No way that the discrete GPU is more efficient than a CPU.... But with the .net module you can use the Intel built-in GPU
As far as power usage the gpu is still more efficient based on speed of confirmed alerts and power used. On my secondary machine with 12 cameras recording continuous and firing off AI requests all at the same time my ryzen 5 5700x is rated at 65 watts and Windows task manager reports roughly 65% usage at that point in time. So I ran some tests with pbo off and the average power draw from the cpu reported by Ryzen Master was on the order of 42.25 watts average for the cpu usage at that time. That math from the op would equate out to $16.5564 per year to run the cpu to analyze the AI requests.

The cpu is using more power to process the events (although marginally more). However it takes the cpu 250ms on average to analyze an image equating to a slower rate of identification. IMO this gives the gpu a slight advantage in power usage and analyzing time making the GPU the more efficient of the 2 methods.

Give me some time and I will figure the math out on using the IGPU on my 5820k and report the results back. I believe that it will be more of the same due to the higher power draw of the cpu.
 

mailseth

Getting the hang of it
Joined
Dec 22, 2023
Messages
126
Reaction score
87
Location
California
I don’t think I agree. 0.0037 kW is 3.7 watts, but I had intended 37 watts, so I did it correctly. It’s not too far off from your 26 watts.
 

kaltertod

Getting the hang of it
Joined
Jul 30, 2022
Messages
65
Reaction score
46
Location
BFE
I don’t think I agree. 0.0037 kW is 3.7 watts, but I had intended 37 watts, so I did it correctly. It’s not too far off from your 26 watts.
37 watts is 3.7 percent of 1000 you have to use the thousandths place 1704421740385.png

Therefore I respectfully disagree. .0037 would be correct.
 

mailseth

Getting the hang of it
Joined
Dec 22, 2023
Messages
126
Reaction score
87
Location
California
I’m not sure how the math works for percentage or why you’d want use it. I’m converting half of 75 watts of TDP to kW because the utility charges in kWh. Even my low-power card wasn’t running at 3.7 watts. It was running 10-30 w.
 
Top