Intersting Deepstack Challenge

CCTVCam

Known around here
Joined
Sep 25, 2017
Messages
2,674
Reaction score
3,505
I have had an interesting thought recently that I thought might be of a wide interest to the community. Unfortunately I don't have the means to test it but if anyone does, maybe a project for someone....and may settle the dabate of CPU vs GPU...

So the thought is, in these times of sky high and rocketing energy prices, which is the more energy efficient DS+GPU or DS+ CPU?

You might think the answer is obvious - CPU, but I don't think it's that clear cut.

Here's why and what would be interesting to test.

Power Consumption depends on load.

So running a CPU + 50w GPU such as a Quaddro P620 might seem to be a simple case of CPU + 50W. However...... DS + CPU will put the CPU under load so the CPU will use more power, whilst DS + GPU will remove the load on the CPU but add load to the GPU. However no: 2 - the amount of power the GPU draws will also be dependent on load so if the GPU isn't under a lot of load, the draw could be a lot less than the 50W specified for the card. So this then raises the question of, is the energy requirement of DS + a highly loaded CPU less than the energy requirement of DS + a lightly loaded CPU + a lightly loaded GPU (assuming here a P620 is going to kick it unless DS is processing a lot of cameras). even if the GPU does make it less efficient, what is the gap. eg you'd assume it was 50w just from the specs but what if cosnidering the diferent loads and power draws it came out to only 5-10W? It might then be worth swinging one way or the oher because the difference was so small.

So the question, is does anyone want to test this for the community to discover a definitive answer? I would but my CCTV isn't installed yet (I have other projects physcially blocking the installation atm) plus I don't have any way of measuring power draw. I think if someone was in a position to test though the results could be interesting at best or at least draw a line under it at worst.
 

jrbeddow

Getting comfortable
Joined
Oct 26, 2021
Messages
374
Reaction score
489
Location
USA
Indeed, an interesting question. I only run DS on CPU, and haven't tried GPU, but I strongly suspect than my setup is more power efficient (but admittedly lower performance). Here's why: in either scenario the CPU system is running very lightly loaded (3-6% CPU use on mine, 4 cameras @4MP each) about 99% of the time. Obviously, it gets very brief spikes when DS is processing, but overall the whole computer has a total draw of about 23-24W on average, long term. Now if I add an Nvidia GPU , even a Quadro P400 that has a maximum draw of 30W, my understanding is that it will always add at least 8-10 watts all the time, even at a nominal "idle" use. This will effectively always put me up over roughly 32-34 watts.
As they say, "prove me wrong" :) ; in this case I'd take it as good news.
 
Last edited:

Swampledge

Getting comfortable
Joined
Apr 9, 2021
Messages
210
Reaction score
469
Location
Connecticut
I really don’t think there can be a “definitive” answer without specifically defining the usage. Although I have 6 cameras, I‘m only running AI on two of them, one during daylight and one at nighttime on a different camera that is prone to spider webs. If I had a scenario where the second camera didn’t need spider-proofing, and was using GPU, fully 50% of the time I’d be powering the GPU to do nothing.
 

jrbeddow

Getting comfortable
Joined
Oct 26, 2021
Messages
374
Reaction score
489
Location
USA
Yes, I do have a Kill-a-Watt meter, the numbers I previously posted were based on that. I just don't have a spare Nvidia card to throw in there to compare to my baseline CPU only setup, but I am fairly certain that the long term average power draw will always be higher with a dedicated video card. Ignore the momentary spikes in power draw when DS is processing, in the long run they are insignificant background "noise" compared to the continuous added power draw.
 

sebastiantombs

Known around here
Joined
Dec 28, 2019
Messages
11,511
Reaction score
27,695
Location
New Jersey
I am sure that a machine with a GPU for DS will definitely use more power, but how much more? Even a high powered GPU like an RTX2070 or RTX3070 only uses about 10 watts at idle. That does amount to 240 watts per day of idling and it will draw more power when processing DS images. The additional load for detection is a moving target based on how many cameras, how many triggers and how many images per trigger end up getting processed along with the actual power required by the GPU based on the model of the GPU.

Short answer is that yes a GPU will need more power. How much more is an unpredictable and variable number. As a guesstimate I'd venture about a KW a day, plus or minus. So if you're into saving power don't use a GPU. If you're into fast detection times and have a number of cameras using DS a GPU may well be worth that power penalty.
 
Last edited:
Top