(16) 3MP Camera System

zebrock

Getting the hang of it
Joined
Dec 3, 2016
Messages
163
Reaction score
91
What is idiotic is your name calling and general derision for points of view that aren't your own. Reign it in some and have a normal discussion like a regular person.
I wish I could like this more than once.
 

nayr

IPCT Contributor
Joined
Jul 16, 2014
Messages
9,329
Reaction score
5,325
Location
Denver, CO
Well, I'm here with one of the largest systems on this board I'm aware of.
Really? lols.. Ive recently worked on setups with three BlueIris servers each handling 64 cameras each... and I listen to what Fenderman says because he's built and maintains dozens of BlueIris servers, he's forgotten more about BlueIris than you'll ever know.

The difference between a 4 year old system loaded 24/7/365 and a new modern system idling 24/7/365 is indeed a several hundred dollar electricity savings..
 

spencnor

Getting the hang of it
Joined
May 25, 2015
Messages
127
Reaction score
56
This discussion has me considering upgrading my (meager;)) system if the cost savings warrant. I agree energy usage and efficiency is a hidden cost. I got to thinking what my actual energy numbers are by plugging in my Kill A Watt meter.

HP Elite 8300, SFF, i7-3770, 12gb RAM, 500gb and 4tb HD's

(all values approx, as numbers hover)
Computer idling = 35 watts
Blue Iris app w/console minimized = 55 watts
Blue Iris app w/console open = 73 watts
CPU w/Blue Iris minimized (10 cams @2mp, avg. 15fps) = 18%
CPU w/Blue Iris open = 35%

HP ProCurve 2610-24/12PWR (12 port PoE)
Switch idle with no ports connected = 25.2w
9 cams connected w/ no IR = 55.9w (my PTZ is powered by a 2a wall wort)
9 cams connected w/IR = 67.1w

It would be great to see what others are experiencing :).
 
As an Amazon Associate IPCamTalk earns from qualifying purchases.

fenderman

Staff member
Joined
Mar 9, 2014
Messages
36,897
Reaction score
21,250
This discussion has me considering upgrading my (meager;)) system if the cost savings warrant. I agree energy usage and efficiency is a hidden cost. I got to thinking what my actual energy numbers are by plugging in my Kill A Watt meter.

HP Elite 8300, SFF, i7-3770, 12gb RAM, 500gb and 4tb HD's

(all values approx, as numbers hover)
Computer idling = 35 watts
Blue Iris app w/console minimized = 55 watts
Blue Iris app w/console open = 73 watts
CPU w/Blue Iris minimized (10 cams @2mp, avg. 15fps) = 18%
CPU w/Blue Iris open = 35%

HP ProCurve 2610-24/12PWR (12 port PoE)
Switch idle with no ports connected = 25.2w
9 cams connected w/ no IR = 55.9w (my PTZ is powered by a 2a wall wort)
9 cams connected w/IR = 67.1w

It would be great to see what others are experiencing :).
For comparison, my i7-6700 elitedesk with an SSD, and two 2tb drives...
idle 21w
28%=52
50%=57w
60%=61w
85%=63w
Unless you live in a very high priced area, the 25w or so saved is not worth the upgrade ...
 
As an Amazon Associate IPCamTalk earns from qualifying purchases.

zero-degrees

Known around here
Joined
Aug 15, 2015
Messages
1,359
Reaction score
847
Threads like this leave me like...

Whoa... Someone's talking shit to Fender... Oh, look... He's only been a member for a month and has like 20 posts... Ah, who cares, he'll have moved on within a month and it isn't worth the time. :)

Kudo's to you @fenderman for the continued support of this forum and education of the uninformed, even if slamming your head in a door would be more productive at times... o_O
 

bobfather

Getting the hang of it
Joined
Jan 17, 2017
Messages
103
Reaction score
26
I only see an attempt at a discussion taking place, along with attempts to shut down or derail those discussions!

Since I, too have a Kill-A-Watt, I also did some measurements:

i7-3770 with a GTX 1070 (my gaming computer): 64 watts idle, 129 running Prime95 with all cores loaded. And probably way over 200w while playing an intensive game. I understand the GTX 1070 adds 20-30 watts of additional power consumption at idle, so doing that math brings the power consumption down to ~30-40 watts idle (with no video card), ~100 watts fully loaded, just in line with what spencnor observed above with his system

i5-3570k as a Blue Iris server; 24 cameras @ 2.1mp @ 10 fps: 68 watts with Blue Iris closed; 74 watts with it open; this is the computer that is constantly at 46-58% CPU usage.

Core2Quad 9550 - the computer Fender claimed idled at well over 150w! Actually idles at 48 watts and consumes a whopping 123 watts running Prime95 with all cores loaded.

Moral of the story: 3rd generation K processor that Fender claims is a huge and hungry POWER HOG is not. If we are to believe his (probably spurious) claims that an i5-6500 can run 24 cameras at 25 watts, then that means the 3rd generation i5 literally consumes 1032 more watt hours in a day, and in my area that costs an extra $.09 (9 cents) per day. If, in reality the i5-6500 system consumes 40-50 watts with the same setup, there is a marginal, nearly unnoticeable difference in power consumption.

Fender, feel free to doubt my math, my abilities, my whatever. Everyone else, continue to blindly believe him. Personally, I'm satisfied with my proof: a 3rd gen i5 is as capable as a 6th gen i5, but costs about $34 a year more to run, if you believe a 6th gen i5 uses less power under load than Intel's Atom systems do.
 

fenderman

Staff member
Joined
Mar 9, 2014
Messages
36,897
Reaction score
21,250
I only see an attempt at a discussion taking place, along with attempts to shut down or derail those discussions!

Since I, too have a Kill-A-Watt, I also did some measurements:

i7-3770 with a GTX 1070 (my gaming computer): 64 watts idle, 129 running Prime95 with all cores loaded. And probably way over 200w while playing an intensive game. I understand the GTX 1070 adds 20-30 watts of additional power consumption at idle, so doing that math brings the power consumption down to ~30-40 watts idle (with no video card), ~100 watts fully loaded, just in line with what spencnor observed above with his system

i5-3570k as a Blue Iris server; 24 cameras @ 2.1mp @ 10 fps: 68 watts with Blue Iris closed; 74 watts with it open; this is the computer that is constantly at 46-58% CPU usage.

Core2Quad 9550 - the computer Fender claimed idled at well over 150w! Actually idles at 48 watts and consumes a whopping 123 watts running Prime95 with all cores loaded.

Moral of the story: 3rd generation K processor that Fender claims is a huge and hungry POWER HOG is not. If we are to believe his (probably spurious) claims that an i5-6500 can run 24 cameras at 25 watts, then that means the 3rd generation i5 literally consumes 1032 more watt hours in a day, and in my area that costs an extra $.09 (9 cents) per day. If, in reality the i5-6500 system consumes 40-50 watts with the same setup, there is a marginal, nearly unnoticeable difference in power consumption.

Fender, feel free to doubt my math, my abilities, my whatever. Everyone else, continue to blindly believe him. Personally, I'm satisfied with my proof: a 3rd gen i5 is as capable as a 6th gen i5, but costs about $34 a year more to run, if you believe a 6th gen i5 uses less power under load than Intel's Atom systems do.
I dont understand why you feel the need to lie when my statements are right there in black and white. I stated that they consume 150w under moderate/minimal load. I never said anything about idle load. I dont believe your numbers. More importantly you fail to state how much power that system used under its current load. Likely 100w+ (even with your skewed numbers)...
I NEVER stated that 3rd gen processors are POWER HOGS. That is lie #2.
I further NEVER stated that the i5-6500 system can run 24 cameras at 25w. ANOTHER LIE. What i did say was that the i5-6500 system will consume about 25w under the same load as you are applying to the core2quad. The 75-100w savings is well worth an upgrade and will pay for itself in less than 2 years (even with your skewed numbers)...
Not only are you inept at mathematics, but your reading comprehension is beyond poor.
Now tell us what video card you are using in your 9550 so I can prove you wrong.
 
Last edited:

bobfather

Getting the hang of it
Joined
Jan 17, 2017
Messages
103
Reaction score
26
Believe what you'd like. I took pictures as evidence for all my claims when I did my testing earlier today. Here's a link to the album.

Not sure what video card the Q9550 is running, but it's a Dell Optiplex 760 SFF chassis, so you can research it if you'd like. Surprisingly, it's hard for me to remember a time when processors didn't have onboard video.

Meanwhile, I've yet to see anyone with an i5-6500 running Blue Iris with any number of cameras step up to share some from-the-wall power consumption figures. I figure it's because Intel is really good at marketing, so people believe they've made progress over the last 4 years, when really they peaked at Ivy Bridge.
 

fenderman

Staff member
Joined
Mar 9, 2014
Messages
36,897
Reaction score
21,250
Believe what you'd like. I took pictures as evidence for all my claims when I did my testing earlier today. Here's a link to the album.

Not sure what video card the Q9550 is running, but it's a Dell Optiplex 760 SFF chassis, so you can research it if you'd like. Surprisingly, it's hard for me to remember a time when processors didn't have onboard video.

Meanwhile, I've yet to see anyone with an i5-6500 running Blue Iris with any number of cameras step up to share some from-the-wall power consumption figures. I figure it's because Intel is really good at marketing, so people believe they've made progress over the last 4 years, when really they peaked at Ivy Bridge.
Check your numbers again...every other online test shows you are WAY off. I have tested these machines and know they idle much much higher.
There is a HUGE improvement in the second gen+ processors over the corequad and first gen i processors. No one said that there is a big difference between ivy and skylake when it comes to power consumption, just that there is improvement. Again you need to be a complete moron to buy a 5 year old system when for a few more dollars you can get a modern system with a warranty.
 

bobfather

Getting the hang of it
Joined
Jan 17, 2017
Messages
103
Reaction score
26
I just spent 2 minutes searching on the internet.

Here's a guy with a C2Q 9550 idling at 69 watts.

Honestly, lets just drop it. Your pride, or reputation, or whatever will never ever let you believe what I say.

I'm sure you'd rather be doing other things than calling me mean names over and over on the internet.
 

fenderman

Staff member
Joined
Mar 9, 2014
Messages
36,897
Reaction score
21,250
I just spent 2 minutes searching on the internet.

Here's a guy with a C2Q 9550 idling at 69 watts.

Honestly, lets just drop it. Your pride, or reputation, or whatever will never ever let you believe what I say.

I'm sure you'd rather be doing other things than calling me mean names over and over on the internet.
69 vs 48...makes total sense. Its not about pride, its about you giving folks shit advice. Penny wise pound foolish.
 

bobfather

Getting the hang of it
Joined
Jan 17, 2017
Messages
103
Reaction score
26
Yeah that guy has a Plex server with who knows how many hard drives full of files he serves. Our C2Q is a lowly, stock Optiplex 7600, probably with the original hard drive from 8 years ago. Truth is, I never suggested for anyone to get a C2Q - you created that narrative yourself.

My first and only recommendation was to get a 3rd generation Intel processor. At this point multiple data points have been presented to you that indicate that not only is a 3rd generation i5 or i7 more than capable of running Blue Iris for the majority of users (I.e., those with ~20 cameras, or less). Those processors perform almost identically to newer processors, and while newer processors are indeed more power efficient, your claims of any of the processors named, whether C2Q or 3rd generation i5, being $132-$262 more expensive per year to run have turned out to be wildly false. Like Trump alternative-facts level false.

Now you're just going to cling to your belief and say "well, an i5-6500 is $320 on EBay, why pay $220 for an i7-3770?" Here's the answer to that: because the i7 is a vastly superior processor, and because it will take 2-3 years for it to burn the $100 cost difference in extra electricity.
 

fenderman

Staff member
Joined
Mar 9, 2014
Messages
36,897
Reaction score
21,250
Yeah that guy has a Plex server with who knows how many hard drives full of files he serves. Our C2Q is a lowly, stock Optiplex 7600, probably with the original hard drive from 8 years ago. Truth is, I never suggested for anyone to get a C2Q - you created that narrative yourself.

My first and only recommendation was to get a 3rd generation Intel processor. At this point multiple data points have been presented to you that indicate that not only is a 3rd generation i5 or i7 more than capable of running Blue Iris for the majority of users (I.e., those with ~20 cameras, or less). Those processors perform almost identically to newer processors, and while newer processors are indeed more power efficient, your claims of any of the processors named, whether C2Q or 3rd generation i5, being $132-$262 more expensive per year to run have turned out to be wildly false. Like Trump alternative-facts level false.

Now you're just going to cling to your belief and say "well, an i5-6500 is $320 on EBay, why pay $220 for an i7-3770?" Here's the answer to that: because the i7 is a vastly superior processor, and because it will take 2-3 years for it to burn the $100 cost difference in extra electricity.
I never stated that you suggested a core2quad. You have a problem stating truthful facts.
No one disputes that the i7-3770 is a powerful processor (the number of cameras it can handle will depend on resolution, bitrate AND frame rates)..simply stating 20 cameras is useless.
Once again you LIE about my statements. I NEVER stated that a 3rd gen i5 will cost 132 more to run. I stated that a core2quad will, and that is FACT substanciated by myself and others on the internet, real websites, not forum members.
Finally, the i7-3770 is not a superior processor to an i5-6500 when you factor in intel hd and power consumption. An i5-4590 at 100 pecent load only consumes 75 watts. The i5-6500 will be better.
PC NVR Power Consumption Sample
Your math is off again. 300-250 is 50. Easy peasy. Conveniently forgot the extra cost of memory again, and added 20 dollars to the price. For 310 (actually less because it was best offer) there was an i5-6500 with a 256gb ssd.
There is a 50w difference which for some is 100 per year, others 50 or so.
Its amazing how you distort the facts.
Oh, and since you HAD to bring politics into this, Trump is our president and he will make America great again. Despite your damn protests.
 

bobfather

Getting the hang of it
Joined
Jan 17, 2017
Messages
103
Reaction score
26
I would almost rather move to a political discussion than continue on this course, and I hate politics.

You and I pretty much only agree to disagree, so I'm still waiting for someone to produce power consumption figures of Blue Iris running on an i5-6500 with any number of cameras. Come on guys, the i5-6500 is the only acceptable processor to use with Blue Iris. Surely someone has some solid numbers!

And since you're bringing cost back into it, EBay has a $325 i5-6500, a $229 i7-3770, and a $129 i5-3570. The 3570 is just ridiculous bang for the buck, and since we know it uses 74 watts to run 24 cameras, I'm really eager to see the i5-6500 folks and their power consumption figures.

Here's a hint: if the i5-6500 draws 0 watts, the i5-3570 is still the cheaper option to use for the next 3 years.
 
Last edited:
Top