I only see an attempt at a discussion taking place, along with attempts to shut down or derail those discussions!
Since I, too have a Kill-A-Watt, I also did some measurements:
i7-3770 with a GTX 1070 (my gaming computer): 64 watts idle, 129 running Prime95 with all cores loaded. And probably way over 200w while playing an intensive game. I understand the GTX 1070 adds 20-30 watts of additional power consumption at idle, so doing that math brings the power consumption down to ~30-40 watts idle (with no video card), ~100 watts fully loaded, just in line with what spencnor observed above with his system
i5-3570k as a
Blue Iris server; 24 cameras @ 2.1mp @ 10 fps: 68 watts with Blue Iris closed; 74 watts with it open; this is the computer that is constantly at 46-58% CPU usage.
Core2Quad 9550 - the computer Fender claimed idled at well over 150w! Actually idles at 48 watts and consumes a whopping 123 watts running Prime95 with all cores loaded.
Moral of the story: 3rd generation K processor that Fender claims is a huge and hungry POWER HOG is not. If we are to believe his (probably spurious) claims that an i5-6500 can run 24 cameras at 25 watts, then that means the 3rd generation i5 literally consumes 1032 more watt hours in a day, and in my area that costs an extra $.09 (9 cents) per day. If, in reality the i5-6500 system consumes 40-50 watts with the same setup, there is a marginal, nearly unnoticeable difference in power consumption.
Fender, feel free to doubt my math, my abilities, my whatever. Everyone else, continue to blindly believe him. Personally, I'm satisfied with my proof: a 3rd gen i5 is as capable as a 6th gen i5, but costs about $34 a year more to run, if you believe a 6th gen i5 uses less power under load than Intel's Atom systems do.