What's your Blue Iris Setup like? (List Specifications)

When I run the XTU stress test, I see the Package TDP going up to 120 watts. Is this the same as actual output? I've seen the same problem happen with the current 10th gen Intel CPU's as well. Its frustrating and even spoke with Intel regarding this; they keep sending me back to the motherboard manufacturer. There's even a video by Gamer's Nexus regarding current 10th gen Intel CPU's with higher than normal thermal performance. Could be that with these third party board manufacturers this has been plaguing Intel for years as I see this happening with my 7th gen Intel CPU. I know that most BI setups I come across are using OEM hardware (from Dell, IBM, HP, etc). I don't know if the problem is related to doing custom builds. However, I am interested in other user's specifications :)
You need a meter like a killawatt meter. A normal i5-7500 pc will draw about 30w under that load.
 
  • Like
Reactions: Arjun
You said your kill-a-watt shows 120w? And this is just the computer plugged into it? No monitors or anything else? And you're sure its on W and not V?
 
  • Like
Reactions: Arjun
I plugged in the surge protector into the Kill-a-Watt (silly me). Plugging in directly from desktop to the Kill-a-Watt yields ~38 watts when at idle. This is not an ATX nor micro-ATX foam factor.

You said your kill-a-watt shows 120w? And this is just the computer plugged into it? No monitors or anything else? And you're sure its on W and not V?
 
I plugged in the surge protector into the Kill-a-Watt (silly me). Plugging in directly from desktop to the Kill-a-Watt yields ~38 watts when at idle. This is not an ATX nor micro-ATX foam factor.
That's more like it :)
 
  • Like
Reactions: Arjun
I'm still concerned over 10th Gen CPU's though, why the Package TDP exceeds the TDP specified for a 65W and 125W CPU, but that's reserved for a separate discussion :)

That's more like it :)
 
So its 38w at idle. What about when the cameras are streaming to BI? Whats the load then?
 
I just completed an upgrade this weekend from an i9-9900K system.

In a 4U rackmount chassis with a SeaSonic titanium 750W PSU...
Intel Core i9-10900K
Onboard Intel HD 630 GPU only
ASUS PRIME Z490A
16GB DDR4-4400 RAM, CAS18
Mellanox ConnectX-3 10GBase-SR NIC
2x Samsung SM961 NVMe SSDs (boot)
2x Intel S3510 SATA SSDs (images)
LSI MegaRAID 9266-8i RAID HBA, plus LSI 24-port SAS expander
8x 6TB WD Blue/Red/Purple (RAID5)
8x 8TB WD Red (RAID5)

All cameras recording 24/7 D2D, with motion detection, not using substreams or limit decoding.
Right around 8000KB/s of camera feed, and just under 950MP/s
BI 5.2.9.23 is running about 15% CPU, consuming 5.7GB RAM, with UI and all viewers closed.
Video Decode 40%
Kill-A-Watt says 215 watts, 120V source. I normally run off 240V but I have it in a build area right now.
Without BI running (system background), I'm pulling 160 watts. Sixteen rust spinners, the storage controller, and the industrial chassis fans take their toll.


Interestingly, my usage went down far more than the core/clock increase over the 9900K would suggest. With my normal 24/7 viewers (via UI3) running, I was hitting 70% or higher before, which is why I did the upgrade. I did reinstall the OS from scratch and used newer GPU drivers, but still--from 70+% to 30-35% is far more improvement than I expected. One possible reason may be related to something that has been speculated here before--memory speed may really matter. I went from DDR4-3733C17 to DDR4-4400C18.
 
I don't know how reliable Kill-A-Watt meters are, but shows about 120 watts. I was exploring other options and oddly enough a new system may consume three times that amount depending on specs. I don't know why anyone would want to put a dedicated GPU in their system for BI
I used a Kill-A -Watt to measure mine. My UPS also lists total wattage load. They are within one watt of each other every time I compared them.

I use a GPU card and my system only uses 107 watts. That dedicated GPU dropped my CPU usage by more than 1/2.
 
  • Like
Reactions: Arjun
Thanks, if you don't mind telling me what GPU are you using?

I used a Kill-A -Watt to measure mine. My UPS also lists total wattage load. They are within one watt of each other every time I compared them.

I use a GPU card and my system only uses 107 watts. That dedicated GPU dropped my CPU usage by more than 1/2.
 
It's in post #9 in this thread.

Graphics Card: EVGA GeForce GTX 1050 Ti gaming 4GB memory
 
  • Like
Reactions: Arjun
Jesus, some of the systems here makes mine look a bit Mickey Mouse

Edit In the pic above there are 76 hard drives, assuming they are all populated, that is a shit load of storage.
 
Last edited:
  • Haha
Reactions: djernie and Arjun
BI 5.29.23 on i7-6700 with 16 Gb of RAM, a 256 Gb SSD and 3 Tb for storage.

10 cameras with 6 recording 24/7 and the others depending on the status of our house - away, awake or asleep.

CPU right around 25% and power consumption at about 22 watts.
 
With the ability to use sub streams, I'm actually looking to massively downsize my system, if for nothing more than a kWh saving. Though there's that age old problem or spending to save. A new setup, would most likely never pay for itself with the savings -- but as a techie, new toys are nice :)

I currently have an 5.3.0.3 running 7x Hik DS-2CD2385FWD-I 8MP on an i7-7700 CPU/20GB RAM/128GB SSD for OS and 2TB Surveillance Disk for storage (custom build, with 80+ gold PSU). Since switching to sub streams, I average 3-5% CPU usage, and 1-2GB RAM. Down from around 15-20% CPU and 8GB RAM.

I have a couple of i5-6 Intel NUC devices laying around, which could suit. I just need to benchmark power consumption and CPU consumption when multiple cams are recording/triggered.

Top graph shows CPU usage over time, over 7 days.
Bottom graph, CPU temp and room temp, over 7 days.

Screenshot 2020-07-15 at 20.59.51.png
 
Last edited:
  • Like
Reactions: djernie
Bottom graph, CPU temp and room temp, over 7 days.
Your room has a low of 10 and a high of 38 in the same day? That's 50-100F. How do you do that?

So this shows that the biggest impact on CPU temp for you is room temp and not CPU usage.
 
Your room has a low of 10 and a high of 38 in the same day? That's 50-100F. How do you do that?

So this shows that the biggest impact on CPU temp for you is room temp and not CPU usage.
I live in the UK............. where we use real money (Celcius). And have stupid ass weather -- welcome to our summer time!

And correct. CPU temp, I'm not worried about at all. Though in our hottest days (30 degrees C outside) the room can reach 50, which means CPU gets to 60-70, which is not ideal. It's purely monitored as I have a series of additional fans that kick in when the room temp hits 45.
 
Last edited:
  • Like
Reactions: djernie
I live in the UK............. where we use real money (Celcius).
I live in Texas. About 40 years ago I was doing some camping in British Columbia Canada in June. It was in the mountains and I was in a small tent with a sleeping bag good to about freezing. Well I froze my but off the last night I was there. On the way to the airport I stopped at a gas station and was talking to the old, crusty guy working there. We chatted and he wanted to know what I was doing there. I told him about camping and mentioned that I was really cold last night. He said, "yeah, it got a little below zero last night". So since I was in Canada, I'm thinking Celsius, so I figure a little below zero would be about 30F. I said wow, it sure seemed colder than just below freezing. He looked at me kind of strangely and said "I'm talking below zero, FAHRENHEIT! A real man's scale. Not that sissy scale where zero isn't even cold!"
 
Windows 10 Virtual machine running on my NAS:
i3-8100
asrock z370m/itx
16gb ram
5x8TB drives
250gb NVMe ssd
250gb sata ssd

The VM itself has 3 pinned cores, 4gb RAM, and 100gb on the sata ssd. More than enough, especially with BI5 substreams
 
I used to run it in a VM running on a XEON. I tried loads of times to pass through the Intel graphics through (this Xeon version has it) and it doesn't work. I ended up essentially building a similar machine and running Windows natively. I like the Xeons as they have ECC memory and I use a RAID to ensure the system is stable. Everything gets backed up onto a ZFS NAS.

How many cameras do you have on that machine to get that kind of usage on the VM (now many megapixels).
 
I used to run it in a VM running on a XEON. I tried loads of times to pass through the Intel graphics through (this Xeon version has it) and it doesn't work. I ended up essentially building a similar machine and running Windows natively. I like the Xeons as they have ECC memory and I use a RAID to ensure the system is stable. Everything gets backed up onto a ZFS NAS.

How many cameras do you have on that machine to get that kind of usage on the VM (now many megapixels).
5 cameras. Blue iris reports a total of 20 MP/s at 1700 kB/s, using 4% CPU and 830 MB RAM.

I’m not passing through Intel graphics at the moment since there’s an 8GB RAM requirement. Would need to get a 32GB kit for the NAS to give some overall headroom.

KVM on this NAS has been rock solid and gives close to bare metal performance.