Blue Iris on AMD 1800x

mezger

Young grasshopper
Joined
Dec 4, 2017
Messages
31
Reaction score
12
Do you want me to screenshot the i7 7700 with the MP/s and task manager CPU and HWMonitor? If so, can do.
 

fenderman

Staff member
Joined
Mar 9, 2014
Messages
36,903
Reaction score
21,275
Do you want me to screenshot the i7 7700 with the MP/s and task manager CPU and HWMonitor? If so, can do.
Dont need a screenshot, just curious what the cpu consumption and power consumption is under the same load, intel vs amd.
 

mezger

Young grasshopper
Joined
Dec 4, 2017
Messages
31
Reaction score
12
Here's what I got; I edited an earlier post to have both Intel and AMD numbers next to each other. Unsure if I'm defining things a different way, but I take it you mean CPU consumption (I called it 'CPU utilization') and at the plug power consumption under the same MP/s load, which is what is below (860 MP/s BI load). Also, I intend to grab similar numbers for the smaller MP/s load sometime in the next week to give numbers for a lighter use case.

Bumped back up to ~860 MP/s (my current config).

Console closed, run as service:
1800x: CPU ~33% utilization, GPU 0% utilization. CPU in boost. System power @ the plug is ~110W to ~120W. HW monitor says ~62W to the Ryzen.
i7 7700: CPU ~37% utilization, GPU 50% utilization. System power @ the plug is ~60W to ~65W. HW monitor says ~35W to the i7.

Console open, active local session:
1800x: CPU ~41% utilization, GPU ~8% utiliation, system power @ the plug is 125W to 135W. HW monitor says ~72W to the Ryzen.
i7 7700: CPU ~60% utilization, GPU 50% utilization. System power @ the plug is ~75W to ~80W. HW monitor says ~50W to the i7.

It is notable that when remoted via RDC, the CPU utilization rises less for the Ryzen than for the i7 7700. With console open I see around 75% - 80% CPU utilization when I'm running RDC on the i7 7700, compared with ~50% for the 1800x, which is also offloading some to its discrete GPU. I do demand fast refresh rates when running RDC, and this is my mission mode for watching the BI server, so YMMV. You can get less utilization with lower refresh rates, etc..
 

fenderman

Staff member
Joined
Mar 9, 2014
Messages
36,903
Reaction score
21,275
Thanks...thats about a 55w difference....between the two, which is very significant...
as far as remoting in consider using UI3 in stead of rdp...hopefully Ken will keep his promise and provide a true windows client at some point....
 

SFrame

n3wb
Joined
Mar 26, 2018
Messages
1
Reaction score
0
As a casual green newbie to BI with a spare Ryzen 1700 lying around and the will to brave a mild necro, I just want to say I really appreciated you starting a thread and sharing your data, Mezger. I also greatly appreciated the presence experienced counter arguments against Ryzen/AMD CPUs from Intel side by Fenderman.

With that, and some rather incessant/repetitive postings, what I gathered from this thread was:

Inferences:
- While the Ryzen CPUs work decently (significantly better than potatoes),
- they run terribly inefficient,
- and that an Intel lineup will easily do the same job for less power consumption.

Implications (derived guesses):
- BI is poorly optimized to run on AMD CPUs (as with many things, AMD),
- For very small loads (e.g. 1~3 1080p cams), some standalone NVRs might be more power efficient (~15-40W) and/or have a lower upfront-cost (~$300) than the average BI PC-NVR (~50-80W, $400-600),
- and there seems to be a positive correlation between seniority and the tendency to type with ellipses, and a negative correlation between those and trigger threshold (I was so moved by the flow of the thread that I had google find for my curiosity a few related articles to hypothesize the phenomenon: (1) (2)).


As a first-post, it'd probably be asking too much for raw data rather than subjective input and simplified data, but serious thanks nonetheless for the info and discussion, guys!
 

mezger

Young grasshopper
Joined
Dec 4, 2017
Messages
31
Reaction score
12
Hey SFrame, my objective was to put some objective data out there, glad it helped someone. I came away with the same implications, though the inferences apply to BI but not necessarily other tasks.

As an update, I have since deployed all my cameras, but found myself compute limited by the i7 7700. Given my criteria of high refresh rate RDC, I have found that its limit is about 970 MP/s, and have had to reduce camera frame rates to 15 f/s to keep things happy. When I run as a service and am not watching via RDP, its CPU utilization is ~55% & 10GB RAM, but it's >90% when RDP, depending on the resolution of the client. One could run more MP/s if they were OK with lower refresh rate RDC, using a different client, or logging directly to the computer.

WRT camera frame rates, I understand the arguments for lower frame rates, but one reason for a higher frame rate is the higher probability that I'll catch the perp mid step and therefore get more chances at a crisp face shot at a distance and after dark. I have to say that copious supplemental IR makes a remarkable difference when one lives in a street light-free hood and runs no outside lights.

Given my objectives, I also found that I needed a larger drive for my /new folder and have upgraded its drive to a 1TB SSD ($$$$).

Finally, I have put all cameras, NVR, and NAS on a VLAN and have firewalled it from any communication except to receive incoming RDC to and allow outgoing SMTP from the BI box. As such, I haven't updated anything for a few months. Got to learn a lot about ERL and CLI.

It's been running rock solid ever since.

I will add some cameras and upgrade the computer whenever the 8th gen shows up for workstations on outlet.dell.com. Might experiment with seeing if a small GPU will help with the RDC situation, if so I could do significantly more MP/s.

Although not enjoyable, this site has definitely been useful for putting together this setup. Still have a ton to learn, I know there there is a lot left on the table in several respects.

Caveats: I have a different set of demands vs money and time vs money threshold than others. Everything could be done more cheaply and power and hardware efficient than my implementation.
 

bp2008

Staff member
Joined
Mar 10, 2014
Messages
12,676
Reaction score
14,024
Location
USA
@mezger

I don't know if you saw this already but since you were considering an upgrade to i7-8700K, you would probably be interested in the tests I did between that and AMD Ryzen 7 1800X:

AMD Versus Intel

In a nutshell, I found the two CPUs to be extremely close in performance running identical Blue Iris loads (HWVA disabled on both systems) but once HWVA was enabled the 8700K pulled way ahead.
 

krp70s

n3wb
Joined
Jun 7, 2017
Messages
9
Reaction score
1
So I decided to build a 1000MP/s build. 20 cameras, some 4MP and some 2.1MP running @ 15FPS and some running at 25 FPS, in total 20 cameras.

I work in IT Security for a defence contractor so they have all these new parts and I decided to build with what ever was lying around and found interesting results.

Decided to use an i7 8700 (6 cores, 12 threads) 16 GB ram, and for OS used Samsung 960 NVME 256GB M2, for storage WD Purple drives connected directly to SATA ports. I have gone through many of Fenderman's tips and most of them were of use to me, direct to disc only, reduce to 15 fps etc etc. However, even with at 15 fps and only at 780 - 800 MP/s the system was getting choked with 100% utilization even with cpu/hardware quicksync acceleration enabled in Windows 10 Pro. Every change I made to the system, would not help reduce the utilization and pressure from the CPU. I feel GPU CUDA acceleration is the future of Blue Iris later on. So against all the wisdom provided here, I went to the shelf and built another server using an i7820x.

Used a i7 7820x (8 cores, 16 threads) with 16GB Ram, LSi Controller Raid 5 -> 7 x 3TB Purple WD Drives for storage, for OS used Samsung 960 256GB NVME M2. Got past slightly 1000MP/S at 66% utilization when not minimised and shows 86% CPU usage in Blue Iris. Was using 77W at this point. When minimised and just running as a service, it dropped down to 38%. Also decided to run OS using Windows Server 2016 instead of Windows 10 etc because the i7 7820x doesnt have quicksync support anyway. When minimised CPU was using 69W. The powersupply used is a HX850 Platinum Spec power supply, and is currently running at 91% efficiency.

However, I'm really eager to see if I can get a i7 8700 running >1000MP/s with no problems. Going to try to use different camera manufacturers and see if I can get there.

Also will be testing with i9 7980XE, 18 core, 36 thread CPU for kicks.

specs7820x.jpg minimized.JPG
 
Last edited:
Joined
Apr 23, 2018
Messages
1
Reaction score
0
I have installed an NVR along with a switch and 4 IP cameras. When I reboot the switch, the cameras are working smoothly and I can see all 4 cameras. After a while I see server broken and a black screen.

Is it possible that the problem is with the switch itself?

Noting that the cameras have electricity from their adapters, not through the network.
 

krp70s

n3wb
Joined
Jun 7, 2017
Messages
9
Reaction score
1
I have installed an NVR along with a switch and 4 IP cameras. When I reboot the switch, the cameras are working smoothly and I can see all 4 cameras. After a while I see server broken and a black screen.

Is it possible that the problem is with the switch itself?

Noting that the cameras have electricity from their adapters, not through the network.
Hussein, I think you are posting your issue in the wrong thread and wrong area. You should be posting it in NVR's for more assistance. However, your issue is still quite vague. You said only "server i broken and a black screen". That can mean many things. If you just see a black screen, what I am thinking is your NVR has crashed possibly. Can because of overheating in the CPU or possibly a bug. What I would suggest is to make sure your NVR model has the most recent firmware update and try again from there.

All the best. K
 

bp2008

Staff member
Joined
Mar 10, 2014
Messages
12,676
Reaction score
14,024
Location
USA
So I decided to build a 1000MP/s build. 20 cameras, some 4MP and some 2.1MP running @ 15FPS and some running at 25 FPS, in total 20 cameras.
I don't have a lot of evidence to back this up, but I'm suspicious that higher FPS (e.g. 25 FPS) can result in disproportionately worse performance. What I'm saying is that maybe one camera at 30 FPS creates more of a load for Blue Iris than two otherwise identical cameras at 15 FPS. I'd suggest running everything at 15 FPS or below.

I work in IT Security for a defence contractor so they have all these new parts and I decided to build with what ever was lying around and found interesting results.

Decided to use an i7 8700 (4 cores, 8 threads) 16 GB ram, and for OS used Samsung 960 NVME 256GB M2, for storage WD Purple drives connected directly to SATA ports. I have gone through many of Fenderman's tips and most of them were of use to me, direct to disc only, reduce to 15 fps etc etc. However, even with at 15 fps and only at 780 - 800 MP/s the system was getting choked with 100% utilization even with cpu/hardware quicksync acceleration enabled in Windows 10 Pro. Every change I made to the system, would not help reduce the utilization and pressure from the CPU.
You've got something configured wrong. Check this guide, though it sounds like you've attempted all the important stuff already I'm betting something isn't working as you expected. Optimizing Blue Iris's CPU Usage | IP Cam Talk

i7-8700 is 6 cores, 12 threads, not 4/8. Double-check which model you used. Though I would expect even a 4 core to be able to handle 800 MP/s better than that.

Was it reaching 100% even with BI in service mode?

Because my i7-8700K here is running 820 MP/s at just... well... let me show you...





That is with BI running in service mode, console closed. I don't know why usage always shows so much higher on the Processes tab, but it always does. Drives me nuts honesty because you never know where someone else is reading their % numbers from.

Below is with the console open. Still way below your usage levels.



Note the GPU usage statistics too (requires a fairly recent Win10 feature update). GPU 0 in this case is the Intel GPU that is built in to the CPU. Seeing a load on the "Video Decode" engine indicates that Quick Sync is working. GPU 1 here is an Nvidia GT 1030 card I added specifically to offload rendering work while the console is open, and it makes a substantial difference in CPU usage when Blue Iris's console is open at 4K resolution. That is the only benefit of a discrete GPU, and it doesn't appear to matter how fast the GPU is, so I suggest using the lowest power GPU you have available for that purpose. E.g. a GT 710 or 1030. Some low-end AMD card may work the same, but I haven't tested one.

One other thing, make sure you've done the hack to fix quick sync. On 8th gen, it is a little more complicated than just having the right driver installed. Memory Leak: Quick Sync (Hardware Acceleration) | IP Cam Talk

Oh and it couldn't hurt to check Windows's power options advanced settings, under the "Processor power management" section, to make sure there isn't somehow an artificial limit on your CPU speed there.

I feel GPU CUDA acceleration is the future of Blue Iris later on. So against all the wisdom provided here, I went to the shelf and built another server using an i7820x.

Used a i7 7820x (8 cores, 16 threads) with 16GB Ram, LSi Controller Raid 5 -> 7 x 3TB Purple WD Drives for storage, for OS used Samsung 960 256GB NVME M2. Got past slightly 1000MP/S at 66% utilization when not minimised and shows 86% CPU usage in Blue Iris. Was using 77W at this point. When minimised and just running as a service, it dropped down to 38%. Also decided to run OS using Windows Server 2016 instead of Windows 10 etc because the i7 7820x doesnt have quicksync support anyway. When minimised CPU was using 69W. The powersupply used is a HX850 Platinum Spec power supply, and is currently running at 91% efficiency.
This does kind of make it sound like hardware acceleration wasn't working on your other system. You should be aware that H.265 acceleration doesn't work currently, and you need to use H.264.

However, I'm really eager to see if I can get a i7 8700 running >1000MP/s with no problems. Going to try to use different camera manufacturers and see if I can get there.
Different camera manufacturers shouldn't make any difference. They're all producing basically the same thing. Make sure any vendor-specific encoding extensions are disabled though. Things like "Smart codec" or "Smart coding" or "H.264+". "H.264H" however is fine if you see that in the options since that just means high profile which is pretty standard.

Also will be testing with i9 7980XE, 18 core, 36 thread CPU for kicks.
That should easily blow away everything else, though it is overkill for sure.
 

krp70s

n3wb
Joined
Jun 7, 2017
Messages
9
Reaction score
1
i7-8700 is 6 cores, 12 threads, not 4/8. Double-check which model you used. Though I would expect even a 4 core to be able to handle 800 MP/s better than that.
Yeah I meant 6 cores, 12 threads, so I thought it would be able to handle it okay. I actually had a i7 4770 running fine with 750 MP but I feel there is still much I need to work on here. I'll read the thread you provided and other threads I may have missed.

Thank you all.
 
Top