AMD Versus Intel

bp2008

Staff member
Joined
Mar 10, 2014
Messages
12,666
Reaction score
14,005
Location
USA
I've had an AMD Ryzen 7 1800X for quite a while now and have been putting off testing Blue Iris on it. Yesterday my Intel I7-8700K arrived, and this gave me just the excuse I needed to compare their performance.

Predictably, Intel destroys AMD when hardware acceleration is enabled (Blue Iris only supports hardware acceleration with Intel CPUs):



However, with hardware acceleration turned off on the Intel system, they are on basically even footing:




These stats were captured by BiUpdateHelper on two systems running identical Blue Iris configurations. Same cameras, same frame rates, same configuration.
 

bp2008

Staff member
Joined
Mar 10, 2014
Messages
12,666
Reaction score
14,005
Location
USA
I've made some power consumption measurements using a kill-a-watt meter.

There are some differences between the systems. Both use a 256GB SSD for the OS/boot disk, but the Intel has an additional 2 TB Hitachi hard drive. The AMD system has a GeForce GT 710 (graphics card), since it doesn't have integrated graphics.

I connected both to a 4K display for the "GUI Open" tests, and the live preview rate was unthrottled, making it a very CPU-intensive task.

It is the same 23 cameras as before, but I've increased frame rates somewhat since last time, resulting in 801.1 MP/s of video data.

CPUWorkloadPower Consumption (Watts)Blue Iris CPU Usage %Overall CPU Usage %
Intel(R) Core(TM) i7-8700KIDLE3100
Intel(R) Core(TM) i7-8700KBlue Iris (Service)983132
Intel(R) Core(TM) i7-8700KBlue Iris (GUI Open)1297375
Intel(R) Core(TM) i7-8700KBlue Iris (Service)
With Hardware Acceleration
791518
Intel(R) Core(TM) i7-8700KBlue Iris (GUI Open)
With Hardware Acceleration
1194956
AMD Ryzen 7 1800XIDLE4000
AMD Ryzen 7 1800XBlue Iris (Service)992930
AMD Ryzen 7 1800XBlue Iris (GUI Open)1304951

Edit: I had some numbers mixed up in the table above.
 
Last edited:

bp2008

Staff member
Joined
Mar 10, 2014
Messages
12,666
Reaction score
14,005
Location
USA
I'm pleasantly surprised by how well the Ryzen box handled having the GUI open (this was actually the result of the AMD/Ryzen box having a dedicated GPU while the Intel box did not during those tests). Next, I guess I'll have to transplant a graphics card into the Intel box since that should reduce "GUI Open" CPU usage. I wonder what effect it will have on the power consumption.
 
Last edited:

tigerwillow1

Known around here
Joined
Jul 18, 2016
Messages
3,815
Reaction score
8,424
Location
USA, Oregon
Thanks for posting the power numbers. I know it took a good bit of time and effort. Having used an older AMD Phenom system until recently, I'm impressed with how well power consumption has been cut in the newer systems. The old AMD system pulled 100 watts at idle and about 120 watts heavily loaded. I set up two newer Intel-based systems and measured the idle power draw. A Dell tower with an i7-4770 pulls 30 watts at idle, and an HP desktop with an i5-4590, 15 watts at idle. I had a hard time believing it! This is with SSD and internal video. Benchmark-wise, the i7 system is about 6x faster than the old system, and the i5 system about 3x faster.
 

bp2008

Staff member
Joined
Mar 10, 2014
Messages
12,666
Reaction score
14,005
Location
USA
I re-ran the Intel tests with dedicated graphics installed. I had two cards on hand so I tested each of them. One, the fanless GT 710 from the AMD build. The other, the GTX 950 from my old Blue Iris server that I was using for its HDMI 2.0 output. The GTX 950 is totally overkill, even requiring additional power from the PSU. A GTX 1030 would be a modern compromise, but I don't have one.

GPUWorkloadPower Consumption (Watts)Blue Iris CPU Usage %Overall CPU Usage %
GTX 950IDLE4300
GTX 950Blue Iris (Service)1143334
GTX 950Blue Iris (GUI Open @ 4K)1505354
GTX 950Blue Iris (Service)
With Hardware Acceleration
901520
GTX 950Blue Iris (GUI Open @ 4K)
With Hardware Acceleration
1313336
GTX 950Blue Iris (GUI Open @ 4K plugged in to motherboard)
With Hardware Acceleration
1294652
GT 710IDLE3400
GT 710Blue Iris (Service)1053335
GT 710Blue Iris (GUI Open @ 4K)1325354
GT 710Blue Iris (Service)
With Hardware Acceleration
831517
GT 710Blue Iris (GUI Open @ 4K)
With Hardware Acceleration
1133134
GT 710Blue Iris (GUI Open @ 4K plugged in to motherboard)
With Hardware Acceleration
1234752


As expected, with the GUI closed, there is no benefit to having a dedicated GPU.

With the GUI open, CPU usage is significantly reduced by connecting the monitor to the graphics card's video output. Both graphics cards produced a virtually identical CPU usage drop, despite one of the cards (the GTX 950) being newer and much more powerful than the other.

These results also show that it is important to use a very-low-power graphics card, otherwise the additional power consumption of the GPU will negate your energy savings, as seen in all cases with the GTX 950. In fact, the only test case where adding a graphics card improved power consumption was the test with the GT 710 with the GUI open and hardware acceleration enabled in Blue Iris. Remember, I was using a 4K monitor! If it had been a more-common 1080p monitor, then the graphics card would have been much less beneficial.
 
Last edited:

bp2008

Staff member
Joined
Mar 10, 2014
Messages
12,666
Reaction score
14,005
Location
USA
I made some charts. Since these are showing resource consumption, shorter bars are better.

These first two charts compare the Intel and AMD computers. Both systems were tested with 80plus Gold power supplies, an SSD for the OS, and a Nvidia GeForce GT 710 graphics card installed.

The Intel machine has a 2 TB HDD that the AMD machine does not have, which increases the reported power consumption of the Intel system by about 5 watts. Essentially, Intel and AMD are on even ground in terms of power consumption when running the same load, but in these charts it will look like Intel does a little worse because of that hard drive.

Note how Intel takes a clear lead in all aspects when Hardware Video Acceleration is enabled.





This last chart is kind of hard to follow, but shows the effect on power consumption and CPU usage when the Intel system had installed:

* No graphics card
* GeForce GT 710
* GeForce GTX 950

Again, shorter bars are better.


Some takeaway points from the GPU tests:
  1. Dedicated GPUs consume additional power.
  2. Dedicated GPUs only help when the Blue Iris GUI is open.
  3. It is possible, in the right circumstances, for a very efficient dedicated GPU to save more energy than it costs to run.
  4. The very low-end GT 710 has the same effect on Blue Iris as the newer and more power-hungry GTX 950.
 
Last edited:

awsum140

Known around here
Joined
Nov 14, 2017
Messages
1,254
Reaction score
1,128
Location
Southern NJ
Wonder how those charts would look with a 10XX series card? I'm running a 1060 and a 970, no onboard graphics so I'm stuck with cards and no HWA. Power consumption is something I live with to be able to do something other than BI on the 24/7 machine (blasphemy I know), but it would be interesting to see.
 

pcunite

Young grasshopper
Joined
Jan 28, 2018
Messages
84
Reaction score
24
@bp2008

I would like to better understand the Quick Sync vs Core comparison.

How would you feel about having forum members donate to raise funds to test these various platforms? We need a Blue Iris expert who can flush this out. I would like to know: power draw, CPU utilization, and a metric of performance (frames per second encoded?). Keep it simple. This way performance and cost could be understood.

Suggested processors:
  • i7-800K 6 core with Quick Sync
  • Threadripper 1950x 16 core
  • Core i9-7980XE 18 core
  • Epyc 7401P 24 core
  • Epyc 7551P 32 core
Maybe even create a test EXE that can simulate different workloads: 1080p from 30 cams, 10 remote viewers watching cams, etc.
 
Last edited:

fenderman

Staff member
Joined
Mar 9, 2014
Messages
36,897
Reaction score
21,250
@bp2008

I would like to better understand the Quick Sync vs Core comparison.

How would you feel about having forum members donate to raise funds to test these various platforms? We need a Blue Iris expert who can flush this out. I would like to know: power draw, CPU utilization, and a metric of performance (frames per second encoded?). Keep it simple. This way performance and cost could be understood.

Suggested processors:
  • i7-800K 6 core with Quick Sync
  • Threadripper 1950x 16 core
  • Core i9-7980XE 18 core
  • Epyc 7401P 24 core
  • Epyc 7551P 32 core
Maybe even create a test EXE that can simulate different workloads: 1080p from 30 cams, 10 remote viewers watching cams, etc.
Lol you are looking at over 6k in gear...if you are stressing an i7-8xxx then you need to move to an Enterprise solution..
 

nejakejnick

Getting the hang of it
Joined
Aug 30, 2015
Messages
138
Reaction score
24
Have you tried the third option of HW acceleration: " Yes+VideoPostproc"?
I found that it lowers the total CPU consumption even lower (must also count kernel time for this, as it is half of the total usage). Not sure what are the disadvantages, but if I remember correctly that option did not work while also using NVIDIA card.
 

fenderman

Staff member
Joined
Mar 9, 2014
Messages
36,897
Reaction score
21,250
Have you tried the third option of HW acceleration: " Yes+VideoPostproc"?
I found that it lowers the total CPU consumption even lower (must also count kernel time for this, as it is half of the total usage). Not sure what are the disadvantages, but if I remember correctly that option did not work while also using NVIDIA card.
Vpp is not recommend with high bitrate throughput...just use yes, no vpp
 

nejakejnick

Getting the hang of it
Joined
Aug 30, 2015
Messages
138
Reaction score
24
I have 4500-5500kB/s, 350MP/s and VideoPostproc saves me total CPU from 30% to 20% on i5-3570k@3,7GHz. So not sure if I qualify for high bitrate throughput.
 

pcunite

Young grasshopper
Joined
Jan 28, 2018
Messages
84
Reaction score
24
Lol you are looking at over 6k in gear...if you are stressing an i7-8xxx then you need to move to an Enterprise solution..
I really enjoy building. If an Epyc processor works well, why not? What Enterprise solution are you speaking of?

This system comes to $2,990. Add as many 12TB drives ($490) as you need.
  • $1,200 EPYC 7401P 24-Core
  • $650 GIGABYTE MZ31-AR0
  • $430 32GB RAM
  • $150 Power supply
  • $150 Samsung 960 EVO M.2
  • $270 Case
  • $140 Windows 10 PRO x64
 
Last edited:

fenderman

Staff member
Joined
Mar 9, 2014
Messages
36,897
Reaction score
21,250
I really enjoy building. If an Epyc processor works well, why not? What Enterprise solution are you speaking of?

This system only comes to $2,870, which is a far cry from $6,000
  • $1,200 EPYC 7401P 24-Core
  • $650 GIGABYTE MZ31-AR0
  • $430 32GB RAM
  • $150 Power supply
  • $150 Samsung 960 EVO M.2
  • $150 Case
  • $140 Windows 10 PRO x64
the 6k was for the parts listed for comparison...
I can think of much better ways to enjoy the 1k+ extra you spend for building a system...the epyc will be a powerhog and will result in a perpetual tax that you pay to your electric company...makes no sense...
There are lots of great enterprise/commercial vms solutions, like avigilon (just acquired by Motorola the other day) or on the cheaper side network optix (sold by dw spectrum in north america with free lifetime upgrades to new versions)...
 

bp2008

Staff member
Joined
Mar 10, 2014
Messages
12,666
Reaction score
14,005
Location
USA
I have 4500-5500kB/s, 350MP/s and VideoPostproc saves me total CPU from 30% to 20% on i5-3570k@3,7GHz. So not sure if I qualify for high bitrate throughput.
That is 40 Mbps (give or take) so I'm impressed if VPP is indeed being beneficial. The last time I tried running VPP, I could barely even move the mouse cursor to turn it back off. Granted I had at least twice that bit rate going at the time.
 

pcunite

Young grasshopper
Joined
Jan 28, 2018
Messages
84
Reaction score
24
@fenderman

A custom server that we control is very important to us. An extra 200w power draw is within our operating budget. Performance is very important. Given that, I could run Avigilon Standard or Network Optix software? Sounds like a win. I guess you're saying that these two software packages are more CPU efficient over Blue Iris? That way I could build less of a server?
 

fenderman

Staff member
Joined
Mar 9, 2014
Messages
36,897
Reaction score
21,250
@fenderman

A custom server that we control is very important to us. An extra 200w power draw is within our operating budget. Performance is very important. Given that, I could run Avigilon Standard or Network Optix software? Sounds like a win. I guess you're saying that these two software packages are more CPU efficient over Blue Iris?
you have the same control over a 500 dollar server as you do over a 3k server....a 200w extra draw equates to 200-400 per year depending on your rates....if you are willing to spend that kind of money + 3k to build a system you should not be looking at blue iris...
Avigilon is lighter and has lots more features useful for your commercial needs...
 

asilva54

n3wb
Joined
Oct 25, 2017
Messages
16
Reaction score
2
man that cpu usage chart BP provided makes you just think...what if VCE or w/e its called was supported.
 
Top