Who no love for AMD CPU's?

Before Ryzen AMD had nothing unless you went all the way back to Phenom/Phenom II. Bulldozer was a joke. Intel owned the desktop CPU segment from the end of Phenom II to the first Ryzen chips.

Its not a stretch to say that AMD was pretty poor in terms of performance to price for those 10+ years when AMD was wandering through the valley.

If you are strictly setting up a PC for just BI, then its a no brainer to use an Intel CPU. But I can't bring myself to build out an entire server for a single purpose in 2020. Virtualization is simply too good. That is where AMD is truly beating the pants off Intel in the price to performance/cores catagory.

The Phenom II was the last I had used. It's been a while.
I was talking more in terms of quality and failures, problems etc. Many over the years have claimed Intel was a higher quality product, regardless of it's performance.

To me, a security camera system should be dedicated to just that.
But, I've been wrong before and I will be again I'm sure.
 
  • Like
Reactions: djernie and Arjun
Many over the years have claimed Intel was a higher quality product, regardless of it's performance.

To me, a security camera system should be dedicated to just that.
But, I've been wrong before and I will be again I'm sure.

I can't speak to quality. I'm not sure how you determine "quality" to begin with. If we are taking failure rates I'd guess they are the same between both companies: Pretty damn low. Its been my experience that Motherboard and RAM fails more than other parts. Even hard drives seem to last ages...

Its fine if you want to build out a dedicated machine for BI. It's just not for me. Virtualization offers too many pros when compared to the cons.
 
My last AMD system build was back when AMD Athlon 64 was gathering steam. Always disliked receiving a CPU with bent CPU pins :rofl:

The Phenom II was the last I had used. It's been a while.
I was talking more in terms of quality and failures, problems etc. Many over the years have claimed Intel was a higher quality product, regardless of it's performance.

To me, a security camera system should be dedicated to just that.
But, I've been wrong before and I will be again I'm sure.
 
My last AMD system build was back when AMD Athlon 64 was gathering steam. Always disliked receiving a CPU with bent CPU pins :rofl:

I had an athlon 64 X2 6000+ I ran for years but it wasn't a good overclocker. Right after that I built a Phenom II system for a coworker and then built an I5 750 for my self specifically for overclocking and gaming and right after I bought the parts found out a ton of people were having socket problems. I got lucky and the EVGA motherboard I used was one of very few that used a good socket. It went many years without an issue. It's still sitting at home but not being used. It started having odd issues around 2017, probably the motherboard.
 
  • Like
Reactions: Arjun
No question you improperly setup your intel system. 307 is nothing for the 6500. You wasted your money and now will be paying an increased power bill in perpetuity. Your are also comparing an intel processor that can be purchased as a complete Pc and os for under 200 bux to a processor that costs 250 Alone.
The number of cameras is irrelevant. I have 10 cameras systems on weaker processors.

While it might seem that way, keep this in mind. I went from a Dell XPS 9020 Intel 4790k, to a HP Prodesk 400G3 I5-6500 to gain H.265 acceleration. Both systems choked. From the Prodesk 400, I moved the drives and memory over to the Ryzen system. I exported BI settings to the new Ryzen system and out of the box the CPU usage dropped dramatically, almost like BI isn't even running. Same exact settings and some of the same hardware, vastly different result. Yes I spent more and the power and money doesn't bother me and the extra power usage is minimal. I produce excess solar energy credits as well. The GTX 1650 idles at 8 watts and goes up to 75 watts if it's pushed and it's not being pushed. I'm sorry but that extra power consumption is peanuts and I'd rather pay a few bucks extra each month to have a smooth running BI system than to be cheap. I followed the common advice on here of buy a used eBay desktop system with Quicksync and it didn't work for me. The web interface stuttered and CPU would frequently got over 30% and when you are trying to work with a Dahua Starlight PTZ it just wasn't going to cut it, especially since I'm adding 2 Hikvision colorvu cameras and the big sensor Dahua PTZ. H.265 seemed to punish the older systems. My other system has 20 cameras on an i7-8700k that I built, I'm quite familiar with setting it up.
 
  • Like
Reactions: djernie
While it might seem that way, keep this in mind. I went from a Dell XPS 9020 Intel 4790k, to a HP Prodesk 400G3 I5-6500 to gain H.265 acceleration. Both systems choked. From the Prodesk 400, I moved the drives and memory over to the Ryzen system. I exported BI settings to the new Ryzen system and out of the box the CPU usage dropped dramatically, almost like BI isn't even running. Same exact settings and some of the same hardware, vastly different result. Yes I spent more and the power and money doesn't bother me and the extra power usage is minimal. I produce excess solar energy credits as well. The GTX 1650 idles at 8 watts and goes up to 75 watts if it's pushed and it's not being pushed. I'm sorry but that extra power consumption is peanuts and I'd rather pay a few bucks extra each month to have a smooth running BI system than to be cheap. I followed the common advice on here of buy a used eBay desktop system with Quicksync and it didn't work for me. The web interface stuttered and CPU would frequently got over 30% and when you are trying to work with a Dahua Starlight PTZ it just wasn't going to cut it, especially since I'm adding 2 Hikvision colorvu cameras and the big sensor Dahua PTZ. H.265 seemed to punish the older systems. My other system has 20 cameras on an i7-8700k that I built, I'm quite familiar with setting it up.


Curious what frame rate you're running the cameras at?
 
@therealdeal74

Did you by chance use the Intel+VPP hardware acceleration option? I've known that to cause performance problems (in older BI versions - it changed a couple months ago). Having single-channel memory could also be a limiting factor, although at 307 MP/s I wouldn't expect that to be a problem.
 
@therealdeal74

Did you by chance use the Intel+VPP hardware acceleration option? I've known that to cause performance problems. Having single-channel memory could also be a limiting factor, although at 307 MP/s I wouldn't expect that to be a problem.
No I just used Intel. I also use dual channel memory. The same DDR4 modules were moved to the Ryzen system. The system ran decent when I wasn't using H.265.
 
While it might seem that way, keep this in mind. I went from a Dell XPS 9020 Intel 4790k, to a HP Prodesk 400G3 I5-6500 to gain H.265 acceleration. Both systems choked. From the Prodesk 400, I moved the drives and memory over to the Ryzen system. I exported BI settings to the new Ryzen system and out of the box the CPU usage dropped dramatically, almost like BI isn't even running. Same exact settings and some of the same hardware, vastly different result. Yes I spent more and the power and money doesn't bother me and the extra power usage is minimal. I produce excess solar energy credits as well. The GTX 1650 idles at 8 watts and goes up to 75 watts if it's pushed and it's not being pushed. I'm sorry but that extra power consumption is peanuts and I'd rather pay a few bucks extra each month to have a smooth running BI system than to be cheap. I followed the common advice on here of buy a used eBay desktop system with Quicksync and it didn't work for me. The web interface stuttered and CPU would frequently got over 30% and when you are trying to work with a Dahua Starlight PTZ it just wasn't going to cut it, especially since I'm adding 2 Hikvision colorvu cameras and the big sensor Dahua PTZ. H.265 seemed to punish the older systems. My other system has 20 cameras on an i7-8700k that I built, I'm quite familiar with setting it up.
Once again, you claim that two systems easily capable of handling a much higher load than yours. Also 30 percent is not a high load. This is plain and simple user error.
If you are getting solar credits it doesnt make the power free, you lose the extra credits. There are thousands of users here running weaker systems at higher loads. Dont just compare the power usage of the card. You must compare both systems under load. Your first post focusing on the number of cameras rather than the actual load tells me you are not setting these up properly.
 
Once again, you claim that two systems easily capable of handling a much higher load than yours. Also 30 percent is not a high load. This is plain and simple user error.
If you are getting solar credits it doesnt make the power free, you lose the extra credits. There are thousands of users here running weaker systems at higher loads. Dont just compare the power usage of the card. You must compare both systems under load. Your first post focusing on the number of cameras rather than the actual load tells me you are not setting these up properly.

30% is not a high load but it makes the interfaces with H.265 enabled painful. The card maxes out at 75 watts and the CPU is more efficient than the Intels so it's a bit of an offset. I can afford to run a 75 watt bulb 24x7 regardless and I'd rather do that than be cheap and miserable.. That's fine if you think I'm not setting them up correctly, I know what happened when I exported the BI settings to the new machine and ran it in the exact same configuration. If it was configured wrong, then there would be minimal difference but instead it's night and day.

One other thing I forgot to mention is most of my machines have 4k monitors and there is a noticeable difference in performance when using a lower resolution screen.
 
Love is just a 4 letter these days..:smoking:
 
Intel: Higher Core Counts with integrated GPU.
AMD: Run integrated graphics with a generation behind processor core, or run dedicated graphics for more watts consumed without necessarily needing it. Both potentially bad options for a 24x7 system.

Honestly now with sub-streams, I think we'll see much more reserved CPU picks for Blue Iris. I have a nice AMD system I built before sub-streams, but AMD options with integrated GPU won’t currently boost Blue Iris performance for you, and a dedicated graphics card is a less effective choice from a performance-per-$$ standpoint. The fact that the business world has literally loaded up on Intel for generations, and keeps dumping them to the 2nd hand market also potentially saves you a lot. For the forseeable future you can get some parts (maybe CPU, motherboard & memory) for a system or a working system off lease for the same ballpark price.

Maybe if AMD retains the node lead, and business starts buying them up, and you can wait 3-4 years for typical leases to end you might have an alternative that consumes less power and produces less heat for the right price. If the Blue Iris software suddenly got VCE support or something that would also put AMD back into the conversation.

But for right now cores is where AMD is crushing things, and you don’t really need lots of cores for Blue Iris anymore, especially if you can use sub-streams. I’m excited to use the spare cores now for other things (like AI & LPR), but those are just side projects not absolutely necessary. My testing showed quad channel memory CPU was adequate for even very high loads, and that all went out the window with the release of sub-streams.
 
Last edited:
Intel: Higher Core Counts with integrated GPU.
AMD: Run integrated graphics with a generation behind processor core, or run dedicated graphics for more watts consumed without necessarily needing it. Both potentially bad options for a 24x7 system.

Are you saying AMD processors are a generation behind?
 
Are you saying AMD processors are a generation behind?

He's talking specifically about the CPUs with integrated graphics being behind compared to the CPUs without integrated graphics. Only on the AMD side of course. Intel has been releasing most of their latest tech first in the form of CPUs with integrated graphics. AMD has been doing it later, such that the architecture is falling behind a bit by the time it is released.
 
He's talking specifically about the CPUs with integrated graphics being behind compared to the CPUs without integrated graphics. Only on the AMD side of course. Intel has been releasing most of their latest tech first in the form of CPUs with integrated graphics. AMD has been doing it later, such that the architecture is falling behind a bit by the time it is released.

I see, that all changes with Renoir.

And thanks to crw030 for mentioning the substreams, I missed the clamor and announcement. Switching my 20 camera system over to substreams and watching the CPU usage fall like a house of cards. Time to buy more cameras!
 
  • Like
  • Haha
Reactions: crw030 and Arjun
I see, that all changes with Renoir.

I believe Renoir will have Zen 2 cores, being released in 2020, while the latest non-APU chips will be Zen 3. That’s my point is the AMD APU’s besides not being supported by Blue Iris (for decode accel) at this time, incorporate a generation-behind CPU core.

I have Zen 2 in my EPYC chip, it’s solid performance, but it will be beaten by Zen 3 performance within a year. So once again you have to choose Zen2 APU or Zen3 CPU + Dedicated Graphics. Or you just buy a 1-2 gen older Intel system that supports QuickSync and call it a day.

I’m no longer huge Intel fan after getting gouged for slight performance improvements for the past 8 years, it’s really nice to have a competitive alternative. But for this specific use case (Blue Iris) I don’t think AMD will be the best choice.
 
  • Like
Reactions: therealdeal74
I see dual channel memory mentioned a lot here.
Do camera systems benefit from 4 channel memory on Xeon systems, or not really?
 
I believe Renoir will have Zen 2 cores, being released in 2020, while the latest non-APU chips will be Zen 3. That’s my point is the AMD APU’s besides not being supported by Blue Iris (for decode accel) at this time, incorporate a generation-behind CPU core.

I see what you mean now. The Renoir chips just came out for OEM only so it's a move into the pre-built desktop and that can lead to some cheap power efficient boxes. The Zen 2 platform is still quite good and more than capable. The really nice thing about AMD is the longstanding support they have kept with the AM4 socket whereas Intel changes sockets like they are going out of style. If it wasn't for AMD we'd be paying a fortune for first gen pentiums right now so I'm happy to support them and get the performance I want.
 
  • Like
Reactions: crw030