is i9-9900K still the king?

abo

Young grasshopper
Joined
Mar 21, 2018
Messages
44
Reaction score
2
I have now an i7-8700K config with 33 2MP cams and I would like to upgrade my system.
I would like to add 5 new additional 2MP cams.
According the wiki i9-9900K is currently the suggested best CPU for BI.
Is this really true or is it available any better CPU for BI?
None of the 10gen and 11gen CPU-s are better for BI than the old i9-9900K?
 

fenderman

Staff member
Joined
Mar 9, 2014
Messages
36,901
Reaction score
21,270
I have now an i7-8700K config with 33 2MP cams and I would like to upgrade my system.
I would like to add 5 new additional 2MP cams.
According the wiki i9-9900K is currently the suggested best CPU for BI.
Is this really true or is it available any better CPU for BI?
None of the 10gen and 11gen CPU-s are better for BI than the old i9-9900K?
You can easily run that load on your current system using the substream function.
 

abo

Young grasshopper
Joined
Mar 21, 2018
Messages
44
Reaction score
2
You can easily run that load on your current system using the substream function.
Thank you for your suggestion!
I prefer to use the main sterams.
I would like to upgrade my system independently if it is capable of handle the new cams or not.
What do you think about my question about CPU-s?
 

fenderman

Staff member
Joined
Mar 9, 2014
Messages
36,901
Reaction score
21,270
Thank you for your suggestion!
I prefer to use the main sterams.
I would like to upgrade my system independently if it is capable of handle the new cams or not.
What do you think about my question about CPU-s?
I dont think you understand the substream function. See wiki.
 

abo

Young grasshopper
Joined
Mar 21, 2018
Messages
44
Reaction score
2
I dont think you understand the substream function. See wiki.
I think I understand the substream function.
Cons for me:
I would had to reconfigure the motion detection of my 33 old cams.
Furthermore I can not find any information how this new function effects the precision of the motion detection.
 

fenderman

Staff member
Joined
Mar 9, 2014
Messages
36,901
Reaction score
21,270
I think I understand the substream function.
Cons for me:
I would had to reconfigure the motion detection of my 33 old cams.
Furthermore I can not find any information how this new function effects the precision of the motion detection.
It has zero effect on motion detection.
So you would rather upgrade the pc and keep it under load than use a fantastic new function that significantly reduces cpu consumption?
 

bp2008

Staff member
Joined
Mar 10, 2014
Messages
12,674
Reaction score
14,020
Location
USA
@biggen lol, that disclaimer has been in there (more or less) for months already. Way down at the bottom in one of the paragraphs about sub streams.

You could run, for example, dozens of 4K cameras on a midrange system as long as you are careful with how you use Blue Iris.
Alas, the page has grown way too big. Back when I originally wrote it, Intel was still on the 7th gen so the whole CPU landscape was much simpler. Sub streams were not supported by Blue Iris, and in fact even H.265 decoding didn't work yet.

Now at the very high end, the CPU isn't even the bottleneck anymore as far as I have been able to tell.
 

abo

Young grasshopper
Joined
Mar 21, 2018
Messages
44
Reaction score
2
It has zero effect on motion detection.
So you would rather upgrade the pc and keep it under load than use a fantastic new function that significantly reduces cpu consumption?
With the lower quality substeram does not have the motion detection agorithm less information to work with ?
I would buy a new PC for BI anyway independently of the camera upgrade.
 

abo

Young grasshopper
Joined
Mar 21, 2018
Messages
44
Reaction score
2

fenderman

Staff member
Joined
Mar 9, 2014
Messages
36,901
Reaction score
21,270
With the lower quality substeram does not have the motion detection agorithm less information to work with ?
I would buy a new PC for BI anyway independently of the camera upgrade.
its already uses very few pixels otherwise the system would be bogged down.
 

bp2008

Staff member
Joined
Mar 10, 2014
Messages
12,674
Reaction score
14,020
Location
USA
Thank you!



What other component could be the bottleneck if I build an i9-10900K config for BI?
When I tested with a Ryzen 3950x and increasing load, Blue Iris started having performance problems long before the CPU reached 50% utilization. When I tried again with lower memory speed, it got much worse. There's a link to the thread I created about this near the end of the "choosing hardware" wiki page.

With an i9-10900K, you'd max out the quick sync decoder probably at around 1500 MP/s. Then you would have to run some additional cams with hardware acceleration disabled. At some point likely around 2000 MP/s, Blue Iris would likely start dropping frames even though the CPU usage was not near 100% yet. This is just a guess based on my past experience. I haven't ever run a 9th or 10th gen intel PC with Blue Iris.

Fortunately since sub stream support was added to Blue Iris, the need for increasingly powerful computers has gone away. You could probably run 64x 4K cameras on a 10900K (or even a lesser CPU) just fine as long as they all were configured with their sub streams.
 
Joined
Aug 8, 2018
Messages
7,415
Reaction score
26,000
Location
Spring, Texas
With an i9-10900K, you'd max out the quick sync decoder probably at around 1500 MP/s. Then you would have to run some additional cams with hardware acceleration disabled
Just as a discussion, if one did max out the i9-10900k quick sync decoder, could you add an Nvidia graphics card for the additional cams? So like 50 cams on the i9-10900k quick sync and 10 more on an Nvidia graphics card?
 

bp2008

Staff member
Joined
Mar 10, 2014
Messages
12,674
Reaction score
14,020
Location
USA
Just as a discussion, if one did max out the i9-10900k quick sync decoder, could you add an Nvidia graphics card for the additional cams? So like 50 cams on the i9-10900k quick sync and 10 more on an Nvidia graphics card?
Yes, but it probably wouldn't increase the amount of video the PC could handle. When I tested on AMD I found the same limit with or without nvidia decoding.
 

bp2008

Staff member
Joined
Mar 10, 2014
Messages
12,674
Reaction score
14,020
Location
USA
With the sub-stream functionaility, would you say that these requirements have gone obsolete?
Not entirely obsolete, but mostly. Sub streams are a compromise. They have downsides too (discussed in the sub stream guide in the wiki). I still run many of my cameras without sub streams so they will be slightly more reliable, more responsive when maximized, and more responsive when capturing main stream snapshots.
 

abo

Young grasshopper
Joined
Mar 21, 2018
Messages
44
Reaction score
2
its already uses very few pixels otherwise the system would be bogged down.
Theoretically the motion detection agorithm should give better results from higher quality streams even if it would use the same amount of pixels from a lower and from the higher quality streams.
If this is not the case the motion detection agorithm has a lot of room for development.
 

whoami ™

Pulling my weight
Joined
Aug 4, 2019
Messages
230
Reaction score
224
Location
South Florida
Theoretically the motion detection agorithm should give better results from higher quality streams even if it would use the same amount of pixels from a lower and from the higher quality streams.
If this is not the case the motion detection agorithm has a lot of room for development.
IMHO motion detection is so 2019... AI Tool or AI in general, is the future...

aii.PNG
 
Last edited:

bp2008

Staff member
Joined
Mar 10, 2014
Messages
12,674
Reaction score
14,020
Location
USA
Theoretically the motion detection agorithm should give better results from higher quality streams even if it would use the same amount of pixels from a lower and from the higher quality streams.
Yes, but only if the lower quality stream was really bad quality. Like, so bad that the compression artifacts still had a significant impact on the image after it was downscaled to the ridiculously low resolution Blue Iris uses for motion detection.

Blue Iris's help file hints at this.

By default, to save CPU and smooth-out noise, the image is reduced by considering it in
blocks. The High definition option actually increases the number of motion detection blocks
that are used by typically 4x.
I've noticed that the "High definition" option shrinks the size of the blocks in the motion zone editor. This implies that the grid in the motion zone editor is equal to the resolution of the frames fed into the motion detector. If that is true, then the frames received by the motion detector would look something like these:





Just to prove a point, one of these images was captured from a 4K source, and the other from its D1 sub stream. Then I downscaled to 120x68 using linear scaling. Then upscaled to 640x363 using nearest neighbor so you could more easily see the details without using browser zoom. Unfortunately it is snowing right now so the falling snowflakes are different in each frame, but otherwise they are virtually identical because the downsampling has masked what used to be a very substantial difference in image quality.

That said, there is also a difference in motion zone editor grid size between 4K resolution and D1 sub stream resolution, so it IS likely that Blue Iris is feeding smaller frames to the motion detector when you use a sub stream. If that is concerning, you could enable the "High definition" option, and that should more than make up for the loss of detail fed into the motion detector.
 
Top