Considering Intel NUC 5 i7 (RYH) for BI Server - feedback?

MartyO

Banned
Joined
Jun 4, 2015
Messages
589
Reaction score
20
I don't set bit rate of camera, I set fps, compression strength and pixels. based on that the bit rate is determined.
 

fenderman

Staff member
Joined
Mar 9, 2014
Messages
36,906
Reaction score
21,282
I don't set bit rate of camera, I set fps, compression strength and pixels. based on that the bit rate is determined.
I'm surprised at your response.
Not sure why you are surprised...your camera is low end and probably uses mpeg compression with a fixed ratio....any decent camera uses h.264 and allows the user to set the bitrate..You seem to be under the impression that since you can run 8 vga cameras on your blue iris system then you can run 8 3mp cameras with very little effect on the processor..this is simply wrong..I urge you to see for yourself before providing this misinformation...it does no one any good to put out inaccurate or misleading information.
 

MartyO

Banned
Joined
Jun 4, 2015
Messages
589
Reaction score
20
when I drop camera fps, pizels and compression, bitrate drops. when I increase fps and compression and pizels, bit rate increases.

- - - Updated - - -

admit your wrong or your a loser

- - - Updated - - -

my post were for reference
 

MartyO

Banned
Joined
Jun 4, 2015
Messages
589
Reaction score
20
everyone can see I didn't say what your saying, I posted data for pure reference.
 

MartyO

Banned
Joined
Jun 4, 2015
Messages
589
Reaction score
20
NUC is not for all situations, I just showed what it can do based on how camera data is eaten by BI pc. all honest, not misleading
 

fenderman

Staff member
Joined
Mar 9, 2014
Messages
36,906
Reaction score
21,282
I don't set bit rate of camera, I set fps, compression strength and pixels. based on that the bit rate is determined.
I'm surprised at your response.
are you going to admit your wrong
Ok, once again. Blue iris cpu consumption is NOT dependent on the amount of data sent by the camera. This is where you are mistaken. This biggest factor is the RESOLUTION of the camera..NOT the bitrate...this is what you are missing - completely...
I dont mean to insult you but your prior responses, particularly misunderstanding how bitrates are generally set, and you limited experience with blue iris and ip cameras in general, is evident in your comments.
Please take some time to learn about how cameras work and how various resolutions and frame rates affect blue iris. There are MANY threads on this with comments from others not just me...
 

MartyO

Banned
Joined
Jun 4, 2015
Messages
589
Reaction score
20
I've build camera systems for thin film etching. I also will admit when I wrong, took me longer to learn that skill.
 

fenderman

Staff member
Joined
Mar 9, 2014
Messages
36,906
Reaction score
21,282
I don't set bit rate of camera, I set fps, compression strength and pixels. based on that the bit rate is determined.
I'm surprised at your response.
are you going to admit your wrong
when I drop camera fps, pizels and compression, bitrate drops. when I increase fps and compression and pizels, bit rate increases.

- - - Updated - - -

admit your wrong or your a loser

- - - Updated - - -

my post were for reference
This is because you are using a cheap mpeg camera that has a set compression ratio...that is not my problem...please take a look at ANY decent camera with h.264 compression...the user can adjust the cameras bitrate..again it is evident that you only have experience with one camera..what i said is completely accurate... take any hikvision or dahua for example, the user can set the bitrate independent of the fps and resolution....calling me a looser will not change that fact, you simply have a basic misunderstanding of how ip cameras and their compression works.
 

wseaton

n3wb
Joined
Jun 20, 2015
Messages
23
Reaction score
0
f BI server goes in a data center or closet, then I would not get a NUC, but under a TV or in a study, its great if it can do the job. Presently my meesily little NUC i5 4250u is handling 9 cameras at 5% CPU. total dirct to disk video coming in to NUC's lan port is 12 Mbits/s, with up to 2Mbit/s per camera.
The i5 4250 has a Sysmark of 3470, which is right up there in used AMD territory and pretty much garbage in my book. I'd rather buy a cheap laptop and put in on my shelf because the laptop is is certainly more flexible and likely to be faster. Again, I've had to support hundred of these micro form factor PC's over the years and pretty much hated them all because they don't last. By the time you stuff RAM and storage into them it's still more expensive than the i7 2600 I bought nearly three years ago and has half the speed of my 2600, so I don't get what the obsession over these things are. i3 speed for i7 desktop prices. Sorry...guess I like to get my money's worth. Also, I'm better off virtualizing BI on my current desktop with functioning UPS than populating my entertainment center with a PC wanna bee gadget that's going to take a dive when the third or forth thunderstorm roles through.

Tape drives are still in use quite a bit as enterprise back-ups, and tape storage was replaced more by storage virtualization and block level replication than anything else. SSD sure didn't do it because their use outside the consumer sphere is sparse.

Glad you like it, and it shows Intel is doing some innovation, but the poor performance of the things is a turnoff and a bit of a waste of money.

Oh yeah....back to the Mac thing. H264 performance should be identical on OSX or Windows because both are running on Intel now. Back when Macs were running on Motorola or PowerPC (IBM) there was the Altivec angle, but such is not the case anymore.
 

40th Floor

n3wb
Joined
Apr 19, 2015
Messages
19
Reaction score
3
fenderman;34990[app bi said:
cpu consumption is NOT dependent on the amount of data sent by the camera. ... biggest factor is the RESOLUTION of the camera..NOT the bitrate ...
As you write, this affects that app. It's not an ideal. Use the hardware, Luke.

The IP camera is a computer made just for this. It has access to the image before any compression, and so it is in the best position to detect (whatever it is you want it to detect -- line crossing and field intrusion being two that H* cams do very well with very few false positives and no false negatives). Why do this after the image is compressed, at the PC, and then using software running on the CPU?

To the PC side. The CPU won't care if the image is SD, HD, or full HD; the GPU is doing the work. You could say, for this case, it does matter about the bitrate; all the CPU needs to do is move the bytes off the network stack and into the GPU. It doesn't matter what the image H x W is. Well, not if you use the GPU. If you have software that uses the CPU, then you are exactly right (the pixel count is what matters most by far, not the bitrate).

Anyway, you will never have enough CPU down the road, when 4k cams become mainstream (watch when 4k displays become mainstream), using software for event processing, and/or rendering. The camera should do the event processing, not the PC's CPU, and the GPU should do the decoding/rendering, not the CPU doing the decode to system memory then copying to the GPU (incredibly slow). CPUs, in case you haven't been watching the past few years, are not getting faster. Microsoft's way of "it'll be fast enough of NEXT year's CPU" doesn't fly anymore.

BTW, most mods would have taken the ruler out by now. Congrats.
 

wseaton

n3wb
Joined
Jun 20, 2015
Messages
23
Reaction score
0
The IP camera is a computer made just for this.
Sure...if you like to spend a significant portion of your day hunting down firmware updates from countless Asian vendors. Some of us also have a mix of cameras, and all those different brands of cameras have different capabilities, processors, software and tech support. For that reason I prefer to have my cameras as 'stupid as possible' and do most of the processing and auditing from a single point I have absolute control over -vs- have every single camera be it's own island and require it's own support headaches. Also a big difference between < $200 consumer IP cams and > $1,000 commercial ones in this respect.

I do agree that recompressing H264 streams can be a bit illogical, but it depends on the final workflow.

H265 is extremely encoding intensive, but less so decoding intensive. I would guess H265 would be really good at Ultra Def security video and allow much smaller storage and bandwidth requirements, but encoding over-head would be insane.

CPUs, in case you haven't been watching the past few years, are not getting faster. Microsoft's way of "it'll be fast enough of NEXT year's CPU" doesn't fly anymore.
Per Core CPU efficacy has been leveling off for years, mostly because there isn't software demand like there has been to push higher CPU processing levels. Desktop sales are also pretty stagnant, and from Vista onwards Microsoft OS's have been pretty much the same in terms of hardware requirements. Basically if you have a machine that runs Win 7 really good it will also run Win10 really goo. Not so in the past when new Operating system releases were painfull on existing hardware.
 

bp2008

Staff member
Joined
Mar 10, 2014
Messages
12,682
Reaction score
14,043
Location
USA
Per Core CPU efficacy has been leveling off for years, mostly because there isn't software demand like there has been to push higher CPU processing levels. Desktop sales are also pretty stagnant, and from Vista onwards Microsoft OS's have been pretty much the same in terms of hardware requirements. Basically if you have a machine that runs Win 7 really good it will also run Win10 really goo. Not so in the past when new Operating system releases were painfull on existing hardware.
I think a pretty big factor in this was AMD losing their ability to compete at the high end. Intel hasn't had a serious competitor for high end CPUs in many years now, and the result is that most users would not feel the speed difference between a 2nd generation i7 purchased in January 2011 and a 6th generation i7 purchased in September 2015. Compare this to any other computer market: phones, tablets, laptops, NUCs, wristwatches (lol), even solid state drives and graphics cards have improved tremendously in the same years. But not midrange to high end desktop CPUs.
 

MartyO

Banned
Joined
Jun 4, 2015
Messages
589
Reaction score
20
As you write, this affects that app. It's not an ideal. Use the hardware, Luke.

The IP camera is a computer made just for this. It has access to the image before any compression, and so it is in the best position to detect (whatever it is you want it to detect -- line crossing and field intrusion being two that H* cams do very well with very few false positives and no false negatives). Why do this after the image is compressed, at the PC, and then using software running on the CPU?

To the PC side. The CPU won't care if the image is SD, HD, or full HD; the GPU is doing the work. You could say, for this case, it does matter about the bitrate; all the CPU needs to do is move the bytes off the network stack and into the GPU. It doesn't matter what the image H x W is. Well, not if you use the GPU. If you have software that uses the CPU, then you are exactly right (the pixel count is what matters most by far, not the bitrate).

Anyway, you will never have enough CPU down the road, when 4k cams become mainstream (watch when 4k displays become mainstream), using software for event processing, and/or rendering. The camera should do the event processing, not the PC's CPU, and the GPU should do the decoding/rendering, not the CPU doing the decode to system memory then copying to the GPU (incredibly slow). CPUs, in case you haven't been watching the past few years, are not getting faster. Microsoft's way of "it'll be fast enough of NEXT year's CPU" doesn't fly anymore.

BTW, most mods would have taken the ruler out by now. Congrats.
Thanks for detailed info, my 9 camera system is very stable with BI, all my cameras firmware are rock solid. So adding a few more cameras wont require me to change, I guess I can grow with BI.
 

jbeletti

n3wb
Joined
May 30, 2015
Messages
8
Reaction score
2
I took advice from feedback on this thread and subscribed to the Dell Outlet Twitter feed. After weeks of waiting, I finally saw where some Optiplex 9200s were being offered with a coupon. I ended up snagging a 9200 as follows:
- Mini Tower
- Core i7-4790
- 16GB DDR3 @ 1600Mhz
- 500 GB HDD @ 7200 RPM (will use for OS and BI)
- Win 8.1 Pro
- $573 w/ free shipping (not including sales tax)

I'll be adding a 4TB WD Purple HDD in the second HDD bay for BI video storage.

Thanks for the feedback in this thread - much appreciated.

Jim
 

bp2008

Staff member
Joined
Mar 10, 2014
Messages
12,682
Reaction score
14,043
Location
USA
Unless they included a spare hard drive caddy, you may have to buy that separately. Just so you know. It is the little plastic piece that helps mount the drive to the case. I'm not sure you can properly mount a hard drive without that piece in these cases.
 
Top