I don't set bit rate of camera, I set fps, compression strength and pixels. based on that the bit rate is determined.
Not sure why you are surprised...your camera is low end and probably uses mpeg compression with a fixed ratio....any decent camera uses h.264 and allows the user to set the bitrate..You seem to be under the impression that since you can run 8 vga cameras on your blue iris system then you can run 8 3mp cameras with very little effect on the processor..this is simply wrong..I urge you to see for yourself before providing this misinformation...it does no one any good to put out inaccurate or misleading information.I'm surprised at your response.
I don't set bit rate of camera, I set fps, compression strength and pixels. based on that the bit rate is determined.
I'm surprised at your response.
Ok, once again. Blue iris cpu consumption is NOT dependent on the amount of data sent by the camera. This is where you are mistaken. This biggest factor is the RESOLUTION of the camera..NOT the bitrate...this is what you are missing - completely...are you going to admit your wrong
I don't set bit rate of camera, I set fps, compression strength and pixels. based on that the bit rate is determined.
I'm surprised at your response.
are you going to admit your wrong
This is because you are using a cheap mpeg camera that has a set compression ratio...that is not my problem...please take a look at ANY decent camera with h.264 compression...the user can adjust the cameras bitrate..again it is evident that you only have experience with one camera..what i said is completely accurate... take any hikvision or dahua for example, the user can set the bitrate independent of the fps and resolution....calling me a looser will not change that fact, you simply have a basic misunderstanding of how ip cameras and their compression works.when I drop camera fps, pizels and compression, bitrate drops. when I increase fps and compression and pizels, bit rate increases.
- - - Updated - - -
admit your wrong or your a loser
- - - Updated - - -
my post were for reference
f BI server goes in a data center or closet, then I would not get a NUC, but under a TV or in a study, its great if it can do the job. Presently my meesily little NUC i5 4250u is handling 9 cameras at 5% CPU. total dirct to disk video coming in to NUC's lan port is 12 Mbits/s, with up to 2Mbit/s per camera.
As you write, this affects that app. It's not an ideal. Use the hardware, Luke.fenderman;34990[app bi said:cpu consumption is NOT dependent on the amount of data sent by the camera. ... biggest factor is the RESOLUTION of the camera..NOT the bitrate ...
The IP camera is a computer made just for this.
CPUs, in case you haven't been watching the past few years, are not getting faster. Microsoft's way of "it'll be fast enough of NEXT year's CPU" doesn't fly anymore.
Thanks! I take pride in that...Its all about discussion and sharing ideas and points of view...BTW, most mods would have taken the ruler out by now. Congrats.
Thanks! I take pride in that...Its all about discussion and sharing ideas and points of view...
Per Core CPU efficacy has been leveling off for years, mostly because there isn't software demand like there has been to push higher CPU processing levels. Desktop sales are also pretty stagnant, and from Vista onwards Microsoft OS's have been pretty much the same in terms of hardware requirements. Basically if you have a machine that runs Win 7 really good it will also run Win10 really goo. Not so in the past when new Operating system releases were painfull on existing hardware.
As you write, this affects that app. It's not an ideal. Use the hardware, Luke.
The IP camera is a computer made just for this. It has access to the image before any compression, and so it is in the best position to detect (whatever it is you want it to detect -- line crossing and field intrusion being two that H* cams do very well with very few false positives and no false negatives). Why do this after the image is compressed, at the PC, and then using software running on the CPU?
To the PC side. The CPU won't care if the image is SD, HD, or full HD; the GPU is doing the work. You could say, for this case, it does matter about the bitrate; all the CPU needs to do is move the bytes off the network stack and into the GPU. It doesn't matter what the image H x W is. Well, not if you use the GPU. If you have software that uses the CPU, then you are exactly right (the pixel count is what matters most by far, not the bitrate).
Anyway, you will never have enough CPU down the road, when 4k cams become mainstream (watch when 4k displays become mainstream), using software for event processing, and/or rendering. The camera should do the event processing, not the PC's CPU, and the GPU should do the decoding/rendering, not the CPU doing the decode to system memory then copying to the GPU (incredibly slow). CPUs, in case you haven't been watching the past few years, are not getting faster. Microsoft's way of "it'll be fast enough of NEXT year's CPU" doesn't fly anymore.
BTW, most mods would have taken the ruler out by now. Congrats.