I don't set bit rate of camera, I set fps, compression strength and pixels. based on that the bit rate is determined.
Not sure why you are surprised...your camera is low end and probably uses mpeg compression with a fixed ratio....any decent camera uses h.264 and allows the user to set the bitrate..You seem to be under the impression that since you can run 8 vga cameras on your blue iris system then you can run 8 3mp cameras with very little effect on the processor..this is simply wrong..I urge you to see for yourself before providing this misinformation...it does no one any good to put out inaccurate or misleading information.I'm surprised at your response.
I don't set bit rate of camera, I set fps, compression strength and pixels. based on that the bit rate is determined.
I'm surprised at your response.
Ok, once again. Blue iris cpu consumption is NOT dependent on the amount of data sent by the camera. This is where you are mistaken. This biggest factor is the RESOLUTION of the camera..NOT the bitrate...this is what you are missing - completely...are you going to admit your wrong
I don't set bit rate of camera, I set fps, compression strength and pixels. based on that the bit rate is determined.
I'm surprised at your response.
are you going to admit your wrong
This is because you are using a cheap mpeg camera that has a set compression ratio...that is not my problem...please take a look at ANY decent camera with h.264 compression...the user can adjust the cameras bitrate..again it is evident that you only have experience with one camera..what i said is completely accurate... take any hikvision or dahua for example, the user can set the bitrate independent of the fps and resolution....calling me a looser will not change that fact, you simply have a basic misunderstanding of how ip cameras and their compression works.when I drop camera fps, pizels and compression, bitrate drops. when I increase fps and compression and pizels, bit rate increases.
- - - Updated - - -
admit your wrong or your a loser
- - - Updated - - -
my post were for reference
The i5 4250 has a Sysmark of 3470, which is right up there in used AMD territory and pretty much garbage in my book. I'd rather buy a cheap laptop and put in on my shelf because the laptop is is certainly more flexible and likely to be faster. Again, I've had to support hundred of these micro form factor PC's over the years and pretty much hated them all because they don't last. By the time you stuff RAM and storage into them it's still more expensive than the i7 2600 I bought nearly three years ago and has half the speed of my 2600, so I don't get what the obsession over these things are. i3 speed for i7 desktop prices. Sorry...guess I like to get my money's worth. Also, I'm better off virtualizing BI on my current desktop with functioning UPS than populating my entertainment center with a PC wanna bee gadget that's going to take a dive when the third or forth thunderstorm roles through.f BI server goes in a data center or closet, then I would not get a NUC, but under a TV or in a study, its great if it can do the job. Presently my meesily little NUC i5 4250u is handling 9 cameras at 5% CPU. total dirct to disk video coming in to NUC's lan port is 12 Mbits/s, with up to 2Mbit/s per camera.
As you write, this affects that app. It's not an ideal. Use the hardware, Luke.fenderman;34990[app bi said:cpu consumption is NOT dependent on the amount of data sent by the camera. ... biggest factor is the RESOLUTION of the camera..NOT the bitrate ...
Sure...if you like to spend a significant portion of your day hunting down firmware updates from countless Asian vendors. Some of us also have a mix of cameras, and all those different brands of cameras have different capabilities, processors, software and tech support. For that reason I prefer to have my cameras as 'stupid as possible' and do most of the processing and auditing from a single point I have absolute control over -vs- have every single camera be it's own island and require it's own support headaches. Also a big difference between < $200 consumer IP cams and > $1,000 commercial ones in this respect.The IP camera is a computer made just for this.
Per Core CPU efficacy has been leveling off for years, mostly because there isn't software demand like there has been to push higher CPU processing levels. Desktop sales are also pretty stagnant, and from Vista onwards Microsoft OS's have been pretty much the same in terms of hardware requirements. Basically if you have a machine that runs Win 7 really good it will also run Win10 really goo. Not so in the past when new Operating system releases were painfull on existing hardware.CPUs, in case you haven't been watching the past few years, are not getting faster. Microsoft's way of "it'll be fast enough of NEXT year's CPU" doesn't fly anymore.
Thanks! I take pride in that...Its all about discussion and sharing ideas and points of view...BTW, most mods would have taken the ruler out by now. Congrats.
I applaud your patience. Some people just don't get it.Thanks! I take pride in that...Its all about discussion and sharing ideas and points of view...
I think a pretty big factor in this was AMD losing their ability to compete at the high end. Intel hasn't had a serious competitor for high end CPUs in many years now, and the result is that most users would not feel the speed difference between a 2nd generation i7 purchased in January 2011 and a 6th generation i7 purchased in September 2015. Compare this to any other computer market: phones, tablets, laptops, NUCs, wristwatches (lol), even solid state drives and graphics cards have improved tremendously in the same years. But not midrange to high end desktop CPUs.Per Core CPU efficacy has been leveling off for years, mostly because there isn't software demand like there has been to push higher CPU processing levels. Desktop sales are also pretty stagnant, and from Vista onwards Microsoft OS's have been pretty much the same in terms of hardware requirements. Basically if you have a machine that runs Win 7 really good it will also run Win10 really goo. Not so in the past when new Operating system releases were painfull on existing hardware.
Thanks for detailed info, my 9 camera system is very stable with BI, all my cameras firmware are rock solid. So adding a few more cameras wont require me to change, I guess I can grow with BI.As you write, this affects that app. It's not an ideal. Use the hardware, Luke.
The IP camera is a computer made just for this. It has access to the image before any compression, and so it is in the best position to detect (whatever it is you want it to detect -- line crossing and field intrusion being two that H* cams do very well with very few false positives and no false negatives). Why do this after the image is compressed, at the PC, and then using software running on the CPU?
To the PC side. The CPU won't care if the image is SD, HD, or full HD; the GPU is doing the work. You could say, for this case, it does matter about the bitrate; all the CPU needs to do is move the bytes off the network stack and into the GPU. It doesn't matter what the image H x W is. Well, not if you use the GPU. If you have software that uses the CPU, then you are exactly right (the pixel count is what matters most by far, not the bitrate).
Anyway, you will never have enough CPU down the road, when 4k cams become mainstream (watch when 4k displays become mainstream), using software for event processing, and/or rendering. The camera should do the event processing, not the PC's CPU, and the GPU should do the decoding/rendering, not the CPU doing the decode to system memory then copying to the GPU (incredibly slow). CPUs, in case you haven't been watching the past few years, are not getting faster. Microsoft's way of "it'll be fast enough of NEXT year's CPU" doesn't fly anymore.
BTW, most mods would have taken the ruler out by now. Congrats.