Hardware Req for 16-20 3MP Cams

nbstl68

Getting comfortable
Joined
Dec 15, 2015
Messages
1,399
Reaction score
322
How are you getting your 6700k up to 4.8Ghz?
That is a bit beyond intls' specs for it.
Running at absolutel max, what kind of fans, noise to try to cool are required and how could that be stable Long term?

Sent from my SM-P900 using Tapatalk
 

Masejoer

Getting the hang of it
Joined
Mar 28, 2016
Messages
148
Reaction score
26
How are you getting your 6700k up to 4.8Ghz?
That is a bit beyond intls' specs for it.
Running at absolutel max, what kind of fans, noise to try to cool are required and how could that be stable Long term?

Sent from my SM-P900 using Tapatalk
CPU runs 4.8Ghz at 1.37V. 4.9Ghz requires almost 1.45V. Temperature - upper 70's under load with Prime96 and IntelBurnTest, and lower 70's during real workloads that max out 8 cores. Skylake runs much cooler than Haswell and Ivy Bridge did. Just using my old Thermalright Ultra Extreme - standard 120mm tower cooler from close to a decade ago. Extra large coolers don't make much difference now days - the issue is getting the heat from the core, to the heatspreader, and then to the heatsink. It'd be stable forever (10+ years most likely). In most cases, silicon doesn't just degrade. Even the fastest Pentium 4 will still run fine today, and those are over 10 years old. The P4 was where they first ran into limitations of silicon.

My prior 6600K did 4.8GHz fine also.

On that note, I've seen people remove the IHS, replace Intel's thermal paste under it, and reinstall the IHS - still something like a 10-15C degree drop under load while keeping the headspreader on top of the core.
 

nbstl68

Getting comfortable
Joined
Dec 15, 2015
Messages
1,399
Reaction score
322
What about the whole water cooled options? Do those really work better in the current iterations or just the next gimmick?

Sent from my SM-P900 using Tapatalk
 

Masejoer

Getting the hang of it
Joined
Mar 28, 2016
Messages
148
Reaction score
26
What about the whole water cooled options? Do those really work better in the current iterations or just the next gimmick?
Don't get me started!

Watercooling hasn't really been as effective as back when heatsinks were smaller and watercooling didn't come in kits (instead people used eheim pumps and 1/2" tubing). The biggest benefit today is being able to take weight off the CPU or GPU (let's say you were to remove the heatspreader - the core may get crushed under a traditional heatsink) and mount the radiator to the case, rather than have a large thermal radiator attached directly to the heatsink. Myself, I don't see there being much point to watercool things today - we have large heatsinks that hole 120mm or 140mm fans. If someone was going completely fanless/passive, liquid cooling with a larger external radiator still has its place.

I had a watercooling loop back around 2000. It made a lot of sense back then since heatsinks were terrible, especially with the large 180x400mm heatercore I used for thermal dissipation. Today, I see no reason for it. Even with overclocking, what is the point of pushing voltage much more for an extra 2-5% CPU frequency? 4.8Ghz to 5Ghz is 4% faster. Even 4.8GHz is less than 10% faster than 4.4. It isn't like back in the day where 50% overclocking gains were occurring more frequently.

IMO, liquid cooling in kits have always been gimmicky. If it's worth doing, it's worth doing right with large radiator(s), large ID tubing, high flow pump(s), and an efficient waterblock :eek:
 

nbstl68

Getting comfortable
Joined
Dec 15, 2015
Messages
1,399
Reaction score
322
I'd recommend, for the CPU, i7-4790 or i7-6700 (or their respective overclockable versions). 8-16 GB RAM. Windows 8 or 10. No need for a dedicated graphics card.

Just noticed your note about NOT needing ANY graphics card and I guess I don't really understand...



Why no need for a dedicated graphics card at all? BI would use the on board acceleration for processing \ recording to disk but not for screen display without a lot more overhead I'd think.

I would periodically typically be viewing one or multiple cameras on my computer's monitor live or from recording on regular occasion to maybe see what went on during the day, maybe looking at a multi cam layout on screen.


I though I read elsewhere that this would be where your computer \ BI would need a graphics card to display all those images, especially if viewing them high res on-screen to offload the video play from the CPU to the graphics card so as not to max out the CPU while viewing.

....

In my planned setup, I'll probably have a BI computer running in the basement, and rarely go to the computer to view anything...but then remote into it (Something like TeamViewer from my Mac upstairs over the network or web interface or via BI Android app loaded to a TV top box to my 50".)....Would that not also still spike the CPU when doing that or somehow not make use of a graphics card when viewing remote like that?
 

PSPCommOp

Getting the hang of it
Joined
Jun 17, 2016
Messages
694
Reaction score
92
Location
Northeastern PA
Just noticed your note about NOT needing ANY graphics card and I guess I don't really understand...

Why no need for a dedicated graphics card at all? BI would use the on board acceleration for processing \ recording to disk but not for screen display without a lot more overhead I'd think.
From what I've read, BI can't utilize the graphics card hardware (yet). The internal dedicated card is more then enough. If you do have one installed, most people pull them out and use the integrated graphics feature to save energy consumption.
 

fenderman

Staff member
Joined
Mar 9, 2014
Messages
36,902
Reaction score
21,274
From what I've read, BI can't utilize the graphics card hardware (yet). The internal dedicated card is more then enough. If you do have one installed, most people pull them out and use the integrated graphics feature to save energy consumption.
Using intel hd not only saves power, but Blue Iris can utilize the quicksync function for hardware acceleration.
 

nbstl68

Getting comfortable
Joined
Dec 15, 2015
Messages
1,399
Reaction score
322
Using intel hd not only saves power, but Blue Iris can utilize the quicksync function for hardware acceleration.
Is intel hd and quicksync something you set or "turn on" or is it just automatically utilized by BI?
If my system has a separate video card will it try to use that anyway?
 

Masejoer

Getting the hang of it
Joined
Mar 28, 2016
Messages
148
Reaction score
26
Is intel hd and quicksync something you set or "turn on" or is it just automatically utilized by BI?
If my system has a separate video card will it try to use that anyway?
You set a setting in BI. I don't believe the hardware acceleration works if you have any discrete graphics card installed - you must being using integrated only. Not a limitation of the hardware - in many cases with other apps (video transcoding) you can use Quicksync from the IGP while having an add-in board installed.

Your CPU usage should be 60-80% lower by using only integrated graphics and using hardware acceleration.
 

Q™

IPCT Contributor
Joined
Feb 16, 2015
Messages
4,990
Reaction score
3,990
Location
Megatroplis, USA
Is intel hd and quicksync something you set or "turn on" or is it just automatically utilized by BI?
If my system has a separate video card will it try to use that anyway?
There is a switch in Blue Iris to turn the Intel HD Graphics acceleration on a camera by camera basis at...

Camera > Properties > Video > Hardware Decoding

Or on a system wide basis at...

Blue Iris Properties > Cameras > Intel HD Hardware Acceleration (Restart)
 
Top