Blue Iris PC and video cards

easttnemc

n3wb
Joined
Jan 28, 2021
Messages
9
Reaction score
8
Location
Athens TN
I have spent a lot of time reading through the WIKI and all the hardware threads and need some help understanding something.
I am confused (and maybe OVERTHINKING IT) about graphics cards and Intel Quick Sync.

I understand that most every (if not all) CPUs from Intel have the Quick Sync GPU build into the processor and some of the advantages with BI and how BI uses it. I also see the

How does the Quick Sync GPU relate to any graphics cards built into a PC? Does the Quick Sync still work if I have an nVIDIA or AMD graphics card?

I can probably get myself a tower PC with an i9-10900 processor and multiple HDD bays, but they all have either nVIDIA or AMD graphics cards in them.

I realize with use of substreams that an i9 processor-based system may be overkill, but (queue the Tim-the-tool-man sounds and I can get myself for a reasonable price). :)

Thanks
 

SouthernYankee

IPCT Contributor
Joined
Feb 15, 2018
Messages
5,170
Reaction score
5,320
Location
Houston Tx
I run an old intel CPU an i7-4790. I also have an Nvidia Geforce GT 710 graphics card. I use the graphics card for my display. I use the intel GPU quicksync to process my h.264 video for BI. The mother board did not provide a graphics HDMI connector for the intel graphics. I needed an HDMI for my monitor, so i used a cheap GT710. Works with no problem, just used more electricity .

If the intel motherboard provides an HDMI output i recommend using a hdmi dummy plug

GPU1.jpg

GPU3.jpg

GPU2.jpg

GPU4.jpg
 
As an Amazon Associate IPCamTalk earns from qualifying purchases.

gurpal2000

n3wb
Joined
Dec 9, 2021
Messages
9
Reaction score
2
Location
UK
It's interesting. I have a similar use case. Onboard is Intel. I have an additional Nvidia card. The server is headless, but i have stuck a dummy dongle in one of the onboard hdmi ports only. I only intend to access the server via remote desktop when i need to. What settings should i use? Also does the "global" decoder setting become the Default for each of the camera screens? Why have you specified it twice effectively?

I also write directly to disk, so as far as i know there is no re-encoding. Is there any value in me actually having the extra nvidia card in the pc case?

thanks
 

wittaj

IPCT Contributor
Joined
Apr 28, 2019
Messages
25,210
Reaction score
49,112
Location
USA
Personally I haven't seen a big CPU improvement with HA since the substream feature came out. Before substreams, absolutely Hardware acceleration was a must. But now it equates to a few percent of CPU. If you have followed the optimizations in the wiki, you can get by without Hardware Acceleration now.

Plus keep in mind recent BI updates have caused issues for some people using Hardware Acceleration (it may actually be a Windows update not playing nice with BI) and turning off HA fixes it.

Several of my cams still use HA, but the moment they start going wonky (usually after a BI update), I simply turn it off.

Some here can run all their cams with the latest BI version and HA no problem, while others have had issues. So many variables with everyone's system.

The "global" decoder does not override the specific HA in the camera. If you leave the camera at Default it will, but you can certainly specify one in the global and a different one in the camera.

GPUs tend to be energy hogs and with substreams not worth the effort....UNLESS...you use DeepStack - then you will see improvement performance with DeepStack. Folks here are seeing a 5-10 times reduction in make times with a NVIDIA card.
 

sebastiantombs

Known around here
Joined
Dec 28, 2019
Messages
11,511
Reaction score
27,697
Location
New Jersey
An NVidia based graphics card can be useful is you use DeepStack, artificial intelligence, which is easily integrated with Blue Iris in the 5.x5.x versions. It does need to be a CUDA capable card however.
 

gurpal2000

n3wb
Joined
Dec 9, 2021
Messages
9
Reaction score
2
Location
UK
@sebastiantombs , @wittaj - thanks, I will intend to use DeepStack later in my journey for sure. I have a GT730 which does have CUDA cores.
And yes I need to follow the guide you mention. Sorry I should have said I'm a Newbie.
 

sebastiantombs

Known around here
Joined
Dec 28, 2019
Messages
11,511
Reaction score
27,697
Location
New Jersey
I'm not sure how much a 730 will help. It's kind of long in the tooth in the CUDA/NVidia world but it should make some difference.
 

gurpal2000

n3wb
Joined
Dec 9, 2021
Messages
9
Reaction score
2
Location
UK
I'm not sure how much a 730 will help. It's kind of long in the tooth in the CUDA/NVidia world but it should make some difference.
True, when i check out some comparison sites there doesn't seem to be much difference between my onboard which is an Intel HD 530 (on i5-6500).
 

sebastiantombs

Known around here
Joined
Dec 28, 2019
Messages
11,511
Reaction score
27,697
Location
New Jersey
It's not a question of how it processes video per se. It needs enough CUDA cores and enough processor to power them to analyze the snapshots Blue Iris sends. I run a 970 with DeepStack and get >100ms detection times. People running a Quadro get <100ms times. On a CPU, with QuickSync which can't process the snapshots, it might take well over 1000ms for a comparison.
 

gurpal2000

n3wb
Joined
Dec 9, 2021
Messages
9
Reaction score
2
Location
UK
Ok understood. I think i'll find out when i start playing around with DeepStack. For now, I'm not experiencing any issues with just 3 camera (so far) without any acceleration specified. I don't think Remote Desktop even knows that fact. The dummy dongle is new to me - i didn't know those things existed and what the purpose was. For me, it's just a way of seeing the GPU stats in the process monitor so i can work out if a GPU is even being used at all.
 
Top