New setup for BI5, Nvidia or Intel GPU?

Abula

Young grasshopper
Joined
Jul 20, 2014
Messages
51
Reaction score
19
Location
Guatemala
Hi,

I'm getting ready to upgrade my BI server toward an i7 9700k, to what i seen intel igpu with quicksync has been the recommendation for a long time. But upon contacting BI for some insight, i got the following answer,

Deinfiely a newer chip is preferred however maybe even better for performance would be a newer Nvidia card which will offer not only decoding by encoding assist as well. Please see this matrix for details

Video Encode and Decode GPU Support Matrix
So this leads me to think that BI5 will benefit from an nvidia GPU, so what you guys doing? sticking with intel igpu or going with dedicated nvidia gpu?
 

awsum140

Known around here
Joined
Nov 14, 2017
Messages
1,254
Reaction score
1,128
Location
Southern NJ
An Nvidia GPU will easily process video, H264 or H265. The problem is that Nvidia cards are power hogs. I have a 1060 that uses about 100 watts under load , a 970 that uses about 135 watts under load and a 2070 that uses about 160 watts under load. The onboard Intel doesn't come anywhere close to that kind of power consumption. So, if you want some efficiency, stick with Intel.
 

Abula

Young grasshopper
Joined
Jul 20, 2014
Messages
51
Reaction score
19
Location
Guatemala
@awsum140 thanks for the reply, your explanation was very detailed and you placed me back into going with intel igpu.

But more for informative purposes, how much better or powerful is Nvidia over Intel here, and does it scale with the GPU? There very little info into how it works on BI with nvidia.

For example, OBS supports nvidia NVENC, before the RTX line and OBS23, the outcome of using nvidia had its pros and cons, given that streamers ware already using the GPU for the rendering of the game, using the extra encoder chip offloaded the CPU and GPU, the problem was that quality, usually Nvidia NVENC was a very good encoder a very high bitrates (for example local recordings with huge files) but was not as good as CPU encoding on low bitrates (streaming), x264 medium or slow was noticeable better than NVENC, this somewhat has changed with OBS23 + RTX cards with the new encoder, the quality is comparable to x264 medium. On the mid to high end gpus (i believer GTX1060 and above) it had two encoding chips, thus could handle two streams separated, but on BI not sure how it will work.

So how is it with BI? maybe someone can answer some questions to start learning about BI and nvidia.
- On recordings, Is there a noticeable difference between Intel or Nvidia in quality? does it depend on codec used?
- On Playback, will it have better quality than intel igpu on playback? would be easier scrubbing through time lines?
- Can Nvidia handle multiple streams? or is it limited to one per chip?
- Is BI design as OBS23 to take advantage of the new Turing NVENC encoders?
 

awsum140

Known around here
Joined
Nov 14, 2017
Messages
1,254
Reaction score
1,128
Location
Southern NJ
I've noticed no difference between Nvidia and Intel in terms of recording or playback. Keep in mind that's anecdotal and based on the observations from 70 year old eyes. Nvidia does use some CPU toachieve the processing so there is some, very minor, overhead there. Nvidia can handle H265 with no problems, from what I've heard and not based on my experience since I don't use H265, while Intel has problems with it. I have had as many as six cameras running on a 970 and the video engine load sits around 10% on that card with that load. Multiple cameras are not a big problem. The cameras I am using are all 2MP, 1080P, running at 10FPS with a max bit rate of 4096 for scaling comparisons. To my knowledge, BI is not using the Turing encoders at this time.
 

bp2008

Staff member
Joined
Mar 10, 2014
Messages
12,666
Reaction score
14,007
Location
USA
The amount of video you can decode with an Nvidia card does scale with the GPU to some extent. Based on one guy's test, a high-end Nvidia card is roughly equivalent in its decoding capacity compared to Intel (about 1500 MP/s). A GT 1030 can only handle a few hundred MP/s (maybe 300).

The main differences are that:

1) The Intel GPU is practically free. Nvidia cards are expensive.
2) Using Intel hardware acceleration reduces your system's energy consumption slightly. Using Nvidia hardware acceleration will likely more than double your system's energy consumption!
3) Nvidia is way better at writing drivers than Intel. Intel's acceleration has been plagued with memory leaks and continuously rising CPU problems in a lot of driver releases and has never worked with H.265 video in BI despite the ability being turned on over a year ago. With Nvidia it just works. H.264 and H.265 alike. And Nvidia works for accelerated H.264 encoding too.

Hardware accelerated decoding of any type does not affect video quality. Encoding on the other hand can affect quality somewhat, but not usually in a major way.

Nvidia cards can decode multiple streams no problem. Supposedly they can only encode a limited number of streams depending on the card (see the support matrix).

No idea about Turing encoders.
 
Top