Nvidia helps BI?

Abula

Young grasshopper
Jul 20, 2014
51
19
Guatemala
I have always prefer building around intel iGPU, but recently i rechecked BI Recommended specs and i saw

nVIDIA graphics adaptor for hardware decoding

Does BI can use cuda core or any of nvidia tech to help the CPU into BI tasks?
 
I have always prefer building around intel iGPU, but recently i rechecked BI Recommended specs and i saw



Does BI can use cuda core or any of nvidia tech to help the CPU into BI tasks?
This has not been implemented yet...the difference between using nvidia or intel may be negligible...intel is preferred because of the power savings as well as one less point of failure.
 
This has not been implemented yet...the difference between using nvidia or intel may be negligible...intel is preferred because of the power savings as well as one less point of failure.

Is there any word on when it will be implemented and if any support for Intel is planned in addition to nVidia?

Also, are there any estimates on how much reduction in CPU usage we might see when this is implemented?

Thanks,
Carlton
 
Is there any word on when it will be implemented and if any support for Intel is planned in addition to nVidia?

Also, are there any estimates on how much reduction in CPU usage we might see when this is implemented?

Thanks,
Carlton
We have no idea. He never said intel wont be supported. Its just that it may be optimized for nvidia. Lets wait and see.
 
We have no idea. He never said intel wont be supported. Its just that it may be optimized for nvidia. Lets wait and see.
Personally i would love if BI supports quicksync, but im not against into supporting Nvidia cuda.... if they do probably my next build will be on micro atx case.