CPU usage and processor temp with BI

fenderman

Staff member
Joined
Mar 9, 2014
Messages
36,908
Reaction score
21,296
To be clear, you wont see a 60w increase because that assumes the gpu is working at max...also the lower cpu usage will drop consumption as well...its all a guessing game until the update is released. Hopefully soon.
 

wcrowder

Getting the hang of it
Joined
Oct 8, 2015
Messages
294
Reaction score
53
Location
French Lick, Indiana 47432
I think your calculations are way off. Where are you getting 34w for the p4600...a system with that processor should idle at 20-25w all in (assuming a single drive)..the p4600 likely uses close to zero at idle as its integrated into the processor. I dont know how the hardware acceleration will be implemented, Nvidia might be a complete waste. Benchmarks dont paint a complete picture. I would wait until its released then buy the card and test, so you can return it if it doesnt help.
You are correct, I got the 34 watts from a random benchmark site "at load" that compared the 750ti "at load" at 60 watts. And if implemented we are not talking "idle". I do understand that nVidia will use the Intel API to some extent. I've been looking at the nVidia API and cuda, decoding/encoding is taking place entirely in the card. I don't think I'm that far off. I will say the enterprise system, (30fps/6kbs) I use at work uses nVidia Quadro cards, without them a workstation or display wall will work, sorta, but is useless...

Not arguing, I just see a HUGE benefit either way.
 
Last edited by a moderator:
Top