GPU's that fit Optiplex 3070 sff?

truglo

Pulling my weight
Joined
Jun 28, 2017
Messages
275
Reaction score
103
I may have jumped the gun on purchasing an sff box. Now I'm looking at gpu's to use for AI, and it seems like the 1050's are a good value, however looking at the shop manual for my yet to arrive optiplex 3070 sff, it appears the 16x slot only fits 1-slot GPU's!

Anyone use put a gpu in their sff optiplex?
Is there a particular 1050 that would fit?
If 1050's won't fit, what's the best gpu I could use with this thing?

[edit: Also it has to be nvidia... using it for CPAI cuda.]
 

fenderman

Staff member
Joined
Mar 9, 2014
Messages
36,903
Reaction score
21,275
I may have jumped the gun on purchasing an sff box. Now I'm looking at gpu's to use for AI, and it seems like the 1050's are a good value, however looking at the shop manual for my yet to arrive optiplex 3070 sff, it appears the 16x slot only fits 1-slot GPU's!

Anyone use put a gpu in their sff optiplex?
Is there a particular 1050 that would fit?
If 1050's won't fit, what's the best gpu I could use with this thing?

[edit: Also it has to be nvidia... using it for CPAI cuda.]
Your bigger problem is going to be the power supply. It will likely not be sufficient for your gpu.
You can run ai just fine on CPU alone - in the future there is likely to be coral support as well.
 

truglo

Pulling my weight
Joined
Jun 28, 2017
Messages
275
Reaction score
103
Thanks for the quick reply. I was afraid that would be the case, and dell's proprietary garbage psu's are always a problem. My current PC is gen3 and I wanted h265 ha, as well as os support beyond '25 (win11). I am glad I asked about the 1050 before buying one.

I swear I read across something that the optiplex 3070 psu can do something like 40W for the GPU... but can't find the thread. I think that would be something like a quadro p400 or so. From what I read those are already a huge help for ai, and they fit the sff case too. How far off is Coral support you think? I don't want to spend $50 on a gpu if it's just like a few months out.
 

fenderman

Staff member
Joined
Mar 9, 2014
Messages
36,903
Reaction score
21,275
Thanks for the quick reply. I was afraid that would be the case, and dell's proprietary garbage psu's are always a problem. My current PC is gen3 and I wanted h265 ha, as well as os support beyond '25 (win11). I am glad I asked about the 1050 before buying one.

I swear I read across something that the optiplex 3070 psu can do something like 40W for the GPU... but can't find the thread. I think that would be something like a quadro p400 or so. From what I read those are already a huge help for ai, and they fit the sff case too. How far off is Coral support you think? I don't want to spend $50 on a gpu if it's just like a few months out.
Why do you think you will need gpu support at all?
 

truglo

Pulling my weight
Joined
Jun 28, 2017
Messages
275
Reaction score
103
My make times are like 200-250msec avg which is fine most times. When I have several triggers coming all at once sometimes cpu gets pegged. My thought was taking the ai processing load off of the cpu and cutting make times down from 250msec to 25msec would reduce how often that happens.
 

fenderman

Staff member
Joined
Mar 9, 2014
Messages
36,903
Reaction score
21,275
My make times are like 200-250msec avg which is fine most times. When I have several triggers coming all at once sometimes cpu gets pegged. My thought was taking the ai processing load off of the cpu and cutting make times down from 250msec to 25msec would reduce how often that happens.
Are you using the substream for the AI?
 

truglo

Pulling my weight
Joined
Jun 28, 2017
Messages
275
Reaction score
103
I have used both substream and mainstream for ai, and found only a slight diff using substream. Makes with mainstream average 280-320msec, and 220-260msec with substream.
 

truglo

Pulling my weight
Joined
Jun 28, 2017
Messages
275
Reaction score
103
FWIW, I completed the build and got cpai cuda gpu working. Make times are now ~80msec average using all the same settings otherwise. So not the 20msec I was hoping for, but still a big improvement. I will play with substream for ai, as well as the medium and smaller models, and see how accuracy/performance trades off.

I will add I had to do a few modifications to the full size GPU back plate for it to fit the sff case. I had to cut it with a dremel and bend it into a sff sized back plate. Sorta OT... I also had to order a 3.5" drive cage for $8 on amazon, since it came with a 2x2.5" cage that did not fit my wd purple hdd. I am also able to use h265 without much cpu penalty thanks to the gen9 cpu. Also the old PC ran windows 10 off the same wd purple bvr was using... so very slow in general. Now everything is super snappy with a 970+ nvme ssd for win11 and BI (just BVR on the purple drive, but even playing back clips is much faster now).
 
Top