Which nVidia GPUs are you using?

johnd126

n3wb
Oct 25, 2023
3
2
Home
I'm moving my Blue Iris installation to a computer where I can actually install a video card and I'd like to get one that will work with the Codeproject AI. I don't have a ton of cash for this so I'm looking for the most inexpensive option that will run a useful version of CUDA. There is a refurbished Quadro K4000 available on Amazon for approx $100CAD, which is pretty much my price range. From what I can find it will only support CUDA 9.1. Is this good enough?

Which cards are everyone else using?
 
  • Like
Reactions: OSU_BuckeyesFan
So, I'll answer my own question: the Quadro K4000 will NOT work with Codeproject AI because it only supports compute level 3.0 and TyTorch (included with the AI package) requires compute level 3.7 or newer. Sending that card back and have another inexpensive card on the way.
 
  • Like
Reactions: Flintstone61
I must be in the way way back machine....running a nvidia gtx 1060 6GB card. seems to handle basic codeproject duties. I was running it on 7 cams. then i moved to internal IVS on 4 cams. now only 3 cams without AI running codeproject.
 
  • Like
Reactions: Gimmons
I have a gtx1080ti, which I don’t use for BI, but even though it’s about 13 years old, it’s a beast. You have to get pretty fancy card$ to out perform it. Overkill, unless you run lots of AI cameras, but if you do, it could probably handle the load despite its age.
 
1734913057580.png
 
NVIDIA Tesla P4 here, love it!
 
  • Like
Reactions: CanCuba
I have a gtx1080ti, which I don’t use for BI, but even though it’s about 13 years old, it’s a beast

1080 came out in 2017, so just 7 years old
 
  • Like
Reactions: Flintstone61
Launch dates 2nd column, ( if Wikipedia is accurate)
1734927909585.png
 
The challenge is to find an inexpensive, low profile and low power (70 watts or less) Nvidia card. Most of the reliable sellers are out of stock right now, at least for the GTX 3050 cards.
Amazon has low profile RTX 3050s



 
As an Amazon Associate IPCamTalk earns from qualifying purchases.
I'm using a RTX3050 w/ CUDA 11.8.89. Been running this setup for about a year w/ 9 cameras on CPAI. No issues. I have been wondering if I could move up to CUDA 12, anyone running that on a RTX3050?

Michael
 
  • Like
Reactions: Flintstone61
Thanks for all your suggestions. Most of them were out out of my (admittedly too low) price range. I got a Quadro M2000 (with compute level 5.2) and I think it's working. I'm not seeing the CPU blip when it's doing object detection. But, it does seem to be missing some detections and there have been a few crashes. I hope to do more troubleshooting after the holidays.
 
Up until a few days ago I was running an old GTX 970 and the CUDA CP.AI performance was very good. I upgraded to an RTX 4060 on Christmas day and the CP.AI results are a bit quicker but importantly my power consumption has dropped by about 30-40w (the rest of the system remained the same) the power saving will cover the cost of the card in a few years as I'll save about £70-£80 per year in energy costs just on the card change, I paid £239 for the RTX 4046. I also use the system as a Plex server and live viewing BI, so now having HW encode/decode for things like H265 is a big bonus from the older card.

Below is my BI system 30 day power usage, ignore all the high spikes as that will be me using the system for other stuff and having the monitor/screen turned on. The lows around the 100-110w mark are BI and CP.AI and you can clearly see the drop to ~65w after I installed the new card on the Christmas day afternoon.

1735469481663.png
 
Last edited:
Up until a few days ago I was running an old GTX 970 and the CUDA CP.AI performance was very good. I upgraded to an RTX 4060 on Christmas day and the CP.AI results are a bit quicker but importantly my power consumption has dropped by about 30-40w (the rest of the system remained the same) the power saving will cover the cost of the card in a few years as I'll save about £70-£80 per year in energy costs just on the card change, I paid £239 for the RTX 4046. I also use the system as a Plex server and live viewing BI, so now having HW encode/decode for things like H265 is a big bonus from the older card.

Below is my BI system 30 day power usage, ignore all the high spikes as that will be me using the system for other stuff and having the monitor/screen turned on. The lows around the 100-110w mark are BI and CP.AI and you can clearly see the drop to ~65w after I installed the new card on the Christmas day afternoon.
I was looking into potentially upgrading my P1000 to either the RTX 3050 ones that @MikeLud1 mentioned or the RTX 4060 you are using here. I also do want to say in the low power mode and PCI slot powered GPU only as the box Im using is a SFF Dell that doesnt have any 6 or 8pin GPU power plug cable. The P1000 is not bad at all, but Im thinking why not just put something more powerful if I can still remain at the same wattage usage and then throw larger sets on it.
 
  • Like
Reactions: jrbeddow
Im want to change my coral TPU with a GPU. What's your idle power on the p1000? Which model,size and time's?
 
Im want to change my coral TPU with a GPU. What's your idle power on the p1000? Which model,size and time's?

I’ll have to put the computer on a Kill a Watt meter to see. Right now it’s on a UPS which also has my PoE switch for the cams so I can only see total wattage.

I’m using the default YOLO v8 setup and its model set as Large. Avg times I see are about 125ms for my cams which are sending it 720p sub stream.


Sent from my iPhone using Tapatalk