didn't you see my task manager image above on a GTX 970 with 4GB using only 1.2GB VRAM? - Memory is only going to be an issue with loads of AI modules and/or custom models loading. Most BI users are only going to need 1-3 of them running - The 3060 will be fine, only get the Ti if you have other use case (gaming) beyond BI AI.
Hey guys
Let's not get carried away regarding GPU power. Presently many Nvidia GPU's are simply incompatible due to the software being in the early stages of development. I suspect in due course this issue will be resolved and so long as you are running a CUDA based GPU with enough memory it will suffice for most modest installations of up to 10 cameras.
The brings us to memory requirements. I'm operating 8 HD cameras 2 of which are 12 MP and the GPU is a 2060. Even with significant activity the GPU usage is never more than 10 – 15 percent. Memory on the other hand can reach the 6GB limit of the card when I send high resolution snaps for analysis which happens if you choose to use 'switch to HD stream if available '. The development team believe the jump from a baseline of 4GB to 6GB might be a software error only time will tell.
So, IMHO unless you have a significant number of cameras, and are planning for the future most CUDA based GPU's 'given compatibility issues being ironed out' that have 320 plus cores will probably return fair results.
One thing to bear in mind is that the bigger the GPU the higher your standby power consumption. My modest 2060 is already constantly gobbling 17 watts and is clearly way too powerful for my present needs and I will probably swap it for an older 1060 6GB card I have.
Caveats to the above, Face recognition may eventually become a reality (read that as reliable) and will most probably consume more GPU resource and needless to say all bets are off if you want to play games.
The above are just my rambling thoughts after reading posts of members considering much more powerful GPU’s for CodeProject AI
Hey guys
Let's not get carried away regarding GPU power. Presently many Nvidia GPU's are simply incompatible due to the software being in the early stages of development. I suspect in due course this issue will be resolved and so long as you are running a CUDA based GPU with enough memory it will suffice for most modest installations of up to 10 cameras.
The brings us to memory requirements. I'm operating 8 HD cameras 2 of which are 12 MP and the GPU is a 2060. Even with significant activity the GPU usage is never more than 10 – 15 percent. Memory on the other hand can reach the 6GB limit of the card when I send high resolution snaps for analysis which happens if you choose to use 'switch to HD stream if available '. The development team believe the jump from a baseline of 4GB to 6GB might be a software error only time will tell.
So, IMHO unless you have a significant number of cameras, and are planning for the future most CUDA based GPU's 'given compatibility issues being ironed out' that have 320 plus cores will probably return fair results.
One thing to bear in mind is that the bigger the GPU the higher your standby power consumption. My modest 2060 is already constantly gobbling 17 watts and is clearly way too powerful for my present needs and I will probably swap it for an older 1060 6GB card I have.
Caveats to the above, Face recognition may eventually become a reality (read that as reliable) and will most probably consume more GPU resource and needless to say all bets are off if you want to play games.
The above are just my rambling thoughts after reading posts of members considering much more powerful GPU’s for CodeProject AI
17watts isn't much in the scheme of things, my 3060ti seems to draw around the same ~19w (I think each of my CCTV cams consume 12-14watts each). My GPU is about 200watts when mining eth. Yeah for about a year I was doing both and deepstack, both happily co-existed.
@Village Guy - have you tried changing the fan curve setting so you only fire up the GPU fans at about 60 degrees - Even on a hot day my GPU stays below 60c without the fan on.
@Village Guy - have you tried changing the fan curve setting so you only fire up the GPU fans at about 60 degrees - Even on a hot day my GPU stays below 60c without the fan on.
No, I did look for an adjustment setting but couldn't find one so assumed I would need a third party app. I installed the studio driver and the tuning features appeared to be non existent.
No, I did look for an adjustment setting but couldn't find one so assumed I would need a third party app. I installed the studio driver and the tuning features appeared to be non existent.
MSI Afterburner is the world’s most recognized and widely used graphics card overclocking utility. It provides detailed overview of your hardware and comes with some additional features like customizing fan profiles, benchmarking and video recording.
www.msi.com
don't worry that it's by MSI it works for all Nvidia cards (I have it working on EVGA, MSI and ASUS cards) works great to set clocks, power and fans. Everyone with an Nvida card should have this installed.
Being from the school of "more is better" and always anticipating that more horsepower, raw processing as well as memory, will be needed as things advance I think it's prudent to use a card as powerful as you can afford or find at a reasonable price. "Reasonable price" is a budget based variable however, All the NVidia cards I have fooled with 970, 1060, 2070 and 3070, seem to draw around 15 watts, + -, at idle. It's only under load that you see differences. The 1060 seemed to be the most efficient between the 970 and 1060. The 1060 ran at about 130 watts under load while the 970 runs at about 150 watts. Both the 2070 and 3070 run around 200 watts under load, but the 3070 can accomplish more "work" making it more efficient than the 2070.
MSI Afterburner is the world’s most recognized and widely used graphics card overclocking utility. It provides detailed overview of your hardware and comes with some additional features like customizing fan profiles, benchmarking and video recording.
www.msi.com
don't worry that it's by MSI it works for all Nvidia cards (I have it working on EVGA, MSI and ASUS cards) works great to set clocks, power and fans. Everyone with an Nvida card should have this installed.
Assuming "face" is a custom model and you don't want to detect face on that camera, it needs to be commented as "face:0" in the custom models box, not the objects box.
Assuming "face" is a custom model and you don't want to detect face on that camera, it needs to be commented as "face:0" in the custom models box, not the objects box.
@sebastiantombs Good suggestion. Given your advice, I realized my config was not correct. Here is new config below. I still see face processing count go up though...
@sebastiantombs Good suggestion. Given your advice, I realized my config was not correct. Here is new config below. I still see face processing count go up though...
Face is not a custom model so you don't need to include it in the custom models box. You just need to enable the facial recognition option within the AI tab in your BI settings. Also in the "To confirm" box you need to include the face names that you want to recognize.
Face is not a custom model so you don't need to include it in the custom models box. You just need to enable the facial recognition option within the AI tab in your BI settings. Also in the "To confirm" box you need to include the face names that you want to recognize.
@Vettester I've already enabled face recognition (I agree with the screenshots you shared). I indicated this in my post above. The problem is turning off face recognition on per camera basis.
@Vettester I've already enabled face recognition (I agree with the screenshots you shared). I indicated this in my post above. The problem is turning off face recognition on per camera basis.
Are you really sure? I ask because if I leave the 'To Confirm ' box blank with Face Recognition on, BI will post a captured face in the unknown folder if there is no match with a face on file. In addition my .dat file analysis will report no faces found if it is unable find one.
Up until now I have assumed that you cannot turn face processing off on a per camera basis.
Are you really sure? I ask because if I leave the 'To Confirm ' box blank with Face Recognition on, BI will post a captured face in the unknown folder if there is no match with a face on file. In addition my .dat file analysis will report no faces found if it is unable find one.
Up until now I have assumed that you cannot turn face processing off on a per camera basis.
@MikeLud1 That is some great detail. I guess my original question still remains, if I only have "ipcam-general" specified as custom model, why do I see Face Processing counter increment up in CodeProjectAI admin screen