Specs are:
I currently have 27 cameras. I am currently consuming 19GB/24GB on a 3090. I have Deepstack, however stopping deepstack and rebooting the system (after disabling startup), only affects the VRAM by 1-2GB. Below is a screenshot of all my cameras in play, that shows resolution, substreams, bitrate, etc. I am aware some of them are 30 fps and high bitrate, which I will be reducing, but it doesn't really affect much of the overall usage thus far. I am just wondering if this is normal or if there is something I need to tweak? I have been testing out multiple graphics cards, and have had some BSODs. This might have messed up the system causing the high usage, so I might reinstall the OS just to make sure everything is fresh (because I have been trying different drivers when swapping gpus).
EDIT: After reinstalling everything, I got the same results, however I set
- Ryzen 5800X
- X570 Mobo
- 32GB (2 x 16GB DDR4 2666MHz)
- ROG Strix 3090 (I have also tried an A2000 6GB, a 2080ti 12GB)
I currently have 27 cameras. I am currently consuming 19GB/24GB on a 3090. I have Deepstack, however stopping deepstack and rebooting the system (after disabling startup), only affects the VRAM by 1-2GB. Below is a screenshot of all my cameras in play, that shows resolution, substreams, bitrate, etc. I am aware some of them are 30 fps and high bitrate, which I will be reducing, but it doesn't really affect much of the overall usage thus far. I am just wondering if this is normal or if there is something I need to tweak? I have been testing out multiple graphics cards, and have had some BSODs. This might have messed up the system causing the high usage, so I might reinstall the OS just to make sure everything is fresh (because I have been trying different drivers when swapping gpus).
EDIT: After reinstalling everything, I got the same results, however I set
Hardware accelerated decode (restart cameras)
under Settings > Cameras to No from NVDEC and it stopped using the VRAM. Performance looks the same afaik? I am going to do more testing in the meantime. I was aware that NVIDIA encoding was supposed to be inefficient, but to require at least a 3090 to utilize it at 27 cameras vs none seems a bit broken in my book.Attachments
Last edited: