If your like me and you have your IPCameras recording 24/7 to a local FTP server you've ran into the issue where you trying to delete old files before drive fills up.. Ive found a great tool that will help us out.
The project is called py-direg, its a python project lets you regulate directories, ie specify how much space each camera can have and then it keeps your recordings within these limits.. AWESOME! It'd probably work on OSX with some MacPorts help, and if your determined enough it might work in windows under Cygwin.
https://github.com/jakejohns/py-direg
Requirements:
Python2 (Python3 breaks it, you can have both installed at same time tho)
pip for Python2 (pip2)
To get it working on ArchLinux I had to install python2 and python2-pip, then run:
then edit the top line of the direg.py script (because I have both python2 and python3 installed):
after that the script ran perfectly fine, here is a sample of my ~/.direg.py configuration file:
This gives each camera 850GB, 850x4 Cameras = 3400GB and thats the formatted size of my 4TB disk with a 250GB partition used elsewhere.
If you want to keep tight tolerances you set it to run every 10 mins and set your recording bundle size to 10mins..
Here is my crontab, the second command removes empty folders every hour which direg.py will leave behind:
and here is an example debug output of it cleaning up a camera folder that was too big, as you can see it deletes the oldest files until the space is back within allowances.
http://pastebin.com/aXkwARfF
Getting this working has been a godsend, trying to find files older than X days was getting unreliable because some days the space used was much bigger due to alerts/alarms going off so it was not reliable enough to be workable unless I left ~500GB free for a fudge-factor.
Cheers,
-R
The project is called py-direg, its a python project lets you regulate directories, ie specify how much space each camera can have and then it keeps your recordings within these limits.. AWESOME! It'd probably work on OSX with some MacPorts help, and if your determined enough it might work in windows under Cygwin.
https://github.com/jakejohns/py-direg
Requirements:
Python2 (Python3 breaks it, you can have both installed at same time tho)
pip for Python2 (pip2)
To get it working on ArchLinux I had to install python2 and python2-pip, then run:
Code:
pip2 install docopt
pip2 install humanfriendly
then edit the top line of the direg.py script (because I have both python2 and python3 installed):
Code:
#!/usr/bin/python2
after that the script ran perfectly fine, here is a sample of my ~/.direg.py configuration file:
Code:
# cat .direg.py
directories = [
{
'path' : '/srv/ftp/cameras/EAST-IPC',
'test': 'max_size',
'max_size': '850GB'
},
{
'path' : '/srv/ftp/cameras/North-IPC',
'test': 'max_size',
'max_size': '850GB'
},
{
'path' : '/srv/ftp/cameras/West_PTZ',
'test': 'max_size',
'max_size': '850GB'
},
{
'path' : '/srv/ftp/cameras/south-ipc',
'test': 'max_size',
'max_size': '850GB'
}
]
This gives each camera 850GB, 850x4 Cameras = 3400GB and thats the formatted size of my 4TB disk with a 250GB partition used elsewhere.
If you want to keep tight tolerances you set it to run every 10 mins and set your recording bundle size to 10mins..
Here is my crontab, the second command removes empty folders every hour which direg.py will leave behind:
Code:
*/10 * * * * ~/py-direg-master/direg.py
@hourly find /srv/ftp/cameras/* -type d -empty -exec rmdir {} \; > /dev/null 2>&1
and here is an example debug output of it cleaning up a camera folder that was too big, as you can see it deletes the oldest files until the space is back within allowances.
http://pastebin.com/aXkwARfF
Getting this working has been a godsend, trying to find files older than X days was getting unreliable because some days the space used was much bigger due to alerts/alarms going off so it was not reliable enough to be workable unless I left ~500GB free for a fudge-factor.
Cheers,
-R
Last edited by a moderator: