IPCam FTP Cleanup (*nix)

nayr

IPCT Contributor
Joined
Jul 16, 2014
Messages
9,329
Reaction score
5,325
Location
Denver, CO
If your like me and you have your IPCameras recording 24/7 to a local FTP server you've ran into the issue where you trying to delete old files before drive fills up.. Ive found a great tool that will help us out.

The project is called py-direg, its a python project lets you regulate directories, ie specify how much space each camera can have and then it keeps your recordings within these limits.. AWESOME! It'd probably work on OSX with some MacPorts help, and if your determined enough it might work in windows under Cygwin.

https://github.com/jakejohns/py-direg

Requirements:
Python2 (Python3 breaks it, you can have both installed at same time tho)
pip for Python2 (pip2)


To get it working on ArchLinux I had to install python2 and python2-pip, then run:
Code:
pip2 install docopt
pip2 install humanfriendly
then edit the top line of the direg.py script (because I have both python2 and python3 installed):
Code:
#!/usr/bin/python2
after that the script ran perfectly fine, here is a sample of my ~/.direg.py configuration file:
Code:
# cat .direg.py 
directories = [
         {
             'path' : '/srv/ftp/cameras/EAST-IPC',
             'test': 'max_size',
             'max_size': '850GB'
         },
         {
             'path' : '/srv/ftp/cameras/North-IPC',
             'test': 'max_size',
             'max_size': '850GB'
         },
         {
             'path' : '/srv/ftp/cameras/West_PTZ',
             'test': 'max_size',
             'max_size': '850GB'
         },
         {
             'path' : '/srv/ftp/cameras/south-ipc',
             'test': 'max_size',
             'max_size': '850GB'
         }


    ]
This gives each camera 850GB, 850x4 Cameras = 3400GB and thats the formatted size of my 4TB disk with a 250GB partition used elsewhere.

If you want to keep tight tolerances you set it to run every 10 mins and set your recording bundle size to 10mins..
Here is my crontab, the second command removes empty folders every hour which direg.py will leave behind:
Code:
*/10 * * * * ~/py-direg-master/direg.py
@hourly find /srv/ftp/cameras/* -type d -empty -exec rmdir {} \; > /dev/null 2>&1
and here is an example debug output of it cleaning up a camera folder that was too big, as you can see it deletes the oldest files until the space is back within allowances.
http://pastebin.com/aXkwARfF

Getting this working has been a godsend, trying to find files older than X days was getting unreliable because some days the space used was much bigger due to alerts/alarms going off so it was not reliable enough to be workable unless I left ~500GB free for a fudge-factor.

Cheers,
-R
 
Last edited by a moderator:

nayr

IPCT Contributor
Joined
Jul 16, 2014
Messages
9,329
Reaction score
5,325
Location
Denver, CO
this has been working wonderfully in production for over a week now.. So much better than finding files older than X days and removing them, I really like I can set a quota for each camera allowing some more important cameras to keep a longer history than others, depending on need.

I have since tweaked the config, here is my current setup:
Code:
directories = [         {
             'path' : '/srv/ftp/cameras/EAST-IPC',
             'test': 'max_size',
             'max_size': '700GB'
         },
         {
             'path' : '/srv/ftp/cameras/North-IPC',
             'test': 'max_size',
             'max_size': '940GB'
         },
         {
             'path' : '/srv/ftp/cameras/West_PTZ',
             'test': 'max_size',
             'max_size': '940GB'
         },
         {
             'path' : '/srv/ftp/cameras/south-ipc',
             'test': 'max_size',
             'max_size': '700GB'
         }


     ]
and its keeping them right where I need them, it goes slightly over for 10mins then back down.. never more than a few hundred MB but this rounds up to the next GB.
Code:
[nayr@dispatch cameras]# du -hs *
701G    EAST-IPC
941G    North-IPC
941G    West_PTZ
701G    south-ipc

[nayr@dispatch cameras]# df -h ./
Filesystem      Size  Used Avail Use% Mounted on
/dev/sda2       3.4T  3.3T   13G 100% /srv/ftp
I could probably chew up even more of that 13GB avil, but I am leaving it in place with an sms alert going to me if it drops below 10gb.. just incase the script fails I can be notified before it runs out of space.

Running the script every 10mins has not added any noticeable load to my tiny lil arm powered ftp server (load average: 0.68, 0.43, 0.44, basically ~90% idle)

Doing a lil research it seems it should be possible to run this script in a Jail on a FreeNAS server.. dunno how hard or easy that task would be, perhaps I'll give it a shot and document it if anyone needs it.
 
Last edited by a moderator:
Top