Just finished up a simple method for doing a MySQL database dump and uploading the output to Amazon S3 for Django projects. You can grab the code from GitHub. There are two files, one is the Python code to generate the dump and push it over to S3 and the other is a bash script to let the Python script be run as a cron job.
The python script shouldn't need to be modified at all to work with any Django project assuming you are already using S3 as your storage backend. If you're not, you'll need to install the Python S3 library and add the following to your settings.py file:
AWS_ACCESS_KEY_ID = 'your_key_goes_here' AWS_SECRET_ACCESS_KEY = 'your_secret_key_goes_here' AWS_STORAGE_BUCKET_NAME = 'your_bucket_name_goes_here'
This code was inspired by Rama Vadakattu's backup script
The shell script is setup to work with MediaTemple's Grid Service. If you
host with another company, YMMV. Once you get the shell script setup, you can
test it out by running
bash cron.sh in the directory it lives in; then
you'll want to setup a cron job to run the script at regular intervals.
If anyone would like to help out, I'm thinking that it would be nice to modify the python a bit so that once per week a full dump was saved and then throughout the week only diffs against that latest full version were saved. It would significantly lower the overall storage size and wouldn't take too much effort to get a current snapshot reassembled. If you come up with modifications like that, put a link in the comments to the code.