Difference between revisions of "Cedeus DB backups"
m (→Example : Cron Job that makes a Dump of the GeoNode DB) |
(→Example: cron Job that makes a Dump of the GeoNode DB) |
||
Line 35: | Line 35: | ||
*:* check if postgres user has a password assigned already (use ALTER... to do so: http://wiki.geosteiniger.cl/mediawiki-1.22.7/index.php/Setting_up_geonode#Some_PostgreSQL_commands ) | *:* check if postgres user has a password assigned already (use ALTER... to do so: http://wiki.geosteiniger.cl/mediawiki-1.22.7/index.php/Setting_up_geonode#Some_PostgreSQL_commands ) | ||
*:* create a .pgpass file to provide the password: http://wiki.postgresql.org/wiki/Pgpass | *:* create a .pgpass file to provide the password: http://wiki.postgresql.org/wiki/Pgpass | ||
+ | *:*: the pgpass file should have chmod 0600 | ||
* check also if the cron is running: "<code>sudo service cron status</code>" otherwise start it... | * check also if the cron is running: "<code>sudo service cron status</code>" otherwise start it... | ||
* to see what the cron tab contains use "<code>crontab -l</code>" | * to see what the cron tab contains use "<code>crontab -l</code>" |
Revision as of 12:26, 5 December 2014
>> return to Cedeus_IDE
Contents
- 1 How to set up Automated Backups
- 2 Performed CEDEUS Observatory Backups
- 2.1 Dump of the GeoNode DB - on CedeusDB
- 2.2 dump of the GeoNode user db - on CedeusGeonode VM
- 2.3 tar/zip of the (uploaded) GeoNode file data and docs - on CedeusGeonode Vm
- 2.4 MySQL dump for Elgg miCiudad - on CedeusGeonode VM
- 2.5 tar/zip of the (uploaded) Elgg miCiudad files - on CedeusGeonode VM
- 2.6 MySQL dump for Mediawiki(s) - on CedeusGeonode VM
How to set up Automated Backups
The Objective of this exercise is to have an automated backup process of user-profiles and user contributed data, that is copied to a portable medium at least once a week.
General Workflow to Create the Backups
The backups contain several steps. Usually they consist of:
- create a script that contain commands to
- create a database dump =or= tar/zip the files in a particular folder
- copy this dump file or zip archive to another machine from where it can be easily copied to portable medium, i.e. tape
- create a cron tab entry that runs the backup script(s) at some set intervall, e.g. each night at 1am
Below now some personal notes on how to set things up:
Notifications
To get notified about the backups via email, a/the shell script may send emails via "mailx" - i.e Nail. => see http://klenwell.com/press/2009/03/ubuntu-email-with-nail/
Btw. postfix may work as well
=> ToDo: Install mail program
Example: cron Job that makes a Dump of the GeoNode DB
- create a shell script that contains the pgdump instructions - see for example /home/ssteinig/pgdbbackup.sh on CedeusDB
- test if script or script execution actually works. A simple script for testing may perhaps be this (/home/ssteinig/touchy.sh)
-
#!/bin/bash touch /home/ssteinig/ftw.text
- create a cron-tab entry for user ssteinig with "
crontab -e
"- then add entry such as "
00 01 * * * sh /home/ssteinig/geonodegisdb93backup.sh
" to run the dump script daily at 1am - => when using the user "postgres" to do the db dump
- check if postgres user has a password assigned already (use ALTER... to do so: http://wiki.geosteiniger.cl/mediawiki-1.22.7/index.php/Setting_up_geonode#Some_PostgreSQL_commands )
- create a .pgpass file to provide the password: http://wiki.postgresql.org/wiki/Pgpass
- the pgpass file should have chmod 0600
- then add entry such as "
- check also if the cron is running: "
sudo service cron status
" otherwise start it... - to see what the cron tab contains use "
crontab -l
"
Dump example script geonodegisdb93backup.sh
#!/bin/bash logfile="/home/ssteinig/geonode_db_backups/pgsql.log" backup_dir="/home/ssteinig/geonode_db_backups" touch $logfile echo "Starting backup of databases " >> $logfile dateinfo=`date '+%Y-%m-%d %H:%M:%S'` timeslot=`date '+%Y%m%d-%H%M'` /usr/bin/vacuumdb -z -h localhost -U postgres geonodegisdb93 >/dev/null 2>&1 /usr/bin/pg_dump -U postgres -i -F c -b geonodegisdb93 -h 127.0.0.1 -f $backup_dir/geonodegisdb93-backup-$timeslot.backup echo "Backup and Vacuum complete on $dateinfo for database: geonodegisdb93 " >> $logfile echo "Done backup of databases " >> $logfile # sstein: email notification not used at the moment # tail -16 /home/ssteinig/geonode_db_backups/pgsql.log | mailx blabla@blub.cl
This example is based on the shell script posted here: http://stackoverflow.com/questions/854200/how-do-i-backup-my-postgresql-database-with-cron For a better Postgres dump script it may be worth to look here: https://wiki.postgresql.org/wiki/Automated_Backup_on_Linux
File transfer
To tranfers files I decided, for safety reasons, to create a new cedeus backup user on the receiving computer (20xxb...p).
A file transfer can be accomplished using scp or better rsync e.g.:
- "
scp /home/ssteinig/ftw.txt user@example.com:/home/backup_user/dbbackups/
"
- However, a ssh key should be generated first so no password needs to be provided. A detailed dscription can be found on: http://troy.jdmz.net/rsync/index.html
- in short do "
ssh-keygen -t rsa -b 2048 -f /home/thisuser/cron/thishost-rsync-key
". But do not provide a pass phrase when generating it, otherwise it will always asked for it when establishing a connection. - Then copy the key to the other servers users .ssh folder (using scp), and add it to the authorized_keys. (Note, the authorized_keys should be chmod 700).
- Then we would use "
scp -i /home/ssteinig/cron/thishost-rsync-key /home/ssteinig/ftw.txt user@example.com:/home/backup_user/dbbackups/
" - note that it is probably necessary to initialize a server connection once (with whatever file), so the connection gets an ECDDSA key fingerprint.
- "
- having my ssh keys setup, the code for syncing the cedeusdb directory with rsync would be
- "
...ToDo...
"
- "
Performed CEDEUS Observatory Backups
Dump of the GeoNode DB - on CedeusDB
- server: CedeusDB
- cron job running nightly at 1:00am
- using the script geonodegisdb93backup.sh
- copies the PG dump file to CedeusGeoNode into folder /home/cedeusdbbackupuser/geonodedbbackups/
- => ToDo: perhaps change this last step and copy it to cedeusgis1 for straight backup on a drive
dump of the GeoNode user db - on CedeusGeonode VM
blabla
tar/zip of the (uploaded) GeoNode file data and docs - on CedeusGeonode Vm
blabla
MySQL dump for Elgg miCiudad - on CedeusGeonode VM
blabla
tar/zip of the (uploaded) Elgg miCiudad files - on CedeusGeonode VM
blabla
MySQL dump for Mediawiki(s) - on CedeusGeonode VM
blabla