I’m setting up a backup system on my linux web server. The backup will be done nightly to an external hard drive. Typically, what is the best way to handle backups? In mind I see two possibilities. Back up the complete server system including operating system files and configs (everything in the / directory), or just backup the website files in the user directories individually (example: backup /home/www). Is one way or the other bad practice?
With our corporate backups we usually backup all website data including a compressed copy of any databases. Along with the website data needed we grab any configuration files for services that will need to be setup again should a server failure occur. Some of these services include cpanel, mysql, postgres, and others. Also if the customers have email with us we will grab those as well. There should be no need to backup an entire web server. Just focus on what you need to get back up and running quickly. In our company test runs for data failure we can usually provision a server and be back up within an hour for a failure because we only grab what we need. Sorting through things after a failure is always time consuming.
Another related question pertaining to backup of databases. On my server I have a shell script which does a mysqldump on all databases and stores the sql flies. These files then get backed up with the website.
Is it possible to backup mysql databases without having to do a dump? or is the method in which the data is stored in the database inaccessible by any means other than a dump.