A plan for unix backup of site and database

I have found a couple of resources on backing up my sites (mostly joomla) But I also want this to be somewhat automated. The very good Akeeba backup for joomla is excellent but I wanted something to cover all of my sites automated and their databases. Its cheap to get the automated version but I wanted to do all sites or just folders instead that I change.

I came across the unix command tar and dump. Dump was deemed better in some researches and tests done by the author but I never take a single word for things especially online and time moves on.
So here I am asking what others use. Also being a CMS like joomla I only need the folders where user generated content gets added doing so the following if automated could help


that could be set up to do each folder I am looking at doing, and then I just need the database backing up - I hope there is a command that can just dump that too. I have seen something in phpmyadmin that can output all the data in the commands needed to rebuild itself from scratch. But it isnt automated…

So with your help maybe we can come up with a script for us all and run this from cron daily. If you run a shared server then chances are this is done for you. In this case I am doing it for myself, and need to try and get it automated.

1.) backup folders into an archive file for each folder that has user generated content in a way that the folders wont have namespace issues and overwrite each other. Also any site changes that I make like css mods etc.
2.) backup the sql database
3.) perhaps delete older archives auto for example never have anything older than a month.

this is my suggested start point. I would like to know the limits and how to achieve these things using for example cron and unix commands…

Some other resources :

here is a very similar post - but with closer answers - I will look into tailoring this for my needs, I dont want to back up all databases but from this I think my solution can be built.