Rsync beats FileZilla hands down

Eventually got round to replacing FileZilla with Rsync and it is fantastic :slight_smile: A single click now updates all local files to the server and takes about five seconds!

It is well worth taking the trouble to setup this method because it completely eliminates the need for FileZilla or other upload file transfer programs.

Rsync stands for “remote sync”, is a remote and local file synchronization tool. It uses an algorithm that minimizes the amount of data copied by only moving the portions of files that have changed.

A link on the localhost home page set $_GET['rsync'] which is tested and if true then Linux RSYNC :

  1. tests local files which need updating
  2. zips required files that want updating
  3. uploads, unzips and overwrites ONLY outdated files
  4. sets all server file permissions, time stamps, etc

Platform:

Desktop: Linux Ubuntu 18.04 with SSH setup
Remote: Linux Ubuntu Server
Linux SSH: Setup
RSYNC: Setup

Usage:

I created the following link on the home-page.php

<?php 
// RSYNC PAGE LINK or Version
  $rsync = '<i class="flr fss"> &nbsp; Ver: 5.505 &nbsp; </i>';
  if(LOCALHOST):
    $rsync = '<a class="flr fss" href="?rsync"> rSync &nbsp; </a>';
  endif;  
  echo '<h5>' .$rsync .'</h5>';'

COMMON FILE

<?php
// COMMON TO ALL PAGES - mine is in the SuperClass
  $rsync = isset($_GET['rsync']) ? true : false;
  if(LOCALHOST && $rsync):
   // SET LOCAL and REMOTE PATHS 
    $HERE   = '/var/www/EXAMPLE.COM/src_files/'; 
    $THERE  = 'SSH_ROOT_USER@123.123.123.123:/var/www/EXAMPLE.COM/src_files/';
    $USER   = 'SSH_USER';  
    $PWORD  = 'SSH_PASSWORD';

    $tmp    = '/usr/bin/rsync -ratlz --rsh="/usr/bin/sshpass -p $PWORD ssh -o StrictHostKeyChecking=no -l $USER" $HERE $THERE';

	$ok = exec($tmp);
  endif;

SPECIAL NOTE:

To prevent the SSH password from being entered every time I used the following:

2 Likes

So, what’s the usecase? Looks good for backups, but i prefer using GIT for deployment.

And why do you disable the security feature? It’s relatively easy to just create the keys on both sides and use encryption.

I develop websites locally and instead of using FileZilla for uploading to online have started to use Rsync which is a lot quicker.

I use the online files as my Github.

The security feature is disabled temporarily because I have yet to correctly set SSH… maybe tomorrow :slight_smile:

I use rsync as well, but I never overwrite the current website with new files, because while you’re uploading you’re in an intermediate state where files of the old version are mixed with files from the new version, and you can’t be sure at all what will happen when a request comes in at that time.

Instead I have a folder releases that contains releases of the website, and a symlink current that points to the currently active version:

For example:

/releases
  /20190219-201211
  /20190222-092912
  /20190301-125432
/current # Symlink to releases/20190301-125432

Then when I upload a new site I create a new directory in releases, upload all files to it, and switch the symlink to it, so the entire changes as one atomical action.

An additional benefit here is that if there is something wrong with the current version you can just point the symlink to a previous release.

5 Likes

There are also tools to automate this, like Capistrano for Rails. I’m sure there’ll be sth similar for PHP.

*: As long as your situation is narrowly defined as ‘you have SFTP access to the remote server’, since this program uses that protocol.

1 Like

Does it mean you have a whole web site in each of the releases folder? If so then you need to upload all files to it, or at least make a copy from a previous release and upload changes. This might work well if a site is small but deploying large sites might be time consuming and resource hungry.

I’m not sure I want to use rsync for server uploads. Maybe very occasionally, when I want to make sure the server matches my local copy - just in case I failed to upload a file in the past for some reason or some other corruption happened. When I upload a site initially then I just send a zip file and unzip it on the command line. When I make an update I export from git the changed files and send it via FTP or SSH - and if I need a very fast update then I can unzip a file with changes, which usually takes a split second.

Rsync looks like a tool more for backups than site deployments. Often there are some files and folders which are different on the local server and remote (like configs or user data) so I’d need to configure rsync to handle those exceptions - and while I have all those already configured in git then I’d need to keep those configurations mirrored in my rsync setup.

Probably for cases when I need to have an exact remote replica of my local files I image rsync would be a good solution. But I hardly ever have that use case.

I bitterly learnt a long time ago that having different config files could be a problem and have adopted the following work-around:

<?php 
   defined('LOCALHOST') 
   || 
   define('LOCALHOST', 'localhost'===$_SERVER['SERVER_NAME'] );

// usage
   $uName = 'onlineUserName'; // default
   $pWord = 'onlinePassword'; // default
   if(LOCALHOST):
     $uName = 'localhostUserName'; 
     $pWord = 'localhostPassword'; 
     $uData  = ['special', 'user', 'data', 'goes', 'here'];
   endif;

// similar for debugging, backup, etc
   $debugStuff = '';
   if(LOCALHOST)
     // displayed beneath banner or in the footer?  
     $debugStuff = '
       <div class="btns">
         <a href="?rsync=applicaton"> Applicaiton </a>
         <a href="?rsync=writable"> Writable </a>
         <a href="?rsync=assets"> Assets </a>
       </div> 
      ';//
   endif;
   echo $debugStuff;

   $rsync = isset($_GET['rsync']) ? true : false;
   if(LOCALHOST && $rsybc): 
     // RSYNC
   endif;

The above three $debugStuff links links cater for the CodeIgniter Framework which has two frequently used main sections and the ./assets/ folder. Less resources are needed because FileZilla or a Terminal Program are not required.


Perhaps try having localhost files mirror the online server because I find it is easy to recognise the setup and updating can be safely done without having to check for any differences.

I use this strategy for Rails apps and yup, you have the whole app in each of the folders, with the caveat that you symlink anything which is shared between the releases (such as config files, log files, the public directory etc).

It’s a great strategy really, as if you notice that your latest deploy has broken something, you can roll everything back with a single command.

You also need to configure the deploy tool to say how many releases you want to keep on the server, as otherwise you will run out of disk space at some point.

Yes

Actually the entire site is packed as a .tar.gz and extracted on the host

No, I like it to be one canonical copy of the software, impossible to be mixed with any older versions. Uploading everything anew keeps the process simple and precedictable. Plus, with internet speeds being as they are nowadays, uploading isn’t as much as an issue as it used to be (actually with ~500Mbit up/down at the office nobody really cares).

Time consuming no (see above), resource hungry yes, which is why only keep the latest 5 releases. As soon as a 6th is released the oldest is removed. Still gives some leeway to roll back a bit, while at the same time not cramming the hard disk with ancient builds.

If anyone is interested, we’re using Ansible deploy helper to manage what I’ve described in my earlier posts.

Oh, this is the way I used to do it in the old days and I don’t really want to go back to it - it gets very messy unless you are the only developer using a single dev environment. Different config files are no problem at all and are essential for several reasons:

  1. There may be many developers working on an application or one developer may have several different testing environments - if we went your way then we’d need to have a long list of all of the config data in the application code, inserted within creative if statements that do their best to identify the current environment.

  2. Having database credentials, etc. in the source code might be a security hazard. The are scenarios you might want to send the code to someone without them knowing the production credentials. Or, when putting the code into a VCS.

Have you looked at dotenv? It follows a similar idea of having individual config files which are separate from the source code.

Also, use data like file uploads may be different on the live server so it doesn’t make sense to synchronize them with rsync.

1 Like

[off-topic]
A work-around for each developer is to create their own config-files above-the-root:

<?php 
$config    = 'config.php'; // default to online server
$aboveRoot = dirname($_SERVER['DOCUMENT_ROOT'] ) 
           . '/developerPath/' 
           . $config;
if( file_exists($aboveRoot) ): // NOT RSYNCED
  $config = $aboverRoot; 
endif;
require $config;

[/off-topic]

Just briefly and it appears comprehensive and quite involved.

The supported version seems quite expensive and not sure the terms, conditions and expiry date of the trial version.

Edit:
Further investigation shows the GitHub source is available and it is just the support which is expensive.

[off-topic]
I like…
Releases are named after Led Zeppelin songs. (Releases prior to 2.0 were named after Van Halen songs.)
[/off-topic]

Do you use Ansible for private and personal websites?

The simple sites I develop using the CodeIgniter PHP Framework have similar deployment structures which can be changed in the common index.php. I have even used a GET parameter to temporarily set the online development version before hard-coding the latest paths. Fun and games with the browser file cached versions :frowning:

Currently only for work.

I don’t have any private website running at the moment, but if I did I would probably use Ansible there too.

By making a copy from a previous release I mostly had in mind the user data like uploads, etc., which sometimes can take most of the disk space. How do you handle that? @James_Hibbard mentioned sharing the public folder between the releases - how can this be done? Symlinking to the common public folder from each of the releases folder?

This looks like a nice strategy if atomic deployments are critical. But if I had to have several copies of the whole site including the uploads then I might not have enough disk space for more than one copy of the site.

I wonder if someone hasn’t already come up with a solution to the disk space problem - something like an incremental file system where you can add a new directory with files and specify that those files are not to be stored physically in whole but as a delta difference against the previous directory. Then it would be possible to store many versions of a project with minimal disk space requirements.

Exactly that, yes.

/releases
  /20190219-201211
    /uploads # Symlink to shared/uploads
  /20190222-092912
    /uploads # Symlink to shared/uploads
  /20190301-125432
    /uploads # Symlink to shared/uploads
/current # Symlink to releases/20190301-125432
/shared
   /uploads

Or upload your files to something like AWS S3 or GCP Cloud Storage etc, and serve it directly from there. Solves your disk problem, and goes around the problem of having to have all shared files on all servers in case of a multi-server setup. Sadly this is not really plug-and-play in most systems yet, but for custom-build it’s not that hard. Especially with packages like Flysystem.

This also fits well with immutable infrastructure, which is an awesome thing to have.

1 Like

Yup. Exactly that :slight_smile:

1 Like

This topic was automatically closed 91 days after the last reply. New replies are no longer allowed.