Synchronizing Composer's vendor folder with ftp when deploying

I’ve searched for this but couldn’t find any established method to do this. Quite often I have only ftp access to a server and can’t use ssh or any Composer installations on the remote site. If my dependencies change and need to synchronize the contents of the vendor folder how do I do it? Are there any ready made tools for that? Uploading thousands of files each time via ftp is slow as hell. Some ftp programs offer synchronization tools but they can’t be relied upon if they only take time or file size into account - and they are slow, too.

When I upload files from my projects I usually do a git diff to export only changed files and upload them in one go - this works well since there’s not many of such files usually. But the vendor folder is outside version control so I don’t know which files have changed nor can I export them for upload.

At the moment I’m thinking of making a simple php script that will extract a zipped vendor folder on the server. Uploading a single zip is an order of magnitude faster than thousands of files and this might work well enough. However, maybe there is some ready made tool for that?

Maybe you can ask your IDE to do that.
I use NetBeans IDE and it can sync project files with remote server via FTP
It tracks only changed files and uploads them in background

Interesting idea, I haven’t thought of that. I’m not fond of synchronization in the background because I may easily lose control over what goes to the live server - I’ve already experienced unexpected background file uploads that broke things and I don’t want this any more.

But I’ve tried the Synchronize option in Netbeans and it was slow as hell and unreliable. As a test I have a vendor folder that is 7 MB in size with 2317 files and 476 folders (a small one, BTW!). Trying to synchronize this folder while there are no changes between the local and remote - it first took Netbeans over 3 minutes just to analyse and display the info dialogue. Then after hitting Synchronize it went on downloading a bunch of files that didn’t have to be downloaded (because they were identical) and it all took 20 minutes.

Then I added a temp folder into verdor and copied to it a couple of small files to see how fast they get uploaded by synchronizing. Again, I waited 3 minutes for the results window and Netbeans figured out that the new files should be deleted from my local copy instead of uploaded to the remote - totally wrong, but I understand that based on timestamps it can’t know any better. And then synchronizing took also 20 minutes, this time a bunch of files were uploaded to the server that weren’t changed at all. All in all, a pretty useless tool for me.

Finally, I struck up a simple PHP script that takes a zipped archive and extracts it on the server, first recursively deleting the folders that are present in the zip. The results are much better for the same vendor content:

  • zipping vendor (with 7-zip) - 1 second
  • uploading the zip file via ftp - 2 seconds
  • deleting existing vendor on the server (with php) - 2.7 seconds
  • extracting zipped vendor on the server (with php) - 1.1 seconds

It looks like I answered my own question :slight_smile:

I was thinking the process would involve comparing
existence of files by filename
file sizes and / or last modified dates and / or hash values

But
zipping - uploading - deleting - extracting
looks to be much simpler and efficient enough

My guess is the reason NetBeans is slow is because it does similar to what I was thinking would be involved

Have you found that permission settings are maintained?

Yes, I was originally thinking about doing the same and certainly with hash values because last modified dates can’t be relied upon. But such a tool would be much more complicated and now seeing the performance of uploading everything I don’t think it’s worth the effort - it would certainly be if I added large files (like multimedia) into the mix but just for php code this is good enough for me.

Yes, it does, except there is no way to know a hash of a file via ftp without downloading that file first (which then makes the hash unnecessary). So it relies on timestamps and while file timestamps are not reliable themselves, depending on the server, the net result is unreliable. The worst thing is that via ftp the only solution is to fetch the listing of all files in all directories and upload/delete files one by one - if there are many files and directories this will be extremely slow. And my case was just a very simple website based on Silex - if I required more dependencies the vendor folder would grow much larger!

Netbeans has an option of maintaining permissions (slower). My php tool of course does not do it because it firsts wipes out everything and creates new files. But I don’t think I need to worry about permissions in most cases - the default ones are usually fine for php code. However, on one of my shared servers I noticed a difference in permissions between ZipArchive extraction and regular ftp upload:

  • ftp upload creates folders with 0770 rwxrwx— and files with 0660 rw-rw----
  • ZipArchive creates folders with 0755 rwxr-xr-x and files with 0644 rw-r–r-- (just like other php functions)

I don’t know if that poses any problems but I think everything should be fine.

You may not be able to do this but I have had luck with the following on shared hosts:

echo shell_exec("php /path/to/composer.phar update")

or try exec()

Interesting, indeed sometimes this may work if phar is enabled on the server. Thanks for the idea!

This topic was automatically closed 91 days after the last reply. New replies are no longer allowed.