We teamed up with SiteGround
To bring you up to 65% off web hosting, plus free access to the entire SitePoint Premium library (worth $99). Get SiteGround + SitePoint Premium Now

Wget is a tasty utility on Linux and Mac OS X systems that can come in handy for web system administrators.

Wget — found on the GNU.org site — is a command line application for file retrieval over ftp, http and https connections.

I find it useful for downloading files directly to a server I am working on in a shell session, saving time instead of downloading to my local desktop and uploading. Additionally, since it can pass user names and passwords, it is powerful for use in web site migrations, setting up mirrored sites and more.

Finally, Wget can be scheduled using cron, so if a file or directories need replicated on a regular basis, it can be set to do so without adminstrator intervention.

Some useful examples for utilizing Wget:

1) Downloading a remote file – Perhaps you are downloading an update to an application and have been sent the url. In this case you could use either ftp or http to retrieve:


wget http://somedomain.com/public/remotefilename.tar.gz
or wget ftp://somedomain.com/public/remotefilename.tar.gz

Wget over ftp defaults to binary (i mode on ftp lingo), however, of you need to use ascii mode, you simply add ‘;type=a’ (without quotes) onto the end of the ftp url example above.

2) Downloading with authentication – you may be updating a registered application requiring a user name and password to access. Change the syntax as shown below:


wget username:password@http://somedomain.com/reg/remotefilename.tar.gz
or wget username:password@ftp://somedomain.com/reg/remotefilename.tar.gz

3) Inserting custom ports into the wget request – perhaps your download will require a custom port along with authentication. Wget easily handles this as well by inserting a colon and portnumber afrter the host and before the /path to file(s):


wget username:password@http://somedomain.com:portnumber/reg/remotefilename.tar.gz
or wget username:password@ftp://somedomain.com:portnumber/reg/remotefilename.tar.gz

4) Entire directories can also be migrated from one server to another, i.e. moving a web site to new hardware. I have found ftp access to be most effective for this. I also make use of logging (the -o option) the transfer in the event debugging or verification of file retrieval is needed, and use the recursive option (-r) to recreate the directory structure on the new server.

So if I am moving mydomain.com — I would use:


wget -o mylogfile -r myuser:mypass@ftp://mydomain.com/

If you have an ftp user that can see more than one domain, insure you specify the path to the files and directories for the domain you are moving.

There are several other interesting and useful options including:

–passive-ftp: for using wget behind firewalls

-nd: does not recreate the directory structure on the remote machine and instead simply saves all retrieved files into the current local directory.

–cookies=on/off: if the remote site requires cookies to be on or off to retireve files (helpful with authentication at times)

–retr-symlinks: Will retrieve files pointed to by symbolic links.

There are several other powerful features in Wget, and fortunately, the manual included offers excellent examples. Simply run man wget on the command line to review.