Web
Article

Are FTP Programs Secure?

By Matthew Setter

Do you deploy or transfer files using FTP? Given the age of the protocol and its wildly popular nature amongst a wide number of hosting companies, it’s fair to say you might.

But are you aware of the security issues this may open up for you and your business? Let’s consider the situation in-depth.

Programs such as FileZilla, CyberDuck, Transmit, or Captain FTP can be secure. They may implement measures such as obscuring passwords from view by those around you. But if you’re transferring data with FTP, these measures are effectively mitigated.

I’ll cut to the chase; the reason I’m writing this is because of an interesting discussion on SitePoint back in August. The discussion focused quite heavily on FileZilla, making a range of assertions as to how insecure it is (or isn’t).

A key aspect of the debate centered around whether you should store your passwords with FileZilla. One of the comments linked to a quite descriptive article which showed that, despite obscuring your credentials when using the software, if you save your credentials, they’re quite easy to retrieve.

If you’ve not read the article, FileZilla stores connection details in a simple XML file, an example of which you can see below.

<?xml version="1.0" encoding="UTF-8" standalone="yes" ?>
<FileZilla3>
    <Servers>
        <Server>
            <Host>localhost</Host>
            <Port>21</Port>
            <Protocol>0</Protocol>
            <Type>0</Type>
            <User>anonymous</User>
            <Pass>user</Pass>
            <Logontype>1</Logontype>
            <TimezoneOffset>0</TimezoneOffset>
            <PasvMode>MODE_DEFAULT</PasvMode>
            <MaximumMultipleConnections>0</MaximumMultipleConnections>
            <EncodingType>Auto</EncodingType>
            <BypassProxy>0</BypassProxy>
            <Name>test site</Name>
            <Comments />
            <LocalDir />
            <RemoteDir />
            <SyncBrowsing>0</SyncBrowsing>test site
        </Server>
    </Servers>
</FileZilla3>

You can see that it’s storing a lot of information about the connection, so that you don’t need to remember it. But note how it stores your password in the clear too?

FileZilla Site Manager

Sure, when you use the program, it obscures the password, as shown in the screenshots above, so that it can’t be read over your shoulder.

But there’s little point when you can just lift it from the computer, should you have access. To be fair, in the latest version of FileZilla, storing passwords is disallowed by default.

What About Encrypted Configuration Files?

People suggested that at the very least, the configuration files should be encrypted or be set up in such a way as to ask for a master password before access was granted, much like 1Password and KeePassX do.

Louis Lazaris then linked to a discussion on Stack Exchange, which attempted to counter the position. Here’s the core of the post:

You see, encrypting the credentials requires an encryption key which needs to be stored somewhere. If a malware is running on your user account, they have as much access to what you (or any other application running at the same level) have. Meaning they will also have access to the encryption keys or the keys encrypting the encryption keys and so on.

I believe the above assertion doesn’t fully appreciate the design considerations of programs such as the two listed above. Applications which are specifically designed to be a secure vault for passwords, and other secure information, would likely not be as easy to crack as this answer implies.

For example, a recent blog post from 1Password lists a number of the key mechanisms employed in the fight against crackers.

These include 128 and 256-bit symmetrical keys, SHA512 and PBKDF2 encryption – along with a range of other features employed to protect the data files being accessed, all while retaining the ease of use and simplicity of them.

So to infer that employing secure encryption vaults is not really any more secure is incorrect, especially given all of these techniques available.

It’s FTP, Not Your App!

But the arguments of whether credentials should or shouldn’t be saved is moot, as there’s a key point which using FTP in the first place overlooks – your credentials and data are sent in the clear. Don’t believe me? Have a read of Why is FTP insecure, on the Deccanhosts blog.

If you weren’t aware of it, through using a simple packet sniffer, such as Wireshark, you can retrieve not only the username and password used, but any other credentials stored in the files being sent, along with the algorithms, database structures, and anything else stored there.

Given the fact that, for quite some time, it’s been common practice to store this information in .ini and config files, I’d suggest quite a large amount of readily downloaded software, such as WordPress, Joomla, etc., will be developed in such a fashion.

FTP was never designed with security in mind; it was designed as a public service. Inherent in this design were a series of further assumptions, which also didn’t take security into consideration. Enrico Zimuel, senior software engineer at Zend, even goes so far as to say: Never use FTP – ever!

Yes, security changes came later, but they were added on — not built-in. There’s no protection against brute-force attacks and while SSH tunneling is possible, it’s difficult, as you need to encrypt both the command and data channels. As a result, your options are limited. And when you seek to implement them, the difficulty factor isn’t always trivial.

Are you a Webmaster? Do you enable a chroot jail for your FTP users? If you’re not familiar with the term chroot, it’s a way of limiting user movement and access. From the directory they log into, they can descend into any sub-directory but can’t move outside of it.

Alternate Options to FTP

Before I convince you that it’s all doom and gloom – it’s not. A number of the FTP programs around today – especially the ones referenced earlier – also support some more secure derivatives of and alternatives to FTP. Let’s have a look at them.

FTPS and SFTP

FTPS is Secure FTP, much like HTTPS is secure HTTP, and runs over SSL (Secure Sockets Layer) and TLS (Transport Layer Security). The user credentials and data are no longer sent in the clear; instead they are encrypted before they’re transmitted.

Client software also has the flexibility, if it’s allowed by the server, to encrypt only parts of the communication, not all of it. This might seem counterintuitive based on the discussion so far.

But if the files being transferred are already encrypted, or if no information of a sensitive nature is being transferred, then it’s likely ok not to incur the overhead that encryption requires.

However, switching to FTPS does come at a cost (and a price). Using FTPS involves generating either a self-signed SSL certificate, or purchasing one from a trusted certificate authority. So better security is available, but there’s a greater amount of effort and cost involved.

But before you shy away, ask yourself how much your information is worth to your business? That might convince you to preservere.

Now let’s look at SFTP. SFTP, or SSH File Transfer Protocol, works differently to FTPS. Designed as an extension of SSH 2.0, SFTP creates a normal FTP connection but executes it over an already encrypted connection. The FTP data stream itself is no more secure than normal FTP, however, the connection over which it operates is more secure.

SSH, SCP and Other Login Shells

If you’re going to move away from FTP, why take half measures? Why use FTP at all? If you’ve installed SFTP, you’ve installed the SSH tools; these give you access to a wide array of functionality.

Starting at the top with SSH itself, this provides full user access to the remote system, letting them do more than standard FTP ever would, or could. The connection is secure and data can be copied from one system to another quite easily.

If you’re a bit of a command line guru, you can even use a tool such as Rsync over SSH.

In a simple use case, it can be used to recursively copy all files from a local directory to a directory on a remote machine. The first time it’s run, all files are copied over.

The second and subsequent times, it checks for file differences, transferring only the differences, newer files, and optionally removing files and directories on the remote machine no longer present locally.

The problem is that granting this kind of access is in itself a security issue waiting to happen. But the effects can be mitigated. OpenSSH allows for a number of configuration choices, such as disallowing root access, limiting the users who can login remotely, and chroot’ing users to specific directories.

Perhaps users don’t need to be on the remote machine in the first place or don’t need many privileges while they’re there. If that’s the case, and it likely is, you can pick from a number of shells, designed to accommodate these situations.

Two of the best are scponly and rssh. Scponly only allows a user to copy files to a remote machine.

The user can’t login, move around, look at, or change files. What’s great is that it still works with rsync (and other tools). rssh goes a bit further, allowing access to SCP, SFTP, rdist, rsync and CVS.

To implement it, a systems administrator need only change the user’s shell, with their tool of choice, and then edit /etc/rssh.conf, listing the allowed protocols. Here’s an example configuration:

allowscp
allowsftp

This configuration allows users to use only SCP and SFTP.

SSH Keys

Next, let’s consider SSH keys. The process takes a bit of an explanation, but I’ll try and keep it quick and concise, paraphrasing heavily from this answer on Stack Exchange:

First, the public key of the server is used to construct a secure SSH channel, by enabling the negotiation of a symmetric key which will be used to protect the remaining session, enable channel confidentiality, integrity protection and server authentication. After the channel is functional and secure, authentication of the user takes place.

The server next creates a random value, encrypts it with the public key of the user and sends it to them. If the user is who they’re supposed to be, they can decrypt the challenge and send it back to the server, who then confirms the identity of the user. It is the classic challenge-response model.

The key benefit of this is that the private key never leaves the client nor is any username or password ever sent. If someone intercepts the SSL traffic, and is able to decrypt it (using a compromised server private key, or if you accept a wrong public key when connecting to the server) – your private details will never fall in the hand of the attacker.

When used with SCP or SFTP, this further reduces effort required to use them, while increasing security. SSH keys can require a passphrase to unlock the private key, and this may seem to make them more difficult to use.

But there are tools around which can link this to your user session, when you log in to your computer. When set up correctly, the passphrase is automatically supplied for you, so you have the full benefit of the system.

What About Continuous Delivery?

Perhaps you’ve not heard the term before, but it’s been floating around for some time now. We’ve written about it on SitePoint before, as recently as last week. Coined by Martin Fowler, Continuous Delivery is defined as:

A software development discipline where you build software in such a way that the software can be released to production at any time.

There’s many ways to implement it, but services such as Codeship and Beanstalk go a long way to taking the pain away.

Here’s a rough analogy of how they work. You set up your software project, including your testing code and deployment scripts, and store it all under version control. I’ll assume you’re using an online service, such as GitHub or Bitbucket.

When a push is made to either of these services, after a commit or release is made in your code branch, the service then runs your application’s tests. If the tests pass, then a deployment of your application is made, whether to test or production.

Assuming everything went well, it would then take care of rolling out a deployment for you automatically. You’d be notified afterwards that the deployment had either succeeded or failed.

If it succeeded, then you can continue on with the next feature or bug fix. If something went wrong, you can check it out to find the cause of the issue. Have a look at the short video below, showing the deployment of a test repository in action with Codeship.

What did you have to do? Push a commit to the Github repository – that’s it! You don’t need to remember to run scripts, where they are, what options and switches to pass to them (especially not late on a Friday evening, when you’d rather be anywhere but doing work).

I appreciate that’s rather simplistic and doesn’t cover all the options and nuances, but you get the idea.

The Problem of Human Error

Let’s finish up by moving away from the basic security concerns of using FTP, to the effectiveness of doing so on a day-to-day basis. Let’s say, for example, that you’re developing a website, say an e-commerce shop, and your deployment process makes use of FTP, specifically FileZilla.

There’s a number of inherent issues here, relating to human error:

  • Will all the files be uploaded to the right locations?
  • Will the files retain or obtain the required permissions?
  • Will one or two files be forgotten?
  • Is there a development name that needs to be changed in production?
  • Are there post deployment scripts that need to be run?

All of these are valid concerns, but are all easily mitigated when using continuous delivery tools. If it’s late, if the pressure’s on, if the person involved is either moving on from the company or keen to get away on holidays, manually transferring files over FTP is asking for trouble.

Ok, manually transferring files, period, is asking for trouble. Human error is just too difficult to remove.

Quick Apology to FileZilla

I don’t want to seem like I’m picking on FileZilla. It’s a really good application and one I’ve made good use of over a number of years. And there have been techniques used to attempt to make it more secure.

The key point I have is with FTP itself, not necessarily with FileZilla alone.

Wrapping Up

So this has been my take on the FTP security debate. My recommendation — just don’t use it; what’s more, when managing deployments, keep security in mind. After all, it’s your data.

But what are your thoughts? Do you still use FTP? Are you considering moving away? Share your experiences in the comments and what solutions you’ve tried so we can all work towards a solution that’s practical and easy to use.

Further Reading and Resources

  • Dagwood

    I’m sure I’m not the only person using Filezilla specifically because of the fact that it stores site passwords in plain text :/

  • Chris

    I used to be a heavy user of FileZilla a year or two ago but have since switched to SSH for all of my transfer tasks along with using a different port other than just the default of 21, for services such as Amazon I use IAM so I can log in without needing to use my master account credentials and anything else that doesn’t offer this I use an alternate login to my main account but keep my passwords locked away within 1Password.

    For me it’s just easier and avoids any security risks posed, and for me (not sure about others) the transfer speeds are just as fast.

    Just my 2 cents.

    • http://www.matthewsetter.com/ Matthew Setter

      @ChrisUpjohn:disqus i hear you mate. After a bit of work to transition to a different approach, whether that’s using something like Putty Agent, IAM, 1Password or something else entirely, it often becomes second nature. One of the best things about doing so, as you rightly say, is peace of mind.

  • Jameos

    I learned this the hard way with WS_FTP, and stupidly saving the passwords for each site. The .ini file that houses all of the info got swiped, and roughly 18 sites I maintained got very badly hacked. I will NEVER make that kind of mistake again!

  • http://www.zacksdomain.com/ Zack Wallace

    I admit I use FileZilla because it’s an awesome FTP client in its own right. And while I do use WinSCP for file operations as well, plain FTP is just so much faster with transfer speeds.

    Maybe I’m being nuts, but if someone literally gets onto my PC locally, then why am I worried about plain text passwords? They will have instant access to my entire lastpass directory, all my spreadsheets and data, any “always-on” services that run in the background like dropbox and mediafire and email which also contain private data. They will already be able to wreck complete havoc. So why worry about one file with plaintext passwords in it?

    The bigger issue is to make sure your entire system itself is secure. Make sure nobody remotely takes control, or can sit at your desk and fool around if you’re away.

    Let’s face it, you can only make your computer so secure when you’re sitting there and logged in. Am I supposed to have to type my lastpass password EVERY time I want to log in somewhere? Type a password each time I open a cloud folder? Type a password each time a private service that I keep open, refreshes its page, type my password each time email wants to check for messages? One can only be so paranoid at the end of the day.
    You can also open Firefox or IE and see a plain text list of saved passwords there. If you are logged in to any websites, you can usually change the password without having to go through any other steps. Email clients are usually automatically logged in, so a bad guy sitting at my computer could reset any number of passwords once they can get in my email.

    So is the fact that FileZilla has an XML file with passwords in the clear really that big of a deal considering? After all, you have to enter those passwords in at some point, and if they are hidden and encrypted, this just means you’ll store those passwords somewhere else in a file anyway.

    Do you have PuTTY? Or WinSCP? Or any program that also can save connections and auto-login? You’re screwed there too then.

    Maybe I’m a weird one too, but once I’ve logged in to my own computer, I want all manner of conveniences. I don’t want to be bothered with “mother may I” password prompts 149 times a day because every tool I use wants to be uber secure and hide everything from me.

    Of course, none of my comments have to do with programs encrypting their own data. But even if FileZilla kept its data encrypted, all you have to do is export the data and you’re good to go anyway so what difference does it make? I don’t need your LastPass password but if you’re already logged in, I’ll just export it (unless it asks for password again :)

    • http://www.matthewsetter.com/ Matthew Setter

      Hi Zack Wallace thanks for such a detailed reply. I do understand the point you’re making about the convenience factor.

      On the point of entering passwords repeatedly, I don’t use Windows often, but when I do, the Windows Authorisation dialogs really annoy me, so I understand your pain. I think the way that Windows approaches higher level authentication’s wrong btw — but that’s a story for another day.

      There are ways to avoid repeatedly entering credentials; services are available such as Putty Agent, which only require you to login once and for the life of the session, which can be linked to you being logged in, they retain your credentials and transparently supply them on your behalf. So you get the benefit of convenience and security in one.

      Regarding speed, the reason that FTP is (and likely always will be) faster than any form of secure FTP, or file transfer, is because there’s no overhead involved in encrypting the connection or information transmitted in the session.

      As a result, services FTP will always win a raw speed test. But if you use plain FTP, you leave yourself wide open to people sitting between your computer and the server you’re connecting to and sniffing the information from the wire. You’d never know they were there. So using secure protocols is a trade off I’m prepared to make.

      I believe several changes need to happen. Firstly, the vendors need to take more care about information which their applications store and how it’s accessed. And then certain protocols need to be phased out, namely FTP.

      Thanks again for sharing, and I hope this has helped cast a light on why FTP isn’t the best choice and what can be done to mitigate some of the inconveniences of using alternative options.

    • Marc

      “Maybe I’m being nuts, but if someone literally gets onto my PC locally, then why am I worried about plain text passwords?”

      I can answer this. I administer multiple servers and I had to give access to a designer on one of them so she could upload some changes she had to do. I initially didn’t want to do it, but my client, whom I administer the server for, said I _had_ to do it. I told him I wouldn’t be responsible if anything “bad” happened.

      Just a week later, some spam-sending scripts were uploaded to the vhost I gave access the designer to. The scripts started sending spam like crazy, fullfilling the postfix send queue and exhausting all available disk space in the VPS, making all the sites in there fail, email not working, etc… The server’s IP was also added to most of the anti-spam lists in the internet. The attacker also automatically added malicious code at the end of all javascript files he could find, making the site be marked as “malware” by IE and Chrome and services like Google and Yandex.

      After investigation, I saw the files were uploaded with her FTP account. I immediatelly contacted her and asked if she knew what happened. She said a “virus” infecter PC some days ago. Apparently one of the things this “virus” did was check for stored FileZilla passwords (stored in plain text) and send them to a remote server. Then the saved FTP credentials were used to upload the spam-sending scripts, which were called periodically from russian IPs.

      Do you know who had to fix everything and try to get the server off the anti-spam lists? Yes, it was ME. Do you know who will never give FTP credentials to anyone again or resign to continue administering a server for a client? Yes, it’s me too.

      Hope this helped answering your question and helped you start using a FTP client that stores encrypted passwords.

      • http://www.zacksdomain.com/ Zack Wallace

        That’s quite a story. It’s exactly the thing that people argue about can happen with FileZilla. Their defense is that the “bad guy”, having infected your computer, can simple “watch over your shoulder” and steal the passwords anyway once you’ve unlocked them.

        Another issue, if the passwords are sent over plain FTP, it is cleartext anyway, so the baddie can steal them in transit if not over SSL.

        If there is some sort of private key that is stored locally, well the bad program can just find THAT and unencrypt them. The list goes on.

        At some point, the issue really is protecting your computer itself, just don’t get infected. But at the end of the day, sophisticated programs that have infected your computer and can do anything you can do except literally type in a password to unlock something.

        So what are we to do? Be required to retype our password every time ANY program with ANY kind of sensitive data performs ANY viewing or use of said private data? Nobody will live like that. Simply having encrypted passwords in FileZilla would not solve all security issues. But I agree it’s a start.

        This conversation is still going on with FZ, and they are still refusing to add encryption.

        https://forum.filezilla-project.org/viewtopic.php?f=2&t=30765

  • http://www.matthewsetter.com/ Matthew Setter

    Jameos, yeah, not a fun position to be in I’m sure. Thanks for taking the time to share.

  • http://www.matthewsetter.com/ Matthew Setter

    It is a good app, and they’ve made improvements as time’s gone on. I want to be fair to them. The issue I have really is with the use of FTP itself and the risk that it exposes you to, irrespective of whichever front end tool you’re using, even if that’s in the terminal.

  • Matt M.

    Wow, I had no idea FileZilla stored passwords in plain text. I swear it wasn’t always that way (ie. they used to encode the password in the xml file). And if my memory is correct, what a bizarre step back.

  • lucasrolff

    You write: “Now let’s look at SFTP. SFTP, or SSH File Transfer Protocol, works differently to FTPS. Designed as an extension of SSH 2.0, SFTP creates a normal FTP connection but executes it over an already encrypted connection. The FTP data stream itself is no more secure than normal FTP, however, the connection over which it operates is more secure.”

    Please when writing a blog post on sitepoint, learn about the different protocols, how they work and what they do.

    SFTP is not a normal FTP connection with encrypted authentication. SFTP is it’s own protocol, and has NOTHING to do with FTP, also SFTP doesn’t have authentication, that’s why it’s a subsystem of SSH (for same reason you enable it in sshd config, and you don’t need an FTP server to use SFTP).

    You can look up the drafts and RFCs for SSH File Transfer Protocol.

    TL;DR:
    SFTP != SSH + FTP – Correct article please.

    • LouisLazaris

      Thank you. Can you clarify your view a little more here? I don’t think Matthew is saying that it’s the same protocol. I believe what he’s saying is that in this unique protocol there is a file transfer connection created, and it’s secure. Is that not correct? Wikipedia says that SFTP was designed…

      as an extension of the Secure Shell protocol (SSH) version 2.0 to provide secure file transfer capabilities.

      I didn’t do the primary technical review for this article, and I don’t claim to be an expert, I’m just trying to understand exactly what it is you think he said wrong. It could be his explanation is semantically incorrect, but maybe he had the correct thought in mind. I’m sure Matthew will respond and if necessary we’ll make a correction to the article.

      In the meantime, I’ll do some further research (I know Wikipedia isn’t always the best source). Thanks.

      • lucasrolff

        Once again, he writes: “SFTP creates a normal FTP connection but executes it over an already encrypted connection. The FTP data stream itself is no more secure than normal FTP”

        Normal FTP connection – this is not true, since it doesn’t create a normal FTP (File Transfer Protocol) connection over an already encrypted connection.

        The data stream you also have with SFTP (SSH File Transfer Protocol) is actually encrypted, since the data stream is sent over SSH, and SSH by default is more secure than FTP (File Transfer Protocol).

        So please don’t write FTP (File Transfer Protocol) anywhere related to SFTP (SSH File Transfer Protocol), neither write that the data stream is not secure – since it is secure, and encrypted with the encryption ciphers defined in your sshd config.

        Surely you create a connection to transfer files when you do SFTP (SSH File Transfer Protocol) since else you wouldn’t be able to transfer any files, but it really needs to be written correct, else people like me will come and complain, and people without much knowledge will then get the feeling that SFTP (SSH File Transfer Protocol) only encrypts authentication, but all data is transferred normally.

        And yes, wikipedia writes that SFTP (SSH File Transfer Protocol) is an extension for SSH to provide file transfer capabilities, and thats completely true, since it’s provided as a subsystem in SSH, but that doesn’t make it use “FTP” (File Transfer Protocol).

        To be short:
        SFTP is not SSH with FTP, the data stream using SFTP is secure using encryption ciphers defined on the server, the text should be corrected, so no one is mistaken, and thinks that SFTP uses or relates to FTP, it’s completely standalone.

  • Richard

    Not sure why the Continuous Delivery section was in this article – did it get kept in by mistake?

Recommended
Sponsors
Because We Like You
Free Ebooks!

Grab SitePoint's top 10 web dev and design ebooks, completely free!

Get the latest in Front-end, once a week, for free.