I'm working on developing a web app that will let a recording artist share and send recorded sound files with his clients.
Ideally I'd have the artist's clients upload their large files via FTP to the server (files are around 1GB--web browser uploader wouldn't work). The artist would then record his part, upload it to the server, and his clients could log in to the site and download it via HTTP or FTP.
My question is what would be the best way to manage the incoming and outgoing files? I have two possible ideas:
1. Set up a PHP or bash script that is triggered whenever a new file comes into the uploads folder and make the clients name their files a certain way so that they get sorted (e.g. client01-recording.zip would get moved to client1/ so the artist could get it (and so the uploads folder would stay clean and anonymous)
2. Somehow wrangle the server to allow different FTP users and sort based on that (e.g. firstname.lastname@example.org would put the file in client1/)
What would be the best (and most secure) way to manage something like this?
Right now the only way to do it is to use the cPanel and add accounts manually, linking them to different home folders. Sadly, it looks like this is the only way I can get this to work unless I roll my own FTP system...
If you roll your own FTP server, then you probably need a dedicated server. If you do choose this route, there are some FTP server libraries that you can use to hack together a reliable FTP server quickly. Python has pyftplib for example.
If you're going to stick with a shared hosting environment, then you probably want to do option #1, because it's much easier and and more easily maintainable.