SitePoint Sponsor

User Tag List

Page 2 of 2 FirstFirst 12
Results 26 to 35 of 35
  1. #26
    SitePoint Wizard bronze trophy
    Join Date
    Jul 2008
    Posts
    5,757
    Mentioned
    0 Post(s)
    Tagged
    0 Thread(s)
    There's ways to make it more difficult for someone to write to your images directory, but...you have much more glaring issues on a shared server when php runs as the webserver user. For example, while you could store images in the db, but your db isn't really protected either. Anyone on the shared server can easily read your php files to get the db credentials, and then they can modify the db all they want.

    Now, some automated virus type script that penetrated some other users website on your server, is not likely to try to go this far. So by adding this "hoop", you at least gain some protection against that, but don't give yourself some false sense of security.

  2. #27
    SitePoint Zealot
    Join Date
    Oct 2008
    Posts
    167
    Mentioned
    0 Post(s)
    Tagged
    0 Thread(s)
    Quote Originally Posted by YuriKolovsky View Post
    so any ideas how to upload images using php on a "shared server" without SSH or root access, without compromising security?
    Without suPHP or an equivalent PHP CGI wrapper, there's no way to avoid using 777 folders for PHP. You could have a directory created that is owned by the Apache user (nobody) and set the permissions to 755, but you're still not gaining anything. Other PHP scripts from other users on the shared server are running as nobody and would still have write access to that folder.

    If you do have to have open directories, I do suggest that you place these folders outside of your DocumentRoot (outside your public_html folder). This way, even is someone does write a malicious script into the open folder, they at least won't be able to run it from the web. This is far from ideal, but does help alleviate some potential problems.
    CanisHosting - Web Hosting plans starting at $3.95 per month

  3. #28
    Hibernator YuriKolovsky's Avatar
    Join Date
    Nov 2007
    Location
    Malaga, Spain
    Posts
    1,072
    Mentioned
    4 Post(s)
    Tagged
    0 Thread(s)
    the problem with that is.
    the images in the upload folders are meant for users to browse.
    so putting them outside the documentroot removes their access from the images...

    unless i use php to "recreate" the images dynamically and then give them to the users, but this is way too slow and resource demanding, especially for large images.

  4. #29
    Theoretical Physics Student bronze trophy Jake Arkinstall's Avatar
    Join Date
    May 2006
    Location
    Lancaster University, UK
    Posts
    7,062
    Mentioned
    2 Post(s)
    Tagged
    0 Thread(s)
    Quote Originally Posted by sparek View Post
    If you do have to have open directories, I do suggest that you place these folders outside of your DocumentRoot (outside your public_html folder). This way, even is someone does write a malicious script into the open folder, they at least won't be able to run it from the web. This is far from ideal, but does help alleviate some potential problems.
    Of course, one needs to ask how they'll know the path to his images folder in the first place, without permissions to execute?
    Jake Arkinstall
    "Sometimes you don't need to reinvent the wheel;
    Sometimes its enough to make that wheel more rounded"-Molona

  5. #30
    SitePoint Wizard bronze trophy
    Join Date
    Jul 2008
    Posts
    5,757
    Mentioned
    0 Post(s)
    Tagged
    0 Thread(s)
    If you have access to cron jobs(so you can run script as your own user account), you could do the uploads to some type of temporary directory that you treat as being volatile(because this dir is writable by "nobody"). Like suggested, this could be outside the web root, or a database could be used etc...

    But you could make a web accessible dir thats only writable by your user account, and readable by "nobody" so that the web server can serve files from it. Have a script which runs via cron, which will write stuff into the web accessible dir. The key thing here is your script must be able to identify what should, and should not be copied over. For example, if your criteria is that only images can be copied over, that's pretty easy to enforce reliably. While another user on the system could still get files into your "temp" dir, you at least now have the opportunity to not copy them over to the public web accessible dir.

    This is pretty useless complexity if you can't implement some solid criteria about which files get copied over. It also prevents uploads from being realtime, as at best, it will take up to 1 minute for your cron job to run and copy a file over to the web dir.

  6. #31
    Hibernator YuriKolovsky's Avatar
    Join Date
    Nov 2007
    Location
    Malaga, Spain
    Posts
    1,072
    Mentioned
    4 Post(s)
    Tagged
    0 Thread(s)
    crmalibu
    Have a script which runs via cron, which will write stuff into the web accessible dir.
    i have no idea on how to do this...:S

    should, and should not be copied over. For example, if your criteria is that only images can be copied over,
    there is no problem related to the php script, it copies images (and it checks that they are images) to a temp folder, then imagejpegcreate's new ones, just to doublecheck that they are images in reality and then deletes the original ones.

    the problem that keep bothering me at night is that php can only upload to a 777 folder (on a shared host), so other users on the server can also copy there... and that is my main gripe.

  7. #32
    SitePoint Wizard bronze trophy
    Join Date
    Jul 2008
    Posts
    5,757
    Mentioned
    0 Post(s)
    Tagged
    0 Thread(s)
    Well, first thing to do is find out if you can run a cron job. Not always possible with shared hosts, but theres probably an option in your control panel if you can. You would need to be able to execute php from the command line, so your cron job can directly execute the cgi or cli version of php, which hopefully, the host has installed.

    The most you should really expect out of this is that you will be able to filter which files get copied over. Someone can still manually write to the temp dir, but unless its an image, it won't be copied over(dur to you recreating it with imagecreatefrom*() function). It will set up a reasonable barrier to be overcome if they wanted to write arbitrary files directly to the final destination dir, as they would need to escalate thier priveledge to your user, or the root user, which is pretty much catastrophe anyway for the entire server if that happened.

  8. #33
    Hibernator YuriKolovsky's Avatar
    Join Date
    Nov 2007
    Location
    Malaga, Spain
    Posts
    1,072
    Mentioned
    4 Post(s)
    Tagged
    0 Thread(s)
    Have a script which runs via cron, which will write stuff into the web accessible dir.
    after looking at that phrase for an hour, only now do i get it.
    this is a very interesting idea, and i will definitely consider it in the future.

    crmalibu
    Well, first thing to do is find out if you can run a cron job.
    no, i can't.
    i realize that there is "no way" in the case of this webhost...
    unless there is some magical "trick" that someone somewhere might know.

    this hosting was recommended and setup for me ages ago, when i was just starting out with web stuff.
    so its pretty "functionless"
    its cheap and has unlimited traffic though.

    crmalibu
    to your cron job can directly execute the cgi or cli version of php, which hopefully, the host has installed.
    i was told that i have to install it myself...

    The most you should really expect out of this is that you will be able to filter which files get copied over.
    that is not very motivating : )

    which is pretty much catastrophe anyway for the entire server if that happened.
    so the only way i can have php upload images to the upload folder is by having that folder have 777 permissions (shared hosting)?
    and this is ok?

  9. #34
    SitePoint Wizard bronze trophy
    Join Date
    Jul 2008
    Posts
    5,757
    Mentioned
    0 Post(s)
    Tagged
    0 Thread(s)
    No, 0777 permissions and other levels similar to it are not something to feel warm and fuzzy about, but this is a common reality with shared hosting, espescially the more budget oriented ones.

    There's a chance you may be able to do it securely through ftp(php has ftp functions). The "upload" script could ftp upload after logging in as your ftp user. This hopefully would create the files as your ftp user, and you can set the dir to only be writeable by that ftp user, and make them readable by all(so the webserver can later read/serve the files).

    This could be slow, cause you gotta wait for the ftp upload now before your php script will finish. The file basically gets uploaded twice(once by the user, from the form, and then uploaded again via ftp by your script). But I think it would work.

  10. #35
    Hibernator YuriKolovsky's Avatar
    Join Date
    Nov 2007
    Location
    Malaga, Spain
    Posts
    1,072
    Mentioned
    4 Post(s)
    Tagged
    0 Thread(s)
    crmalibu

    There's a chance you may be able to do it securely through ftp(php has ftp functions). The "upload" script could ftp upload after logging in as your ftp user. This hopefully would create the files as your ftp user, and you can set the dir to only be writeable by that ftp user, and make them readable by all(so the webserver can later read/serve the files).

    This could be slow, cause you gotta wait for the ftp upload now before your php script will finish. The file basically gets uploaded twice(once by the user, from the form, and then uploaded again via ftp by your script). But I think it would work.
    this is certainly an idea. very interesting one as well, ill look into it.


Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •