SitePoint Sponsor |
|
User Tag List
Results 26 to 35 of 35
-
Apr 17, 2009, 09:21 #26
- Join Date
- Jul 2008
- Posts
- 5,757
- Mentioned
- 0 Post(s)
- Tagged
- 0 Thread(s)
There's ways to make it more difficult for someone to write to your images directory, but...you have much more glaring issues on a shared server when php runs as the webserver user. For example, while you could store images in the db, but your db isn't really protected either. Anyone on the shared server can easily read your php files to get the db credentials, and then they can modify the db all they want.
Now, some automated virus type script that penetrated some other users website on your server, is not likely to try to go this far. So by adding this "hoop", you at least gain some protection against that, but don't give yourself some false sense of security.
-
Apr 17, 2009, 09:22 #27
- Join Date
- Oct 2008
- Posts
- 167
- Mentioned
- 0 Post(s)
- Tagged
- 0 Thread(s)
Without suPHP or an equivalent PHP CGI wrapper, there's no way to avoid using 777 folders for PHP. You could have a directory created that is owned by the Apache user (nobody) and set the permissions to 755, but you're still not gaining anything. Other PHP scripts from other users on the shared server are running as nobody and would still have write access to that folder.
If you do have to have open directories, I do suggest that you place these folders outside of your DocumentRoot (outside your public_html folder). This way, even is someone does write a malicious script into the open folder, they at least won't be able to run it from the web. This is far from ideal, but does help alleviate some potential problems.CanisHosting - Web Hosting plans starting at $3.95 per month
-
Apr 17, 2009, 09:50 #28
- Join Date
- Nov 2007
- Location
- Malaga, Spain
- Posts
- 1,072
- Mentioned
- 4 Post(s)
- Tagged
- 0 Thread(s)
the problem with that is.
the images in the upload folders are meant for users to browse.
so putting them outside the documentroot removes their access from the images...
unless i use php to "recreate" the images dynamically and then give them to the users, but this is way too slow and resource demanding, especially for large images.
-
Apr 17, 2009, 10:14 #29
-
Apr 17, 2009, 10:26 #30
- Join Date
- Jul 2008
- Posts
- 5,757
- Mentioned
- 0 Post(s)
- Tagged
- 0 Thread(s)
If you have access to cron jobs(so you can run script as your own user account), you could do the uploads to some type of temporary directory that you treat as being volatile(because this dir is writable by "nobody"). Like suggested, this could be outside the web root, or a database could be used etc...
But you could make a web accessible dir thats only writable by your user account, and readable by "nobody" so that the web server can serve files from it. Have a script which runs via cron, which will write stuff into the web accessible dir. The key thing here is your script must be able to identify what should, and should not be copied over. For example, if your criteria is that only images can be copied over, that's pretty easy to enforce reliably. While another user on the system could still get files into your "temp" dir, you at least now have the opportunity to not copy them over to the public web accessible dir.
This is pretty useless complexity if you can't implement some solid criteria about which files get copied over. It also prevents uploads from being realtime, as at best, it will take up to 1 minute for your cron job to run and copy a file over to the web dir.
-
Apr 17, 2009, 10:42 #31
- Join Date
- Nov 2007
- Location
- Malaga, Spain
- Posts
- 1,072
- Mentioned
- 4 Post(s)
- Tagged
- 0 Thread(s)
crmalibu
Have a script which runs via cron, which will write stuff into the web accessible dir.
should, and should not be copied over. For example, if your criteria is that only images can be copied over,
the problem that keep bothering me at night is that php can only upload to a 777 folder (on a shared host), so other users on the server can also copy there... and that is my main gripe.
-
Apr 17, 2009, 11:06 #32
- Join Date
- Jul 2008
- Posts
- 5,757
- Mentioned
- 0 Post(s)
- Tagged
- 0 Thread(s)
Well, first thing to do is find out if you can run a cron job. Not always possible with shared hosts, but theres probably an option in your control panel if you can. You would need to be able to execute php from the command line, so your cron job can directly execute the cgi or cli version of php, which hopefully, the host has installed.
The most you should really expect out of this is that you will be able to filter which files get copied over. Someone can still manually write to the temp dir, but unless its an image, it won't be copied over(dur to you recreating it with imagecreatefrom*() function). It will set up a reasonable barrier to be overcome if they wanted to write arbitrary files directly to the final destination dir, as they would need to escalate thier priveledge to your user, or the root user, which is pretty much catastrophe anyway for the entire server if that happened.
-
Apr 17, 2009, 12:44 #33
- Join Date
- Nov 2007
- Location
- Malaga, Spain
- Posts
- 1,072
- Mentioned
- 4 Post(s)
- Tagged
- 0 Thread(s)
Have a script which runs via cron, which will write stuff into the web accessible dir.
this is a very interesting idea, and i will definitely consider it in the future.
crmalibu
Well, first thing to do is find out if you can run a cron job.
i realize that there is "no way" in the case of this webhost...
unless there is some magical "trick" that someone somewhere might know.
this hosting was recommended and setup for me ages ago, when i was just starting out with web stuff.
so its pretty "functionless"
its cheap and has unlimited traffic though.
crmalibu
to your cron job can directly execute the cgi or cli version of php, which hopefully, the host has installed.
The most you should really expect out of this is that you will be able to filter which files get copied over.
which is pretty much catastrophe anyway for the entire server if that happened.
and this is ok?
-
Apr 17, 2009, 23:03 #34
- Join Date
- Jul 2008
- Posts
- 5,757
- Mentioned
- 0 Post(s)
- Tagged
- 0 Thread(s)
No, 0777 permissions and other levels similar to it are not something to feel warm and fuzzy about, but this is a common reality with shared hosting, espescially the more budget oriented ones.
There's a chance you may be able to do it securely through ftp(php has ftp functions). The "upload" script could ftp upload after logging in as your ftp user. This hopefully would create the files as your ftp user, and you can set the dir to only be writeable by that ftp user, and make them readable by all(so the webserver can later read/serve the files).
This could be slow, cause you gotta wait for the ftp upload now before your php script will finish. The file basically gets uploaded twice(once by the user, from the form, and then uploaded again via ftp by your script). But I think it would work.
-
Apr 18, 2009, 01:24 #35
- Join Date
- Nov 2007
- Location
- Malaga, Spain
- Posts
- 1,072
- Mentioned
- 4 Post(s)
- Tagged
- 0 Thread(s)
crmalibu
There's a chance you may be able to do it securely through ftp(php has ftp functions). The "upload" script could ftp upload after logging in as your ftp user. This hopefully would create the files as your ftp user, and you can set the dir to only be writeable by that ftp user, and make them readable by all(so the webserver can later read/serve the files).
This could be slow, cause you gotta wait for the ftp upload now before your php script will finish. The file basically gets uploaded twice(once by the user, from the form, and then uploaded again via ftp by your script). But I think it would work.
Bookmarks