By default Apache’s timeout is 300 seconds.
I want to allow uploading of large files over HTTP, the problem is users will likely hit this limit. Raising it will help but it’s a bit of a security issue (everywhere says to take it down to 60 seconds).
Is there any way to extend it for specific scripts and/or directories?
I’m not sure, but this looks like you could set different timeouts per directory or at least per virtual host:
But if it can’t be extended, is there a way to make it like wget? I remember I can get cut off while downloading a file with wget and it’ll resume where it left off as soon as it can connect again. I dunno if it works the other way around, with uploads.
I’ve run into this, too, but found that it’s a PHP timeout rather than an Apache timeout so I’m moving this to the PHP board.