Results 1 to 7 of 7
Nov 19, 2008, 11:20 #1
avoid fatal error of file_get_contents?
I have a script that uses file_get_contents. However, sometimes the script times out and I get a fatal error. Can I limit file_get_contents execution time and redirect to another page to avoid the fatal error? Please Help. Basically, if the file isn't retrieved within say 10 seconds, I want to redirect.
Last edited by stuffedbuggy; Nov 19, 2008 at 12:25.You know you cooler than me...
Nov 19, 2008, 11:35 #2
- Join Date
- Oct 2006
- France, deep rural.
- 17 Post(s)
- 1 Thread(s)
Is that file on your own server?
If it is not, then why not use cURL to get some meta data about the file first, info I think it is called, before going to the trouble of getting the file if you have persistent network problems.
Nov 19, 2008, 12:01 #3
How can I predetermine how long it's going to take to get the contents?
I've never had much experience with CURL.
Connect_Time_Real??You know you cooler than me...
Nov 19, 2008, 12:09 #4
Nov 19, 2008, 12:12 #5
Interesting. Do you have an example?You know you cooler than me...
Nov 19, 2008, 22:06 #6
After searching all day, I found this. This may be what I need but I don't quit understand how to use it. If anyone knows, I'd appreciate it. here's the link:
http://www.planetmysql.org/entry.php?id=12850You know you cooler than me...
Nov 19, 2008, 22:37 #7
- Join Date
- Mar 2008
- 0 Post(s)
- 0 Thread(s)