Okay, here’s my problem, I use the file_get_contents function to make a call to another server and return the response. However there server isn’t all that reliable. Every once in a while I’ll reciecve this:
Warning: file_get_contents(‘url’): failed to open stream: HTTP request failed! ´¸e ĸe in ‘path info’
Is there anyway to set a time limit and stop that warning from being displayed to the user? In other words I want to call a function if the call fails w/o any warnings.
I’m not sure what the return will be if it fails.
Perhaps first you should check if the file exists, then there will be no problems with getting the content.
Its not a matter that the file doesn’t exist, it is a ASP script that i’m calling (get protocol for variables), if there is a problem with there script i don’t want my script to time out.
In theory there is nothing you can do because php is a single thread language but I have got a work around. First though you need to decide what outcome you want if you don’t manage to get those contents.
Now that file will do the call and if no answer will throw an error after 10 seconds.
Now in your present file do this:
$data = file_get_contents(“http://your_domain.tld/file.ext”);
Note the http.
So here’s what happens. Your file has a 30 second timeout. Your file calls to the other local file via http. That local file trys to connect to the remote server. If it fails after ten seconds it will send an error message to your script rather than the data. Now all you need is some error checking to see what that local file sent back to your script.
The problem with that bokehman is that that sets the timeout for the entire script. The reason why 30 seconds is the default is that sometimes a script needs that long to run. Ideally it doesn’t, but you don’t want to cut your legs off do you?
No it does not. The call is being made over http protocol so it’s totally seperate. That’s the whole point. The rules of one script don’t affect the other.
Is there a limit to the to the size of the url called in the file_get_contents?
because now my script is timing out, i set it to 90 seconds (the script should take half that time at the very most)
The url goes as follows: http://server.com/folder/post.aspx?src={xml string}
The xml string is approximately 1,700 - 3,000 characters but i recieve this:
Warning: file_get_contents(http://server.com/folder/post.aspx?src=(first 416 characters) in /some/path/post.php on line 251
Fatal error: Maximum execution time of 90 seconds exceeded in c:\\wamp\\www\\all-acura\\modules\\Dealership\\post.php on line 251
The url string always stops at the same point in the xml string.
If you have a better way of posting the xml string and retrieving the response i’m open to suggestions.
Oh I’m using the urlencode function to encode the xml string.
PS. Thank you for your help, I really do appreciate it!