file_get_contents timeout?

Okay, here’s my problem, I use the file_get_contents function to make a call to another server and return the response. However there server isn’t all that reliable. Every once in a while I’ll reciecve this:

Warning: file_get_contents(‘url’): failed to open stream: HTTP request failed! ´¸e ĸe in ‘path info’

Is there anyway to set a time limit and stop that warning from being displayed to the user? In other words I want to call a function if the call fails w/o any warnings.

Thanks for your help


<?php
@file_get_contents('url');
?>

should do it.

Putting @ in front of the problematic line should hide all warnings arising from its use.

Thanks guys, if it isn’t able to open a connection what will

$result = @file_get_contents('url');

return?

I think it has to be like this:

@$result = file_get_contents('url');

I’m not sure what the return will be if it fails.
Perhaps first you should check if the file exists, then there will be no problems with getting the content.

You are not telling enough about the error. Is it just a warning or is the script timing out?

The @ sign needs to be before the function, not the variable. But wouldn’t this be easiest?

if (!@file_get_contents('url')) {
    echo 'There was an error with file_get_contents';
    exit;
}

Its not a matter that the file doesn’t exist, it is a ASP script that i’m calling (get protocol for variables), if there is a problem with there script i don’t want my script to time out.

If there is a problem this is what is returned:
Warning: file_get_contents(http://www.site.com/PingTier/PingTier.aspx?ID=0002&year=2005&make=Ford&model=Mustang&zip=32817) [function.file-get-contents]: failed to open stream: HTTP request failed! ;cÉoŐ;cÉoÅ in c:\wamp\www\modules\Dealership\step_2.php on line 51

Fatal error: Maximum execution time of 60 seconds exceeded in c:\wamp\www\modules\Dealership\step_2.php on line 51

Will this make two calls or just one? Is there a better way to ping there server?


if (!@file_get_contents('url'))
{
    echo 'There was an error with file_get_contents';
    exit;

} else {

 	$result = @file_get_contents($pingstr);
}

In theory there is nothing you can do because php is a single thread language but I have got a work around. First though you need to decide what outcome you want if you don’t manage to get those contents.

What kind of output does the ASP script produce?

All i’d want to do is log that the call failed into a mySql db.

When the call is properly formated it returns a XML document.

There is a way to do this but it’s a bit flakey! At the minute you are using ‘file_get_contents(“http://remote.tld/file.ext”)’

What you need to do is have a second file on your server that does this check.

In that file you would put this code:

<?php
set_time_limit(10);
print file_get_contents("http://remote.tld/file.ext");
?>

Now that file will do the call and if no answer will throw an error after 10 seconds.

Now in your present file do this:
$data = file_get_contents(“http://your_domain.tld/file.ext”);
Note the http.

So here’s what happens. Your file has a 30 second timeout. Your file calls to the other local file via http. That local file trys to connect to the remote server. If it fails after ten seconds it will send an error message to your script rather than the data. Now all you need is some error checking to see what that local file sent back to your script.

The problem with that bokehman is that that sets the timeout for the entire script. The reason why 30 seconds is the default is that sometimes a script needs that long to run. Ideally it doesn’t, but you don’t want to cut your legs off do you?

No it does not. The call is being made over http protocol so it’s totally seperate. That’s the whole point. The rules of one script don’t affect the other.

The other way of course is to retrieve the remote file using a socket connection and set a timeout on the stream.

I’ve been told that since i’m retrieving the entire file file_get_contents is more efficent that opening a stream.

What is more ineffecient, a script that crashes or one that takes an extra couple of milliseconds to process but has good error control?

good point!

Is there a limit to the to the size of the url called in the file_get_contents?
because now my script is timing out, i set it to 90 seconds (the script should take half that time at the very most)
The url goes as follows: http://server.com/folder/post.aspx?src={xml string}
The xml string is approximately 1,700 - 3,000 characters but i recieve this:


Warning: file_get_contents(http://server.com/folder/post.aspx?src=(first 416 characters) in /some/path/post.php on line 251

Fatal error: Maximum execution time of 90 seconds exceeded in c:\\wamp\\www\\all-acura\\modules\\Dealership\\post.php on line 251

The url string always stops at the same point in the xml string.
If you have a better way of posting the xml string and retrieving the response i’m open to suggestions.

Oh I’m using the urlencode function to encode the xml string.

PS. Thank you for your help, I really do appreciate it!

It shouldn’t matter how long the file is.

  • Are you doing everything correctly when reading the file?

  • Have you tested it in your local test server?

  • Have yo tested your scipt with a normal XML file?