So I’ve got a bit of code that saves some modifications to a file:
$fh = fopen('index.php', 'w');
if ($fh) {
for ($line = 0; $line < count($index); $line++) {
fwrite($fh, $index[$line]);
}
fclose($fh);
$messages[] = array("success","The script was modified to enable $include.");
return TRUE;
}
else {
// Could not save the script -- probably a permissions error.
// Tell the user what the file should look like.
$message = "Could not automatically enable $include. Modify <b>index.php</b> to look like this:\
\
". showCode($index, $start);
$messages[] = array("error",$message);
return FALSE;
}
I’m not getting an error on fopen or fwrite, but it’s clear the file isn’t being writen out. It seems to be a server issue, since the script works on another server; however, I don’t know what to look for.
What I guess I’m asking is, “is there a runtime configuration or other directive that would keep the script from writing the file out on this particular server?”
!&*@$~!$!!! This issue showed up last month with a different piece of software and I didn’t correlate the two. For some !@$# reason both Safari and Firefox are caching pages from this server – or the server itself is caching the pages.
In today’s issue, either the browser wasn’t requesting the script again or the server wasn’t running it again. I want to blame the server, since this problem doesn’t occur with other servers.
I’m not familiar with cache control and PHP – how can I force the server (or browser) to run the script again?
Didn’t get a lot of help from my web host. Seems that the server that my account was moved to recently is sending “Cache-Control: max-age=1209600” for every page request. Using:
Header set Cache-Control: “no-cache”
in my .htaccess file overrides that, but they can’t tell me why the server is behaving badly.