Changing PHP settings with ini_set

I am following this link

I am reproducing the code snippet here →

// Make sure the script can handle large folders/files
ini_set('max_execution_time', 600);
ini_set('memory_limit','1024M');.

If we go on the official PHP page →
https://www.php.net/manual/en/info.configuration.php#ini.max-execution-time

It says →

You can not change this setting with ini_set() when running in safe mode. The only workaround is to turn off safe mode or by changing the time limit in the php.ini.

But if we further browse their internal links then the safe mode is deprecated →
https://www.php.net/manual/en/ini.sect.safe-mode.php#ini.safe-mode

My Question:
So if we are trying to make/code something in PHP we should not be worried about “You can not change this setting with ini_set() when running in [safe mode]” as far as we are sure that the latest version of PHP is installed on the servers.

Going Further,

Your web server can have other timeout configurations that may also interrupt PHP execution. Apache has a Timeout directive and IIS has a CGI timeout function. Both default to 300 seconds. See your web server documentation for specific details.

What does that means?

// Make sure the script can handle large folders/files
ini_set('max_execution_time', 600);
ini_set('memory_limit','1024M');

Does that mean that server companies have their wayround to superimpose restriction over what we are doing in the above code line?

This recent Topic may be of interest:

1 Like

Hey there @John_Betong, thanks for sharing, but can we discuss few points that I raised in my OP.

Some parameters changed in php.ini require systemctl restart apache2

This topic is more appropriate - please run the script and notice the results:

1 Like

no, safe mode is removed. Therefore:

You can’t ever be in safe mode in PHP >=5.4, so this statement is moot when your environment is 5.4 or later.

1 Like

Basically: yes

PHP is not fully handling the HTTP request, a web server is (most used with shared hosting is Apache). Any HTTP request hits Apache first and if Apache decides after 30 seconds that that’s enough then it doesn’t matter what the timeout in your PHP is, the process will be stopped.

2 Likes

I believe/assume that in case of the dedicated hosting that limitation wont be there?

Additionally, in shared hosting larges sites are not (actually) hosted. The marketing gimmick is that everything is unlimited, but as storage/traffic/increases they show that no free lunch (= No unlimited Dude).

I will wait for your input.

You always need a Webserver. PHP cannot serve HTTP request by itself.
So you either have Apache, or Nginx, or something similar in front.

If you increase the timeout in PHP, you need to increase the timeout there too.

Why do you want to increase the timeout anyway?

1 Like

For the purpose of learning I was creating a backup of folder and database of a wordpress installation.

I would highly recommend doing that on the command line instead :slight_smile:

2 Likes

Hi, there @rpkamp Command lines are also available in shared hostings?

Depends on your hoster. Some offer it, some don’t. You should ask them.

1 Like

Unsurprisingly, most web hosts don’t want you to be able to tell your script to never time out and run forever, consuming their server’s CPU and RAM resources indefinitely to try and serve your webpage.

3 Likes

But there are various backup plugins that can backup not just GB or TB but PB of data also, including Database.

I sincerely doubt anyone is backing up Petabytes of data via a PHP script. Maybe an actual program that gets STARTED by a PHP script; but that said, anyone who is backing up terabytes/petabytes of data would own the server that they are using, and thus would have the ability to enable an infinite script because they own the server.

1 Like

HI there, they claim to be taking backup till 1TB. Even on most of the shared hosting.

saying you have a max size of 1 PB is like me saying my house can fit 1000 people, or a car has a top speed of 300mph.

Is it true? Sure. Is anyone using it?

1 Like

May be a gimmick, and may not be possible unless some is on dedicated hosting or has full access to the server. I am naive and may lack full understanding. I just shared in case we can brainstorm of any such possibility.

Nice analogy. You nailed it and put in simple words.

Exactly.

If you know of a shared hosting site that will 1) let you have a Petabyte of storage space, and 2) then make a backup copy of that entire Petabyte (so now you’re using up at least 1.4 PB) using a PHP script that 3) doesn’t mind you using their server’s resources for hours on end…

True, and shared hosting will actually never allow such a huge website on shared hosting. Unlimited is a trap. exceed a number and they will say get out of our hosting or move to dedicated cloud etc.

There is nothing like free lunch and unlimited is comparable to a free lunch.

Even if you exceed a certain amount of traffic email will come your website is abusing our servers and is dangerous to stability and scalability.