My hosting is shared, their rule is at most 30 the set_time_limit , I already tried in several ways changing in cpanel or .htaccess I have many lines in different files to save.
Currently I am cutting the contents of the files in several files so as not to exceed the time:
$lines = file(''.get_template_directory_uri() . '/lines1.csv', FILE_IGNORE_NEW_LINES);
foreach ($lines as $line_num => $line){
//here is some code for save you content line
}
But, someone told me to use the code:
exec("php csv_import.php > /dev/null &");
This would run only a single file .csv in the background instead of multiple files , without having problems with exceeding time limit
It is the first time I see about shell and php, and I have doubts on how to work
I have to create a file csv_import.php with the normal php code? But how do I run this in the shell of my server?
Thank you very much, that was very enlightening.
But I was already dividing the csv file into several files and asked in another forum how I could do to run in a single file. And I received this answer:
call it in the background command line scripts have no time limit exec("php csv_import.php > /dev/null &");
True, the script running in the command line itself doesn’t have a time limit, but the script that is calling it is not on the command line, so still prone to the time limit.
Do you have to ability to install cron jobs on your hosting?