Hi all,
I have a 17 megabyte database with 40,000+ rows that I run with a script called update.php. This script that queries the database uses 108 megabytes of memory just to pull all database rows, and 111 megabytes total for the entire script. I recently increased my memory_limit to 150MB.
Can you look at this code and tell me if there is a way to optimize it to decrease memory usage? The following function accounts for 108 MB of memory.
function PullDatabase($database) {
$result = mysql_query("SELECT * FROM $database") or die(mysql_error());
$i = 0;
while($row = mysql_fetch_assoc($result)){
$data[$i] = $row;
$data[$i][recorded_at] = UnixTime($row[recorded_at]);
$i++;
}
return $data;
}
$data = PullDatabase('dbname');
Is there anyway to optimize this function while giving the same output?
Once the database is pulled in my script, I run it through about 15 functions to pull data from it. I figured it would be less intensive if I pulled all the rows/columns once, and used the $data array in all my functions (even if the function does not require all the selected rows).
Would it be better to query the database individually for each function, calling only the required columns?
Thanks
Brandon