Hi,
I m inserting a 4000 line text file with end of day stock data. The file is ~170KB.
The code I run inserts the lines fine (albeit, taking around 5 minutes) but always inserts 3 or 4 times the rows (meaning 3 or 4 duplicate rows). So inserting 12000 instead of 4000 for example.
Besides inserting too many rows, it almost always returns a blank page. Server is Zeus I think, running PHP 5.2.14 and an outdated MYSQL.
Any ideas?
<?php
function insert($file) {
//Increase the allowed execution time
ini_set('max_execution_time', 300);
if (($handle = fopen($file, "r")) !== FALSE) {
//Declare an empty string
$values = "";
while (($pieces = fgetcsv($handle, 100, ",")) !== FALSE) {
//Adds new values into one large insert
$values .= "('". example($pieces[0]) ."', '". $pieces[1] ."', '". $pieces[2] ."', '". $pieces[3] ."', '". $pieces[4] ."', '". $pieces[5] ."'),";
}
$string = "INSERT into `table` (value1, value2, value3, value4, value5, value6) values ";
//Deleting the last comma in values
$values = substr_replace($values, '', -1, 1);
//Combine the strings
$query = $string . $values;
$insert = mysql_query($query);
if (!$insert) {
return FALSE;
} else {
return TRUE;
}
}
}
if (insert("example.txt")==TRUE) {
echo "Success";
} else {
echo "Failure";
}
?>