Inserting 3 times the rows and blank page return


I m inserting a 4000 line text file with end of day stock data. The file is ~170KB.

The code I run inserts the lines fine (albeit, taking around 5 minutes) but always inserts 3 or 4 times the rows (meaning 3 or 4 duplicate rows). So inserting 12000 instead of 4000 for example.

Besides inserting too many rows, it almost always returns a blank page. Server is Zeus I think, running PHP 5.2.14 and an outdated MYSQL.

Any ideas?

function insert($file) {

    //Increase the allowed execution time 
    ini_set('max_execution_time', 300); 

    if (($handle = fopen($file, "r")) !== FALSE) {

          //Declare an empty string
          $values = ""; 

          while (($pieces = fgetcsv($handle, 100, ",")) !== FALSE) {
              //Adds new values into one large insert                 
              $values .= "('". example($pieces[0]) ."', '". $pieces[1] ."', '". $pieces[2] ."', '". $pieces[3] ."', '". $pieces[4] ."', '". $pieces[5] ."'),"; 
          $string = "INSERT into `table` (value1, value2, value3, value4, value5, value6) values ";
          //Deleting the last comma in values
          $values = substr_replace($values, '', -1, 1);
          //Combine the strings
          $query = $string . $values;  
          $insert = mysql_query($query);
          if (!$insert) {
              return FALSE;
          } else {
              return TRUE;

if (insert("example.txt")==TRUE) {
    echo "Success";
} else {
    echo "Failure";

Do an

echo $query; die();

just above where you run the query and you should see the problem with your query.

And it’ll be easier to debug of you use a smaller input file with say just 10-20 rows of data.

No errors displayed. It’s doing just 1 duplicate entry now.

When I test on WAMP with a smaller file (400 rows) it works perfectly.

I added:

ini_set('auto_detect_line_endings', true);

Thinking it was failing to read the end of the file, to no avail.

Anyone else?