I am trying to store some large files inside the database (they are emails, with attachments). I know this istn't necessary a great idea; but there are a number of reasons why I need to do this.

Now, inserting a large file works fine. The problem is I am trying to take account of a low max_allowed_packet.

So what I am doing is getting 90% of the max_allowed_packet amount and inserting that to create the row. Then I split the remaining amount up and do:

UPDATE email SET source = concat(source, $variable) WHERE id = $id

This should work pretty simply; but it I am getting very weird behaviour.

Its hard to replicate exactly what is going on, but I often end up with either nothing in the source column, or I get a small amount of data (possibly from the last query).

I tried running the queries that where generated directly through phpmyadmin. The first query would work fine. The second query that updates with another 900kb of data would appear to do very little (the size of sapce reported would increase, however I would get a very big increase in "overhead"). If I optimized the tables, I would often loose all the data.

I have the column type set to TEXT. I have tried LONGTEXT, BLOB and LONGBLOB.

Blob is even weirder, it seems to only put in 64kb of data. Running the UPDATE/CONCAT query and I am still left with only 64kb of data.

This is on mySQL 4.0.21

Is there a bug I don't know about (couldn't find one in mySQL) or am I doing something wrong?