This is my script:
UPDATE du_raw2, cris3 SET du_raw2.sn2 = cris3.sn WHERE du_raw2.key1 LIKE CONCAT(‘% ‘,cris3.sn,’ %’)
cris3.sn is varchar(30) and has been indexed
du_raw2.key1 is varchar(30)
It successfully updates 1000 rows in a second.
It successfully updates 2000 rows in three seconds.
It times-out somewhere between 2000 and 3000 rows.
I isolated rows 2000-3000 and successfully updated them to make sure there wasn’t an issue with the input.
There are no errors except for the time out.
What do you think the cause might be?
I think I’m hitting a limit that looks like a time limit, but it’s probably something else. What I don’t know.
My script isn’t in a loop (yet). I have 100,000 rows to process. I’m just peeling it back to figure-out where it chokes. It’s not the data. It’s the number of rows and or the way I’m applying the code.
I’m using it with php, but I doubt php is the problem, I’ve already played with set_time_limit().
The query log says the connections are opened, but never closed.
My code parses mailing addresses. It targets the key1 column in the table du_raw2. The key1 column contains addresses (all the address components on a single line).
cris3 contains all the street name possibilities for a specific city. There are 2000 of them.
The script scans each address in the target file (when it has less than 2000 rows) and returns the street name if it’s found (from the 2000 street name possibilities).
When the target file has 3,000 rows it times out. I suppose I could work around the problem in a while loop. It would only run 50 times.