Tackling a table with duplicates using LIMIT
The logins table has over 400,000 rows. The query below was timing out before it could ever finish. So I added the LIMIT 100, to see if that would help. It didn't, as well as it didn't help when I moved the LIMIT 100 after the last closing parenthesis. By the way, this same type of query worked great on some other tables. Any thoughts?
#Create the temporary table to store the IDs of the elements to erase.
CREATE TEMPORARY TABLE ids (loginID int);
#Find the IDs of the elements to erase:
INSERT INTO ids(loginID) SELECT loginID FROM logins AS t
WHERE 1 != (SELECT COUNT(*) FROM logins
WHERE logins.loginID <= t.loginID
AND logins.uID = t.uID
AND logins.tempDate = t.tempDate LIMIT 100);
#Delete the elements of the table with the previously selected indexes
DELETE logins FROM logins,ids WHERE logins.loginID = ids.loginID;
#Remove the temporary table
DROP TABLE ids;