Tackling a table with duplicates using LIMIT

The logins table has over 400,000 rows. The query below was timing out before it could ever finish. So I added the LIMIT 100, to see if that would help. It didn’t, as well as it didn’t help when I moved the LIMIT 100 after the last closing parenthesis. By the way, this same type of query worked great on some other tables. Any thoughts?

Thanks!

#Create the temporary table to store the IDs of the elements to erase.
CREATE TEMPORARY TABLE ids (loginID int);

#Find the IDs of the elements to erase:
INSERT INTO ids(loginID) SELECT loginID FROM logins AS t
WHERE 1 != (SELECT COUNT(*) FROM logins
WHERE logins.loginID <= t.loginID
AND logins.uID = t.uID
AND logins.tempDate = t.tempDate LIMIT 100);

#Delete the elements of the table with the previously selected indexes
DELETE logins FROM logins,ids WHERE logins.loginID = ids.loginID;

#Remove the temporary table
DROP TABLE ids;

End of the day and I’m slipping… 400k records is not enough to constitute a freeze unless somethings being written wrong or an index isn’t being hit. What constitutes a dupe? uID and TempDate?

I forgot the index and all is working fine now. Thank you.