I run a connection testing site, and when my users test various aspects of the test are logged. This is a very new feature I have added to my site, and I am not an expert in SQL, but am getting much better every day. My new DB has already grown to 1.5 million entries but with the indexes I have most routine queries are running well under 0.3 sec, some more complex or multiple queries in the same script are running at about 0.5-1.1 sec -- all thanks to our buddy INDEXING - BUT the following query is still taking too much time.

This query is taking 5-6 seconds to complete, I have a very fast dedicated server (dual XEON 2.8GHz) - User_Host is indexed.


SELECT `User_Host`, round(avg(`Test_Score`)) AS Score, COUNT(`User_Host`) AS Count FROM `Master` WHERE `Date` >= '$date_0' AND `Test_Type` = 'download' GROUP BY `User_Host` HAVING Count >= 1000 ORDER BY Score DESC LIMIT 10
End result is a list of ISP's ordered by speed (based upon tests performed at my site)

example (crudely pasted):
Top Ten Providers (ordered by Download Speed)
Rank Provider Average Download Speed ~ 30 Day Test Count
1 optonline.net 4505 Kbps or 550 KB/s 1699
2 cox.net 3253 Kbps or 397 KB/s 10480
3 insightBB.com 3108 Kbps or 379 KB/s 1666
4 mindspring.com 3034 Kbps or 370 KB/s 1760
5 attbi.com 2667 Kbps or 326 KB/s 3703
6 comcast.net 2621 Kbps or 320 KB/s 17880
7 adelphia.net 2550 Kbps or 311 KB/s 10208
8 rr.com 2477 Kbps or 302 KB/s 8633
9 rogers.com 2290 Kbps or 280 KB/s 1885
10 charter.com 2265 Kbps or 276 KB/s 2810

Where am I going wrong in my syntax? I have read and read mySQL documentation and am now pulling out my hair.


ALSO, what the hell am I going to do about the size of my database... row size is only 150 bytes but with 1.5 million entries and growing at many hundreds of thousands per month isn't it going to get out of hand... and become slow, is just flushing old data the best thing to do in my case??

Sorry about the long post, I just have so many unanswered questions.