Results 1 to 8 of 8
May 23, 2009, 12:53 #1
- Join Date
- Aug 2004
- New York
- 0 Post(s)
- 0 Thread(s)
How to optimize a 100 million record table?
I have a MyISAM table with 100 million records.
The table has a PRIMARY KEY and a non-unique INDEX on a BIGINT column. My question is, how can we run a query on this INDEXED column and figure out which PRIMARY KEYs have the most matches. As of right now, this is pretty much as far as I can get it:
indexed_column = some_numeric_value
I know there are tiny ways to improve this, but I am looking to go from several seconds to getting data within as fast as 100-200 ms. Any ideas? Am interested in hearing any algos, tuning tips, or even database specifically designed for this even if it is not MySQL.