How to optimise a large database
I have a database which contains approx 80,000 records and receives about 2000 new records every day. The records are regularly queried and the database is approx 120MB in size.
My server is using up a lot of RAM and I am thinking that it might well be due to the the database... is there any way to optimise the database to reduce the ram usage? I have heard of cases where the table is split up into several smaller tables which hold, for example, table_1 -> records 1-5000, table_2 -> records 5000-10,000 , and so on....
Would this help reduce memory use? Or are there any other techniques?