I am working on a project in which an admin uploads 4000-5000 records per day in a table through CSV files (average 500 records per CSV upload). Database is MYSQL.
Users on the main website could search and fetch data from this huge table.
Now the table already has nearly half a million records and search and query functionality (on the main website) have become very slow. Though, I have applied a few indexes on this table and all other tables that are related to this table (through foreign keys) bur still the searches are slow.
I could add some more indexes on this table however I am afraid that could slow down the inserts/updates (via CSV upload). As you know indexes slow down the inserts/updates.
Do you guys have any idea regarding how to manage a database with such a huge table by optimizing both the processes (inserts/updates and searches).
I look forward to your replies. Thanks.