Hi,
After some reading Ive moved away from individual tables and I m importing my end of day stock data (some 4000 rows) into a single table. Currently my database structure looks like this:
CREATE TABLE IF NOT EXISTS `shares` (
`id` mediumint(7) NOT NULL auto_increment,
`name` char(8) NOT NULL,
`date` date NOT NULL,
`open` decimal(9,3) unsigned NOT NULL,
`high` decimal(9,3) unsigned NOT NULL,
`low` decimal(9,3) unsigned NOT NULL,
`close` decimal(9,3) unsigned NOT NULL,
`volume` mediumint(7) unsigned NOT NULL,
PRIMARY KEY (`id`),
KEY `date` (`date`),
KEY `name` (`name`)
) ENGINE=InnoDB DEFAULT CHARSET=latin1;
The table is fast approaching 500,000 rows, which has me thinking about the performance of the table at 10’s of millions of rows.
Is there any stage that I should move (now or later) to a structure like this, and create many thousand of tables:
CREATE TABLE IF NOT EXISTS `name_of_share` (
`id` mediumint(7) NOT NULL auto_increment,
`date` date NOT NULL,
`open` decimal(9,3) unsigned NOT NULL,
`high` decimal(9,3) unsigned NOT NULL,
`low` decimal(9,3) unsigned NOT NULL,
`close` decimal(9,3) unsigned NOT NULL,
`volume` mediumint(7) unsigned NOT NULL,
PRIMARY KEY (`id`),
KEY `date` (`date`),
) ENGINE=InnoDB DEFAULT CHARSET=latin1;
It has been some 3 months to accumulate half a million rows and I would like (hopefully) to stay in the business for 5 - 10 years, at which point the database will be quite large.
When I query, I only look at one stock at a time, and I never access two stocks at one time. With this in mind, would it be quicker to query individual tables, eventually?