I'm currently working on a project that structures content using a versioning system. There are 2 database tables that make up this structure:

wr_items - This holds the core information about an individual item, such as it's latest status, date created and last edited timestamp, it's unique ID etc.

wr_item_version - This table has an entry for each version of an item that is saved. This table includes the core information for each version, such as the version's title, timestamp, status, meta information etc.

As content has been entered into the system (there are almost 2,000 items, which have around 8,000 versions associated with them) fetching lists of items from the DB using PHP has become quite intensive and can take 10-15 seconds purely on database queries. I have already looked into optimising the queries before this point, however thought I'd post here to see if I can get any more advice on them.

The main query that is taking time is the following, on some pages multiple "types" are required so this query can end up being run 4 or 5 times in one page load with a different "type" value in the where clause:

SELECT a.* FROM wr_item_version a, wr_items b, (SELECT item, max(ts) AS max_ts FROM wr_item_version WHERE `status` != 'New' GROUP BY item) c WHERE a.item = b.id AND a.item = c.item AND a.ts = c.max_ts AND a.`type` = 'holiday' AND b.`gstatus` != 'New' AND a.`trashed` != '1' ORDER BY a.ts DESC LIMIT 20, 20
There are indexes set up on the status, type, timestamps, gstatus, trashed and item fields in the tables. I refined my previous method which included a lot of PHP loops to query the database many times in order to get the same information which took even longer to execute.

I'm happy to post any more information about the tables if required, any advice would be greatly appreciated