Help understanding slow query log

Hey everyone, I enabled slow query logging on my site for any queries that take longer than 2s. In the space of 15 minutes I had a log file 1mb in size (it’s a fairly active website).

That said, since I can’t seem to get mysqldumpslow to work in putty, I just opened up the log file and skimmed through it. I saw stuff like this:


# Query_time: 40  Lock_time: 0  Rows_sent: 10  Rows_examined: 39026
SELECT DISTINCT t.*, p.* FROM ibf_portals_news n LEFT JOIN ibf_topics t ON (t.tid=n.news_topic_id) LEFT JOIN ibf_posts p ON (p.topic_id=t.tid) INNER JOIN ibf_forums f ON (t.forum_id = f.id) WHERE (t.content_type = 1 OR t.content_type = 2 OR t.content_type = 3 OR t.content_type = 4 OR t.content_type = 5 OR t.content_type = 6 OR t.content_type = 13 OR t.content_type = 14) AND (t.description !="screenshot_page" AND t.description !="game_info" AND t.content_type != "7" AND t.content_type != "9" AND t.content_type != "999") AND t.pinned=0 AND p.new_topic=1 AND t.approved = 1 AND t.is_index=0 AND f.f_type = 10 ORDER BY n.news_topic_id DESC limit 12;

Now query aside, am I correct in understanding that the Query_time of 40 = 40 seconds and not 40ms?

EDIT:

Also I have a bunch like this:

# Query_time: 47  Lock_time: 35  Rows_sent: 323  Rows_examined: 323
SELECT * FROM ibf_topic_markers WHERE marker_member_id=83889;

yep it is in seconds

Do you happen to know of any slow query log analyzers, or why when I enter mysqldumpslow into putty I get the following error:

Can’t determine basedir from ‘my_print_defaults mysqld’ output: --datadir=/var/lib/mysql

In my.cnf, I have the following line:

basedir=/var/lib

but it’s under [mysql.server], instead of [mysqld]

is that the issue?

Well, in the first example,

[LIST]
[]the left joins will slow down the query quite significantly, and in the case of this query look to be inappropriate.
[
]Do you need each and every field from both the ibf_topics and the ibf_posts tables? Change the query to only return the fields back that it needs.
[]Is content_type a character or a numeric? In the first check, it’s treated as numeric and in the second it’s treated as character. One of those is doing a conversion for each record checked.
[
]You can use IN and NOT IN for a number of those checks and they would be a little more efficient than the individual checks. Probably not a huge difference, but might be noticeable.
[/LIST]I would probably rewrite this query to something like this:


SELECT DISTINCT t.field01
     , t.field02
     , t.field03
     , t.field04
     , t.field05
     , p.field06
     , p.field07
     , p.field08
     , p.field09
     , p.field10
  FROM ibf_portals_news n 
 INNER JOIN ibf_topics t ON (t.tid=n.news_topic_id) 
 INNER JOIN ibf_posts p ON (p.topic_id=t.tid) 
 INNER JOIN ibf_forums f ON (t.forum_id = f.id) 
 WHERE (t.content_type in (1, 2, 3, 4, 5, 6, 13, 14) 
   AND ((t.description NOT IN ("screenshot_page", "game_info"))  AND (t.content_type NOT IN (7, 9, 999)))
   AND t.pinned=0 
   AND p.new_topic=1 
   AND t.approved = 1 
   AND t.is_index=0 
   AND f.f_type = 10 
 ORDER BY n.news_topic_id 
  DESC limit 12

For the second, I would also ask if you really need all the fields from the ibf_topic_markers table. The more fields you return, the more traffic you have going back and forth from the database server.

For both queries, I would also check that there are indexes on the fields being searched on.

Thanks for the advice Dave. To be honest, I’m not our site’s technical guy when it comes to queries and things so I will be of little help when actually optimizing them :stuck_out_tongue:

Unfortunately the people that are experts at that are volunteer and don’t have as much time to study site performance issues as I have.

What I’m mainly interested in is simply parsing/analyzing the logs so I can then pass along potentially slow queries for them to tweak.

Do you know of any good, possibly free web-based log analyzers I could use?

you can specify the log file e.g.

mysqldumpslow --help

mysqldumpslow /path/to/slow_query.log

If you send your tech guys the slow_query.log itself that should be enough for them to go on.

Thanks jurn, that did it.

Fixed a few slow queries already. Looks like my coders have some indexing to do :stuck_out_tongue: