One-Hour SEO Audit

If you’re a programmer, a user interface designer, a content manager or a marketer in a small to medium-sized business, it is most likely that one of your tasks is to “make sure that web pages on our site are search engine optimized”.

Actually, in many cases, programmers are assigned with “checking” the website for SEO compliance with best practice. Or, it could be the designer who is “lucky” enough to take care of this task.

I would like to help the many who struggle with time concerns and regular SEO audits. I will provide straightforward tips on what tools to use, and what to look for to perform SEO audits as quickly as possible. But keep in mind that if you want to dig deeper once you find a problem (or opportunity), it takes a lot more than one hour.

There are several tools of the trade that can be used for this purpose:

-          SEOmoz Tools (paid) – basic versions are free, but you will need access to the paid version to be able set up a crawl.

-          Xenu Link Sleuth (free) – this used to be my preferred tool for crawling, but while it’s very fast at finding broken links on a site, it lacks SEO reporting and analysis.

-          Screaming Frog (free and paid) – this is an SEO crawler that does more than Xenu, but the free version is limited to 500 URLs, which nowadays could be a WordPress blog with a few posts and images. Lots of SEOs are very happy with it, but that’s because they are not aware of the next item on the list.

-          IIS SEO Toolkit from Microsoft (free) – this is a real hidden gem very few SEOs know or talk about. It is a desktop crawler and SEO analyzer that scans sites, from small to large (up to 50k URLs if your system can handle it). Many probably do not use it because they think they need to have an ASP-built website, but that is not true. You can run this tool on any URL out there. I highly recommend this tool for anyone involved with SEO. You can even install this on Windows 8.

I like to start my analysis with a 10,000 feet view of the SEO visibility of the website using Search Metrics. I do this to check if there are any big organic visibility issues. The SEO visibility metric is composed of search volume and the position of ranking keywords, and it is not the same as the site traffic. It is, however, a good tool to identify trends and penalties.

 10kfeet-view

Penguin pecked hard on this site.

Identifying onsite SEO issues

The first thing I look for is the robots.txt file, to check:

-          If there are any pages that are blocked (sometimes there can be mistakes that block the entire sites/directories from being crawled and indexed).

-          That every block directive is correct and on purpose.

-          The syntax with this tool.

The second thing to look for is the sitemap.xml file. If you do this for your own site, you know where to find the file. If you’re peeping at your competitors’, you might find the path to this file in the robots.txt file. Or just add “sitemap.xml” after the domain name trailing slash. Run the file through this tool to see if there are any errors.

Third, I check the header response for the home page separately, using http://web-sniffer.net/.

 response-header

A 404 page for a home page is not good at all.

In most cases the response code should be either 200OK or 301 – Moved Permanently. If not, then you need to start asking questions.

Large scale issues

For me, the fastest way to identify SEO issues across the entire web site is to crawl it, on demand, from my local machine, using the IIS SEO Toolkit. Once you’ve installed it (and the eventual dependencies), set up a new crawl of up to 10k pages on your site. When the crawl is done, wait a minute for the analysis to be performed, and then start fixing the errors.

 SEO-violations

That many errors mean a lot of work.

Take a look at the following reports:

-          Content: Status Code Summary (you want to see 200’s and 301’s)

-          Content: Pages with Broken Links (broken links means broken PR flow)

-          Content: Duplicate Titles (Once you find duplicate URLs, use the Similarity Analyzer tool at http://tool.motoricerca.info/similarity-analyzer.phtml)

-          Content: Directory Summary (this is interesting to assess the taxonomy of your site)

-          Links: Link Depth (more than four depth levels is not recommended)

Additionally, it’s important to take a look at the anchor text used to interlink pages on your site. To get this, you need to run a New Link Query from the Dashboard. For the Field Name use “Link Text” for Operator choose Contains, and leave the Value field empty. Then Group By Link Text.

  anchor-text-query

20,000 empty anchors is not a good thing for internal linking.

That’s how you do a one-hour audit of any website. But once you’ve identified any problems, it may take considerably longer to address them.

Once you do the above, let me know if you were surprised with your findings.

See you at the top!

Free book: Jump Start HTML5 Basics

Grab a free copy of one our latest ebooks! Packed with hints and tips on HTML5's most powerful new features.

  • http://smallbusinessmarketingsucks.com/ John Tabita

    Nice article, Traian. I’ve read a dozen articles like this over the past month and yours is the only one to mention some of the tools you have.

    If you’re looking for article suggestions, how about one on how to set up a canonical URL: tag vs. 301 redirect vs . htaccess … in plain English for a non-technical SEO/non-programmer like myself?

    Thanks again for a very helpful article.

    • http://www.pitstopmedia.com/ Traian Neacsu

      I’m glad you found something useful in the article and yes, I kind of agree, many times SEOs give away great info but they don’t really mention the tools they use or don’t provide Excel sample files that everyone can use.

      Regarding your suggestion, the canonical attribute is by definition quite technical and usually the implementation goes to a programmer. If you’re referring to how to actually choose between the 3 options, that could be done. Can’t promise anything but thanks for the suggestion.

  • http://www.pricklypearmedia.com Angelos

    Thanks, just what I was looking for!

    • http://www.pitstopmedia.com/ Traian Neacsu

      Glad I could help!

  • http://www.floryabekoservisi.com jannyy

    thank you .. .

  • http://www.mattearly.com Matt Early

    How often do you recommend doing this?

    • http://www.pitstopmedia.com/ Traian Neacsu

      Hi Matt,

      It depends of how big your site is, how many issues you find at the first run of the audit and how often you add new pages. But generally, at least once a month is a good enough.

  • http://georglangley.ca George Langley

    Thanks for the list – will be checking them out on a few of my sites.
    I do have to wonder though – a Microsoft tool? For SEO? Given that Bing can’t even catalogue an iFrame-based site properly, the way Google can and does, I’d have to take it as a Microsoft-slanted view on your web site. Which is possibly a good thing if IE still figures into your viewers’ stats. But I would suggest use all of the tools and see if/where IIS SEO Toolkit may be missing out.

  • http://www.christianrochford.me Christian Rochford

    This is really useful, thanks for the article :)

  • http://parkersorensen.com Parker

    Thanks for the great article. I am just starting to do SEO for a startup company, so free and useful software is very helpful. Thanks for spreading the word, and giving some tips to help me get started.