One-Hour SEO Audit
If you’re a programmer, a user interface designer, a content manager or a marketer in a small to medium-sized business, it is most likely that one of your tasks is to “make sure that web pages on our site are search engine optimized”.
Actually, in many cases, programmers are assigned with “checking” the website for SEO compliance with best practice. Or, it could be the designer who is “lucky” enough to take care of this task.
I would like to help the many who struggle with time concerns and regular SEO audits. I will provide straightforward tips on what tools to use, and what to look for to perform SEO audits as quickly as possible. But keep in mind that if you want to dig deeper once you find a problem (or opportunity), it takes a lot more than one hour.
There are several tools of the trade that can be used for this purpose:
– SEOmoz Tools (paid) – basic versions are free, but you will need access to the paid version to be able set up a crawl.
– Xenu Link Sleuth (free) – this used to be my preferred tool for crawling, but while it’s very fast at finding broken links on a site, it lacks SEO reporting and analysis.
– Screaming Frog (free and paid) – this is an SEO crawler that does more than Xenu, but the free version is limited to 500 URLs, which nowadays could be a WordPress blog with a few posts and images. Lots of SEOs are very happy with it, but that’s because they are not aware of the next item on the list.
– IIS SEO Toolkit from Microsoft (free) – this is a real hidden gem very few SEOs know or talk about. It is a desktop crawler and SEO analyzer that scans sites, from small to large (up to 50k URLs if your system can handle it). Many probably do not use it because they think they need to have an ASP-built website, but that is not true. You can run this tool on any URL out there. I highly recommend this tool for anyone involved with SEO. You can even install this on Windows 8.
I like to start my analysis with a 10,000 feet view of the SEO visibility of the website using Search Metrics. I do this to check if there are any big organic visibility issues. The SEO visibility metric is composed of search volume and the position of ranking keywords, and it is not the same as the site traffic. It is, however, a good tool to identify trends and penalties.
Penguin pecked hard on this site.
Identifying onsite SEO issues
The first thing I look for is the robots.txt
file, to check:
– If there are any pages that are blocked (sometimes there can be mistakes that block the entire sites/directories from being crawled and indexed).
– That every block directive is correct and on purpose.
– The syntax with this tool.
The second thing to look for is the sitemap.xml
file. If you do this for your own site, you know where to find the file. If you’re peeping at your competitors’, you might find the path to this file in the robots.txt
file. Or just add “sitemap.xml” after the domain name trailing slash. Run the file through this tool to see if there are any errors.
Third, I check the header response for the home page separately, using http://web-sniffer.net/.
A 404 page for a home page is not good at all.
In most cases the response code should be either 200OK or 301 – Moved Permanently. If not, then you need to start asking questions.
Large scale issues
For me, the fastest way to identify SEO issues across the entire web site is to crawl it, on demand, from my local machine, using the IIS SEO Toolkit. Once you’ve installed it (and the eventual dependencies), set up a new crawl of up to 10k pages on your site. When the crawl is done, wait a minute for the analysis to be performed, and then start fixing the errors.
That many errors mean a lot of work.
Take a look at the following reports:
– Content: Status Code Summary (you want to see 200’s and 301’s)
– Content: Pages with Broken Links (broken links means broken PR flow)
– Content: Duplicate Titles (Once you find duplicate URLs, use the Similarity Analyzer tool at http://tool.motoricerca.info/similarity-analyzer.phtml)
– Content: Directory Summary (this is interesting to assess the taxonomy of your site)
– Links: Link Depth (more than four depth levels is not recommended)
Additionally, it’s important to take a look at the anchor text used to interlink pages on your site. To get this, you need to run a New Link Query from the Dashboard. For the Field Name use “Link Text” for Operator choose Contains, and leave the Value field empty. Then Group By Link Text.
20,000 empty anchors is not a good thing for internal linking.
That’s how you do a one-hour audit of any website. But once you’ve identified any problems, it may take considerably longer to address them.
Once you do the above, let me know if you were surprised with your findings.
See you at the top!