9-Point Technical SEO Checklist for Developers

By Sam Gooch

Technical SEO Checklist

This article is part of an SEO series from WooRank. Thank you for supporting the partners who make SitePoint possible.

Organic search traffic is vital to any commercial website: Almost half of online shoppers begin their shopping process with a search engine, and a third of e-commerce traffic comes from search results. Even if you’re a brick and mortar shop, you’re likely reliant on organic traffic since half of mobile searches result in a store visit within a day of the search. All that adds up to one fact: if you’re trying to make money with your website, you need organic traffic. And how do you bring in that traffic? SEO.

SEO is typically viewed as the realm of marketers and writers. However, developers have a large role to play as well. If sites aren’t built correctly, search engines could struggle, or even fail entirely, to find and index pages. One false move with your robots.txt file, for example, could prevent your entire site from showing up in Google search results.

That’s why we’ve put together this 9-point checklist to help developers build sites in a way that’s optimized to rank highly in search results.

Crawling and Indexing

Since the purpose of SEO is to appear in search results for your target audience, one of the most important considerations when creating a site is getting crawled and indexed. The easiest way to get indexed is to submit your site directly to Google and Bing. Use Google Search Console to submit your URL to Google. This doesn’t require a Google Search Console account, but if you do have one, you can use the Fetch as Google tool in the Crawl section. After Googlebot successfully fetches your site, click the “Submit to index” button.

Fetch as Google in Google Search Console account

Submitting your site to Bing requires a Bing Webmaster Tools account.

XML Sitemaps

A basic description of XML sitemaps is a list of every URL on your site, stored as a text file in your site’s root directory. In reality, there’s a little bit more to them than that. Yes, they list every URL on your site (or at least the URL for every page you want crawled and indexed), but they also list extra information about each page and serve an important SEO function. Search engines use the information in sitemaps to crawl sites more intelligently and efficiently so they won’t waste their crawl budget on unimportant or unchanged content. When done correctly, your basic sitemap looks like this:

<?xml version="1.0" encoding="UTF-8”?>
<urlset xmlns="http://www.sitemaps.org/schemas/sitemap/0.9” xmlns:xhtml=”http://www.w3.org/1999/html”>
    <url>
        <loc>https://www.example.com/</loc>
        <lastmod>2016-8-01</lastmod>
        <changefreq>monthly</changefreq>
        <priority>0.9</priority>
        <xhtml:link rel="alternate” hreflang=”fr” href=”https://www.example.com/fr/”/>
   </url>

What does all that mean? Here’s a synopsis:

  • <urlset>: This tells crawlers that the sitemap is starting and ending.
  • <url>: Denotes the beginning and end of each URL entry in the sitemap.
  • <loc>: This defines the URL of the page. While the rest of the attributes found in the <url> tag are optional, <loc> required.
  • <lastmod>: The date, in YYYY-MM-DD format, the page was updated or modified.
  • <changefreq>: This indicates how frequently you update the page, which will help search engines decide how often to crawl it to make sure they’re indexing the freshest content. You might be tempted to lie to increase your crawl frequency, but don’t. If search engines see <changefreq> doesn’t jive with the actual change frequency, they’ll just ignore this parameter.
  • <priority>: Sets the priority of the page in relation to the rest of the site. Valid values range from 0.0 to 1.0, from least to most important. Use this tag to help search engines crawl your site more intelligently. Note that this only tells crawlers how important your pages are compared to your other pages. It does not affect how your pages are compared to other sites.
  • <xhtml:link>: This tag points to alternate versions of the page. In this example it indicates the French version of https://www.example.com.

Sitemaps aren’t a ranking signal, but they help search engines find all your pages and content, which makes it easier for you rank well.

If you don’t want to write your own sitemap, there are plenty of tools out there that can help you create one. Once you have your XML sitemap, validate and submit it using Google Search Console. You can also submit your sitemap to Bing via Webmaster Tools. Make sure you fix any errors so you don’t wind up impeding your site’s indexing.

Robots.txt

Like XML sitemaps, robots.txt files are plain text files stored in the root directory of your site, and help crawlers navigate your site. The file contains lines of code that specify which user agents have access to which files, file types or folders. The code is broken up into blocks, with one user agent line per section. Basic robots.txt code looks like this:

User-agent: *
Disallow:

User-agent: googlebot
Disallow: *.ppt$

The asterisk (*) is used as a wild card. In the user agent line, the wild card represents all bots. In a disallow line, it represents the URL up to a specified point. In our example above, our robots.txt disallows Googlebot from crawling pages that end with a PowerPoint file extension — the $ denotes the end of the URL.

You can block bots from crawling your entire site by using a slash in the disallow line like this:

User-agent: *
Disallow: /

It’s good practice to disallow all robots from accessing the entire server when you’re building, redesigning or migrating your site. However, you have to be sure to restore access once you’re done, or your shiny new site won’t get indexed.

Use Google Search Console to test your robots.txt file for syntax errors or other problems.

Google Search Console robots.txt Tester

Meta Robots Tag

One problem with the robots.txt file is that it won’t stop search engines from following external links to your site, so disallowed pages could still wind up indexed. Add an extra layer of protection to individual pages using the robots meta tag:

<meta name="robots” content=”noindex”>

Or, in the context of getting your site indexed, make sure you don’t have the robots meta tag on the pages you want to get indexed.

URL Optimization

URLs have an impact on both your site’s user experience and SEO. Both humans and bots expect them to provide at least a basic description of what the page is about, and where that page sits in the site’s hierarchy. Optimize your URLs by including the page’s target keyword as well as the page’s folder and subfolders. Take a look at these two URLs:

  • https://www.example.com/clothing/mens/shirts/fancy-white-dress-shirt
  • https://www.example.com/product/cid=12345&pid=67890

Search engines crawling the page will see that URL and will be able to tell that not only is the page about fancy white dress shirts, but it’s also topically related to men’s clothing. The second URL, unfortunately, doesn’t really tell you anything about what you’ll find on that page, except maybe that it’s a product produced by example.com. Which do you think will appear more relevant to a search for “men’s fancy white dress shirts”?

When creating URLs, follow these best practices:

  • Concise: Your URLs should be descriptive and contain keywords, but they should also be concise. Generally speaking, your URLs should be 100 characters or less.
  • Clean: When possible, avoid using URL parameters like session IDs and sorting/filtering. They lower usability and run the risk of creating duplicate content problems.
  • Hyphens: When using multiple words in your URL, separate them using hyphens, not underscores. Search engines use hyphens as word separators but don’t recognize underscores, so url_keyword looks the same to them as urlkeyword. Since humans use spaces in searches, hyphens in your URL will look more relevant.

Canonical URLs

URL optimization isn’t just how you use keywords, though. It’s also part of preventing duplicate content and consolidating ranking signals like link juice. It can be easy to inadvertently host duplicate content on a few pages thanks to URL parameters and syndicated content. This is bad not just for the duplicate pages — the Panda “penalty” affects the whole site. If you wind up with duplicate content thanks to your content management system or e-commerce platform, just use rel=”canonical” to point search engines toward the original version.

When using canonical URLs, first implement the WWW resolve. To do this, set a preferred domain in Google Search Console under Site Settings. Google takes preferred domains into account when crawling the web and displaying search results. So if you set your preferred domain to www.example.com, all links to example.com will send link juice to www.example.com, which is the URL Google will display in SERPs. Next, add the canonical tag to the <head> of HTML pages or the HTTP header of non-HTML pages:

  • HTML: <link rel="canonical” href=”https://www.example.com”/>
  • HTTP: Link <https://www.example.com>; rel="canonical”

When adding the canonical tag, make absolutely sure the URLs you’re using match 100% to your canonical URLs. Google sees http://www.example.com, https://www.example.com and example.com as three different pages. Google will simply ignore the canonical link if you use more than one on a page or link to a page that returns a 404 error.

Page Speed

Page load time is a crucial aspect of site usability and SEO. Google is out to give their users the best websites so they don’t want to send people to slow websites. When you audit your site using WooRank, check the Usability section to see how fast your page loads and how that compares to your competitors.

WooRank Load Time criterion

If your site is slow, optimize these elements to improve your page speed:

  • Images: Images are one of the biggest culprits of slow page speed. Don’t rely on HTML to reduce the file size of an image — it can only change its dimensions. Use image editing software like Photoshop to reduce the file size. Consider using other image optimization tools to further compress your images.
  • Dependencies: Certain plugins and scripts, like social share buttons and tracking software, are required for you to get the most out of your website. Whenever possible, use plugins made by your CMS and stick to just one tracking system at a time. Keep your CMS software up to date, but test each update in a test environment in case something with your site breaks with the CMS upgrade.
  • Caching: Use expires headers to control the length of time that your site can be cached, and tell browsers that they can cache images, stylesheets, scripts and Flash. This will reduce the number of HTTP requests, therefore improving your page speed.
  • G-Zip Encoding: Use G-Zip compression to zip and compress large files on your page to save bandwidth and download time.
  • Redirects: Some redirects are unavoidable. However, remember that every redirect is a new HTTP request and adds milliseconds on to your load time.

If all else fails, use browser developer consoles to find files that are bottlenecking your page load.

Mobile Friendliness

Mobile Page Speed

Mobile friendliness has a direct tie to site speed, as load time is a major factor in mobile search rankings. It’s arguably even more important for mobile pages to be fast than it is for desktop sites. There are numbers that back this up: 40% of mobile users will leave a page after waiting three seconds for it to load. Google’s criteria for a page to be mobile friendly is to load above the fold content in one second or less.

You can optimize your mobile speed the same way you do your desktop speed: reducing image size, relying on caching, reduce dependencies and minimize redirects. Or, you can create an Accelerated Mobile Page (AMP). AMP is an open source spec initiative to create mobile pages that are fast and have enhanced user experience. There are three main parts to AMP:

  • HTML: HTML for AMP pages is basically normal HTML. It just has a few custom variations and limitations for resources like images, videos and iframes.
  • JavaScript: AMP pages use a custom JavaScript library that loads asynchronously. You’re also required to set sizes in HTML, so browsers know how the page will look before elements are loaded. So the page won’t jump around as other resources load.
  • Cache: Google has a dedicated cache for AMP pages that it uses to serve in search results. When Google loads a page saved in the AMP cache, everything is coming from the same location, which means better efficiency.

Mobile Site Structure

There are three main options you have when creating the mobile version of your site:

  • Mobile Subdomain: This is the most labor and time intensive option of the three since it requires building an entirely separate mobile website, hosted on a subdomain (normally something like mobile.example.com or m.example.com). Google won’t be able to tell that the subdomain indicates the site is just for mobile users so you’ll have to use rel=”canonical” tags on any duplicate pages. This method requires a lot of resources, more than the other two, and generally isn’t recommended.
  • Dynamic Design: This method detects user-agent and serves different HTML to mobile and desktop browsers. Use the vary: user-agent HTTP header to tell search engines that you will be serving different code based on user-agent.Add this code if you’re working in PHP:
    <?php
    header<"Vary: User-Agent, Accept”);
    ?>
    

To do this in Apache, add the following code to your .htaccess:

    Header append Vary User-Agent

Add this code in functions.php if you’re working with WordPress:

    function add_vary_header($headers) {
    $headers['Vary'] = 'User-Agent';
    return $headers;
    }
    add_filter('wp_headers', 'add_vary_header');
  • Responsive Design: The simplest and easiest way to create a mobile version of your site, responsive design is Google’s recommended method. It just requires you to set the viewport meta tag. The viewport tells browsers what dimensions to use when displaying a page. Set the viewport to scale to the device to make your pages mobile friendly:
    <meta name="viewport" content=”width-device-width, initial-scale=1.0”/>
    

Structured Data Markup

Structured data markup gives meaning to the content on your page so search engines can understand it. You can use Schema.org markup on your About page, for example, to tell search engines where to find your address, opening hours and phone number. Add it to product pages so search engines can easily find reviews and ratings of your products. If you’ve got a personal brand, add Schema markup to denote education, family and professional information.

Schema.org markup won’t necessarily make you outrank a page that doesn’t use it. But it’s a big help for SEO because it’s used in Google’s rich snippets. The easiest way to see search snippets live is to search for a recipe. In the search results you’ll see the normal search snippet: title, URL and description, along with a picture and star rating. Those last two are thanks to semantic markup.

So while semantic markup isn’t a ranking signal, it can help improve your search ranking. The better Google understands what’s on your page, the more likely you are to have better rankings. Plus, semantic markup also helps assistive applications like screen readers, improving your site’s user experience.

Wrapping It Up

There’s more to technical SEO than just the 9 points listed here (check out our on page checklist for a few more), but these are the basics to getting your site found and indexed by Google, and setting it up to rank well in search results. And that’s really the point of technical SEO – to make sure that Google can find, access and interpret the on page content, and to provide a strong user experience.

  • motokik

    Thank you for a great article. However, the source code of this article does not work. Because ‘& rdquo;’ is mixed in quotation.

  • thanks for the information, mobile friendly sites are important now for getting traffic

Recommended
Sponsors

Instant Website Review

Use Woorank to analyze and optimize your website to improve your website to improve your ranking!

Run a review to see how your site can improve across 70+ metrics!

Get the latest in Front-end, once a week, for free.