We’ve had a website for several years and recently turned it into an online store. But we are using a third-party site for the ecommerce. So far I’ve used both URL’s but the site looks the same all across. But I’ve decided it’s best just to move the whole site over to the other URL which is on the third-party ecommerce server so that everything is seemless and runs more smoothly, so it’s easier to update, etc. etc.
So my question is…what’s the best way to deal with all the pages from the old URL so that we don’t lose search rankings from those pages and so that people can still find the pages. I’m sure there are a lot of links roaming around the web to those pages and I don’t want people to not find us just because I decided to move everything.
One thought I had was to put up all the old pages and just make all the links go to the new URL pages. So if you come to one of those old pages from somewhere else you’ll still be able to click on the links but those links will just take you to the new pages. But you won’t really know you’re going from one to the other. But then if I do that I don’t want to get “dinged” by the search engines for spamming or whatever by having mirrored pages.
Anyone have some best practices…or advice on the best way to handle this, not lose search rankings and not give out 404 errors?
After reading through the Google article it sounded a little bit like I should leave up the pages on Domain 1 (old site) and just put a redirect on each page to the proper page on the new domain. Is that right? Of course also doing the 301 redirect. I think I’ll also add a custom 404 error page just in case and put links on that to the new domain.
If you put a 301 redirect on a page, it doesn’t matter whether the page is there or not. Any user-agent, including search spiders, that tries the URL will be given the redirection, and there is no way (using http) of seeing the file that lives at that URL.
Off Topic:
When you’re using dummy URLs to explain the point, please don’t allow them to become clickable links. If they start with http:// or www. then you need to un-tick the “Automatically parse links in text” box. If you leave it ticked, we end up with a load of outbound links to dummy sites.
I don’t want to make any assumptions about your technical ability or development platform, so i am not going to go into the technical nitty-gritty…
I was recently responsible for migrating several hundred thousand (!!!) URLs to their new versions (not just changing domains).
We did this successfully using 301 redirections - pattern matching where possible, and 1-1 where needed. You need to keep a close eye on your logs (for 404s etc) to see what falls through the gaps.
Permanent redirections (301s) transfer almost all of the search equity from one page to another. There will be a small dip in search traffic while the search engines re-evaluate whats taken place, but you should be smiling within weeks.
More info on your platform (apache etc) and your URL structures would be handy. Are you just changing domains or URL structures?
Thanks for your feedback. I think your first point regarding punctuation is important to usability and as a result it effects SEO. SEO is about getting people to click on your links and if users can’t read the link then they might not click on your site. Therefore, not following these guidelines could will effect your SEO efforts.
With regards to the depth of directory structure - it is entirely up to the individual how they organise the site and some large sites may require several levels. There is a lot of talk about this, but general consensus is not to build in any more sub-directories than needed. This keeps the site architecture simpler and it keeps the url’s shorter which aids usability and hopefully encourages more clicks.
Thanks again for your reply - SEO definitely isn’t an exact science and it’s always good to get another opinion!
Thanks for sharing your guide with us. Most of it looks pretty sound but there are a few things in there that I might quibble with:
It is also important to use punctuation in URL’s to avoid long strings of characters that are difficult for humans to read.
First quibble - while that is true from a usability point of view, it doesn’t hold up so well for SEO. Googlebot is far more capable of splitting severalwordsruntogether into their constituent words than people are, and as long as you don’t have a situation where the text could be split in different ways, it’s unlikely to make much difference for SEO. That’s not to say you shouldn’t do it, but you need to be clear about the reason for it.
Second quibble - kill the grocer’s’ apo’strophe!
The other very important thing when considering URL structure is to keep everything as flat as possible and do not bury content away in unnecessary directories. Search engines do not crawl pages that are more than 2/3 levels deep as frequently as higher level pages. This means that content may not be indexed and may not show up in their results pages.
Sorry, but I don’t believe that’s true. I have seen no evidence that Google treats pages that are buried deep in directories any differently to those in the root. What will affect it is how many clicks it takes to get to each page - so what really matters here is that your navigation structure is good.
Sure, the example you gave of bad practice does seem to go over the top with subfolder levels and classifications, but that doesn’t mean you shouldn’t use as many levels of subfolders as you need to, to get a good hierarchical structure. This will help both users and spiders.
Awesome thanks for all those links. I’ve started reading through them accept the first link didn’t work.
About my platform and URL structure…I’m actually not sure. I think we have IIS but I’m not sure what version as I don’t take care of that end of things. And not sure what you mean by the URL structure. I’m just changing domains basically from for example [noparse]www.abc.com[/noparse] to [noparse]www.xyz.com[/noparse].
After reading through the Google article it sounded a little bit like I should leave up the pages on Domain 1 (old site) and just put a redirect on each page to the proper page on the new domain. Is that right? Of course also doing the 301 redirect. I think I’ll also add a custom 404 error page just in case and put links on that to the new domain.
I gotta finish reading the other articles you sent links to…
It’s exactly the same process - all I meant was that the regular expression you use to pattern-match the URL can become more complex. You’d probably be better off asking in the Apache forum, where there live strange beings who are really good at writing regular expressions.