How To Beat Google’s ‘Brandy’ UpdateBy
The ‘Brandy’ update seems to have incorporated some pre-‘Florida’ results (another major update that occurred at the end of 2003), mixed with numerous new factors. Google stores its index on a number of data centers around the world. Since ‘Florida’, some of the old data centers were taken offline, and pundits believe that Google has kept the old SERPs (Search Engine Results Pages) in a preserved state for the last few months.
Indeed, Google brought these data centres back at the same time that Yahoo! broke from Google, in favour of its new Inktomi-based results. Consequently, I don’t think this is the last of the major changes we’ll see in Google, but it does seem that Google is getting closer to what it aims to achieve.
Brin, one of the founders of Google recently said,
Google has made five significant changes to its algorithmic formulas in the last two weeks.
(Associated Press (AP), Feb 17th 2004)
While we can only guess at what those changes were, the following are probably a good bet.
- Increase in Index Size
Google’s spider, Googlebot, has had a busy few weeks — at the time of the update, Google announced that it had massively increased the size of its index.
This move was probably made to ensure Google made headlines at the same time as Yahoo! (for example, in this report in the BBC News, Feb 18th 2004). However, in order to increase the index size, Google may have had to re-include some of the pre-Florida results that had previously been dropped.
- Latent Semantic Indexing (LSI)
This is a very significant new technology that Google has always been interested in, and the incorporation of LSI has been on the cards for some time. If you are an insomniac, then Yu et al.’s paper is quite helpful in explaining the concept, but, in short, LSI is about using close semantic matches to put your page into the correct topical context.
It’s all about synonyms. LSI may see Google effectively remove all instances of the search keyword when analysing your page, in favour of a close analysis of other words. For example, consider the search term ‘travel insurance’. LSI-based algorithms will look for words and links that pertain to related topics, such as skiing, holidays, medical, backpacking, and airports.
- Links and Anchor Text
Links have always been the essence of Google, but the engine is steadily altering its focus. The importance of Page Rank (PR), Google’s unique ranking system, is being steadily downgraded in favour of the nature, quality, and quantity of inbound and outbound link anchor text. If PR is downgraded, and the wording of inbound links is boosted, this may explain, to a large degree, the position in which many sites currently find themselves.
For example, most people will link to a site’s homepage. In the past, due to internal linking structures, PR was spread and other pages benefited. Now, it is more important for Webmasters to attract links that point directly to the relevant pages of their sites using anchor text that’s relevant to the specific pages.
Furthermore, Google seems to be using outbound links to determine how useful and authoritative a site is. For example, directories that are doing well are those that direct link to the sites, rather than use dynamic URLs.
Now, more than ever, has the question of who’s linking to your site become critical. Links must be from related topic sites (the higher the PR the better); those links are seen to define your ‘neighbourhood’.
If we again consider the example of travel insurance, big insurance companies might buy links on holiday-related sites in order to boost their ranking. These businesses will actively invest in gaining targeted inbound links from a broad mix of sites. Consequently, their neighbourhoods appear tightly focused to Google.
- Downgrading of Traditional Tag-Based Optimisation
Clever use of the title, h1, h2, bold, and italics tags, and CSS, is no longer as important to a site’s ranking as it once was. It is very interesting to listen to Sergey (co-founder of Google) talk about this, because he’s the one usually quoted about the ways in which people manipulate his index. Google has taken big steps to downgrade standard SEO techniques in favour of LSI and linking, which are far less manipulable by the masses.
The Impact of Brandy
These changes make for sober reading if you’re a Webmaster — to optimize your site successfully for Google has become a lot more difficult. Nevertheless, there are a number of practical steps that can be taken to promote your ranking in the short and long term.
As LSI appears to be so significant, it is important to start looking carefully at the information architecture of each major section of your site, and to increase the use of related words. It is also important to re-examine the title tags to include this concept; good title tags have synonyms and avoid repetition of the key phrase.
- Outbound Links
Link to authority sites on your subject. In the travel insurance example, these authority sites could include places like the State department, major skiing directories, etc. Not only will this help with LSI, it also allows Google to define the neighbourhood more easily. Furthermore, you could engage in link swaps with other companies so that you gain the benefit of an on-topic, LSI-friendly link.
- Inbound Links and Link to Us Pages
Based on what we have just said, sites need to formulate a link development strategy. A budget needs to be set aside to buy links and develop mini-sites. Look to set up links with university sites (.edu or ac.uk), as these seem to be valuable given Google’s informational bias.
Each section of a site should have its own link-to-us page. For example, HotScripts, the major computer script directory, has a great link-to-us page.
By providing people with creatives and cut-and-paste HTML, you can vastly improve your chances of attracting reciprocal links to your site. You’ll need to have a separate page for each section, to maximise on-topic inbound links.
It is important to develop separate mini-sites (also known as satellite sites) for each key subject of your Website. This is a useful tactic that improves your chances of appearing in the SERPs for your keywords. Furthermore, as the last three Google updates have shaken things up so much, having more than one site reduces the likelihood that your business will be disrupted by the engine’s updates. However, Google is likely to view satellite sites as spam, so you must take some steps to reduce the chances of your being blacklisted on this basis.
First, make it as hard as possible to for Google to detect host affiliation between your main site and its mini-sites. Google may define sites to be owned by the same person if the first 3 octets of the sites’ IP addresses are the same (e.g. 123.123.123.xxx). Therefore, if you’re going to run mini-sites, put them on different Web hosts.
Secondly, use different domain names for your mini-sites, rather than sub-domains of your main site. In the past, Google has not penalised sub-domains, but the early results from the Brandy update show a considerable reduction in the presence of sub-domains in the SERPs.
Finally, be very careful with the linking strategy you use between mini-sites — Google will look at the linking structure very critically. Don’t plaster each of your sites with links to the others, and don’t reciprocate links between the sites.
Mini-sites make it easier to create on-topic neighbourhoods and experiment with LSI techniques. Creating a large network can be a means to boost your main site’s rank, but make sure you’re well aware of the risks involved with creating these mini-sites before you embark.
Use Brandy to your Advantage!
Google optimisation is now a lot harder than it used to be. However, the index is still manipulable. Success involves hard work, and potentially the expenditure of funds to develop a good mini-site network and buy links on relevant pages.