Are Traditional Search Engines Failing?

Craig Buckler
Craig Buckler

Is Google doomed? According to interviews with Stefan Weitz, traditional search engines are failing users. Indexing keywords and checking backlink quality is a flawed 12 year-old concept which no longer works on the modern web.

So who is Stefan Weitz? He’s Bing’s director.

The more cynical among you will conclude this is obvious marketing spin, but Weitz raises a few interesting points:

  1. Search engines have difficulty understanding keyword context. When a user enters “jaguars” are they interested in the big cat, the car manufacturer, or the Jacksonville football team?
  2. Search engines are relatively slow to index dynamic real-time data such as tweets or Facebook comments.
  3. Search engines index web pages but people often want the underlying data, e.g. stock prices, flight times, weather reports, etc.

In an attempt to make results more relevant, Bing has integrated Facebook recommendations. If a trusted friend of yours ‘likes’ a specific restaurant, it’s position will be raised when you use an associated search term.

Backlink Breakdowns

Google and Bing analyze backlinks to rate the importance of a page. If someone links to your article, it’s a ‘vote’ for that content. The vote is given more weight if the link originates site which receives many backlinks itself.

The Search Engine Optimization industry attempts to manipulate this metric. Techniques range from link baiting — creating good content which encourages backlinks — to more dubious methods. Most of us have experienced blog spam comments or useless automatically-generated link farm pages in search results.

Google, Bing and other search engines work hard to restrict black hat SEO practices with varying levels of success.

A Social Network Solution?

Microsoft owns a small share of Facebook so it’s not surprising they want to utilize that data. Google has implemented it’s own experimental “+1” feature which allows users to rate the relevancy of a search result.

While I’m sure it will help some queries, the technologies raise a new set of problems:

shenanigans
To receive rated results you must have a Bing account, a Facebook account and be signed into both. This slashes the potential audience — especially when many corporations ban Facebook in the workplace.

Most people are data consumers
The majority of people rarely publish content. Only a small proportion ever click a ‘Like’ button or send a tweet because it takes effort. In most cases, only those who have a strong positive or negative opinion will rate a product or write a short review.

It may be possible to collate some information by analyzing a user’s actions, but assessing their emotional attachment is far more difficult.

People are fickle
Facebook may be the most popular social network today but you would have said the same about MySpace a few years ago. Bing’s data is only as relevant as the users who remain on that system.

Not all content is rateable
While it’s easy to rate a website, page, product or service, how can you assess the quality of the underlying data. You might ‘Like’ a sunny weather report but that doesn’t stop it being wholly inaccurate when snow starts to fall.

Social networks can be manipulated too
Search engine positions are directly related to revenue so it’s only a matter of time before SEO experts discover new ways to exploit the social networks. Any number of psychological or technical tricks can be employed to make people ‘Like’ specific content. It occurs now so you can expect more companies to take advantage of the techniques.

It’s early days for social networking search integration so time will tell whether it improves results. But sorry Mr Weitz, I remain skeptical about any claims it will supersede backlink metrics.

Is traditional search failing? Will the social networks save us? Have you experienced new SEO exploitation practices? Comments welcome…