By Craig Buckler

Are Traditional Search Engines Failing?

By Craig Buckler

Is Google doomed? According to interviews with Stefan Weitz, traditional search engines are failing users. Indexing keywords and checking backlink quality is a flawed 12 year-old concept which no longer works on the modern web.

So who is Stefan Weitz? He’s Bing’s director.

The more cynical among you will conclude this is obvious marketing spin, but Weitz raises a few interesting points:

  1. Search engines have difficulty understanding keyword context. When a user enters “jaguars” are they interested in the big cat, the car manufacturer, or the Jacksonville football team?
  2. Search engines are relatively slow to index dynamic real-time data such as tweets or Facebook comments.
  3. Search engines index web pages but people often want the underlying data, e.g. stock prices, flight times, weather reports, etc.

In an attempt to make results more relevant, Bing has integrated Facebook recommendations. If a trusted friend of yours ‘likes’ a specific restaurant, it’s position will be raised when you use an associated search term.

Backlink Breakdowns

Google and Bing analyze backlinks to rate the importance of a page. If someone links to your article, it’s a ‘vote’ for that content. The vote is given more weight if the link originates site which receives many backlinks itself.

The Search Engine Optimization industry attempts to manipulate this metric. Techniques range from link baiting — creating good content which encourages backlinks — to more dubious methods. Most of us have experienced blog spam comments or useless automatically-generated link farm pages in search results.

Google, Bing and other search engines work hard to restrict black hat SEO practices with varying levels of success.

A Social Network Solution?

Microsoft owns a small share of Facebook so it’s not surprising they want to utilize that data. Google has implemented it’s own experimental “+1” feature which allows users to rate the relevancy of a search result.

While I’m sure it will help some queries, the technologies raise a new set of problems:

shenanigans
To receive rated results you must have a Bing account, a Facebook account and be signed into both. This slashes the potential audience — especially when many corporations ban Facebook in the workplace.

Most people are data consumers
The majority of people rarely publish content. Only a small proportion ever click a ‘Like’ button or send a tweet because it takes effort. In most cases, only those who have a strong positive or negative opinion will rate a product or write a short review.

It may be possible to collate some information by analyzing a user’s actions, but assessing their emotional attachment is far more difficult.

People are fickle
Facebook may be the most popular social network today but you would have said the same about MySpace a few years ago. Bing’s data is only as relevant as the users who remain on that system.

Not all content is rateable
While it’s easy to rate a website, page, product or service, how can you assess the quality of the underlying data. You might ‘Like’ a sunny weather report but that doesn’t stop it being wholly inaccurate when snow starts to fall.

Social networks can be manipulated too
Search engine positions are directly related to revenue so it’s only a matter of time before SEO experts discover new ways to exploit the social networks. Any number of psychological or technical tricks can be employed to make people ‘Like’ specific content. It occurs now so you can expect more companies to take advantage of the techniques.

It’s early days for social networking search integration so time will tell whether it improves results. But sorry Mr Weitz, I remain skeptical about any claims it will supersede backlink metrics.

Is traditional search failing? Will the social networks save us? Have you experienced new SEO exploitation practices? Comments welcome…

  • Balázs Slemmer

    I think traditional searching and social networks should be kept separate to some extent.

    Because for example just because someone is my relative, friend, colleague or otherwise related to me it does not mean that we ‘like’ the same things, have the same taste for a wide variety of topics. And even if that is the case, I would probably like to see content which has different (or even opposing) point of view then mine or that of my friends. Traditional search results should be based upon a much larger sample set. It could be nice though to see if some of my friends ‘like’ a given content or not, but I don’t think it should influence how search engines rate content. Let me decide whether I want to see a much liked content or rather some other.

    This TED talk might be relevant to this topic:

  • IT Mitică

    “checking backlink quality is a flawed 12 year-old concept” is the equivalent for “a trusted friend of yours ‘likes’ a specific restaurant”.

    The only change Bing proposes is to change the reference system, not the actual “algorithm”. It tries to convince us that Facebook or other social network gossip is a better way in establishing ranking over the web.

    That may be or may not be correct, but how do you solve the same problem Google has now: different taste. Some people may agree, some may not, that’s the way it will always be. And Google, by gathering data, by presenting results SERPs based on your locale and surfing habits, does exactly that.

    In the end, it boils down to the same fact: a group of people, and their effort to promote, based on a “voting” system. Using backlinks or Facebook “like”s it’s still subjective. As it should.

  • Alessandro Muraro

    I think there is some truth to it… I mean, seeing companies offering thousand of “high quality” backlinks makes you think about the whole search engine thing…

  • David Sheppard

    “Only a small proportion ever click a ‘Like’ button or send a tweet because it takes effort” this is true, I’m wondering what would happen if facebook opened up user goals system simular to Microsoft xbox. The xbox goal system was greatly underestimated, and very much so un-used. People, especially social media users, have a great need to be accepted and stacked up against others. An incentive for users to like everything would push search results in the right direction.

  • The Schaef

    The two concepts aren’t really all that different in their essence, anyway. A backlink is just a social network vote by people who maintain a web page. Blogs have furiously cross-linked each other for years, essentially becoming a self-serve aggregate system. The “liking” aspect only further democratizes the effort, while ironically reducing the context for the connection.

  • seriocomic

    There’s a hint to the problem in your own wording of the article. 95% of SEO efforts are inclined to be either “manipulation” or “exploitation” of search engine indexing and ranking algorithms. This is a futile and short-sighted exercise.

    Successful people engaged in SEO try to “facilitate” good indexation and ranking through highlighting to the search engines the most authoritative and relevant material and content.

    This is why social is important – as it ads a relevancy layer to the search results. It can be inferred that if your friends are “liking” or discussing something in the social media space, then the search engines could use that data to correlate your search query to those results which are relevant to the social discussion.

    The hidden contextual intent to a search query can only be unravelled so far using algorithms (proximity, frequency, geospatial etc).

    Therefore, if you’re looking to enhance your performance in the search space in the age of social, then you need to think more about what your visitors are searching, how they are searching, and leverage social dynamics (integration of social clues – the like button for instance) and engagement – to signal to the search engines the topical relevancy that would normally be missing by mere keyword and content analysis.

  • coachusa outlet

    I simply added this site to my bookmarks. I like reading you. It was quite helpful and helped me tons

  • biswa

    Nice information as this guy always comes with good technical information.

  • purplechap

    In general I like the idea of including social in my search results, but I would like these search engines to identify if social was included in calculating the results. Maybe even have an option to turn off including social if I feel it’s affecting the results negatively.

    • I think the idea of allowing the user to see and/or decide if ‘social’ based results are used in the search results is a good one. This might allow a more relevant results set, and one that you can control to a limited degree.

  • Martin Rugfelt (expertmaker)

    Hi Craig.
    Thanks for the article.
    Whether likes or back links are more relevant for the result is probably varying for different searches.

    The interesting bit is that popularity ranking is 12 years old and we have moved on significantly since then in computing power and other technology but even more so in the volume of data online. The growth has been massive while search technology has been busy tweaking algorithms with minuscule improvements to results.

    The popularity ranking was an answer to a problem of increasing volume of pages online. Basically keywords was not enough to separate the results from each other and popularity was a good answer to the problem.

    With the volume and growth of data online today keyword plus a popularity ranking is just not enough to separate the results space, with or without SEO activities. Just imagine a search for “travel” and see how many perfectly good sites are out there that could be relevant to the user? There is no way that popularity alone defines the results. There is a need for a lot more data and particularly meta data, contextual, semantic and sentiment, to create precision results.There are a lot of issues to be handled before a search based on that kind of information is fully possible widely, however it is possible.

    We have a commercial implementation in Scandinavia for mobile search that uses a lot more metadata for searches with greatly increased precision. We have used out AI platform for the solution.

  • Ron Bigus

    Everyone Said..MySpace will change everything….Then it was Virtual Worlds…Buy and Sell things that only exist in your computer
    Now Facebook and Linked-In are dominating how things are looked at.

    Two years from now it will be something else.

    Give us a Search that we can truly configure based on what we know to be relevant. I don’t belont to Facebook or linked-in so I don’t want them biasing my searches. I would like to have most recent results take a much higher ranking..without having to go to advanced search everyday and specify how far back to go for results. And allow me to remove some sites from results..permanetly..since they are just garbage but know how to get ranked. Also allow me to give some ++ weight to site sthat have consistently given me good info. When We can really configure our preferences and save them, then we can get consistently valid results.

  • Stomme poes


    If I type in “jaguars”, I *should* get results about cars, cats, and football teams. To deny me one or two of those based on my past searching habits, my friends, my friends’ friends, and what I had for dinner yesterday is the path to darkness.
    Any of you who didn’t check out that link from Balázs Slemmer, please do. We have to fight more than ever to stop information from being funneled into only our little portion of the world.

    If I’m interested in Jaguar cars, I’ll be sure to add “cars” and “dealerships” to my search. Or there are search engines who will ask you: when you typed in “milk”, did you mean the drink, the band, the film, or something else? Which offers you the option of refining your search right away if it didn’t occur to you to do so beforehand.

    Bing’s “like” button is the same as Google’s “+1” button: they are not just using it to make sure you only get the content you want, but they’re also using it to datamine you. You’re logged in to something while using those buttons.

    • Michael

      Thanks for this bit of common sense! I can’t stand Bing’s slogan “Bing & decide”… when they are stripping out options and narrowing your ability to actually make an informed decision. Granted, Google may be doing the same thing, but it’s not as glaring and it isn’t weeding out content and making decisions for me. The smart UX solution for these search engine companies is to offer clear, quick refinements like you mentioned (“Did you mean the football team, car company, large cat or something else?”).
      Let’s hope people are smart enough to resist Microsoft’s attempts to dumb us down and restrict our ability to choose how we want to use the vast amount of info on the Internet, as well as how we access and search for it.

  • Patrick Samphire

    “Search engines are relatively slow to index dynamic real-time data such as tweets or Facebook comments.”

    All I can say is, ‘good’. I really, really don’t want google (or bing) searches returning a whole bunch of tweets and comments. If I search, it’s because I’m looking for high-quality, established, reliable information, not some idiot’s immediately forgotten and inane remark. Thank you.

  • monroe

    I would think that there is a fair bit of truth here. I say this because for a particular search term I see a site that only has the ‘under construction’ page appear within the top 5 search result entries. This I have seen for the past 8 months. The site has some back links which were perhaps done 5 yeas back. So once a site has some good quality back links, it would continue to take top spots even if the content is completely removed.

  • Chaz Scholton (

    1. Most people typing in “jaguars” into Google, will quickly add additional keywords to refine their search such as “jaguars car” or “jaguars sports” or “jaguars animal”. What’s amazing I find are the number of keyword combinations being used to access the pages on a number of sites I’m responsible for. Very seldom do I see one word phrases being used. If anything it’s 3-6 keywords people are entering into the search engines.

    2. Who cares about how fast facebook comments and tweets are being indexed by Google. Most of it is worthless information anyways. People posting comments about sitting at home alone, or going on dates or celebrating a family members birthday (How much value is this information to the public at large anyways, seriously?)

    3. Most people that want to use the underlying data, end up downloading some application that accesses this information. Dare I express this go to websites that specifically deal with providing this data along with a service related to that data.

    I’m sincerely amused by all three points made by Stefan Weitz.

  • eTek Studio

    Hi Craig,
    Very well written.. I enjoyed reading it..
    Couldn’t agree more with the three points :)

  • Stefan Weitz

    My point is not that social will save the day but it is an important signal that can now be used. And to be clear, you don’t actually have to have a LiveID or Bing account to use the FB integration. Finally, a more recent interview that allowed for more long form explanation does a better job explaining the future of search from our vantage point.

    and to be clear, I’m not a marketer :)

    Thanks for reading!
    Stefan Weitz

    • Thanks Stefan — we’ll keep an eye on Bing’s social network integration progress.

      Everyone’s a marketer to some extent!

  • Vishal

    Well, I guess traditional SEO techniques are failing, building links , directory submission, blog commenting, (Spam, blogspot comments) are not working.
    Social Media is in great demand, but there are ways to tweak it also. The only reason why traditional techniques are failing its because due to the fact that these techniques have been highly misused by spammers, and now search engines are becoming more and more strict, which is good for the guys who are in real business.

  • In traditional print advertising or other similar adverts, companies pay for the number of people who are supposed to see the advert or at least that is what is supposed to happen. With search engine marketing, companies opting for pay per click programs, these companies actually pay the search engines each time a person clicks on their adverts. This system appears to be closer to a performance based ad program, since companies only pay to get eyeballs, and hopefully shoppers on their websites. However, there is a limitation to this idea, visitors who visit the website only to browse and not buy anything can dry up the budget of an advert. This has been a growing problem with internet marketing and savvy search engine marketers are coming up with new ideas to lower this affect and make businesses grow. To successfully attract customers to your site it is important to follow some basic rules which are simple and make a huge difference.

  • David frankk

    Its amazing how google has evolved in such a way where others have failed to hold on to the changing trends.

Get the latest in Front-end, once a week, for free.