An unofficial blog that watches Google's attempts to move your operating system online since 2005. Not affiliated with Google.

Send your tips to gostips@gmail.com.

July 22, 2007

Search Engines and Favoritism

While most search engines say they return unbiased results, it's interesting to compare how well the services of a company perform in the search engine of the same company and other competing search engines. I chose the top three search engines: Google, Yahoo, Microsoft's Windows Live Search and 10 general queries directly related to products or services developed by all the three companies.

It's interesting to notice that Google ranked its own sites as #1 in 7 cases out of 10 and Yahoo in 6 cases out of 10. Microsoft's services have poor rankings in most search engines, including Windows Live Search. A single Google service was the top result in Yahoo Search (Google News) and only one Yahoo service was #1 in Google Search (Yahoo Mail). You'll also find surprising that for "desktop search", each company placed its own desktop search software as the top result. But then again, this is just an empirical test and everything might just be a coincidence.

The tables show the ranking for each product (for example, the second row from the next table shows how well performed Gmail, Yahoo Mail and Hotmail in Google Search for the query "mail"). You can use a site like yahoogooglemsn.com to compare the results in the same window.

Mail
Google
Yahoo
Microsoft
Google Search 2
1
11
Yahoo Search 8
1
2
Windows Live Search 2
1
11


Calendar
Google
Yahoo
Microsoft
Google Search 3
4
-*
Yahoo Search 15
2
-
Windows Live Search 3
6
-


Groups
Google
Yahoo
Microsoft
Google Search 1
2
7
Yahoo Search 3
1
2
Windows Live Search 1
3
4


Toolbar
Google
Yahoo
Microsoft
Google Search 1
3
5
Yahoo Search 2
1
6
Windows Live Search 26
1
3


Maps
Google
Yahoo
Microsoft
Google Search 1
4
10
Yahoo Search 3
1
13
Windows Live Search 3
4
17


Desktop search
Google
Yahoo
Microsoft
Google Search 1
5
3
Yahoo Search 3
1
4
Windows Live Search 4
3
1


Image search
Google
Yahoo
Microsoft
Google Search 1
3
10
Yahoo Search 2
6
9
Windows Live Search 1
3
28


Video search
Google
Yahoo
Microsoft
Google Search
1
2
19
Yahoo Search 5
2
-
Windows Live Search 3
1
-


News search
Google
Yahoo
Microsoft
Google Search 1
3
-
Yahoo Search 1
2
-
Windows Live Search 1
3
-


Search
Google
Yahoo
Microsoft
Google Search 8
5
1
Yahoo Search 2
1
16
Windows Live Search 3
2
8

* no web page in the top 30 results

19 comments:

  1. Uhh.. Why is Google Search labelled twice for each one? Mislabeled rows?

    ReplyDelete
  2. I'm not so quick to think it's all bias; there's also context. If I'm searching with Yahoo, I probably also want Yahoo's toolbar, not MSN's. I'm sure that people who use a company's search are more likely to use their other online services because there's a level of integration.

    Another innocent possibility is that, the people who manage a service are better at taking advantage of the search algorithms of their company's search enginge. Eg, I bet the people at Google Video have more knowledge about getting a good Google rank than a good Yahoo rank. Maybe the search departments are giving others in the company some special tricks that the public is privy to.

    ReplyDelete
  3. I think all three companies publically state they do not modify their algorithms to bias themselves, though I wouldn't be suprised if they do anyway.

    I suppose each site being better SEO'd for its respective search engine could account to some 'bias'. However, I can't imagine the three engines are sufficently different that optimizing for one hurts the ranks in the others.

    ReplyDelete
  4. They all use different algorithms for ranking. They are quite proprietary about them. So it is quite likely that these results are not devious - they actually do rank themselves higher than the others and that ephili is on the right track that they design their site using some inside knowledge of their own ranking algorithms, probably knowledge they just pick up around the water cooler.

    ReplyDelete
  5. If you're talking about the algorithmic results, I can affirm that Google does not give any kind of boost to Google's web pages.

    Part of the differences may be how the companies choose to name and brand their products. For example, MSFT went with Hotmail rather than just mail, and they're #1 for "hotmail." Yahoo chose the name "local" for one of their services, and local.yahoo.com ranks #1 in Google for "local." Google went with "Gmail" while Yahoo went with "Yahoo Mail," so the #1 result for the search "mail" on Google is mail.yahoo.com.

    I can't speak to other search engines' results, but I can say that Google doesn't give any kind of boost to Google's web pages in our algorithmic rankings.

    ReplyDelete
  6. As they designed they own ranking algorithms i think that they can optimize their pages to rank high in their own search engines.

    ReplyDelete
  7. Shouldn't the "mail" query be changed to "email", that would better reflect what you are actually talking about. And this query actually provides the results one would expect (at least on Google)

    ReplyDelete
  8. > I can't speak to other search
    > engines' results, but I can say
    > that Google doesn't give any kind
    > of boost to Google's web pages
    > in our algorithmic rankings.

    The algorithms are human-made, and if I'm not mistaken they're influenced by quality reviews done from Google results raters (because if a new algorithm returns bad ratings, it won't go live, will it?). So the question is in what ways does Google Inc ensure those result raters are neutral? (E.g. I know of some raters who are befriended with Google employees.)

    ReplyDelete
  9. Philipp, do remember in 2005 when a 20+ page training guide from Google leaked? It talked about how to rate the relevance of results, and it certainly didn't say anything like "Oh yeah, and make sure to rate Google results well." ;)

    ReplyDelete
  10. You bring up a good point, although it's probably reasonable to assume that each search engine company knows how to optimize it's services for the best ranking in it's own algorithms. If you can't optimize your own code to rank well in algorithms that you created, it doesn't say too much for you!

    ReplyDelete
  11. I think Matt held back a bit in his comment. Not only doesn't Google apear to boost, but Google appears to respond well to keyword optimization.

    Instead of "mail" you should have searched "free email" (a more realistic user query where the user is looking for gmail, yahoo mail, etc). Yahoo! mail ranks #1 everywhere (check it's promotional pages), and GMail doesn't even keyword promote the "free" aspect except for free storage. Look for "free online storage" and see xdrive, Yahoo! briefcase, and other well-SEOd sites or perseverant online storage web sites.

    It seems you picked the wrong SERPs to critique Google... she's clean here yo.

    ReplyDelete
  12. well, i believe they're quiet bias to their own in an unobvious way

    ReplyDelete
  13. Fronkly from these it looks like Google and MSN agree on most results, so seem to be more likely to be correct. Yahoo freequently contradicts both, so scould be suggested that it is sliughtly biased. Also, google gets the top result for almost all of it's products by placing them as an ad at the top, in the same place ads from external companies go.

    ReplyDelete
  14. > Philipp, do remember in 2005 when
    > a 20+ page training guide from
    > Google leaked? It talked about how
    > to rate the relevance of results,
    > and it certainly didn't say
    > anything like "Oh yeah, and make
    > sure to rate Google results well." ;)

    Yes, that was the document that broke the news! So how does Google select raters? It's not enough the training manual doesn't actively ask for skewed results -- skewed results most often happen due to a non-representative user base (like the Alexa stats for instance).

    ReplyDelete
  15. I don't think the raters influence the order of results. They find problematic queries that need some algorithm tweaks.

    ReplyDelete
  16. > I don't think the raters
    > influence the order of
    > results. They find problematic
    > queries that need some
    > algorithm tweaks.

    No, AFAIK they can also give feedback on the overall quality of algo A vs algo B etc. So if Google needs to pick either one -- they're testing a new tweaked rankings approach -- a rater group preferring X would create rankings preferring X. So that's why for us to judge on this it would be important to hear how raters are selected, and I haven't heard Matt comment on this so far. *If* it turns out that raters are more than average likely to be friends or family of Google employees, then yeah, there's a potentially pretty huge skew.

    But then again, this skew may not be *bad* per se. Just imagine it's Google's philosophy that "a relevant site should have X". This philosophy can have two effects:
    1. Google creates their own sites making sure they have "X"
    2. Google creates rankings hoping to push up results that have "X"

    It's easy to see how, when you put 1 and 2 together, Google now ranks its own sites better even (I don't know if that happens, by the way, I think we need a larger sample, using more words) though they wouldn't see the bias... and if they would see it, they wouldn't think it's bad, and indeed it must not be.
    And Google's results of course are biased in an even simpler way: the algos are created by those working at Google. So the "a relevant site should have X" factor can come into play even when raters are picked represeting random sample of the population (which we don't know yet).

    ReplyDelete
  17. Philipp, with the large number of raters, and if I remember correctly there was some type of job posting that had certain requirements for raters, it is most likely that there is a mixed representation. I highly doubt that most of the 10,000 would be friends and relatives of Google employees.

    Otherwise its true that if Google is setting the qualities it may be intrinsically bias to its own properties, since they would follow the same quality guidelines. But to judge google on whether or not it has favoritism using that argument would be an unfair metric. Should they not have quality guidelines? Should they not follow their own guidelines? If one of their pages is below quality and still ranks high, then that's a different story.

    ReplyDelete
  18. The notion of 'bias' is quite subtle, and all the commenters are right in some way.

    I assume there's no isGoogleProperty variable in Google's ranking algorithm, so in that sense there's no deliberate bias at Google.

    But bias can creep in from many sources. Google's 10,000 raters may represent a fairly broad population, but I'm still sure they have different biases than Microsoft's 10,000 raters. Unless Google and Microsoft have decided to randomly sample from the same pool of qualified candidates, the two groups' biases are unavoidable.

    Or take away the raters, let's say the search engines have ranking algorithms than learn from the users' clickthroughs. It'd make sense that people searching for 'mail' on Yahoo tend to click on Yahoo Mail, and so its algorithm boost the ranking of that. Is this bias bad? It's just a statistical learning of the desires of the different user bases.

    And there are other kinds of biases too. I've written a recent blog post on it. See Content creators and consumers have different biases

    ReplyDelete

Note: Only a member of this blog may post a comment.