Here's the official word from market leader Google:
As Larry said long ago, we want to give you back “exactly what you want.” When Google was founded, one key innovation was PageRank, a technology that determined the “importance” of a webpage by looking at what other pages link to it, as well as other data. Today we use more than 200 signals, including PageRank, to order websites, and we update these algorithms on a weekly basis. For example, we offer personalized search results based on your web history and location.As for rival Bing, one need look no further than their aggressive marketing campaign to see similar logic at work. Search is no longer a functional lookup tool a la grep. it is a personalized experience where piles of clever algorithms sift through data, not just about the internet, but about the searcher, in order to deliver custom results.
You may not see anything wrong with this right away - indeed, it is statistically designed to work in the vast majority of cases. The personalization after all isn't truly personal, but rather an attempt to classify people in buckets and show them things that are known to be interesting to most people in that bucket. Searches will suggest and complete in a way that most people want - for example, the next word after "music" is apparently most often "videos."
I often find this behavior irritating, as I suppose my true preferences are still quite a bit different from folks that may be in my "bucket." But it turns out that the dangers of personalization go far beyond occasional poor search results. Eli Pariser gave a compelling 10 minute talk in TED 2011 where he argues that excessive personalization results in "filter bubbles" where we end up consuming only "information junk food."
Silly labels aside, the video is well worth the watch and the argument seems to be a sound one. By always showing the most "relevant" (e.g. most likely to be clicked/consumed) results to a certain demographic, we can end up systematically ignoring whole topics that are often of political or cultural importance. I don't think things are that awful quite yet, but they are getting there.
For a non-political example, think about searching for the word "weather." It used to be that this would give you encyclopedic information about weather - links to NOAA, Wikipedia, and other well-linked resources on the topic. If you wanted to find the current weather conditions in your area then you would add a city name or zip code.
Now if you search for weather search engines will assume that you want that latter case - current weather conditions in your area. You'll get the immediate weather conditions, as well as numerous results for other weather condition services for your area. Most users are satisfied most of the time.
But think of a 5th grader who's trying to write a paper about weather. It may seem like a contrived example, but the point is that "general research" and "canonical results" no longer exist. Search engines (and other websites, such as social networks) make assumptions and tailor results based on them, and generally speaking the user has no recourse if the assumptions are in error.
Eli's talk focuses on political ramifications, and ends with the fanciful notion that algorithms must be programmed to be concerned with "ethics" and "importance" in addition to relevance. Speaking as a computer programmer, I find this rather silly - computers can have no more "ethics" than their programmer, and attempts to codify such things will inevitably have biases and oversights.
I would much rather have the simpler solution of allowing users to disable all the extra signals at their discretion. If I want to search for something and I don't care about where I am or what else I've searched for recently or anything else, I should have that option. If somebody halfway across the world does the same thing for the same search terms, they should get the same results as me.
And this takes me to the title of this entry - while this is admittedly a workaround, currently most personalization and other signals seem to not be as heavily enabled on some of the more exotic localizations of Google (and perhaps other sites, though I haven't checked). Of particular note is www.google.as - Google American Samoa.
The site is still in english, but if you search for "weather" you'll find NOAA and Wikipedia in the top 5 results (Wikipedia doesn't show up til page 3 for me on regular Google). "Pizza" similarly tells you about pizza, rather than making the assumption that you really want to buy a pizza near you. I haven't tested politically oriented queries, but I imagine they should fare better as well (as long as you're logged out/incognito).
And so I at least have a temporary search tool for these use cases, and I hope others find it helpful as well. I imagine in time this workaround will go away, and I can only hope that search providers allow users opt out of any and all signals, giving a "pure" search where the only signal is the query. This means if you screw up your search you'll get screwy results, same as grep - and in some cases, that's a really good thing.