Google’s “Editorial Input” from Personalized Search Could Kill Objectivity

According to an article in The Register, Google’s Vice President of Search Product and User Experience, Marissa Mayer, observed that editorial judgments may play a key role in Google searches in the near future. Mayer made her comments at the Le Web conference in Paris on December 10th. If this change does materialize, the editorial input would come from Google’s “personalized search” as represented by its new Search Wiki. It would turn the search industry upside-down, and would likely compromise Google’s search objectivity.

First, let’s examine what Search Wiki is. Search Wiki is a Google service that appears next to your search results if you are logged in. Let me emphasize this: you must be logged in to a Google account in order for this service to appear.

Search Wiki is represented by three little icons – an arrow pointing up, an “X”, and a comment symbol that looks like a dialog balloon from a cartoon. The arrow allows you to promote a search result to the top of a set of search results. The “X” allows you to remove that search result from future searches (presumably with that same set of search terms), and the balloon allows you to create a note and attach it to a given result.

Currently, the editorial decisions that you make are visible only to yourself when logged in to Google (“signed-in searches”), and do not affect the searches made by others. Not yet, anyway.

The idea behind it is to maintain the state of your search. In other words, wouldn’t it be nice if you could pick up from where you left off if your research is a ongoing effort taking hours, days or weeks? Say you started a search, and thought that the second result was the best, the fifth one was complete garbage, and the eighth result led to a page that had some partially relevant information, but you had to remember what was so special about it. Search Wiki provides tools for these purposes.

The problem is that Google is considering aggregating everybody’s personalized searches and using that information to help determine search rankings. This represents a major change in how Google ranks pages.

When there is a particularly strong movement in rankings, they refer to this as a “signal.”  Google believes that if they get a strong signal about a certain topic, then they will rank it higher. A couple examples are politics and holidays. Every four years, a stronger signal will appear with such topics as “Republican” and “Democrat.” “Christmas” and “Halloween” will pop up every year. And in this case, searches will have seasons.

But when you feed the signal back into the underlying data by, say, skewing the search results which then filters information from view, it can amplify the signal – thus distorting the results.

Consider a political topic – something that generates much more heat than light. Such extreme points of view would trounce and possibly eliminate the reasoned middle of opinions and consequently erase or severely demote their search results. It’s like expecting to get facts on a subject that’s in the political spotlight, and finding only what the rabid supporters of the DailyKos have decided is important. It’s a new form of Googlewashing – gaming the system to skew search results.

Google’s possible use of aggregated personalized search data represents not the democratization of search, but the polarization, mob mentality and statistical distortion of search. How long before a competitor rolls out “True-Search” or “Objective Search”?

Comments are closed.