Regulating the Information Gatekeepers
Concerns about biased manipulation of search results may require intervention involving government regulation.
By PatRicK VoGL anD michaeL BaRRett
Regulating the information Gatekeepers
I N 2 0 0 3 , 2B Ig F e e T,
an Internet business specializing in the sale of oversize shoes ranked among the top results in Google searches for its products. Its prime location on the virtual equivalent of new York’s high-end shopping mecca Fifth Avenue brought a steady stream of clicks and revenue. But success was fleeting: mine the most relevant results in the index. Although the precise workings of these algorithms are kept at least as secret as Coca-Cola’s formula they are usually based on two main functions: keyword analysis (for evaluating pages along such dimensions as frequency of specific words) and link analysis (based on the number of times a page is linked to from other sites and the rank of these other sites) (see Figure 1).
That November, Google’s engineers modified their search engine’s algorithms, an update later dubbed “Florida” by the search-engine community. 2bigfeet’s rankings dropped abruptly just before the Christmas selling season, and this Internet success story was suddenly on the brink of bankruptcy.2 Search engines have established themselves as critical gatekeepers of information. However, despite an increasingly monopolistic Internet search market, they and the implicit filtering process in their rankings remain largely beyond public scrutiny and control. This has inspired us to explore an increasingly topical question: Should search-engine ranking be regulated? Search engines generally work by indexing the Web through so-called crawler programs. When a user types in a request, search algorithms deter-
With search engines guiding access to critical Web-based information flow, many users are increasingly concerned over possible targeted manipulation of search results. With markets and