Concerns about biased manipulation of search results may require intervention involving government regulation. By PatRicK VoGL anD michaeL BaRRett
Regulating the information Gatekeepers
I N 2 0 0 3 , 2B Ig F e e T,
an Internet business specializing in the sale of oversize shoes ranked among the top results in Google searches for its products. Its prime location on the virtual equivalent of new York’s high-end shopping mecca Fifth Avenue brought a steady stream of clicks and revenue. But success was fleeting: mine the most relevant results in the index. Although the precise workings of these algorithms are kept at least as secret as Coca-Cola’s formula they are usually based on two main functions: keyword analysis (for evaluating pages along such dimensions as frequency of specific words) and link analysis (based on the number of times a page is linked to from other sites and the rank of these other sites) (see Figure 1).
That November, Google’s engineers modified their search engine’s algorithms, an update later dubbed “Florida” by the search-engine community. 2bigfeet’s rankings dropped abruptly just before the Christmas selling season, and this Internet success story was suddenly on the brink of bankruptcy.2 Search engines have established themselves as critical gatekeepers of information. However, despite an increasingly monopolistic Internet search market, they and the implicit filtering process in their rankings remain largely beyond public scrutiny and control. This has inspired us to explore an increasingly topical question: Should search-engine ranking be regulated? Search engines generally work by indexing the Web through so-called crawler programs. When a user types in a request, search algorithms deter-
With search engines guiding access to critical Web-based information flow, many users are increasingly concerned over possible targeted manipulation of search results. With markets and technology both unlikely to ensure unbiased results, regulation may be the only alternative. one promising way forward is clearer guidelines for search-engine optimization through self-regulation. 67
n ov e m b e r 2 0 1 0 | vo l . 5 3 | n o. 1 1 | c o m m u n i c at i o n s o f t he acm
Appreciating the value of top rankings, Webmasters have learned to optimize their pages so big search engines rank them more highly. This has spawned a worldwide industry of search-engine-optimization consultants, whose techniques are grouped into two categories: white-hat, ensuring that search engines easily analyze a site and are accepted by search engines; and black-hat, including hidden text, as in white text on a white background, considered illicit by most search engines and upon discovery generally punished with degraded ranking. Search engines clearly have a legitimate interest in fighting inappropriate third-party optimization techniques to ensure their search-result quality; for instance, sites with no other purpose than linking to specific sites to increase page rank (link farms) are black-hat and must be dealt with accordingly, though punishment can be problematic for multiple reasons: First, sudden ranking demotion and resulting diminished inflow of visitors have major effects on businesses, as illustrated by the cases of Skyfacet (which reportedly lost $500,000 of revenue in 2006) and MySolitaire (which reportedly lost $250,000 the same year14). Only a few cases, including the companies SearchKing18 and Kinderstart11 involving lawsuits over page figure 1. how ranking works.
rankings and German car manufacturer BMW, received notable media attention. Many more cases of dropped ranking have been condemned to virtual silence, among them search-engine optimizer Bigmouthmedia. Second, though market-leader Google has published guidelines on creating “Google-friendly” pages, the line between permitted and illicit practices is blurry at best.20...