Studies show that 93% of online experiences start with a search engine query, which adds up to more than 100 billion global searches every month. There are roughly 4.5 billion web pages in existence, and when an Internet user is looking for one particular result, a search engine like Google will use a complex algorithm to scan those billions of pages.
Websites, search engines, and the algorithms that link the two are all created by people, and unfortunately, many have cultural biases. New evidence has come to light showing that search engines tend to reinforce social stereotypes.
In one example, an MBA student named Bonnie Kamona reported that upon searching for “unprofessional hair for work,” Google produced a set of images that was almost exclusively made up of women of color. Her search for “professional hair” provided images of nearly all white women.
Another Internet user shared that her image search for “three black teenagers” revealed countless mug shots, while a search for “three white teenagers” resulted in pictures of young white people smiling and enjoying themselves.
These two experiences are consistent with professional research on search engine bias. In one such study, researchers conducted Google image searches for specific professions and compared the gender distributions in the pictures with those documented in U.S. labor statistics.
For example, more men than women were shown in image results depicting “doctors.” The ratio of male to female doctors in real life was clearly not represented in the Google search results. The same inaccuracy could be found in a search for “nurses.” Put simply, the researchers discovered that image search engines were exaggerating gender stereotypes.
Because a search engine is powered by an algorithm and not a free thinking human being, establishing a solution isn’t as simple as telling Google to present less “stereotypical” results. The problem is more deeply rooted than that.
Jahna Otterbacher, an assistant professor of social information systems at the Open University in Nicosia, Cyprus, suggested that search engine developers must begin to “think critically about each stage of the engineering process and how and why biases could enter them.”
Otterbacher also suggested that developers create automated bias detection methods in order to determine if bias is present in a set of results.
“Like us, algorithms can be biased,” she said. “The reasons can be complicated – bad or biased training data, reliance on proxies, biased user feedback – and the ways to address them may look little like the ways we address the biases in ourselves.”