/ 26 May 2016

How does the Google Image search algorithm work – and is it racist?

How Does The Google Image Search Algorithm Work And Is It Racist?

Google Images, introduced in July 2001, is a search service owned by Google that was that allows users to search the Web for image content. Prior to this addition to Google’s repertoire, Google Search results were limited to simple pages of text with links. 

But after Jennifer Lopez broke the internet at the 42nd Grammy Awards ceremony in 2000, according to Google President Eric Schmidt, developers realised that they needed to offer Google users more than bland text. Pictures of Lopez in her green Versace dress with a plunging neckline (if we can call a plunge to her navel a neckline) were the highly sought after by Google users, becoming the most popular search query Google had at the time.  

Google has a search algorithm so that it can sort results from a variety of websites and deliver the best possible results to the end user. For a typical query there are thousands, if not millions, of webpages with helpful information. Algorithms are the processes and formulas that take a “search” and turn it into an answer.  

Today, Google’s algorithms rely on more than 200 unique signals or “clues” that make it possible to guess what you might really be looking for. These signals include things like the terms on websites, the freshness of content, your region and PageRank. The algorithm tries to garner many clues from the user to better understand what they are looking for. The algorithm is at work when, for example, the ‘search’ tab starts finishing the query you are typing with suggestions. 

Google Images is now an integral tool for internet-based for work and play. And all with the simple act of producing a gallery of images based on words or phrases and suggesting what we may be looking for.  

But, there’s a catch. Because Google Images figures out who or what is shown in an image by judging the text and captions that frame it, certain biases may arise in the search results. 

Try searching “Beautiful woman” or “Man” on Google Search Images. Then search “Unprofessional hairstyles for women”. The search term “Beautiful woman” serves up many images of women, but with little by way of variety – the majority of them are white. Similarly, “Man” retrieves results of white men of varying ages while “unprofessional hairstyles for women” churns out pictures of black women.  

Why does this happen? 

On a rudimentary level Google’s image search algorithm starts as an idea in an engineer’s mind. If the conversations around beautiful women online is primarily centred on or around white women, then that will be the result of a search.  

Ultimately, what these results show is an echo of our online conversations. Of course the algorithm was not developed with the intention of being racist, sexist, ableist or transphobic. The algorithm does what it’s designed to do: reflect the content that’s available.  

The internet is limited, biased and contradictory; it is not fully accessible or readily available to users or developers and designers. We all should know that the internet is not a neutral digital-Jeeves that gives rise to an egalitarian and equal utopia. White supremacy, racism, dudebro-sexism and worse exist in the dark (and even not so dark) recessed of the internet.  

While Google does have minimal control over the algorithm that creates such biased reults, the search engine has little to no control over the netizens who produce biased content on their websites, blogs or forums.  

In light of this, re-framing the question “Is Google racist?” is necessary. Instead, we should ask what Google and other organisations are doing about it. Do Google employees represent the diversity of ideas, cultures and backgrounds sought in its search results? And then we must ask: do we represent this diversity in the content we put online?  

This will help us to find the answers to how we, and Google’s search results by extension, can unlearn our prejudices and biases that have caused simple image searches to perpetuate and amplify the bigotry of those that use them.