Technology is not racist, but it still needs to change

William has wide, expressive eyes and a kind face. He’s 56 years old but his smooth skin and soft afro make him look 10 years younger. He lives in Los Angeles and preaches kindness to the local kids. And a few weeks ago a computer algorithm decided that he was an “animal” and an “ape”.

The algorithm in question was introduced by Flickr, a photo sharing site, at the beginning of May. It automatically scans each image uploaded to the site and categorises or “tags” them according to objects it recognises. 

When it works properly it is like magic. A picture of your cat lying on your couch is automatically tagged “cat”, and the same applies to sunsets, bridges, buildings and countless other things.

But Yahoo, which owns Flickr, did not anticipate that its new algorithm might confuse species and do so in such an offensive way. William, whose picture was uploaded to the site in 2014, is the most prominent example discovered, but the same algorithm also tagged a picture of a white woman as “animal” and “ape”.

It’s important to note that this algorithm is not overtly racist. No closet bigot at Yahoo secretly programmed it to mislabel photos. Instead it uses a technique called “machine learning” to scan through the billions of photos in Flickr’s archives and literally learn to recognise the contents of images through sheer repetition. 

Only a few people bother to tag each image they upload to the site, but the algorithm can use those tags that do exist to extrapolate to all of the other images with similar content. 

In that context it can be forgiven for mis-identifying a few humans as apes since we are part of the same family of primates, the Hominidae. Just like other hominids, we have large heads with forward facing eyes and protruding noses positioned above our mouths. 

Given how relatively crude our machine learning tools still are, these kinds of anomalies were bound to crop up. Computers are still far too limited and stupid to understand how offensive these kinds of mistakes can be. The same algorithm blithely tagged pictures of Dachau, the Nazi death camp, with “jungle gym” and “sport”. 

This is not the first time a piece of technology has been accused of inherent bias. In 2009 a pair of co-workers, one black and one white, uploaded a video to YouTube using a new HP laptop. The laptop boasted a camera with “face tracking” technology, intended to keep it focused on your head as you moved around in front of your computer. 

The camera could easily track the face of the white person, but when the black person entered the frame, it immediately stopped tracking and fell back to a neutral position. The evidence is compelling – there’s no mistaking the camera’s total inability to cope with darker skin. 

But skin colour isn’t the only racial characteristic that algorithms have trouble parsing. In 2010 a Taiwanese-American blogger found that her new digital camera kept warning that the someone in her family photos had blinked. She quickly realised that the camera was misinterpreting the epicanthic folds of their eyes as blinking. 

No one is seriously accusing any of these companies of overt bigotry. There’s no coven of white male geeks with a racist axe to grind. But in many ways this is a product of something much worse: obliviousness. 

At the moment the vast majority of consumer technology is designed by white men, usually from privileged backgrounds. Asian men also feature quite strongly in the sector, with white women coming in a distant third. 

When these white men test their products they do so for their main markets – the people most likely to buy their products. That means affluent North Americans and Europeans. The majority of those people are also white. The net result is technology designed by, tested by and marketed by white people.

In most cases this has no obvious negative effects. An iPhone or a FitBit doesn’t work better if you’re white or malfunction if you’re asian. But it does speak to a narrowness of perspective in the people who control the technology sector. If these machines cannot understand our faces, how will they understand our credit scores or our medical records?

But the tide of history is already shifting and technology will be dragged with it. In a few short decades the majority of middle class people on the planet will no longer be white. As new powers emerge in Asia, South America and Africa, technology will be forced to broaden its perspective and become more inclusive.

And the same thing will happen to the people making the technology: they will start to look more like most people on the planet, and less like the cast of Friends. Silicon Valley will not be the centre of the technology world forever. But until then, we need to keep reminding the geeks that our planet is a lot bigger and more varied than they realise.

PW Botha wagged his finger and banned us in 1988 but we stood firm. We built a reputation for fearless journalism, then, and now. Through these last 35 years, the Mail & Guardian has always been on the right side of history.

These days, we are on the trail of the merry band of corporates and politicians robbing South Africa of its own potential.

To help us ensure another 35 future years of fiercely independent journalism, please subscribe.


‘My biggest fear was getting the virus and dying in...

South African Wuhan evacuee speaks about his nine-week ordeal

Border walls don’t stop viruses, but a blanket amnesty might

Why South Africa should consider amnesty for undocumented migrants in the time of the coronavirus outbreak.

Mail & Guardian needs your help

Our job is to help give you the information we all need to participate in building this country, while holding those in power to account. But now the power to help us keep doing that is in your hands

Press Releases

The online value of executive education in a Covid-19 world

Executive education courses further develop the skills of leaders in the workplace

Sisa Ntshona urges everyone to stay home, and consider travelling later

Sisa Ntshona has urged everyone to limit their movements in line with government’s request

SAB Zenzele’s special AGM postponed until further notice

An arrangement has been announced for shareholders and retailers to receive a 77.5% cash payout

20th Edition of the National Teaching Awards

Teachers are seldom recognised but they are indispensable to the country's education system

Awards affirm the vital work that teachers do

Government is committed to empowering South Africa’s teachers with skills, knowledge and techniques for a changing world

SAB Zenzele special AGM rescheduled to March 25 2020

New voting arrangements are being made to safeguard the health of shareholders

Dimension Data launches Saturday School in PE

The Gauteng Saturday School has produced a number of success stories