Although we are living in an age of infinite information, existing in a comfortable bubble has never been easier, writes Alistair Fairweather.
Whenever mankind makes a great technological leap forward, we expect utopia will soon follow. But even the greatest invention in history – the internet – is subject to the gravitational forces of reality.
When the internet entered the mainstream in the 1990s it coincided with a period of unusual optimism. The Berlin Wall fell, South Africa emerged from apartheid and economic reforms in China and India began lifting millions of people out of poverty.
The internet, futurists told us, would sweep away the remaining ills of the past. Universal education, access to finance, efficient global markets and a dozen other ideals were now within our grasp. Ignorance would be banished and corruption vanquished, all by simply connecting humanity to itself.
And in many ways they were right. Massive Open Online Courses like the Khan Academy and Coursera are educating millions of people around the world for the cost of a basic internet connection.
Microfunds like Kiva provide the tiny amounts of capital that micro entrepreneurs in emerging markets need to set up self-sustaining businesses. And peer-to-peer markets like AirBnB, Lyft and Lending Club are unlocking value and empowering ordinary people around the planet while also reducing waste.
But, like any tool, the internet has its limits and its abuses. I’m not talking about the obvious horrors – crimes like money laundering, child pornography and global terrorism; all of which have thrived on the internet. I’m talking about something much more subtle: prejudice.
I don’t mean racism – although that is certainly part of it – I mean the unfortunate human trait of justifying and defending our own biases. We tend to believe that our own ideas and values are intrinsically good and correct, and that competing points of view are wrong and bad.
Sadly the internet has slowly adjusted to cater to our worst instincts. Google and other search engines learn your preferences over time and prioritise the results you prefer. Facebook filters everything you see, trying to show you only what pleases you. And if it gets anything “wrong”, one click sets it straight. You need never see that unpleasant thing again.
This kind of self-reinforcing filtering isn’t a problem when it comes to music or shoe brands. But when it comes to issues like public health, politics or economics, it can be incredibly dangerous.
Take the debate around vaccination. Millions of parents are now delaying or completely forgoing vaccinating their children against diseases like polio, measles and rubella. They believe that there is a clear link between these vaccines and the onset of developmental disorders like autism.
Unfortunately this choice is based on false science and is dangerous for society as a whole. Measles was declared as “eliminated” from the USA in 2000; the only cases recorded since then were “imported”. Last year there were 175 cases; triple the average number of imported cases.
The Centres for Disease Control is quite clear about why this is happening: vaccination coverage is falling for the first time in decades. Chillingly it notes that “unvaccinated children tend to cluster geographically and socially, increasing the risk for outbreaks. Increases in the proportion of persons declining vaccination for themselves or their children might lead to large-scale and sustained outbreaks … ”
It’s only a matter of time before measles begins to kill children in countries from which it was once banished. The same applies to polio, which crippled and killed generations of children. Well-meaning parents worried about autism are collectively risking a global pandemic.
So why don’t these parents believe the same science that I do? Because for every article disproving the link between vaccines and autism, there are three “proving” it. Google “vaccination causes autism” and you will see what I mean.
Facebook is even more powerfully convincing than Google. If you’re a young parent, worried about inflicting an awful disorder on your child, you’re going to find millions of other young parents with the same fears and the same opinions.
So while the internet can magnify truths, it can also magnify falsehoods. This effect is even more pronounced when it comes to murky fields like politics. The bombardment and invasion of Gaza by Israeli troops is a case in point.
Throughout the siege, my Facebook timeline was awash with diatribes and polemics from both sides. Because Facebook equates activity with popularity, these mini debates constantly rose to the surface. And because I refused to “like” or comment on any of them, Facebook had no way to decide which side of the debate I “preferred” to see.
Had I begun “liking” the pro-Palestinian posts or the pro-Israeli ones, the others would soon have faded into the background. Facebook’s interest is in pleasing me, not in encouraging debate. Facebook is not interested in fairness or open mindedness or debate, only in comfort and enjoyment.
This is not Facebook’s fault. It’s just doing what it’s designed to do: keeping people “engaged” and happy by connecting them with other like-minded people. Unfortunately the net result is that, though we are living in an age of infinite information, existing in a comfortable bubble has never been easier.
It’s tempting to shoot the messenger here, and people often do. The internet is blamed for decreasing tolerance in everything from the US Congress to the Islamic Jihad. But really the internet is just adjusting to our basest human instincts: the fear of the unknown and the need for safety. We are the problem. We have always been the problem.
We should not have imagined that the internet – or any other tool – would solve all our ills. It gave us the means to make millions of lives better in many small ways, and it is doing so every day. But it has not made us perfect. Only when we stop blaming our tools and each other, only when we stop talking and start really listening, only then will things begin to change. No technology can do that for us.