/ 12 March 2015

Share first, verify later: When breaking news breaks the rules

Share First, Verify Later: When Breaking News Breaks The Rules

There was something rather sweet and innocent about the idea of user-generated content when it was first adopted by major news companies.

It was about opening the doors to audiences and making them feel valued. Those who sent in images or information from the scene were mostly trying to be helpful, enhance the understanding of a situation or simply have their moment of fame.

Before long, this on-the-ground footage was leading the news bulletins and going viral within minutes. In the case of the Arab Spring, it was even instrumental in overthrowing entrenched regimes.

Then, alongside this increasing power, we started to see – particularly in the past year – the darker, more manipulative side of user-generated content. From the Islamic State’s propaganda to the information war between Russia and Ukraine, it had become an industry, not a by-product.

“There are hoaxers now, trying to trip you up,” says Chris Hamilton, the social media editor for BBC News, which has faced increasing challenges in verifying content and deciding what to publish.

“Some use it as a propaganda tool. And also, with the Sydney siege and Woolwich murder [of British soldier Lee Rigby by two Islamic extremists], we’ve seen events that are staged with online media coverage in mind.” In both cases, passers-by or hostages were asked to film perpetrators’ statements to post on the web.

Trick or tip-off?
The BBC’s user-generated content department was launched in July 2005 as a pilot project. One week later, the 7/7 bombings hit London and the new department leapt into action, collating witness accounts. The value was noted, the project went permanent and the team now works 24/7 in the heart of the main newsroom.

On an average day, 3 000 pieces of user content are received – including images, comments, emails, SMSs and tweets. This swells to 10 000 a day during bad weather – the British having a particular compulsion for sending in photos of their garden furniture in snow.

Much of this content is uncontroversial, but the images and videos sent from war zones or disaster scenes need attention. Verification involves looking at the core details – including when, where and how content was uploaded – and then assuming a quasi-detective role. Shadows can be a giveaway. So can accents. Is this old footage? Is it from another conflict? Has the soundtrack been dubbed to increase gunfire? Are those fighter jets or has one jet been mirrored to look like a squadron?

Alex Murray, a verification specialist within the BBC’s team, remembers realising a small white square in the corner of a video from Libya was an expensive portable satellite, and therefore it could not have been sent in by a casual bystander. High-end equipment can indicate that a video has been funded by states, by rebels, even by nongovernmental organisations.

“Videos are becoming increasingly professional,” he says. “In the early days, you’d be lucky if people remembered to turn their device around to shoot landscape [format. Most people instinctively shoot portrait format, which doesn’t fit a television screen.] Now you have cameras that create broadcast quality.”

We, the audience, hate to be duped, yet breaking stories are often embraced with a share first, verify later mentality. And if the footage fits in with the narrative we want to believe, we might never ask these questions.

One of the biggest debunks of recent months was the “Syrian hero boy” video in November, when a young boy seems to risk his own life to save a girl from gunfire. A Norwegian filmmaker then admitted he’d staged it – in Malta – to “spur debate” about children in war zones.

The BBC, sensing trickery, never posted it, but it had already been viewed millions of times. And, as Hamilton notes: “Many more people saw the hoax than the follow-up report, so many still believe it to be true.”

The BBC may be doing its best to be responsible, but the platforms to which users upload content directly have their own standards. The Sydney siege videos and Islamic State beheadings were quickly censored by YouTube, but had already been copied and shared elsewhere. Less mainstream sites – such as LiveLeak – are quick to step in and publish the most gruesome videos.

JustPaste.it, a Polish-owned site, inadvertently became jihadists’ choice platform last year, because it allows users to post material without having to register and the simple design loads quickly on cellphones without needing a strong internet connection.

The Islamic State has also found a new trick to enhance the credibility of its content: it is uploading its images to Archive.org, a nonprofit organisation that aims to collect a historic record of the web and won’t delete footage.

“It’s a symbolic act,” says the BBC’s Murray. “They are saying: ‘We are doing this, so our content will be protected by the first amendment; it’s part of a catalogue of our times.’ People look on these sites for events of historic significance.

“Everyone that shares a video is in some way an activist,” adds Murray – whether they are hoping to help as a citizen reporter or have more sinister motivations. “They are partial; they are all sharing their experience from a personal standpoint.”

Feeding our hunger for drama
Some people don’t even realise they are becoming players in a much bigger story when they first hit “upload”. Take the bystander who posted footage of the final moments of Ahmed Merabet, the policeman shot dead on the pavement outside Charlie Hebdo’s offices in Paris.

In a state of shock, the user made the split-second decision to post what he’d just seen on Facebook. When he regretted it and took it down 15 minutes later, it was too late; it had gone worldwide. “I take a photo – a cat – and I put it on Facebook. It was the same stupid reflex,” he said.

But this is how news stories happen now: split-second decisions, a hunger for drama, a desire to share opinions and witness accounts instantly to be part of the action. But those behind the scenes are getting increasingly savvy, from those producing the copycat propaganda to the small teams jumping through hoops to verify it.

There are still those innocently sharing photos of their patio furniture in the snow, but the other side is fast-growing and powerful. More scepticism and patience would serve us well but, in the hungry 24-hour news cycle, it’s not something we are likely to see.

Vicky Baker is deputy editor of Index on Censorship magazine. The latest issue, with a report on user generated content, is out next week. Follow the magazine on @index_magazine


How the BBC embraced user content

2004: The Asian tsunami in December saw footage filmed by tourists leading broadcasts.

2005: Major fire at Buncefield oil depot in Hertfordshire, United Kingdom, in December: viewers from a wide geographical area sent in photos and videos so the BBC could show the blaze’s scale. The BBC launched its user-generated content department a week before the July bombings on London transport: journalists relied heavily on witness images and videos.

2007: User content, particularly from tourists, became crucial in ­covering an uprising in Burma because journalists were not free to report.

2009: The Hudson River plane crash in New York in January was, says the BBC’s Chris Hamilton, “when Twitter exploded into the consciousness of most newsrooms”. Someone tweeted the first image of the plane from a rescue boat. “Previously, he would have had to come home, connect his digital camera to his computer and then strike a deal with a news group.”

In June, the Green Revolution in Iran saw mass protests against the officially declared election victory of Mahmoud Ahmadinejad. “Again, news organisations had limited access. We relied on content from liberal Iranians,” says Hamilton.

2011: This was the watershed year. Hamilton explains: “There were the Norway attacks [by gunman Anders Breivik], the London riots, the Japanese tsunami and, the really big one, the Arab Spring. You saw the rise of hardware: smartphones were more prevalent and much higher quality. Events appeared on Twitter, Facebook and YouTube in real time, and using these sites for news gathering was no longer an esoteric pursuit.”

2014: “We saw the rise of digital jihads,” says Hamilton. User ­content also played a huge role in covering the Gaza conflict from July to August, the United States Ferguson protests in August, and the Sydney coffee shop siege in December.