A study published last week found that misinformation on social media was spreading at a greater rate than during the run-up to the 2016 presidential vote. (Reuters)
Facebook said on Monday that it blocked some 30 accounts on its platform and 85 more on Instagram after police warned they may be linked to “foreign entities” trying to interfere in the United States midterm election.
The announcement came shortly after US law enforcement and intelligence agencies said that Americans should be wary of Russian attempts to spread fake news. The election is Tuesday.
A study published last week found that misinformation on social media was spreading at a greater rate than during the run-up to the 2016 presidential vote, which Russia is accused of manipulating through a vast propaganda campaign in favour of Donald Trump, the eventual winner.
“On Sunday evening, US law enforcement contacted us about online activity that they recently discovered and which they believe may be linked to foreign entities,” Facebook head of cybersecurity policy Nathaniel Gleicher said in a blog post.
“We immediately blocked these accounts and are now investigating them in more detail.”
The investigation so far identified around 30 Facebook accounts and 85 Instagram accounts that appeared to be engaged in “coordinated inauthentic behaviour,” Gleicher said.
He added that all the Facebook pages associated with the accounts appeared to be in French or Russian.
The Instagram accounts were mostly in English, with some “focused on celebrities, others political debate.”
“Typically, we would be further along with our analysis before announcing anything publicly,” Gleicher said.
“But given that we are only one day away from important elections in the US, we wanted to let people know about the action we’ve taken and the facts as we know them today.”
‘Junk News’
Despite an aggressive crackdown by social media firms, so-called “junk news” is spreading at a greater rate than in 2016 on social media ahead of Tuesday’s US congressional election, Oxford Internet Institute researchers said in a study published Thursday.
Twitter said Saturday it deleted a “series of accounts” that attempted to share disinformation. It gave no number.
Facebook last month said it took down accounts linked to an Iranian effort to influence US and British politics with messages about charged topics such as immigration and race relations.
The social network identified 82 pages, groups and accounts that originated in Iran and violated policy on coordinated “inauthentic” behaviour.
Gleicher said at the time there was overlap with accounts taken down earlier this year and linked to Iranian state media, but the identity of the culprits has yet to be determined.
Posts on the accounts or pages, which included some hosted by Facebook-owned Instagram, focused mostly on “sowing discord” via strongly divisive issues rather than on particular candidates or campaigns.
Sample posts shared included inflammatory commentary about US President Donald Trump, British Prime Minister Theresa May and the controversy around freshly appointed US Supreme Court Justice Brett Kavanaugh.
War room
Major online social platforms have been under intense pressure to avoid being used by “bad actors” out to sway outcomes by publishing misinformation and enraging voters.
Facebook weeks ago opened a “war room” at its Menlo Park headquarters in California to be a nerve centre for the fight against misinformation and manipulation of the largest social network by foreign actors trying to influence elections in the United States and elsewhere.
The shutdown of thousands of Russian-controlled accounts by Twitter and Facebook — plus the indictments of 14 people from Russia’s notorious troll farm the Internet Research Agency — have blunted but by no means halted their efforts to influence US politics.
Facebook, which has been blamed for doing too little to prevent misinformation efforts by Russia and others in the 2016 US election, now wants the world to know it is taking aggressive steps with initiatives like the war room.
The war room is part of stepped up security announced by Facebook, which will be adding some 20 000 employees.
© Agence France-Presse