Receive up-to-the-minute news updates on the hottest topics with NewsHub. Install now.

Facebook shuts down thousands of UK accounts in clamp down on fake news

May 8, 2017 6:40 AM
86 0
Facebook shuts down thousands of UK accounts in clamp down on fake news

The site has also introduced its fake news tools ahead of the general election

In April, Facebook announced initiatives designed to tackle the ever-growing problem of fake news. Now, it's revealed how it plans to curb this rise specifically in the UK.

This includes the removal of "tens of thousands" of fake accounts, a so-called Informed Sharing iniative to stop fake news spreading across the social network, plus adverts on the site and in the UK press.

“People want to see accurate information on Facebook and so do we,” said Simon Milner, Facebook's director of policy for the UK. “That is why we are doing everything we can to tackle the problem of false news. We can't solve this problem alone so we are supporting third party fact-checkers during the election in their work with news organisations, so they can independently assess facts and stories.”

These third-party organisations include Full Fact and First Draft and, under this partnership, Facebook said it will work with journalists to address rumours and misinformation spreading online during the UK general election. Google has also signed up to this partnership, and more details will be announced “in due course.”

From today, a notice will be shown across Facebook news feeds in the UK with ten tips on how to spot false news. This builds on the eight tips unveiled in April which include checking dates, URLs and sources. All 10 tips can be found in Full Fact's toolkit.

Facebook claims that since April it has removed “tens of thousands of fake accounts”, but would not be drawn on an exact number. The site said it's made improvements to recognise fake accounts faster and more easily by identifying patterns of activity, without assessing the content itself. For example, Facebook said its systems may detect repeated posting of the same content or an increase in messages sent.

“With these changes, we expect we will also reduce the spread of material generated through inauthentic activity, including spam, misinformation, or other deceptive content that is often shared by creators of fake accounts,” the site said.

Facebook is changing the way it ranks stories though what it has dubbed Informed Sharing. “We’ve found that if reading an article makes people significantly less likely to share it, that may be a sign that a story has misled people in some way,” the site said. For example, people sharing an article without clicking on it could be a sign that it's a false story (or the headline is potentially misleading.) Alternatively, if people are clicking on the link and choosing not to share it, a red flag is raised.

Facebook did not go into detail how this would work but it is likely to expand on its clickbait ranking update, introduced in August. Under that scheme, articles with headlines that don't tell the reader the full story – those leaving a 'curiosity gap' – are penalised by the system that ranks posts by Pages. The headlines are used by publishers to entice readers to click on an article's link to find out more; registering the page view for advertising revenue.

According to Facebook, examples of clickbait headlines include: 'When She Looked Under Her Couch Cushions And Saw THIS… I Was SHOCKED!' and 'He Put Garlic In His Shoes Before Going To Bed And What Happens Next Is Hard To Believe'.

Facebook began tests to incorporate the latest "fake news" signal into its rankings in December, specifically for articles that are outliers where people who read the article are significantly less likely to share it, and is now expanding this to Great Britain.

These UK-focused changes will join the online initatives including the Facebook Journalism Project, set up to offer "deeper collaboration" with news organisations.

Part of this project is e-training and learning courses, in nine languages, to help journalists better use Facebook's products, tools and services.

These come with a certificate curriculum. Facebook also offers training for local newsrooms with Knight Foundation, Detroit Journalism Cooperative, Institute for NonProfit News, Local Independent News Online (LION) and Institute for Journalism in New Media.

Facebook has been on the defensive in recent months. Following the furore surrounding Donald Trump's presidency, the site was accused of being explicit in helping spread fake news about the US presidential election. More recently, it's faced scorn for the way Facebook Live is being used to broadcast murders and suicides. As a result of the latter, the site announced plans to hire 3,000 extra moderators to track offensive content. This will bring the team total to 7,500.

“Over the last few weeks, we've seen people hurting themselves and others on Facebook – either live or in video posted later," Zuckerberg wrote on his Facebook page. "It's heartbreaking, and I've been reflecting on how we can do better for our community.”

The team looks at reports of hate speech and child exploitation, two issues the social network has been under pressure to dramatically improve upon. Zuckerberg added that Facebook needs to respond faster to reports, and is building new ways to make reporting simpler, and the review process faster.

Elsewhere, Jimmy Wales, the founder of Wikipedia, recently announced a community-driven online news service in response to the widespread distribution of deliberately misleading information masquerading as news.

The project, Wikitribune, is a hybrid model in which paid journalists work with a broad network of contributors. “We want to bring some of that fact-based, fact-checking mentality that we know from Wikipedia to news,” Wales told WIRED.

Wikitribune will be financed through a crowdfunding campaign that will determine the size of the initial team. “Humans haven’t fundamentally changed from the way we were 100 years ago or 500 years ago,” Wales continues. “People have a thirst for quality information.” He describes the core editorial mission of the platform as “facts matter” and intends that the site will be able to support “original reporting and investigative journalism”.

Also read: Atheists are 'angry' but more likely to listen to reason than religious people, according to language used on Facebook

Source: wired.co.uk

Share in social networks:

Comments - 0