Header Ads Widget

Facebook whistleblower claims social network profits from hate speech

Facebook whistleblower Frances Haugen talks with CBS' Scott Pelley on
Facebook whistleblower Frances Haugen talks with CBS’ Scott Pelley on “60 Minutes” about the internal workings of Facebook. (Credits: AP)

A former Facebook employee has claimed the world’s biggest social network profits from hate speech and chooses engagement over the safety of its users.

Frances Haugen worked as a product manager at the company until she left in 2021.

Yesterday, she appeared on 60 Minutes in the US to speak out against her former employer.

‘It’s paying for its profits with our safety,’ she told host Scott Pelley.

Haugen explained that the company prioritised engagement above all else. That is, they wanted to keep users on Facebook and using Facebook for as long as possible.

As a result, she alleges the company would let hate speech exist on Facebook because it’s what got people to engage with it.

‘The thing I saw at Facebook over and over again was there were conflicts of interest between what was good for the public and what was good for Facebook,’ she told the programme. ‘And Facebook, over and over again, chose to optimize for its own interests, like making more money.’

The 37-year-old data scientist copied internal documents and leaked them to the US Securities and Exchange Commission in the hope of forcing through regulation.

Facebook's policy on hate speech has been called into question (Getty Images)
Facebook’s policy on hate speech has been called into question (Getty Images)

One of the internal documents states: ‘We estimate that we may action as little as 3-5% of hate and ~0.6% of V&I [Violence and Incitement] on Facebook despite being the best in the world at it.’

This stands in stark contrast to Facebook’s oft-repeated claims it is working to remove hate speech from all its platforms, which include Instagram and WhatsApp.

The company relies on algorithms to catch the vast amount of bile that gets shared each and every day. Last year the company said it saw hate speech rise during the pandemic, but that almost 95 per cent of these posts were detected automatically before any users reported them.

Facebook put in place these algorithms in 2018 when it reworked what content the News Feed would show to Facebook’s billions of users. The company said it would promote meaningful interactions between users rather than spread information from businesses, brands and media.

‘We feel a responsibility to make sure our services aren’t just fun to use, but also good for people’s well-being,’ CEO Mark Zuckerberg said in a Facebook post at the time.

‘So we’ve studied this trend carefully by looking at the academic research and doing our own research with leading experts at universities.’

PARIS, FRANCE - MAY 24: Mark Zuckerberg, chief executive officer and founder of Facebook Inc. attends the Viva Tech start-up and technology gathering at Parc des Expositions Porte de Versailles on May 24, 2018 in Paris, France. The VivaTech exhibition in Paris brings together nearly 1800 start ups alongside the largest international groups. (Photo by Christophe Morin/IP3/Getty Images)
Mark Zuckerberg says Facebook feels a responsibility to offer a safe environment for its users (Getty)

But Haugen said it was a different mentality on the inside.

‘Facebook has realized that if they change the algorithm to be safer, people will spend less time on the site, they’ll click on less ads, they’ll make less money,’ she said.

For it’s part, Facebook has swiftly countered many of the points made by Haugen over the weekend.

‘Every day our teams have to balance protecting the ability of billions of people to express themselves openly with the need to keep our platform a safe and positive place,’ said Lena Pietsch, Director of Policy Communications at Facebook. ‘We continue to make significant improvements to tackle the spread of misinformation and harmful content. To suggest we encourage bad content and do nothing is just not true.’

Pietsch added: ‘The growth of people or advertisers using Facebook means nothing if our services aren’t being used in ways that bring people closer together — that’s why we are investing so much in security that it impacts our bottom line. 

‘Protecting our community is more important than maximizing our profits. To say we turn a blind eye to feedback ignores these investments, including the 40,000 people working on safety and security at Facebook and our investment of $13 billion since 2016.’

MORE : How to find Facebook’s ‘rejection folder’ holding all your ignored friend requests

MORE : Facebook launches two new Portal video-calling devices

 



from News – Metro https://ift.tt/3oxLC50

Post a Comment

0 Comments