A whistleblower has claimed Instagram is unsafe for children and is turning 10-year-old’s into social media addicts.
Armed with thousands of pages of confidential documents secretly copied before she left the company, ex-Facebook employee Frances Haugen has given a blistering account of the company’s inner workings.
The American algorithm specialist spoke out after leaving her senior role with the firm, lifting the lid on the notoriously opaque tech giant.
Ms Haugen gave evidence in front of a parliamentary select committee today where she claimed the company was aware of its platform’s addictive qualities when used by teenagers.
She claimed Facebook is ‘very good at dancing with data’ and alleged the firm’s own research showed Instagram is more dangerous for teenagers than other social media such as TikTok and Snapchat.
Ms Haugen said she ‘sincerely doubted’ that it was possible for Instagram to be made safe for children and that the platform promoted an ‘addict’s narrative’.
She added: ‘Children don’t have as good self regulation as adults do, that’s why they’re not allowed to buy cigarettes.
‘When kids describe their usage of Instagram, Facebook’s own research describes it as ‘an addict’s narrative’.
‘The kids say ‘this makes me unhappy, I don’t have the ability to control my usage of it, and I feel if I left it would make me ostracised’.’
She continued: ‘I am deeply worried that it may not be possible to make Instagram safe for a 14-year-old and I sincerely doubt that it is possible to make it safe for a 10-year-old.’
Ms Haugen said Facebook could make a ‘huge dent’ on the problem if they wanted to but they fail to do so because ‘young users are the future of the platform and the earlier they get them the more likely they’ll get them hooked’.
She added: ‘Facebook’s own research says now the bullying follows children home, it goes into their bedrooms.’
Andy Burrows, head of child safety online policy, at the NSPCC, said the evidence highlighted the ‘scale of the challenge needed to make the company’s services safe for children after years of putting profit and growth first’.
A Facebook spokesperson said while the firm has rules against harmful content it agrees regulation for the whole industry is needed.
‘Contrary to what was discussed at the hearing, we’ve always had the commercial incentive to remove harmful content from our sites.
‘People don’t want to see it when they use our apps and advertisers don’t want their ads next to it.’
The testimony comes as the government prepares to bring its online harms bill back before the Commons, which is expected to create new regulation for social media companies.
There have been renewed calls to clamp down on hateful material being shared online following the killing of Sir David Amess.
Get in touch with our news team by emailing us at webnews@metro.co.uk.
For more stories like this, check our news page.
from News – Metro https://ift.tt/3EeYiSZ
0 Comments