London, United Kingdom
Frances Haugen, a Facebook data scientist turned whistleblower, appeared Monday before a UK Parliamentary committee looking at online safety legislation.
Speaking to the British parliament, she said that Facebook will fuel more violent unrest around the world because of the way its algorithms are designed to promote divisive content.
Also read | Frances Haugen to testify about Facebook and online harm
In her view, the social network saw safety as a cost centre and glorified a start-up culture where cutting corners was good. It was "unquestionably" making hate worse, she added.
"The events we're seeing around the world, things like Myanmar and Ethiopia, those are the opening chapters because engagement-based ranking does two things: one, it prioritises and amplifies divisive and polarising extreme content and two it concentrates it," she said.
Haugen said she had come forward "because now is the most critical time to act"
Also read | Facebook struggles to combat misinformation and hate speech in India: Report
She explained how Facebook Groups amplify online hate, saying algorithms that prioritize engagement take people with mainstream interests and push them to the extremes.
âOne of the things that happens in aggregate is the algorithms take people who have very mainstream interests and they push them towards extreme interests. You can be someone who is center left and youâll be pushed to radical left, you can be centre right and youâll be pushed to radical right.â
As per her, the company could add moderators to prevent groups from being used to spread extremist views.
Watch | Facebook selective in curbing hate speech, mis-information and inflammatory posts
The committee is proposing that companies that fail to limit and remove harmful online material should face heavy fines and other significant penalties.
"It pushes you to the extremes and it fans hate," she told lawmakers, adding," Anger and hate is the easiest way to grow on Facebook... bad actors have an incentive to play the algorithm, and they figure out all the ways to optimise Facebook."
Social media companies could face fines of up to 10% of their revenue if they fail to remove or limit the spread of illegal content, such as child sexual abuse.
The government has also called on platforms such as Facebook to do more to protect children from grooming, bullying, and pornography.
(With inputs from Agencies)