Facebook Reality Labs Photograph:( AFP )
The video was originally posted on Facebook on June 2020. The troublesome thing in this prompt message is that there was no mention of any monkey or chimpanzee in the whole video
Facebook mistakenly labelled Black men as primates — an animal — through its automatic facial recognition programme.
In the past few days, when Facebook users watched a British tabloid video featuring Black men, they were automatically shown a message asking if they would like to watch more videos of Primates.
The video in question was Daily Mail’s content on "White man calls cops on black men at marina". This video was originally posted on Facebook on June 2020. The troublesome thing in this prompt message is that there was no mention of any monkey or chimpanzee in the whole video.
Recently, this issue was raised by Facebook’s former content designer manager Darci Groves, who took to Twitter to share a screengrab of the prompt.
"This 'keep seeing' prompt is unacceptable, @Facebook," Groves tweeted. "And despite the video being more than a year old, a friend got this prompt yesterday. Friends at FB, please escalate. This is egregious."
Um. This “keep seeing” prompt is unacceptable, @Facebook. And despite the video being more than a year old, a friend got this prompt yesterday. Friends at FB, please escalate. This is egregious. pic.twitter.com/vEHdnvF8ui— Darci Groves (@tweetsbydarci) September 2, 2021
As soon as the issue was raised, Facebook quickly disabled the AI software and apologised for the racist mess-up. “We apologize to anyone who may have seen these offensive recommendations,” a statement from Facebook was quoted by the news agency AFP.
Facebook has gone a step ahead and completely disabled the topic recommendation software, controlled by AI. “We disabled the entire topic recommendation feature as soon as we realized this was happening so we could investigate the cause and prevent this from happening again,” the statement read.
This is not the first time Facebook has faced criticism for the social media platform’s problematic content. Since 2020, several human rights activists and groups have been urging Facebook to take concrete steps against hate speech and spread of misinformation, especially since the death of George Floyd.