The case caused an uproar on social media. A Facebook recommendation algorithm asking users if they want to watch other “key videos” under a UK desert video showing black people, it revealed Friday.
The New York Times.

A Facebook spokesperson responded after the discoveries, saying, “We apologize to anyone who saw these insulting recommendations,” confirming that the California group had deactivated the recommendation tool on the topic “as soon as we noticed this happening in order to investigate the causes of the issue and prevent it from happening again.”

“Watching more videos about primates?” “

Video daily Mail, over a year ago, titled “White Man Calls Cops Against Black Men in Marina”. It only shows people, not monkeys. Below, the question “Do you see more primate videos?” With “Yes/Reject” options displayed on some users’ screen, according to a screenshot posted on Twitter by Darci Groves, the social networking giant’s former designer.

And she commented on this by saying, “It’s scandalous,” calling on her former colleagues on Facebook to escalate the matter.

Our AI systems are not perfect.

“Although we have improved our AI systems, we know they are not perfect and we have some progress to make,” the Facebook spokesperson said.

The case highlights the limits of AI technologies, which the platform regularly highlights in its efforts to build a personalized feed for each of its 3 billion monthly users. They also use it extensively to moderate content, to identify and block problematic messages and photos before they are even seen.

See also  The situation in hospitals: "I am very worried," comments Dr. Liu

But Facebook, like its competitors, is regularly accused of not fighting racism and other forms of hate and discrimination. The topic raises more tension as several civil society organizations accuse social networks and their algorithms of contributing to the division of American society, in the context of the movement’s protests. Black Lives Matter Movement.


Please enter your comment!
Please enter your name here