Facebook CEO says ‘Kenosha Guard’ page was left up by ‘operational mistake’

Washington: Facebook Chief Executive Officer Mark Zuckerberg said the company was too slow to remove a page that violated its policy on dangerous organisations in the hours leading up to a deadly shooting in Kenosha, Wisconsin, this week.

The social-media company eventually took down the page, titled "Kenosha Guard," for violating Facebook policies around militia organisations, according to a spokeswoman.

Facebook CEO Mark Zuckerberg. The social media sector is under fire by the White House.Credit:AP

The page's owners created an event promoting a "call to arms" and encouraging users to "Protect our Lives and Property," the Verge reported.

Multiple users reported the group to Facebook but it was not removed until after a shooting on Tuesday in which two people died.

"The Page violated our policies," Zuckerberg said Friday in a video post, adding that Facebook reviewers didn't identify it as a violation in part because the rules against militia groups are new.

"It was largely an operational mistake," the CEO said.

Facebook has removed the alleged shooter's Facebook and Instagram accounts, but says it has not found any direct link between this person and the page or the event created by the page's owners.

The incident is the latest in a long succession of failures by Facebook to effectively police its platforms, including the photo-sharing site Instagram, for posts that incite, organise and glorify violence.

Last week, the company had announced initiatives to intensify its efforts against militia groups and potentially dangerous conspiracy theories, such as QAnon, but Zuckerberg agreed with critics that the company failed to act appropriately against the Kenosha Guard page and the event listing.

Zuckerberg has been the subject of intense criticism from both inside and outside Facebook for not doing more to keep the company he has led since its creation in 2004 from becoming a conduit for violent ideologies.

Facebook long has had ties to serious violence in the real world.

The company acknowledged in 2018 that it had been used to "foment division and incite offline violence" during ethnic cleansing of the Rohingya minority group in Myanmar.

Last year, a gunman live-streamed on Facebook the shooting of Muslims in a New Zealand mosque by Australian Brenton Tarrant that left dozens dead.

Bloomberg, Washington Post

Most Viewed in World

Source: Read Full Article