Facebook says it will monitor posts in wake of Derek Chauvin verdict. Here’s why
Facebook says it will be monitoring content related to the Derek Chauvin trial and will remove content that calls for violence in Minneapolis
Facebook will be monitoring and limiting content on its platform which it determines “could lead to civil unrest or violence” in anticipation of the Derek Chauvin trial verdict, the social media giant said in a blog post Monday.
“This includes identifying and removing calls to bring arms to areas in Minneapolis, which we have temporarily deemed to be a high-risk location,” Facebook’s vice president of content policy Monika Bickert said in the post. “We will continue to monitor events on the ground to determine if additional locations will be deemed as temporary, high-risk locations.”
Chauvin is a former Minneapolis police offer who was fired and arrested after George Floyd died while in police custody late last May. A viral video of Floyd’s death showed Chauvin, a white man, kneeling on the Black man’s neck and torso.
Nationwide protests and riots broke out after several police killings of people of color last year — including Floyd’s death — and protesters were often met on America’s streets by armed counter-protesters.
“We know this trial has been difficult for many people. But we also realize that being able to discuss what is happening and what it means with friends and loved ones is important,” Bickert said in the blog.
How will Facebook moderate posts?
In the blog post Monday, Facebook said it will be “preventing online content from being linked to offline harm and doing our part to keep our community safe.” But how exactly will the social media company regulate its platform? Here are the steps Facebook will be taking:
According to Facebook, it will remove content that goes against its already established “community standards.”
- This includes removing “hate speech, bullying and harassment, graphic violence, and violence and incitement,” Facebook said.
- Facebook “may also limit the spread of content that our systems predict is likely to violate our community standards” — something it said it has done in previous “emergency situations.”
- “We will remove pages, groups, events and Instagram accounts that violate our violence and incitement policy and we will remove events organized in temporary, high-risk locations that contain calls to bring arms,” according to the blog.
Facebook also determined that Chauvin is a “public figure” and that Floyd is considered an “involuntary public figure.”
- This means, according to Facebook, it will remove “severe” attacks against Chauvin and that a “higher level of protection” will be applied to content about Floyd’s death.
Facebook has also said it will be “limiting misinformation and graphic content.”
- The social media platform will be “using several tools” to flag posts that potentially include misinformation for “third-party fact-checking partners.”
- Graphic content will be marked “disturbing or sensitive.”
In the blog, Bickert said the social media company will “remain in close contact with local, state and federal law enforcement” and “will respond to valid legal requests and support any investigations that are in line with our policies.”