Will Facebook policing content chill freedom of expression?

According to this BBC news article, https://www.bbc.com/news/business-51184323, Facebook intends to create 1,000 new jobs in the UK with the aim of policing Facebook’s platforms (Facebook, Messenger, Instagram, Whatsapp).

Although I think it is noble that Facebook wants to remove problematic content for the safety of its users, it also makes me weary at the thought of a private company dictating what is “problematic” and what information users may and may not view. Can we really trust a private company to act in our best interest or are they unnecessarily suppressing the market place of ideas?

2 responses to “Will Facebook policing content chill freedom of expression?”

  1. gabybs

    While I do agree that there are concerns with respect to the over-removal of content from the Internet, I do believe that Internet intermediaries, like Facebook, bear some responsibility when their users post harmful content on their platforms. For one thing, once harmful content is posted online, it is very difficult to undo its effects. Damage can be caused on a large scale with relatively little effort. Such harmful content is persistent, can be reproduced almost instantaneously, and can be extremely difficult to remove from the digital environment. For another, many Facebook users are children and teens, and are inherently vulnerable. Harmful content can have enormous impacts on these youth as their cognitive and emotional maturity is undergoing constant development. In my opinion, Internet intermediaries, like Facebook, should remove harmful content from their platforms.

    That being said, what constitutes “harmful” or “problematic” content should be clearly defined to restrict the potential for over-removal (or the “unnecessary suppressing of the marketplace of ideas”). Facebook does, in fact, define “harmful content” in its community standards here: https://www.facebook.com/communitystandards/. In my view, this allows users to hold Facebook accountable when they believe that the company has wrongly removed content from its platform.

  2. Sancho

    Kate Klonick has an interesting article about how platforms develop their internal moderation rules and the weight that they place on free expression: https://harvardlawreview.org/2018/04/the-new-governors-the-people-rules-and-processes-governing-online-speech/

Leave a Reply

To use reCAPTCHA you must get an API key from https://www.google.com/recaptcha/admin/create