Facebook censorship of online content is necessary, within reason

April 29, 2019 — by Selena Liu

New Zealand massacre shooting broadcast raises questions about what kind of censorship is acceptable on online platforms.

When the suspect of the March 15 New Zealand Christchurch mass shooting live-broadcast the event on Facebook using a body camera, Facebook, along with Instagram and YouTube, quickly sought to eradicate the videos from their platforms. Taking a step further, these tech companies moved on to remove all white supremacist content from their platforms.

Censorship of a whole section of the internet raises controversial questions: Should online platforms be given such power to censor their content? And if so, where should they draw the line?

On the one hand, it’s obvious Facebook should delete videos depicting gruesome massacres, terrorism and any content hatefully targeting an individual or a group, including white supremacy. But this can, of course, go too far. Some on right say this leads to the suppression of merely controversial ideas, as opposed to hateful ones.

Facebook is already doing a good job in censoring content that depicts violence or targets hate toward certain groups. White supremacy is just one example, and Facebook’s new ban of all types of white supremacy, including white nationalism and separatism, is a step toward making the online community a much safer place.

But Facebook must be vigilant not go too far. If Facebook deletes controversial content, such as certain political or personal opinions, it will become a biased platform that limits the free speech of its users. Large tech companies like Google and Facebook must not decide what is considered right for the public to publish online.

While Facebook’s ban of white supremacy is justified, the rest of Facebook’s censorship policies must also be more carefully scrutinized. We must judge whether Facebook’s policies are truly for the benefit of the online community or merely to reflect the company’s own generally liberal biases.

This is not to say that every political opinion should be uncensored; if political posts and commentary on Facebook target hate toward any group, those organizations should be censored. However, the distinction between targeted hate and controversial opinions is often a grey area and must be individually evaluated.

What Facebook should censor raises the essential question: how strict of a regulator should Facebook be? As a social media platform, Facebook cannot have the same regulating role as publishers. However, Facebook cannot be too loose in its censorship either and allow malicious videos and hate speech to develop on its platform. Facebook’s history with censorship and its future acts of it will pave the way for other social media companies in their policies, and will lead old laws and rights into this new generation of technology.

2 views this week