Facebook was recently under fire and was facing advertising boycott from many major companies. The reason behind this protest was the amount
of hate speech that is a part of the Facebook community with no real adverse
consequences. However, the tech company has sped up the efforts to detect and
remove harmful and hateful content from the platform in the second half of
2020.
Facebook has defined hate speech as, “violent or dehumanizing speech,
statements of inferiority, calls for exclusion or segregation based on
protected characteristics, or slurs. These characteristics include race,
ethnicity, national origin, religious affiliation, sexual orientation, caste, sex,
gender, gender identity, and serious disability or disease.”
Facebook has
reportedly taken actions and removed about 22.5 million pieces of hate content
from its platform between April and June. This is a massive boost compared to
the first quarter of 2020 where the removed post numbers were up to about 9.6
million.
Also see: Facebook under fire for hate speech
Also see: Facebook under fire for hate speech
Facebook will
further revise its Community Standard Policy in order to make Facebook and
Instagram a safer place for users to browse around their social media. It will
also be publishing their Community Standard Enforcement Report on a quarterly basis
rather than on a bi-annual basis.
Infographic by: statista.com