Facebook Inc. recently mentioned that it removed 7 million posts in the second quarter regarding coronavirus in which people had shared false information about the virus, incorrect preventative measures and exaggerated cures.
The data was released as part of Facebook's sixth Community Standards Enforcement Report. The report was formed in 2018 after the social media company faced backlash over its lax approach to policing content on its platform.
Facebook said that it would invite proposals from experts this week to audit the metrics mentioned in the report. The company committed to the audit during an ad boycott over hate speech practices in July as it removed about 22.5 million posts containing hate speech on its flagship app in the second quarter, which was a major increase from 9.6 million in the first quarter. According to Facebook, the significant difference was a result of improved detection technology.
Furthermore, the company also removed about 8.7 million posts connected to terrorist organizations, compared with the 6.3 million posts it had deleted in the first quarter.
Facebook revealed that it relied more on automation for reviewing content starting in April because of the presence of fewer reviewers at its offices due to the coronavirus pandemic. As a result, there was less action taken against content related to self harm and child sexual exploitation.
Facebook further said that it will expand its hate speech policy to content related to racism and stereotypes about Jews controlling the world.