Fewer pieces of suicide, child nudity and exploitation content were removed after staff were sent home
- Coronavirus – latest updates
- See all our coronavirus coverage
Facebook has admitted it struggled to remove content that promoted suicide or exploited children after global lockdowns forced it to rely more heavily on automatic moderation.
Facebook sent many of its content reviewers home in March and began focusing on AI-driven moderation. In its first quarterly report on its moderation practices since the coronavirus crisis took hold, the company set out some of the successes and failures of that approach.