Facebook Inc. this year has made a flurry of new rules designed to improve the discourse on its platforms. When users report content that breaks those rules, a test by The Wall Street Journal found, the company often fails to enforce them.
Facebook allows all users to flag content for review if they think it doesn’t belong on the platform. When the Journal reported more than 150 pieces of content that Facebook later confirmed violated its rules, the company’s review system allowed the material—some depicting or praising grisly violence—to stand more than three-quarters of the time.
Facebook’s errors blocking content in the Journal’s test don’t reflect the overall accuracy of its content-moderation system, said Sarah Pollack, a company spokeswoman. To moderate the more than 100 billion pieces of content posted each day to the Facebook platform, the company both reviews user reports and actively screens content using automated tools, Ms. Pollack said.
“Our priority is removing content based on severity and the potential for it going viral,” she said.
Read the full story here.