A primary responsibility for tech companies is to monitor content on their platforms for child sexual abuse material (CSAM), and if material is found, they have a legal obligation to report it to the National Center for Missing and Exploited Children (NCMEC). Many companies have established content moderators who review content flagged as potentially CSAM and determine whether the content should be reported to the NCMEC.
However, Facebook has a policy that could mean underreporting child sexual abuse content a new report from The New York Times. A Facebook training document instructs content moderators to “err on the side of an adult” if they don’t know the age of a person in a photo or video suspected of being CSAM, the report said.
The policy was created for Facebook content moderators working at Accenture and will be discussed in a California Law Review Article from August:
Respondents also described a policy called “encouragement” that each of them personally disagreed with. The policy applies when a content moderator cannot readily determine whether the subject in a suspected CSAM photo is a minor (“B”) or an adult (“C”). In such situations, content moderators are instructed to assume the person is an adult, resulting in more images not being reported to NCMEC.
Here’s the company’s rationale for the policy, out The New York Times:
Antigone Davis, Meta’s security chief, confirmed the policy in an interview, saying it stems from privacy concerns for those posting adult sexual images. “The sexual abuse of children online is abhorrent,” Ms. Davis said, noting that Meta employs a multi-layered, rigorous review process that flags far more images than any other tech company. She said the consequences of erroneously labeling child sex abuse could be “life-changing” for users.
When contacted for comment, Facebook (which is now under the meta-company umbrella) referenced Davis’s quotes in the NYT. Accenture did not immediately respond to a request for comment. Accenture declined to comment The New York Times.
Updated March 31, 9:09 p.m. ET: Facebook pointed this out Davis’ Quotations in the NYT.
https://www.theverge.com/2022/3/31/23005576/facebook-content-moderators-child-sexual-abuse-material-csam-policy Facebook moderators “err on the side of an adult” when they don’t know the age of possible abuse photos