The company reports millions of photos and videos of suspected child sexual abuse each year. But when ages are unclear, young people are treated as adults and the images are not reported to the authorities.
Facebook is a leader among tech companies in detecting child sexual abuse content, which has exploded on social media and across the internet in recent years. But concerns about mistakenly accusing people of posting illegal imagery have resulted in a policy that could allow photos and videos of abuse to go unreported.
Meta, the parent company of Facebook, Instagram, Messenger and WhatsApp, has instructed content moderators for its platforms to “err on the side of an adult” when they are uncertain about the age of a person in a photo or video, according to a corporate training document.
Antigone Davis, head of safety for Meta, confirmed the policy in an interview and said it stemmed from privacy concerns for those who post sexual imagery of adults. “The sexual abuse of children online is abhorrent,” Ms. Davis said, emphasizing that Meta employs a multilayered, rigorous review process that flags far more images than any other tech company. She said the consequences of erroneously flagging child sexual abuse could be “life-changing” for users.
While it is impossible to quantify the number of images that might be misclassified, child safety experts said the company was undoubtedly missing some minors. Studies have found that children are physically developing earlier than they have in the past. Also, certain races and ethnicities enter puberty at younger ages, with some Black and Hispanic children, for example, doing so earlier than Caucasians.
“We’re seeing a whole population of youth that is not being protected,” said Lianna McDonald, executive director of the Canadian Center for Child Protection, an organization that tracks the imagery globally.
Each day, moderators review millions of photos and videos from around the world to determine whether they violate Meta’s rules of conduct or are illegal. Last year, the company made nearly 27 million reports of suspected child abuse to a national clearinghouse in Washington that then decides whether to refer them to law enforcement. The company accounts for more than 90 percent of the reports made to the clearinghouse.
The training document, obtained by The New York Times, was created for moderators working for Accenture, a consulting firm that has a contract to sort through Facebook’s noxious content and remove it from the site. The age policy was first disclosed in California Law Review by a law student, Anirudh Krishna, who wrote last year that some moderators at Accenture disagreed with the practice, which they referred to as “bumping up” adolescents to young adults.
Accenture declined to comment on the practice.
Technology companies are legally required to report “apparent” child sexual abuse material, but “apparent” is not defined by the law. The Stored Communications Act, a privacy law, shields companies from liability when making the reports, but Ms. Davis said it was unclear whether the law would protect Meta if it erroneously reported an image. She said lawmakers in Washington needed to establish a “clear and consistent standard” for everyone to follow.
Legal and tech policy experts said that social media companies had a difficult path to navigate. If they fail to report suspected illicit imagery, they can be pursued by the authorities; if they report legal imagery as child sexual abuse material, they can be sued and accused of acting recklessly.
“I could find no courts coming close to answering the question of how to strike this balance,” said Paul Ohm, a former prosecutor in the Justice Department’s computer crime division who is now a professor at Georgetown Law. “I don’t think it’s unreasonable for lawyers in this situation to put the thumb on the scale of the privacy interests.”
Charlotte Willner, who leads an association for online safety professionals and previously worked on safety issues at Facebook and Pinterest, said the privacy concerns meant that companies “aren’t incentivized to take risks.”
But Ms. McDonald, of the Canadian center, said the rules should err on the side of “protecting children,” just as they do in commerce. She cited the example of cigarette and alcohol vendors, who are trained to ask for identification if they have doubts about a customer’s age.
Representatives for Apple; Snap, the owner of Snapchat; and TikTok said their companies took the opposite approach of Meta, reporting any sexual image in which a person’s age was in question. Some other companies that scan their services for illegal imagery, including Dropbox, Google, Microsoft and Twitter, declined to comment on their practices.
In interviews, four former content moderators contracted by Meta said they encountered sexual images every day that were subject to the age policy. The moderators said they could face negative performance reviews if they made too many reports that were deemed out of policy. They spoke on the condition of anonymity because of nondisclosure agreements and concerns about future employment.
“They were letting so many things slide that we eventually just didn’t bring things up anymore,” said one of the former moderators, who described detecting images of oral sexual abuse and other explicit acts during his recent two-year tenure at Accenture. “They would have some crazy, extravagant excuse like, ‘That blurry portion could be pubic hairs, so we have to err on the side of it being a young adult.’”
The number of reports of suspected child sexual abuse has grown exponentially in recent years. The high volume, up from roughly 100,000 in 2009, has overwhelmed both the national clearinghouse and law enforcement officials. A 2019 investigation by The Times found that the Federal Bureau of Investigation could only manage its case load from the clearinghouse by limiting its focus to infants and toddlers.
Ms. Davis said a policy that resulted in more reports could worsen the bottleneck. “If the system is too filled with things that are not useful,” she said, “then this creates a real burden.”
But some current and former investigators said the decision should be made by law enforcement.
“No one should decide not to report a possible crime, especially a crime against a child, because they believe that the police are too busy,” said Chuck Cohen, who led a child exploitation task force in Indiana for 14 years.
Dana Miller, the commander of a similar task force in Wisconsin, said tech companies could not know whether a report might be useful in furthering an existing investigation. “Even though everyone is overwhelmed, we’re not comfortable on our side making a blanket statement that we don’t want to see those reports,” she said.
Yiota Souras, general counsel at the National Center for Missing and Exploited Children, the national clearinghouse for the reports, said the center’s caseload “can’t be at play here.” She said imagery should always be reported if it might involve a child.
How Facebook makes its age determinations is also a point of contention. According to the training document and interviews, Facebook instructs its moderators to incorporate so-called Tanner stages when assessing age. Initially developed in the late 1960s by Dr. James M. Tanner, a British pediatrician, the tool outlines the progressive phases of puberty. But it was not designed to determine someone’s age.
In a 1998 letter to the journal Pediatrics, Dr. Tanner said that using the stages to measure “chronologic age” when analyzing child sexual abuse imagery was “wholly illegitimate.” Dr. Tanner died in 2010. The co-author of the letter, Dr. Arlan L. Rosenbloom, now a retired pediatric endocrinologist, said in an interview that a child at 13 or 14 could be “fully developed” under the Tanner stages. He also characterized Meta’s approach as “a total misuse” of the scale.
Ms. Davis said the scale was widely used in the tech industry and was just one factor in estimating age. She acknowledged its limitations and said the company was exploring alternatives.
"right" - Google News
April 01, 2022 at 04:58AM
https://ift.tt/RatpBMc
Adults or Sexually Abused Minors? Getting It Right Vexes Facebook - The New York Times
"right" - Google News
https://ift.tt/JkAueiv
Bagikan Berita Ini
0 Response to "Adults or Sexually Abused Minors? Getting It Right Vexes Facebook - The New York Times"
Post a Comment