Descriptive documents show that 100,000 children are exposed to sexual harassment daily on its platforms dead


Meta estimates that about 100,000 children using Facebook and Instagram receive online sexual harassment every day, including “images of adult genitals,” according to internal company documents made public late Wednesday.

The unsealed legal filing includes several allegations against the company based on information the New Mexico Attorney General’s Office received from presentations by Meta employees and communications between employees. The documents describe an incident in 2020 when the 12-year-old daughter of an Apple executive was interrogated via IG Direct, Instagram’s messaging product.

“This is the kind of thing that angers Apple to the point of threatening to remove us from the App Store,” one Meta employee said, according to the documents. A senior Meta employee described how his daughter was lured via Instagram into her testimony before the US Congress late last year. He said his efforts to fix the problem were ignored.

The filing is the latest in a lawsuit filed by the New Mexico Attorney General’s Office on December 5, which alleges that meta social networks have become marketplaces for child predators. State Attorney Raul Torrez accused Meta of enabling adults to find, correspond with and take care of the children. The company denied the lawsuit’s allegations, saying it “mischaracterizes our business by using selective quotes and cherry-picked documents.”

Meta released a statement in response to Wednesday’s filing: “We want teens to have safe, age-appropriate experiences online, and we have more than 30 tools to support them and their parents. We’ve spent a decade working on these issues and hiring people who have dedicated their careers to keeping young people safe.” And support them online.

A 2021 internal presentation on child safety was also referenced in the suit. According to the lawsuit, one slide stated that Meta “does not invest adequately in the sexualization of IG, particularly sexual comments on content posted by minors. This is not only a terrible experience for creators and spectators, but it is also a way for bad actors to learn about each other and communicate.” with her.

The complaint also highlights Meta employees’ concerns about children’s safety. In an internal Meta chat in July 2020, an employee asked: “What specifically do we do in terms of childcare (something I just heard happening a lot on TikTok)?” According to the complaint, he received a response: “somewhere between zero and negligible.”

Meta’s statement also said the company has taken “important steps to prevent teens from experiencing unwanted contact, especially from adults.”

The New Mexico lawsuit follows an April Guardian investigation that revealed how Meta failed to report or detect the use of its platforms for child trafficking. The investigation also revealed how Messenger, Facebook’s private messaging service, was used as a platform for human traffickers to communicate to buy and sell children.

Meta employees discussed using Messenger to “coordinate trafficking activities” and facilitate “every human exploitation phase (recruitment, coordination, exploitation) represented on our platform,” according to documents included in the lawsuit.

However, a 2017 internal email describes executive opposition to scanning Facebook Messenger for “harmful content” because it would put the service “at a competitive disadvantage versus other apps that may provide more privacy,” the lawsuit states.

In December, Meta received widespread criticism for rolling out end-to-end encryption for messages sent on Facebook and through Messenger. Encryption hides the contents of a message from anyone except the sender and intended recipient by converting text and images into unreadable codes that are not encrypted upon receipt. Child safety experts, policy makers and law enforcement have argued that encryption hampers efforts to rescue child sex trafficking victims and prosecute predators. Privacy advocates praised the decision because it protects users from surveillance by governments and law enforcement.

Leave a Reply

Your email address will not be published. Required fields are marked *