Whistleblowers Claim Metadata Is Hiding Research on Children’s Safety in Virtual Reality

Did you know you can set up Google to filter out junk? Follow these steps to improve your search results, including adding my work on Lifehacker as a preferred source .

A group of current and former Meta employees is accusing the company of hiding its own research on child safety in virtual reality. According to two current and two former Meta employees, Meta lawyers review, edit, and veto internal research on child safety in virtual reality to minimize the risk of negative press, lawsuits, and government regulation.

In support of the allegations, the group presented a trove of internal documents to members of the U.S. Senate Judiciary Committee ahead of a hearing on the matter on Tuesday. The documents, first obtained by The Washington Post , include thousands of pages of internal communications, presentations and memos that the group says detail a years-long strategy, led by Mehta’s legal team, to shape research on “sensitive topics.”

You may also like

Meta denies the allegations. In a statement to The Washington Post, company spokesman Dani Lever called the allegations “a handful of examples… stitched together to fit a predetermined and false narrative; in fact, since the beginning of 2022, Meta has approved nearly 180 Reality Labs studies focusing on social issues, including youth safety and well-being.”

Meta and child safety

Meta has apparently been aware of issues surrounding child safety and virtual reality for some time. An internal message board post from 2017 included in the collection is titled, “We have a child issue and it might be time to talk about it.” In it, an unnamed Meta employee writes, “These kids are clearly under our 13th birthday…” and goes on to estimate that 80 to 90 percent of users in some virtual reality spaces were minors.

After a meta-study leak led to congressional hearings in 2021 , the company strongly reiterated the importance of transparency, with CEO Mark Zuckerberg writing : “If we wanted to hide our results, why would we set an industry-leading standard for transparency and reporting about what we do?”

Behind the scenes, however, Meta’s legal team began reviewing, editing, and even vetoing youth safety research to “create plausible deniability,” according to whistleblowers, while detailing potential strategies to “reduce the risk” of conducting sensitive research. In a November 2021 presentation, Meta’s lawyers proposed that researchers “conduct highly sensitive research under attorney-client privilege,” and require that all highly sensitive research be reviewed by lawyers and released only on a “need-to-know” basis.

Another strategy, outlined in the slide, suggests that researchers “be thoughtful” about how research is framed, avoiding using terms like “illegal” or “non-compliant,” and not saying something violates a specific law, preferring to leave the legal findings to lawyers.

What do you think at the moment?

An example of Meta’s policy in action is provided in the documents, and includes conversations between Meta researchers and a woman in Germany. The unnamed mother reported that she had not allowed her sons to interact with strangers in Meta’s virtual reality, but her teenage son interrupted her, claiming that adults had repeatedly sexually assaulted his brother, who was under 10.

Meta management ordered the teen’s comments to be deleted and not mentioned in the company’s report, according to one of the researchers and Jason Sattizan, then one of Meta’s experts on children and technology. Sattizan says he was eventually fired from Meta after arguing with management about research restrictions.

How many kids are actually using Meta’s virtual reality?

Unfortunately, it’s impossible to know exactly how many kids are actively using Meta’s VR platforms. From my observations, I’ve spent enough time in VR to believe that there are plenty of people under the age of 13 in almost every VR space, including (and especially) Meta’s own Horizon Worlds. I can’t say for sure that the people behind the avatars are kids, but it certainly seems like a lot of them, based on the documents in the archive. One report found that only 41% of users reported the same birthdate as before. “These results suggest that many users may not want to share their actual birthdate with us,” the analysis says.

Maintaining this “gray area” of not knowing (or publicly acknowledging) the age of its users may be in Meta’s best interests. According to a document included in the archive, one of Meta’s lawyers wrote: “Overall, the context is that we should avoid collecting research data indicating the presence of U13 users in VR or VR applications (or U18 currently in the context of Horizon) due to regulatory concerns.”

The combination of the documents and Meta’s response suggests that the company is walking a fine line: publicly promising transparency and safety, but privately managing its research process to limit the fallout in the form of liability and regulatory scrutiny. Whether Meta is hiding damaging information or exercising understandable legal caution is an open question, but hopefully these congressional hearings (as confusing as they are) will help us get closer to the real goal of protecting children in immersive spaces.

More…

Leave a Reply