Facebook isn’t telling the whole story about its mental health research – The Verge

Ever since The Wall Street Journal published internal Facebook research that found Instagram harmed the well-being of teenage girls, the company’s defense has been to minimize and dismiss its own findings — saying documents were only relevant for internal product development. That’s nonsense, social science researchers say.

Though Facebook’s work by itself is limited, it fits into a larger set of data — including from researchers outside the company — that suggests social media can have harmful effects on mental health. And even if that context didn’t exist, Facebook’s work alone suggests something bad enough is going on that it should cause concern.

The Wall Street Journal’s reporting included internal slides discussing data that showed Instagram was linked with issues like anxiety, depression, suicidal thought, and body image issues. Facebook immediately went on the defensive, saying that the data was taken out of context, that it was subjective, and that it couldn’t prove anything about Instagram. The company’s efforts to obfuscate the research and smear the whistleblower who leaked it appear to be straight out of Big Tobacco’s playbook.

Experts The Verge contacted think that, while Facebook’s statements on its research may be technically correct, they’re somewhat misleading.

“It’s completely disingenuous,” says Melissa Hunt, a psychologist and associate director of clinical training at the University of Pennsylvania.

“It’s completely disingenuous”

Facebook put up its own version of the leaked slides — complete with annotations that it said “give more context” on the research. Many of those annotations stress that the data is “based on the subjective perceptions of the research participants,” and that it wasn’t designed to assess if or how Instagram caused any positive or negative effects.

The annotations also repeatedly note that the research is qualitative. It relied on subjective information collected on questionnaires and through conversations with Instagram users, and it didn’t collect data that determined how frequently users experienced things like depression or body image issues. Facebook is arguing, then, that the information only shows that some users say they feel that way — and that it’s not enough to draw a line between Instagram and the mental health of teen girls more broadly.

Facebook said in a statement to The Verge that the studies were designed to help its product teams understand how users feel about the products, “not to provide measures of prevalence, statistical estimates for the correlation between Instagram and mental health or to evaluate causal claims between Instagram and health/well-being.” That changes the inferences people can make about the data, a spokesperson said in the statement.

On the surface, that’s not an unreasonable response, says Kaveri Subrahmanyam, a developmental psychologist at California State University, Los Angeles, and associate director of the Children’s Digital Media Center, Los Angeles. The research was only based on survey data, and it wasn’t designed to measure if or how Instagram causes changes in people’s mental health. That’s a problem with a lot of research around social media and mental health, she says: it’s asking people how they feel at one point in time. “That doesn’t tell you much,” Subrahmanyam says.

In that sense, Facebook’s right — there’s not much people can infer about the impact of a social media platform off of that type of data, Hunt says. In a vacuum, the limitations of research based on subjective survey responses from users mean it may not be particularly compelling.

the data from the study is not in a vacuum

But the data from the study is not in a vacuum, Hunt says. Instead, it came out into a world where independent researchers have also been studying mental health and social media, and where some have been studying it with the type of careful research design that can figure out if social media causes changes in mental health.

Hunt ran a study, for example, that randomly assigned a group of undergraduate students to continue their typical use of Instagram, Snapchat, and Facebook, and another group to limit their use to 10 minutes on each platform a day. At the end of three weeks, the group that limited their use reported fewer feelings of loneliness and depression compared with the group that kept using social media as normal.

“We have been finding these exact same things,” Hunt says. That consistency means researchers can take Facebook’s internal findings more seriously, despite the limitations, she says. “What this has conveniently done is provided us with nice illustrative content that simply echoes and mirrors and exemplifies exactly what we keep finding in experimental studies.”

Even without that context, and with the limitations of the survey data, the findings should be concerning enough that they should lead Facebook and other experts to start asking more questions, Hunt says. “It would still be deeply alarming and should instantly lead to more rigorous work,” Hunt says.

Facebook could start doing that sort of work if it wanted to. Since the initial leak of the mental health work, whistleblower Frances Haugen has distributed a mountain of documents detailing the company’s internal operations. They show just how much Facebook already knows about the impact of its platform on users — like how algorithmic changes made conversations on the platform angrier and more sensationalized, and how the platform can push users toward extremism.

It probably already has the data it needs for more extensive analysis on Instagram and teen mental health, Subrahmanyam says. “I’m pretty sure they do have data that speaks to the real question of the impact.” In 2012, Facebook and Cornell University researchers were able to manipulate users’ moods by changing the content of their news feeds. The research was ethically dubious — technically, it was legal, but the team didn’t get informed consent from users, triggering waves of criticism after it was published in 2014. That incident showed just how much information the company can and does collect on its users, Subrahmanyam says.

If the company is trying to say that the findings from the internal study aren’t that bad, they should make that information — detailed breakdowns of how people engage with the platforms — public, she says. “Why are they not releasing the data that they have that shows the clicks and other behavior? I think they should be inviting researchers who have that expertise, and giving them that data and letting them do that analysis,” Subrahmanyam says. “Because they’re not open about that data, I can’t help but be skeptical.”

“Because they’re not open about that data, I can’t help but be skeptical”

There are parallels between Facebook’s approach to these issues and tobacco companies’ efforts to minimize the harm caused by cigarettes, Hunt says. Both rely on people coming back to their products over and over again, even if it’s not healthy for them. Social media can benefit teens and young adults if they stick to some guidelines — follow only people they know and like in real life, and don’t use it for more than around an hour a day, Hunt says. “But that runs directly counter to the business model these companies are built on,” she says — the model depends on people looking at content from as many people as possible, whom they may not know, for as many hours a day as possible.

Tobacco companies had a similar model. “They knew perfectly well that their products were both highly addicting — in fact, they had been engineered to be as addictive as possible — and that they were harmful. And they suppressed those findings,” Hunt says. Big Tobacco also tried to discredit whistleblowers, similarly to how Facebook responded to Haugen.

Facebook executives, for their part, say that the tobacco analogies don’t make sense. “I don’t think it’s remotely like tobacco,” Nick Clegg, vice president of global affairs and communication, said on CNN. “I mean, social media apps, they’re apps. People download them on their phones, and why do they do that? I mean, there has to be a reason why a third of the world’s population enjoys using these apps.” For what it’s worth, in the 1960s, a tobacco executive took a similar position before Congress, saying: “millions of persons throughout the world derive pleasure and enjoyment from smoking.”

Mark Zuckerberg said in his note to Facebook staffers that the company was committed to doing more research, and that it wasn’t suppressing data. “If we wanted to hide our results, why would we have established an industry-leading standard for transparency and reporting on what we’re doing?” he wrote. “If we wanted to ignore research, why would we create an industry-leading research program to understand these important issues in the first place?”

But so far, the company hasn’t released the type of information third-party researchers want to see to actually understand the questions around social media and mental health. “These are really important questions, given how important social media has become,” Subrahmanyam says. “If it’s really not that bad, why not share it?”

  • Leave Comments