Instagram chief says he does not believe people can get clinically
addicted to social media
[February 12, 2026] By
KAITLYN HUAMANI and BARBARA ORTUTAY
LOS ANGELES (AP) — Adam Mosseri, the head of Meta's Instagram, testified
Wednesday during a landmark social media trial in Los Angeles that he
disagrees with the idea that people can be clinically addicted to social
media platforms.
The question of addiction is a key pillar of the case, where plaintiffs
seek to hold social media companies responsible for harms to children
who use their platforms. Meta Platforms and Google's YouTube are the two
remaining defendants in the case, which TikTok and Snap have settled.
At the core of the Los Angeles case is a 20-year-old identified only by
the initials “KGM,” whose lawsuit could determine how thousands of
similar lawsuits against social media companies would play out. She and
two other plaintiffs have been selected for bellwether trials —
essentially test cases for both sides to see how their arguments play
out before a jury.
Mosseri, who's headed Instagram since 2018 said it’s important to
differentiate between clinical addiction and what he called problematic
use. The plaintiff's lawyer, however, presented quotes directly from
Mosseri in a podcast interview a few years ago where he used the term
addiction in relation to social media use, but he clarified that he was
probably using the term "too casually,” as people tend to do.
Mosseri said he was not claiming to be a medical expert when questioned
about his qualifications to comment on the legitimacy of social media
addiction, but said someone “very close” to him has experienced serious
clinical addiction, which is why he said he was “being careful with my
words.”

He said he and his colleagues use the term “problematic use” to refer to
“someone spending more time on Instagram than they feel good about, and
that definitely happens.”
It’s “not good for the company, over the long run, to make decisions
that profit for us but are poor for people’s well-being," Mosseri said.
Mosseri and the plaintiff's lawyer, Mark Lanier, engaged in a lengthy
back-and-forth about cosmetic filters on Instagram that changed people’s
appearance in a way that seemed to promote plastic surgery.
“We are trying to be as safe as possible but also censor as little as
possible," Mosseri said.
In the courtroom, bereaved parents of children who have had social media
struggles seemed visibly upset during a discussion around body
dysmorphia and cosmetic filters. Meta shut down all third-party
augmented reality filters in January 2025. The judge made an
announcement to members of the public on Wednesday after the displays of
emotion, reminding them not to make any indication of agreement or
disagreement with testimony, saying that it would be "improper to
indicate some position.”
[to top of second column] |

Adam Mosseri, CEO of Instagram, arrives in court to testify in a
landmark social media case that seeks to hold tech companies
responsible for harms to children, Wednesday, Feb. 11, 2026, in Los
Angeles. (AP Photo/Damian Dovarganes)
 During cross examination, Mosseri
and Meta lawyer Phyllis Jones tried to reframe the idea that Lanier
was suggesting in his questioning that the company is looking to
profit off of teens specifically.
Mosseri said Instagram makes “less money from teens than from any
other demographic on the app,” noting that teens don’t tend to click
on ads and many don’t have disposable income that they spend on
products from ads they receive. During his opportunity to question
Mosseri for a second time, Lanier was quick to point to research
that shows people who join social media platforms at a young age are
more likely to stay on the platforms longer, which he said makes
teen users prime for meaningful long-term profit.
“Often people try to frame things as you either prioritize safety or
you prioritize revenue,” Mosseri said. “It’s really hard to imagine
any instance where prioritizing safety isn’t good for revenue.”
Meta CEO Mark Zuckerberg is expected to take the stand next week.
In recent years, Instagram has added a slew of features and tools it
says have made the platform safer for young people. But this does
not always work. A report last year, for instance, found that teen
accounts researchers created were recommended age-inappropriate
sexual content, including “graphic sexual descriptions, the use of
cartoons to describe demeaning sexual acts, and brief displays of
nudity."
In addition, Instagram also recommended a “range of self-harm,
self-injury, and body image content” on teen accounts that the
report says “would be reasonably likely to result in adverse impacts
for young people, including teenagers experiencing poor mental
health, or self-harm and suicidal ideation and behaviors.” Meta
called the report “misleading, dangerously speculative” and said it
misrepresents its efforts on teen safety.
Meta is also facing a separate trial in New Mexico that began this
week.
All contents © copyright 2026 Associated Press. All rights reserved
 |