Instagram CEO says social media isn’t “clinically addictive” — the internet disagrees
Adam Mosseri defended Instagram in court, claiming the app isn’t addictive, sparking debate about social media’s real impact on youth.
The relationship between digital platforms and mental health is a major subject of public interest and legal scrutiny. While many people express concern about the compulsive nature of modern apps, the companies behind them often maintain a different view of how their technology functions. Adam Mosseri, the head of Instagram, recently spoke on this issue during a court appearance, stating that social media does not meet the criteria for being “clinically addictive.”
What happened
The head of Instagram has defended his platform against claims it caused mental health damage to minors. The plaintiff alleged that Meta intentionally designed features to keep children and adults engaged, which could have adverse effects on their lives.

During his testimony in Los Angeles, Mosseri addressed these claims directly. He argued that it is necessary to “differentiate between clinical addiction and problematic use.” He told the court that while some people spend more time on the app than they feel good about, this does not constitute a medical addiction. He gave the example of Netflix, saying, “I’m sure I’ve said that I’ve been addicted to a Netflix show when I binged it really late one night, but I don’t think it’s the same thing as clinical addiction.”
When the lead attorney mentioned a survey indicating that around 60% of users reported experiencing bullying on Meta’s platforms within just one week, Mosseri said he was unaware of the issue. The lawyer also argued that Meta appears to prioritize growth over its social responsibilities, which could seriously harm the company’s reputation.
The trial, which is expected to last 6 weeks, aims to hold the big tech firms accountable for the negative impact they have on young people.
Public reactions
The reality outside the courtroom looked quite different from the testimony inside. As Mosseri arrived, he encountered a large crowd of onlookers, protestors, and concerned parents, many of whom weren’t directly involved in the lawsuit but felt strongly about the impact of social media on their children.
One such parent was Mariano Janin from London. He held a photo of his daughter Mia, who tragically took her own life in 2021 at the age of 14. Janin traveled to Los Angeles to attend the trial and advocate for stricter rules on social media use for young people. “If they changed their business model, it would be different,” Janin said. “They should protect kids. They have the technology; they have the funds.” Janin’s comments highlight a troubling situation that many believe companies can solve this issue, but instead prioritize profits.
Online users also disagreed with Mosseri. On Instagram, one person wrote: “Big Tobacco said the same thing decades ago.” This looks back at a time when cigarette companies denied their products caused dependency while knowing the truth. Both the social media and tobacco industries rely on repetitive use to make money. Another user challenged Mosseri directly, asking, “What if it’s your own kid, though? How about then Mosseri? Still not an addiction?” Many tech executives know the risks better than they let on, leading the public to think they may limit screen time for their own children.
On Reddit, users discussed how these apps function. One user wrote: “Hard to take that seriously when the platforms are literally designed to keep people scrolling.” Features like infinite scrolling, also known as dopamine-scrolling, eliminate natural breaks that usually signal our brains to stop. Another person pointed to the financial motive, “Algorithms by for-profit companies are legally required to be as addictive as possible because of the impetus to provide profit for shareholders, at least in the US. These companies have gotten far too comfortable just bold-faced lying.”
One Reddit user quoted a famous saying, “It is difficult to get a man to understand something when his salary depends on his not understanding it.” Mosseri cannot admit the product is addictive without hurting the company’s stock price. Someone else questioned his expertise: “I’d like to know his background in behavioural psychology. Just because you don’t believe it doesn’t make it untrue.” It is a reminder that tech leaders aren’t necessarily experts in mental health, yet they make claims about how social media affects well-being.
The strongest reactions came from people witnessing the impact of social media on young lives. One person wrote, “Tell that to the teenagers who threaten to kill themselves if they lose access to social media or their phones. It’s widely documented at this point because this young generation likes to film/livestream themselves doing literally anything.” A report from the U.S. Surgeon General noted that teenagers who spend over three hours a day on social media are twice as likely to experience issues with their mental health, such as feelings of sadness and anxiety. For many families, the clinical details matter far less than the real crises they’re facing every day at home.
Why this matters

Parents face tough choices every day regarding their children’s screen time, mental health, and the rules around technology use. They often don’t receive helpful advice from the companies behind the apps and devices their kids use. When a tech leader says a product is safe, it makes a parent’s job harder. It places all the pressure on the family to manage a system designed to be difficult to put down. This lack of transparency means families must find their own ways to stay safe, which can include things like being careful about what they share on social media and keeping personal matters private.
Social media plays a significant role in how young people see themselves and how they connect with friends. Even subtle signals from tech leaders about “non-addictiveness” can influence societal norms around self-worth and digital dependency. If the people running these platforms refuse to acknowledge the risks, users have to be more careful. Digital boundaries are not just about time; they are about protecting a person’s sense of reality from an algorithm that only wants more data.
