Social Media Addiction: Are Facebook, YouTube, and TikTok Really to Blame?

4 minutes read

Apple Introduces Starlink Support for iPhones, Expanding Satellite Connectivity
Apple Introduces Starlink Support for iPhones, Expanding Satellite Connectivity

Social Media Addiction: Are Platforms Entirely Responsible?

The debate over social media addiction has intensified following a recent legal settlement involving Snapchat in Los Angeles, California. The case was brought by a 19-year-old who accused the platform of deliberately designing algorithms and features that fueled compulsive use and contributed to mental health challenges.

According to a report by The New York Times, lawyers representing the teenager argued that social media companies have long obscured information about the potential harms of their products. They pointed to features such as infinite scroll, auto-play videos, and algorithmic recommendations as mechanisms that encourage prolonged use, potentially contributing to depression, eating disorders, and other mental health struggles.

Why Snapchat Settled While Others Didn’t

Snapchat was not the only platform named in the lawsuit. Other defendants included Meta, which owns Facebook and Instagram, as well as TikTok and YouTube.

However, Snap was the only company to settle. Reports suggest this was due to internal evidence provided by Snap employees, dating back nearly nine years, showing that concerns had been raised about the impact of its recommendation algorithms on teenage mental health.

Plaintiffs drew comparisons to the “Big Tobacco” lawsuits of the 1990s, where cigarette manufacturers were accused of concealing the health risks associated with smoking.

Addiction Is More Than a Platform Problem

Psychologists largely agree that addiction—whether to substances or behaviors, is complex. It is shaped by a mix of individual, social, and psychological factors rather than a single cause.

While exposure to social media is a factor, other contributors include peer pressure, trauma, stress, depression, early exposure to digital platforms, poor quality of life, and even financial incentives tied to online popularity. The widespread availability and cultural acceptance of social media have also deepened its grip, making platforms an integral part of daily life worldwide.

This raises a critical question: if addiction is multifaceted, why are social media companies bearing the bulk of the blame? Critics argue that holding platforms solely responsible is similar to blaming breweries for alcohol addiction or tobacco companies for smoking-related dependence.

The Role of Parents, Governments, and Society

Because many of these cases involve teenagers, still legally considered minors, it is understandable that responsibility should not rest on young users alone. Yet, this also highlights the role of other stakeholders tasked with protecting minors, including parents, families, schools, and governments.

Several countries are already taking regulatory steps. In December 2025, Australia became the first nation to ban social media use for children under 16. The ban covers platforms such as TikTok, Alphabet’s Google and YouTube, and Meta’s Facebook and Instagram, with non-compliance penalties reaching up to $33.3 million.

In 2026, Malaysia followed suit, announcing restrictions that would prevent users under 16 from creating social media accounts. France has passed a law requiring parental consent for children under 15, though enforcement has faced technical hurdles. Germany mandates parental consent for users aged 13 to 16, while critics argue safeguards remain insufficient. The United Kingdom is also considering an Australia-style ban, with discussions underway about raising the minimum age further.

What Platforms Say They’re Doing

Social media companies argue they are not ignoring the issue. TikTok, for example, has introduced tools that allow users to filter content, block specific keywords, and manage their overall experience. Its “family pairing” feature enables parents to set screen-time limits, manage interactions, and control who can view or comment on their children’s videos.

These tools are primarily targeted at users aged 13 to 15. Similarly, YouTube operates a separate platform, YouTube Kids, which gives parents extensive control over what children can watch.

The Legal Battle Over Core Features

Despite these measures, prosecutors are focusing on the core design elements of social platforms—such as infinite scroll, push notifications, auto-play, and recommendation algorithms, arguing that they are inherently addictive and should be removed or redesigned.

Platforms counter that these features are comparable to editorial decisions made by newspapers about which stories to highlight. They also argue that such decisions are protected speech under the First Amendment in the United States.

What Comes Next?

So far, no major social media company has lost a social media addiction lawsuit in court. This track record gives platforms confidence in their legal defences. However, a loss could be seismic, potentially resulting in billions of dollars in penalties and forcing fundamental changes to how social media products are designed.

Whether responsibility ultimately lies with platforms, parents, governments, or society at large, or a combination of all three, remains an open question. What is clear is that the global conversation around social media addiction is far from over.

 

Share this article

Share your Comment

guest
0 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments

Read More

Trending Posts

Quick Links