Blue screens and screams echo from every direction as the mayor of New York City, Eric Adams, stands on a podium in front of thousands delivering his annual State of the City address on Jan. 24. He explains that New York City would be doing something novel: declaring social media a public health crisis, effective immediately. He would ensure that platforms like TikTok, YouTube, Instagram, Facebook and Snapchat take responsibility for the dangerous and addictive features they have constructed.
“We are the first major city in America to take this step and call out the danger of social media like this,” Adams said in his State of the City Address. “Just as the surgeon general did with tobacco and guns, we are treating social media like other public health hazards”
On Feb. 15th, 2024, the city filed a lawsuit against TikTok, YouTube, Instagram, Facebook and Snapchat in the California Superior Court. The lawsuit demanded the regulation of these platforms and coverage of the cost of mental health help for teens in New York City.
“Online networks are powerful tools to connect with friends, family, classmates, and so much more,” said Anne Williams-Isom, the city’s Deputy Mayor for Health and Human Services. “However, social media can also be a place for unhealthy comparisons to others, a platform for bullying, and lead to negative mental health implications for our children.”
Adams is urging other cities and states to do the same to curb multiple mental health disorders that disproportionately affect young people, including anxiety, depression and body dysmorphia. According to the findings from J Family Med Prim Care, the widespread use of social media among younger audiences has detrimental effects on mental health. Additionally, a recent study by King’s College shows that body dysmorphia and feelings of inadequate physical stature are steadily rising. They point to social media as the predominant cause.
This kind of content has always existed, so why are we putting this all on social media? Certainly, before social media, there were websites, magazines and newspapers that covered the same kind of content. Why then is social media any different? Why is it the main cause of this rise of body discomfort? And what makes it so effective at inspiring these feelings?
“This isn’t necessarily something that’s new,” said Kevin Starkey, a physical educator at CHS. “We’ve always had media that has made the ideal body in quotations, the most important thing, is social media is just a new age and the more accessible way of doing it.”
Starkey argues that, unlike other forms of content, social media is always with you and actively seeks you out. Pinging you every so often to keep you coming back for more, suggests content it knows you will find engaging. According to Starkey, these platforms can harm people and teens especially because they are businesses, and therefore need to sell things. A person won’t buy a product unless they truly believe they need it, so the platform creates the need by making beautiful people sell their products.
The people making this kind of content are accountable only to the companies who are paying them, and the companies will of course ask them to say things that are in support of the fitness product the company is selling. The company can then sell its products to its target audience, people who feel uncomfortable in their bodies because an algorithm will find those people and put these advertisements right in front of them.
This system is inherently parasitic, and how could it not be, the companies that design this system are companies. Companies have to grow, reach more users, profit, and stock prices, and as the company grows, the more it has to continue growing.
“When we’re dealing with the potential for harm to the bulk majority of users under the age of 18. That’s a problem that needs to be fixed. I mean, they’re their kids,” Maveal, an adolescent mental health professional and CHS counselor, said. “They are the most vulnerable people in our society to some degree. These are the people who are getting these messages sent to them and not having the developmental capacity to sift through that, and sift through the messages and decide which are true and untrue. And so from a systems standpoint, it’s mostly the responsibility of these platforms to be monitoring and protecting kids.”
But that’s what the grown-ups say, what do the users think about it? Do they not know they are being exploited or is something entirely different happening? I turned to the hallways of CHS to help answer this question, eventually ending up in room 315, where Rosie Matish, a ninth grader chimed in.
“I think websites and Instagram platforms rely on people being insecure and wanting to change their appearance to actually sell their products,” Matish said. A lot of the people who do this have a basic understanding that they are making body image issues worse. But they don’t care because it’s helping their brand and helping them promote growth.”
She finds it challenging to stop using these systems, even though they may have harmful effects, because of the role they play in society today.
“Well that’s the thing right, not everyone knows that these systems are harmful,” Matish said. “But also I think there can be a misunderstanding of one’s feelings. Like when you see this kind of content you feel bad, but you also feel like you’re slowly doing something to take action. And everyone does it, it’s so normalized nowadays people don’t even think about it anymore, it’s embedded into us.”
When looking through this lens it is clear to see why New York is taking such extreme measures, there is a problem within this industry that people are just now beginning to understand and talk about. And there are many calling for accountability on the part of these corporations. What form will this accountability take? Only time will tell, but one thing is abundantly clear, the time for action has arrived.