We Need to Hold Facebook Accountable for Hiding Data Even though Instagram knew the popular photo-sharing app was harmful to young and vulnerable audiences, the company had fully planned on opening it up to younger children.
By Richard Maize Edited by Amanda Breen
Opinions expressed by Entrepreneur contributors are their own.
Concerns surrounding social media and its effects on mental health have long been a topic of conversation. Perfectly curated feeds, edited photos and the desire to go viral — or at least get a decent amount of likes or views on a post — create a high level of anxiety in users. And while it appears no one is immune to the potential negative impact social media can have on mental health, many lawmakers and journalists demand to know exactly what sort of data has been collected on social media's impact on teen mental health specifically.
Facebook does have this data. But surprise, surprise: It won't share it with the public. Is this suspicious? Yes. Is it dangerous? Well, I sure think so. Facebook, which owns Instagram, recently announced a plan to expand Instagram to an even younger and more vulnerable age group. Yet, a report from the the The Wall Street Journal found the company is already well aware Instagram poses serious dangers to mental health in teenagers.
An internal research slide from 2019 acknowledged that "We [Instagram] make body image issues worse for one in three teen girls." And despite Facebook's halt on the aforementioned plans to expand, I have to wonder why the company's decision-makers thought this was a good idea to begin with when they already knew the data showed Instagram is harmful to teens.
Related: 7 Marketing Tips to Help Grow Your Brand on Instagram
Facebook won't make its research public or available to academics and lawmakers
Let's be honest here: Facebook did not put its Instagram expansion on hold for any other reason but the backlash it received when it made the announcement. Although Instagram knew the popular photo-sharing app was harmful to young and vulnerable audiences, the company had fully planned on opening it up to younger children.
Facebook has consistently downplayed Instagram's harmful effects, both to the media and in comments to Congress, and it refuses to make its research public or available to academics and lawmakers. Internal research at Instagram has collected data for years that prove the app is particularly harmful to teenage girls, more so than any other social-media platform.
When we break it down, the reality is Facebook is looking to make a hell of a profit off exploiting young users. Thus, the motivation to expand is entirely profit-based, regardless of how it affects these young and impressionable minds. CEO Mark Zuckerburg and other reps for the company can spin their expansion plans all they want. But, no matter how hard they try to hammer in the innocent claim that they want to give young users a place to connect with each other, they clearly don't have the users' best interests at heart.
Facebook's appeal to teens has dwindled in recent years, as more and more members of Gen Z flock to Instagram and other apps such as TikTok. In order to stay relevant and keep engagement high, Facebook feels it needs to expand Instagram's base of young users. More than 40% of Instagram's user base is 22 years old and younger, and about 22 million teens log onto Instagram in the U.S. each day. So, expanding its base is vital to the company's more than $100 billion in annual revenue.
Here's the kicker: Research shows the features that make Instagram the most harmful to teens appear to be central features of the platform. For example, Instagram's "Explore" page serves users curated posts from accounts sorted by the app's artificial intelligence, which has been proven to push users towards harmful content. In addition, these internal reports found the app's culture of posting only the best moments of one's life is not only toxic, but also addictive. Instagram's artificial intelligence and algorithm can push teens towards eating disorders, leading to the development of a negative self-image and, ultimately, depression.
Related: Mark Zuckerberg Loses $7 Billion After WhatsApp, Facebook and Instgram Crash
Zuckerberg claims social apps can have positive mental health benefits, but there's another side to the coin
But, instead of openly accepting the research and making moves to change the central features proven to cause harm, Zuckerburg did what he does best: deny, deny, deny. When asked about children and mental health at a congressional hearing in March 2021, Zuckerburg said, "The research that we've seen is that using social apps to connect with other people can have positive mental health benefits." Well, I'm sure there is truth to that statement, but we'd all like to know what Zuckerburg is doing to address what the public now knows.
Facebook's way of dealing with potential criticism is to hide its internal reports and deny its knowledge of the darker impacts on its users. We know this by now, so why aren't we holding Facebook accountable for not only hiding this information, but also ignoring it altogether in its plans to expand the platform to an even younger base? Why does Zuckerburg get to gloss over the data and claim his goal is to "connect" children on Instagram?
So many people, myself included, are struggling to wrap their heads around how Facebook and Instagram can get away with hiding such precious data. Just a few years ago, Zuckerburg famously (or infamously) claimed Facebook had nothing to do with skewing information during election season, but a whistleblower proved otherwise. We know that Facebook just loves to stir the pot by creating an algorithm designed to cause conflict amongst members of political parties. The company creates the perfect storm of drama and pushes content to users for the sole purpose of driving more engagement regardless of how dangerous it can be.
We have absolutely no reason to trust that Zuckerburg and Instagram head Adam Mosseri will do anything to fix the issues at hand. Instead, we can confidently assume they plan on using the controversial algorithm to their advantage by getting new, younger users hooked on using the app. At the end of the day, more engagement leads to more dollar signs. And all the research points to money being the ultimate goal here, no matter how much their plans will negatively impact mental health.
Related: Mark Zuckerberg Is So Rich He Got Morgan Freeman to Voice His Virtual Home Assistant