What is the social responsibility of social media companies? Users’ mental and social safety comes first or business interests? But in this connection, Meta is silent and hiding evidence of the harm caused by its social platforms.

A lawsuit filed by American school districts has revealed that Meta stopped research on the effects of Facebook and Instagram on mental health because the study proved that their platforms were harming users’ mental health. According to the Reuters news agency, these documents have revealed that the experiments conducted by the ‘company itself’ to test the effects of Facebook and Instagram on mental health concluded that users who stayed away from these platforms for a few days felt calmer and less stressed. But the research was stopped before these results reached the outside world.

The lawsuit filed by US school districts alleges that Meta not only concealed the negative evidence but also delayed further research, internally dismissing it as the result of “misleading narratives”. Interestingly, the company’s top executives were privately told that the research findings were accurate. One researcher even compared the situation to the silence of the tobacco industry, which knew the harmful effects but did not inform the public about the truth.

In addition to Meta, the lawsuit also makes serious allegations against other major tech companies, Google, TikTok, and Snapchat. According to the claims, these platforms not only encourage underage users, but in some cases, such decisions were deliberately made that prioritized only increased engagement and business interests over the online safety of children and young people.

The documents also revealed that TikTok offered financial support to children’s rights organizations to strengthen its narrative. That is, the company took steps that enabled these organizations to make statements in favor of TikTok to the public. Similarly, Meta’s internal records also revealed that the company designed some safety features so that they were not used too much. In addition, testing or experimenting with these features was also stopped several times so that users would not reduce their time or engagement on the platform.

Some documents also give the impression that the delay in addressing the risks faced by children within the company was not just a technical issue but a difference in priorities, with Mark Zuckerberg even indicating in a message that building the Metaverse was more important to him than protecting children online. Meta strongly denies all these allegations. The company argues that the quotes in the lawsuit were taken out of context and that the reality is much more complex. According to the spokesperson, Meta has been taking significant steps to protect young users for years, implementing new policies and continuing to collaborate with global experts to address these threats.

However, the court proceedings could take the debate to a new level in the coming months, especially as the question of what social media companies’ social responsibility is, and whether they truly prioritize the mental and social safety of their users over business interests, is increasingly being raised.

Read also: PTA and Meta launch Instagram’s “Teen Accounts” to strengthen online safety for youth in Pakistan

 

Leave a Reply

Your email address will not be published. Required fields are marked *

Related Posts