Sort:  

TikTok's Dark Secrets: Internal Documents Reveal App's Compulsive Design and Negative Mental Health Effects

A recent lawsuit filed by the Kentucky Attorney General's Office has shed light on the inner workings of TikTok, revealing a company that is aware of its app's addictive nature and negative mental health effects, yet continues to prioritize its own interests over the well-being of its users. According to internal documents reviewed by NPR,

Hi, @taskmaster4450le,

This post has been voted on by @darkcloaks because you are an active member of the Darkcloaks gaming community.


Get started with Darkcloaks today, and follow us on Inleo for the latest updates.

TikTok's executives and employees knew that the app's features were designed to foster compulsive use, leading to a range of negative consequences, including loss of analytical skills, memory formation, and increased anxiety.

The documents reveal that TikTok's own research found a strong correlation between compulsive usage and negative mental health effects, including depression, anxiety, and sleep disturbances. Despite this knowledge, the company has continued to design its app to be highly engaging, with features such as infinite scrolling, algorithm-driven content, and notifications that keep users hooked. This has led to a situation where young users, in particular, are spending excessive amounts of time on the app, often to the detriment of their mental and physical health.

The company's attempt to address concerns about compulsive use is its time-management tool, which sets a default limit of 60 minutes a day for app use. However, internal documents reveal that this tool is ineffective, with teens still spending an average of 107 minutes on the app even when it's switched on. The company's own research found that minors lack the executive function to control their screen time, making it difficult for them to stick to the recommended limits.

Furthermore, the documents reveal that TikTok is aware of the existence of "filter bubbles," which can lead users down a rabbit hole of negative content, including hate speech, disinformation, and harmful ideologies. Employees conducted internal studies, which found that they themselves became trapped in negative filter bubbles after following certain accounts. The company is also aware of content and accounts promoting disordered eating, self-harm, and other harmful behaviors, yet has failed to take adequate action to remove such content.

The documents also reveal that TikTok's moderation efforts have been inadequate, with underage girls on the app receiving "gifts" and "coins" in exchange for live stripping, and higher-ups in the company instructing moderators not to remove users reported to be under 13 years old unless their accounts explicitly stated that they were under 13. The company has also acknowledged that a substantial number of content violating its rules gets through its moderation techniques, including videos that normalize pedophilia and glorify minor sexual assault.

In response to the lawsuit, TikTok spokesman Alex Haurek defended the company, claiming that the Kentucky AG's complaint "cherry-picks misleading quotes and takes outdated documents out of context to misrepresent our commitment to community safety." Haurek also highlighted the company's "robust safeguards," including the removal of suspected underage users and the launch of safety features such as default screentime limits and family pairing.

However, the internal documents reviewed by NPR paint a disturbing picture of a company that is aware of its app's addictive nature and negative mental health effects, yet continues to prioritize its own interests over the well-being of its users. As the lawsuit against TikTok continues to unfold, it is clear that the company has a long way to go in terms of addressing its complicity in the mental health crisis affecting young people today.

The revelations in the internal documents raise serious concerns about the impact of social media on young people's mental health and well-being. The fact that TikTok is aware of its app's addictive nature and negative mental health effects, yet continues to prioritize its own interests over the well-being of its users, is a clear indication that the company is more concerned with its own profits than with the well-being of its users.

In conclusion, the internal documents reviewed by NPR reveal a company that is aware of its app's addictive nature and negative mental health effects, yet continues to prioritize its own interests over the well-being of its users. As the lawsuit against TikTok continues to unfold, it is clear that the company has a long way to go in terms of addressing its complicity in the mental health crisis affecting young people today.