Do social media apps deliberately promote harmful content to children?


Alexis Spence, a 20-year-old who now suffers from addiction, anxiety, depression, self-harm, eating disorders and suicidal ideation, has filed a lawsuit against Meta Platforms, Instagram and Facebook’s parent company, alleging that Instagram’s design caused these mental disorders. and physical injuries, Slate reports. Spence’s lawsuit follows Frances Haugen’s 2021 disclosure of internal Facebook research, known as the Facebook Papers, which showed Meta knew its social media products were highly addictive and harmful to teenage girls.

Haugen detailed how Facebook knows about the potential dangers of its products, especially for tweens and teens. Section 230 of the Communications Decency Act generally shields online platforms from legal liability for transmitting, removing, labeling, or concealing problematic third-party content. Courts have interpreted its immunity broadly, but recently Section 230 has come under increasing criticism from academics, politicians and regulators from both sides of the aisle. Meanwhile, New York Times reports parents of two girls who said their children died following a ‘blackout challenge’ on TikTok are suing the company, claiming its algorithm intentionally served children dangerous content that led at their death.


Comments are closed.