Attorney General Ashley Moody and 41 other attorneys general are calling on congressional leaders to pass legislation requiring a U.S. Surgeon General warning on all algorithm-driven social media platforms. The letter comes amid growing scrutiny of social media companies for their role in causing generational harm to young people’s mental health.
Attorney General Ashley Moody said, “Studies show that there is a link between youth’s use of social media and psychological harm. We are fighting to protect our youth online by calling on Congress to pass legislation requiring a U.S. Surgeon General warning on these platforms. This warning would not only highlight the inherent risks that social media platforms presently pose for young people, but also complement other efforts to spur attention, research and investment into the oversight of social media platforms.”
Attorney General Moody and the coalition cited growing bodies of research that link young people’s use of algorithm-driven social media platforms to psychological harm, including depression, anxiety, and even suicidal thoughts in kids and teens. The attorneys general also note how platforms feature irresistible algorithmic recommendations, infinite scrolling, and a constant stream of notifications that are designed to keep kids relentlessly engaged on the platforms, even at the expense of taking breaks, engaging in other activities, or sleeping.
States have already taken historic action to hold platforms accountable for the harm caused to young people. Attorney General Moody took legal action against Meta in October 2023. Many states, including Florida, are also either investigating or actively suing TikTok in state court. Despite these efforts to address the harms caused by social media platforms, the attorneys general say the need for federal action is clear.
The attorneys general say more action is necessary because “social media platforms have demonstrated an unwillingness to fix the problem on their own.”
Attorney General Moody is joined by the attorneys general in the following states and territories in signing the letter: Alabama, American Samoa, Arkansas, California, Colorado, Connecticut, Delaware, the District of Columbia, Georgia, Hawaii, Idaho, Illinois, Indiana, Kentucky, Maine, Maryland, Massachusetts, Michigan, Minnesota, Mississippi, Nevada, New Hampshire, New Mexico, New Jersey, New York, North Carolina, North Dakota, Oklahoma, Oregon, Pennsylvania, Rhode Island, South Carolina, South Dakota, Tennessee, U.S. Virgin Islands, Utah, Vermont, Virginia, Washington, Wisconsin and Wyoming.
----------------
Attorney General Moody is fighting to protect youth online.
In March, Attorney General Moody demanded Meta cease monetizing child exploitation and prohibit child-modeling accounts on Instagram. To learn more, click here.
In January, Attorney General Moody called on Congress to push social media outlets to protect kids. To learn more, click here.
In October 2023, Attorney General Moody took legal action against Meta alleging the company knowingly designed and deployed harmful features on Instagram and its other platforms that purposely addict children and teens.
Since then, Meta made minor changes to enhance youth protections, including a new “nightly nudges” setting that prompts users to put Instagram or Facebook away for the night. However, the setting is not compulsory, and Meta has yet to respond to Florida’s allegations.
Last year, Attorney General Moody, along with 46 other attorneys general, opened a multistate investigation into whether TikTok’s business practices violated consumer protection laws. The investigation seeks to determine whether the company engaged in conduct that harmed the mental health of TikTok users, particularly children and teens.
In March 2022, Attorney General Moody and a bipartisan coalition of 43 attorneys general demanded TikTok and Snapchat give parents the ability to monitor a child’s social media usage to protect children from online threats using parental-control applications and features.
Original source can be found here.