Congressional initiatives aiming to improve children’s online safety in the USA raise concerns

US Congress aims to enhance children’s online safety and privacy by updating laws, restricting data collection from kids, and addressing harmful content. Critics raise concerns about privacy, free speech, and excessive content moderation.

 American Flag, Flag

US Congress aims to enhance children’s online safety and privacy by calling out social media algorithms that promote behavioral disorders, self-harm, and substance misuse.

Senators Ed Marke and Bill Cassidy are two of the sponsors of the ‘Children and Teens’ Online Privacy Protection Act (COPPA 2.0),’ which intends to restrict internet platforms from gathering information from kids without their agreement. It is claimed that the bill updated a decades-old statute for children under the age of 13. At the same time, Senators Richard Blumenthal and Marsha Blackburn are rooting for the Kids Online Safety Act (KOSA), which aims to establish a duty of care to safeguard minors from bullying, harassment, content encouraging suicide, substance misuse, eating disorders, and sexual exploitation. In addition, a set of bills aimed at reducing instances of online child sexual exploitation was recently approved by the Senate Judiciary Committee. Senators Brian Schatz and Tom Cotton proposed a bill that suggests implementing an age verification mandate for social media, preventing individuals under 13 from accessing such platforms.

Critics expressed their concerns about potential violations of privacy and free speech rights, increased data collection, compromised encryption, excessive content moderation, and the blocking of non-harmful content. The international non-profit organization Electronic Frontier Foundation emphasized that such laws give the government too much power to decide what may be safe for children. Senior policy counsel at the American Civil Liberties Union, Cody Venzke, stressed that it would be difficult for online platforms to comply with laws without potentially removing useful content for children or adults.

Tech industry groups, including Meta, criticized child safety bills but emphasized their efforts to protect children. Meta stressed that since 2023 the company has invested huge amounts of funds in privacy policy and federal law, and they have built 30 tools to help young people and families online in collaboration with Google. Snapchat and TikTok have also launched a set of parental controls, content screening, and youth advising strategies.