Anavitarte emphasized the importance of empowering parents who might not know how to manage content access for their children. This initiative takes inspiration from Louisiana’s recently passed law, which mandates age verification and parental consent for minors joining social media platforms. Similar laws have been enacted in Arkansas, Texas, and Utah.
Surgeon General Vivek Murthy has expressed concerns about social media safety for young individuals. Meta Platforms, owner of Facebook and Instagram, has been contacted regarding these plans. This move comes in response to the popularity of social media among teenagers, with the Pew Research Center reporting up to 95% of teens aged 13 to 17 use such platforms. Anavitarte also aims to strengthen Georgia’s cyberbullying law.
Why does it matter?
With the surge in underage social media use and potential associated risks, such legislative efforts seem well-intentioned in safeguarding minors from online threats, including cyberbullying, inappropriate content, and privacy breaches. However, advocates for free speech caution that these actions might result in websites restricting access to information and creating obstacles for adults’ access. Additionally, the introduced laws might prompt online platforms to implement government identification for age verification purposes, which is already happening on some pornography sites, raising concerns about privacy and potential data breaches.
UK law requires parental consent for processing data of children under 13. Snapchat generally requires users to be 13 or older but has not disclosed its measures to address this issue. The Information Commissioner’s Office (ICO) has received complaints and is assessing whether Snap breached rules. Snap could be fined up to 4% of its annual global turnover if found in breach.
Similar pressure has been on other social media platforms like TikTok, which was fined for mishandling children’s data. Snapchat blocks users under 13 from signing up their age, but other platforms take more proactive steps to prevent underage access.
Why does it matter?
As social media platforms increasingly become spaces for cyberbullying, inappropriate content, and other risks that could have lasting psychological and emotional effects on young individuals, governments are contemplating measures to protect these individuals. Recently, these platforms have encountered hurdles as they navigate a complex landscape of US state laws. The laws demand age verification from users and seek enhanced parental control over children’s accounts. The focus is on both ensuring user safety and ensuring that social media companies uphold responsible practices, particularly regarding children’s welfare.
The groups jointly filed a court document asserting that Montana’s ban on TikTok contradicts the fundamental principles of the internet and could lead to a fragmented online experience. TikTok, owned by China’s ByteDance, has been locked in a legal dispute since May, claiming that the state’s ban infringes upon the First Amendment rights of the company and its users.
The tech groups contended that the ban, if implemented, could lead to a fragmented internet where access to information is restricted based on local political preferences, diminishing the overall value of the internet for humanity. A court hearing on TikTok’s request for a preliminary injunction is scheduled for October 12th.
Why does it matter?
Former President Donald Trump attempted to block new downloads of TikTok in 2020. Still, his efforts were thwarted by a series of court decisions, not without prompting a cascade of global considerations. Concerns include data privacy and content censorship, particularly regarding potential Chinese access to user data. Austria recently joined the UK and the EU states in prohibiting TikTok on government devices, reflecting this global trend. Tech groups argue that these decisions could have potential consequences for allowing states to ban specific online platforms, leading to a fragmented internet experience and curtailing users’ access to global networks.
Currently, there is a self-regulatory system for online advertising overseen by the Advertising Standards Authority (ASA), but it lacks the power to address illegal harms as effectively as harmful advertising by legitimate businesses. The government plans to introduce statutory regulation to tackle illegal paid-for online adverts and enhance child protection. This new regulation will extend responsibilities to major players across the online advertising supply chain, including ad tech intermediary services and promotional posts by social media influencers.
The government will launch a consultation on the specifics of potential legislation, including the preferred choice for a regulator to oversee the new rules. The task force will gather evidence on illegal advertising and collaborate with industry initiatives to protect children and address harmful practices. The proposed regulations aim to strike a balance between internet safety and supporting innovation in online advertising while ensuring transparency, consumer trust, and industry growth.
A former TikTok content moderator in Kenya, James Oyange Odhiambo, has alleged that he developed post-traumatic stress disorder (PTSD) due to his work and was unfairly dismissed for advocating for better working conditions.
The law firm representing Odhiambo has sent a letter to TikTok’s parent company ByteDance and the outsourcing company Majorel threatening a lawsuit if their demands are not met within two weeks. The letter alleges that content moderators were, at times, required to watch between 250 and 350 disturbing and violent videos per hour without adequate mental health support. A TikTok spokesperson declined to comment on the accusations made in the letter. ByteDance did not respond to Time’s request for comment.
TikTok, owned by China’s ByteDance, has filed a lawsuit against the Montana state ban and is seeking a preliminary injunction to block its enforcement that should come into effect on 1 January. Montana passed a law imposing fines of $10,000 for each violation by TikTok. The law does not impose penalties on individual TikTok users.
TikTok filed suit in May and is now asking the US District Judge to issue a preliminary injunction to block the first-of-its-kind US state ban on several grounds, claiming that it infringes on the First Amendment rights of both TikTok and its users. Additionally, the platforms claim that the ban is preempted by federal law and violates the Commerce Clause of the US Constitution.
The company estimates that around 380,000 people in Montana use TikTok out of 150 million users nationwide. The platform also insists that it does not share any data with the Chinese government and warns that the ban would have significant and irreversible effects on its business and brand. Montana, on the other hand, is considering expanding the ban.
The newspaper created a custom-built map de_vonya in a secret room underneath a map that players can stumble upon. Only ‘killed’ players can wander around the map and open the door to the secret room, where the map shows facts, figures, and photographs of the situation in Ukraine. If a player is still active or ‘alive’ in the game, the door to the secret room remains closed.
The newspaper said that to shed light on press freedom, they have created a custom map of a Slavic city called Voyna, meaning war in Russian. The information provided in the secret room is in Russian and is reported by the newspaper’s war correspondents in Ukraine. The map has been one of the most visited in the game during May.
Since 2016, China has been running an annual campaign to ‘clean up’ the Chinese cyberspace. Reasons for banning or shutdowns vary from promoting unauthorised politically related content and news services to spreading ‘harmful information’ such as superstitions, gambling, prostitution, pornography, and illegal lending. China has tightened its control of the internet also in an attempt to limit Western influence. To date, major platforms such as Bing.com, Baidu, Sina Weibo, Douyu, and Douban have been fined for failing to supervise information posted by users.
The Cyberspace Administration also seeks on-site inspections through provincial and prefecture-level enforcement teams.
The governor of Montana is now looking to broaden the SB 419 bill to include other social media platforms suspected of sharing data with foreign adversaries and hold mobile app stores liable for offering them in the state.
Twitter ended its old verification system on Friday, 20 April, but rather than boosting subscription sales, public data shows that fewer than 500 out of 400,000 legacy users purchased the blue tick starting at $8 a month. Soon after, the blue mark became a symbol of privilege and inspired the ‘Block the blue’ campaign prompting users to block those with the purchased verification.
The re-verification combined with the interim loss of credibility of having the ‘blue tick’ has some celebrities questioning whether marking their accounts as ‘paid for’ is an attempt by the social platform to gain some legitimacy. Others have gone as far as to ponder the legality of the move, its potential for defamation, or a false impression of endorsement.