Regulating digital games

Regulating digital games is a topic that has been at the forefront recently due to the advancement in gaming technology. US Congress pushed the games industry to set up an Entertainment Software Ratings Board to determine age ratings. This led to the Pan European Game Information rating in 2003. This rating system is similar to that of films.

As online games have become more impactful, the question of content regulation is becoming more important. This issue got in the focus after the Christchurch shootings in 2019, when users of Roblox, an online gaming platform, started re-enacting the event. After this incident, Roblox employs “thousands” of human moderators and artificial intelligence to check user-submitted games and police chat among its 60m daily users, who have an average age of about 13.

This has prompted debates on how to regulate social media-like conversations. Politicians have argued that the constitution should protect in-game chat as it is considered similar to one-to-one conversation.

Game makers are doing their best to design out bad behaviour before it occurs. For example, when users of Horizon Worlds complained of being virtually groped, a minimum distance between avatars was introduced.

Similar o content moderation of social media, online gaming will trigger new policy and governance challenges.

US state of Utah introduces laws that prohibit social media platforms from allowing access to minors without explicit parental consent

In the USA, the Governor of Utah, Spencer Cox, has signed two laws introducing new measures intended to protect children online. The first law prohibits social media companies from using ‘a practice, design,
or feature that […] the social media company knows, or which by the exercise of reasonable care should know, causes a Utah minor account holder to have an addiction to the social media platform’. The second law introduces age requirements for the use of social media platforms: Social media companies are required to introduce age verification for users in Utah and to allow minors to create user accounts only with the express consent of a parent or guardian. The laws also prohibit social media companies from advertising to minors, collecting information about them, or targeting content to them. In addition, there is a requirement for companies to enable parents or guardians to access the minors’ accounts. and minors should not be allowed to access their social media accounts between 10:30 pm and 06:30 am.

The laws – set to enter into force in March 2024 – have been criticised by civil liberties groups and tech lobby groups who argue that they are overly broad and could infringe on free speech and privacy rights. Social media companies will likely challenge the new rules.

European Parliament calls for strengthened consumer protection in online video games

The European Parliament adopted a report on consumer protection in online video games, calling for better protection of gamers from addiction and manipulative practices. The report notes the need for harmonised rules that would give parents an overview of and control over the games played by their children. It also highlights the importance of clearer information on the content, in-game purchase policies, and target age group of games.

In the view of the European Parliament, online video games should prioritise data protection, gender balance, and the safety of players and should not discriminate against people with disabilities. Moreover, cancelling game subscriptions must be as easy as subscribing to them. The purchase, return, and refund policies must comply with EU rules.

New report reveals how US adolescents engage with or experience pornography online

According to research by Common Sense Media, 75% of US teens have seen online pornography by the time they are 17, with the average age of first exposure being 12 years old. The report’s goals are to provide a baseline for understanding US teens’ pornography use and to comprehend the role that internet pornography plays in adolescent life in the USA.

The study was based on a poll of 1,358 Americans between the ages of 13 and 17. More than half of those surveyed admitted to seeing pornographic footage of violent crimes like rape, suffocation, or people in pain. The majority of respondents claimed that Asian, Black, and Latino stereotypes were depicted in pornography. After seeing porn, more than half of the respondents claimed they felt bad or ashamed. Meanwhile, 45% of respondents felt that pornography gave them useful information about sex. Teenagers who identify as L.G.B.T.Q. in particular claimed it helped them learn more about their sexuality.

More time spent online might increase the risk of OCD for children

Preteens are more likely to develop obsessive-compulsive disorder if they spend more time playing internet games or watching videos. The most extensive long-term investigation of brain development in American children, the Adolescent Brain Cognitive Development research, has reached this conclusion. The preteens had a 13% higher chance of developing obsessive-compulsive disorder within two years for every additional hour they spent playing video games. Additionally, for every additional hour they spent watching internet videos, their chance of OCD increased by 11%. According to the report, schools can be vital in ensuring that adolescents form positive digital habits at a crucial juncture in their growth.

Epic Games to pay $520 million penalty in USA over privacy violations and ‘dark patterns’ cases

The US Federal Trade Commission and the creator of Fortnite, Epic Games, have reached a settlement which would see the company pay a total of US$ 520 million in penalties over allegations that it had violated the Children’s Online Privacy Protection Act and used dark patterns to trick players into making unintentional purchases.

For allegations related to collecting personal information from Fortnite players under the age of 13 without getting consent from their parents or caregivers, Epic has agreed to pay a US$ 275 million penalty. Furthermore, the FTC determined that Epic’s default settings for its live text and voice communication features, as well as its system of pairing children with adults/strangers to play Fortnite with, exposed youngsters to harassment and abuse. Epic is also required to adopt strong privacy default settings for children and teens, ensuring that voice and text communications are turned off by default.

In a second case, the business conceded to pay US$ 245 million to refund users for its dark patterns and billing practices.

New report examines TikTok’s algorithm in recommending harmful content to vulnerable teen users

Researchers from the Center for Countering Digital Hate look into how TikTok’s For You feed’s algorithmic recommendations react to teen users who express interest in eating disorders, body image, and mental health.

By creating brand-new accounts for users in the USA, UK, Australia, and Canada, all of whom were 13 years old, researchers looked at the algorithm behind TikTok. One of these accounts has a username that suggests a preoccupation with one’s appearance. They watched and liked any videos about body image, mental health, or eating disorders for each account, and then they recorded the first 30 minutes of algorithmically suggested content on each account’s For You feed. The resulting recordings were examined to see how frequently eating disorder, self-harm, and body image recommendations were made.

The study found that TikTok gave suicide-related information recommendations in under 2.6 minutes. Within 8 minutes, TikTok offered material on eating disorders. Every 39 seconds, teens on TikTok received recommendations for videos on body image and mental health. According to the study, self-harm videos were recommended to vulnerable accounts with the vulnerable phrase in their usernames 12 times more frequently than they were to regular accounts.

TikTok sued in a US State for security and safety violations

Indiana’s Attorney General filed a lawsuit against TikTok for violation of state consumer protection laws. The lawsuit alleges that the social media company failed to disclose that ByteDance, the Chinese company that owns TikTok, has access to sensitive consumer information. Moreover, another complaint claims that the company exposes children to sexual and substance-related content, while misleading the users with its age rating of 12 plus on App Store and Google Play. Indiana seeks penalties of up to US$5000 per violation and asks the Indiana Superior Court to order the company to stop false and misleading representations to its users.

New amendments introduced to UK Online Safety Bill

The UK Government has introduced amendments to the Online Safety Bill, addressing the removal of online content. The new version of the Bill will not define types of objectionable content; rather, it will offer a ‘triple shield’ of protection to users. Online platforms will be required to remove illegal content or content violating their community guidelines and to provide adult users with greater control over the online content. Online platforms will also be expected to be more transparent about online risks to children and to illustrate how they enforce age verification measures. Another set of amendments will protect women and girls online, introducing control or coercive behaviour as a criminal offence under the Bill, and requiring that online platforms be more proactive with such content. The Bill is scheduled to return to the UK Parliament next week, with the first amendments tabled in the Commons for Report Stage on 5 December. Further amendments are expected in the later stages of the legislative process.