Ofcom report highlights challenges in understanding video-sharing platform rules

The study, which examined six major platforms, revealed that the complexity and length of the terms posed significant comprehension challenges, particularly for younger users.

Digital business document management and checking system

The media regulator Ofcom has found that many users of video-sharing platforms like OnlyFans, Twitch, and Snapchat struggle to comprehend the complex and lengthy terms and conditions of these platforms. Ofcom examined six platforms and discovered that advanced reading skills were necessary to understand the rules, making them unsuitable for children.

Jessica Zucker, Ofcom’s online safety policy director, emphasized that unambiguous rules are essential for protecting users, especially children, from harm. OnlyFans had the longest terms, taking over an hour for adult users to read, followed by Twitch, Snapchat, TikTok, Brand New Tube, and BitChute. Ofcom assessed the terms as difficult to read, except for TikTok’s, which were more accessible but still challenging for young users.

Some platforms use “clickwrap” agreements, making agreeing easier without reading the terms. Users also lack understanding about prohibited content and potential consequences for rule violations. Moderators’ training and resources varied, impacting their ability to enforce terms effectively. Platforms like Snapchat acknowledged the need for clearer guidelines and committed to improvements based on Ofcom’s findings.

Why does it matter?

While these platforms have become integral to modern communication and entertainment, the complexity and length of their rules are creating significant barriers for users to understand their rights and responsibilities. Moreover, using methods like “clickwrap” raises crucial questions regarding users’ informed consent and subsequent accountability. As for enforcing these rules, inconsistent training and resources for moderators can lead to uneven enforcement, potentially allowing harmful or prohibited content to go unchecked. Twitter’s last year’s decision to scale down its team of contract-based content monitors has garnered criticism from various organizations, further igniting discussions on the platform’s content governance practices and the essential role moderators have in enforcing them.