Digital standards

Let’s say you need to measure something with a ruler, and the length is 10 centimetres. If I had to measure the line with my ruler, it would also be 10 centimetres. Rulers marked in centimetres are all the same: Manufacturers of rulers mark centimetres – or any other distance – according to a set standard. 

Now take any credit card: its size is the same as any other credit card. If they were not the same, banks would have a hard time developing automated teller machines (ATMs) that would read credit cards in different sizes. Again, there is a standard that defines this size.

Our world is made up of standards, that is, sets of agreed-upon rules that tell us how to do something. Standards exist in every field, including healthcare, aerospace, construction, measurement, and technology and the Internet, where they are called digital standards. For instance, the letters of the alphabet on an English keyboard, including those that pop up on our mobile devices, follow the same pattern, called ‘QWERTY’ (next time you are typing, try to identify where these letters are on your keyboard or on your keypad).  

Digital standards include technical (related to how the infrastructure of the Internet works), web (related to how content is used), and mobile standards (related to how mobiles communicate). These are explained in more detail below.

Technical standards

Internet technical standards and services form the infrastructure that makes the Internet work, and include the Transmission Control Protocol/Internet Protocol (TCP/IP), the domain name system (DNS), and the secure sockets layer (SSL). Standards ensure that hardware and software developed or manufactured by developed entities can work together as seamlessly as possible. Standards therefore guide the technical community, including manufacturers, to develop interoperable hardware and software.

Transmission Control Protocol/Internet Protocol (TCP/IP) is the main Internet technical standard. It is based on three principles: packet-switching, end-to-end networking, and robustness. Internet governance related to the TCP/IP has two important characteristics: the introduction of new standards – an aspect that is shared by technical standards in general – and the distribution of IP numbers, which is explained in more detail in the section on IP numbers.

Setting technical standards

Technical standards are increasingly being set by private and professional institutions. The Internet Architecture Board (IAB) oversees the technical and engineering development of the Internet, while most standards are set by the Internet Engineering Task Force (IETF) as Requests for Comments (RFC). Both the IAB and the IETF have their institutional home within the Internet Society (ISOC).

Other institutions include: the Institute of Electrical and Electronic Engineers (IEEE), which develops standards such as the WiFi standard (IEEE 802.11b); the WiFi Internet Governance Alliance, which is the certification body for WiFi-compatible equipment; and, the Groupe Speciale Mobile Association (GSMA), which develops standards for mobile networks.

Standards that are open (i.e. open Internet standards) allow developers to set-up new services without requiring permission. Examples include the World Wide Web and a range of Internet protocols. The open approach to standards development has been affirmed by a number of institutions. One such affirmation is the OpenStand initiative, endorsed by bodies such as the IEEE, the IETF, the IAB, the World Wide Web Consortium (W3C), and the Internet Society.

Technology, standards, and policy

The relevance of setting or implementing standards in such a fast-developing market gives standard-setting bodies a considerable amount of influence.

Technical standards have far-reaching economic and social consequences, promoting specific interests and altering the balance of power between competing businesses and/or national interests. Standards are essential for the Internet. Through standards and software design, Internet developers can shape how human rights are protected.

Efforts to create formal standards bring private technical decisions made by system builders into the public realm. In this way, battles over standards can bring to light unspoken assumptions and conflicts of interest. The very passion with which stakeholders contest standards decisions should alert us to the deeper meaning beneath the nuts and bolts.

What are the policy issues related to technical standards? Non-technical aspects – such as security, human rights, and competition policy – may not be sufficiently covered during the process of developing technical standards. For instance, most of the past developments of Internet standards aimed at improving performance or introducing new applications, whereas security was not a priority. It is now unclear whether the IETF will be able to change standards to provide proper authentication and ultimately reduce the misuse of the Internet (e.g. spam, cybercrime).

Given the controversy surrounding any changes to basic Internet standards, it is likely that security-related improvements in the basic Internet protocol will be gradual and slow. Yet, decisive steps are starting to be implemented in this direction, with the Domain Name System Security Extensions (DNSSEC) being a good illustrative example. Following almost 12 years of research, trials, and debates within the technical community, DNSSEC was first deployed for some country code top-level domains (ccTLDs). Since 2010 it has also been implemented at the root server level. However, further challenges reside in the large-scale adoption of this new security standard down the ladder by domain name registrars, Internet service provider (ISPs), and website owners.

There also appears to be a gap in the participation of stakeholders in the development of technical standards. Even though participation is open to all stakeholders groups, some submissions to the CSTD Working Group on Enhanced Cooperation (WGEC) have noted the need for more involvement from specific stakeholder groups such as governments.

Web standards

Web standards are a set of formal standards and technical specifications for the World Wide Web. They ensure that content is accessible across devices and configurations, and therefore provide the core rules for developing websites.

The main content and applications standards include: HyperText Markup Language (HTML), a plain text language which makes use of tags to define the structure of documents; eXtensible Markup Language (XML), another type of language used for sharing structured information; Cascading Style Sheets (CSS), a language used in conjunction with HTML to control the presentation of web pages; and, eXtensible HTML (XHTML), an extended version of HTML which uses stricter rules.

Why were web standards required? By the late 1980s, the battle of network standards was over. TCP/IP gradually became the main network protocol, marginalising other standards. While the Internet facilitated normal communication between a variety of networks via TCP/IP (see Technical Standards), the system still lacked common applications standards.

A solution was developed by Tim Berners-Lee and his colleagues at the European Organization for Nuclear Research (CERN) in Geneva, consisting of a new standard for sharing information over the Internet called Hypertext Markup Language (HTML). Content displayed on the Internet first had to be organised according to HTML standards. HTML, as the basis of the web, paved the way for the Internet’s exponential growth.

Since its first version, HTML has been constantly upgraded with new features. The growing relevance of the Internet has put the question of the standardisation of HTML into focus. This was particularly relevant during the Browser Wars between Netscape and Microsoft, when each company tried to strengthen its market position by influencing HTML standards. While basic HTML only handled text and photos, newer Internet applications required more sophisticated technologies for managing databases, videos, and animation. Such a variety of applications required considerable standardisation efforts in order to ensure that Internet content could be properly viewed by the majority of Internet browsers.

Application standardisation entered a new phase with the emergence of XML, which provided greater flexibility in the setting of standards for Internet content. New sets of XML standards were also being introduced, such as the standard for the distribution of wireless content called Wireless Markup Language (WML).

Setting web standards

The main web standard-setting institution is the World Wide Web Consortium (W3C), headed by Tim Berners-Lee. Standards are developed through an elaborate process that aims to promote consensus, fairness, public accountability, and quality. At the end of the process, standards are published in the form of recommendations.

When it comes to an open approach to standards development, W3C – in addition to other bodies such as the IEEE, the IETF, the IAB, and the Internet Society – subscribes to the OpenStand initiative, an affirmation of principles that encourages the development of open and global market-driven standards.

W3C standards define an open platform for the development of applications, which enables developers to build rich interactive experiences. W3C states that ‘although the boundaries of the platform continues to evolve, industry leaders speak nearly in unison about how HTML5 will be the cornerstone for this platform’.

It is interesting to note that in spite of its high relevance to the Internet, the W3C has not attracted much attention in the debate on Internet governance thus far.

Another institution involved in standards is the European Computer Manufacturers Association (ECMA), an association of companies whose main role is to develop standards and technical reports.

As with technical standards, the possible gap in the development of web standards is related to the coverage of non-technical aspects (e.g. human rights, competition policy, and security). Web standards have an even stronger impact on these non-technical aspects since they shape the ways in which the Internet is accessed and used more than technical standards.

Web standards for accessibility of persons with disabilities

‘The power of the Web is in its universality. Access by everyone regardless of disability is an essential aspect. The Web is fundamentally designed to work for all people, whatever their hardware, software, language, location, or ability.’ - Tim Berners-Lee

The emergence of the Internet created a great opportunity for people from all across the globe to connect on a new level. The main idea of the web was to leave no one behind and to include persons with disabilities. In order to do this and create a more inclusive Internet, W3C established the Web Accessibility Initiative (WAI). The W3C WAI regularly works on updating and developing web standards to fulfill this mission. Part of this effort includes creating Web Content Accessibility Guidelines (WCAG), ‘developed through the W3C process in cooperation with individuals and organizations around the world’. The latest edition of WCAG, WCAG 2.1, has set the standard for the future accessibility of websites and web applications.

WCAG are also being developed for mobile technology producers. Mobile technology in this sense includes: tablets, household devices, car dashboards, and wearable devices (such as smart watches), among others.

Access to web applications for persons with disabilities is mandatory in some states. The US Department of Justice, Civil Rights Division, issued guidelines on how public web services should look in order to comply with this request.

Mobile standards 

Mobile standards are a set of standards developed for best communication via mobile devices. They incorporate standards regarding mobile networks and their compatibility with mobile operators. Mobile networks are networks in which the last communication link is wireless.

The Global System for Mobile communication (GSM) was developed in 1991 by the European Telecommunications Standards Institute (ETSI). The GSM standard set the rules for the use of the 2G mobile network. In the year 2000, the IS-2000 standard (also known as: CDMA 2000) – a set of 3G mobile technology standards – was developed. The IS-2000 standard is set to make requirements and determine properties for sending and receiving audio and video data on mobile networks. Third generation cellular standards were further polished and published by the ITU’s Universal Mobile Telecommunications System (UMTS) standard in year 2001.By 2009, a new set of standards aimed to improve the speed and capacity of mobile networks, and are referred to as Long-Term Evolution (LTE) or 4G networks. LTE looked to provide a new set of Internet technologies, including high definition video over the Internet, gaming services, video conferencing, and 3D television. 

The next step for mobile standards was the development of the fifth generation cellular network technology: the 5G network. The first implementation of 5G standards started in March 2019. 5G networks were envisaged as networks for massive device connectivity, high-speed Internet, and low-communication latency. The development of 5G networks is lead by business sector players such as: Intel, Huaway, Qualcomm, and Cisco.

Setting mobile standards 

After the development of 2G networks, most efforts regarding mobile standards were made by the 3rd Generation Partnership Project (3GPP), a consortium with telecommunications standards associations from Asia, Europe, and North America. Members include: ARIB (Japan), ATIS (US), CCSA (China), ETSI (Europe), TSDSI (India), and TTA (South Korea). The 3GPP consortium has added market representation partners to help with the implementation of 5G. The partners include: the GSM Association, Next Generation Mobile Networks, the Wireless Broadband Alliance, and the IPV6 Forum. 

Standardisation bodies dealing with mobile standards in order to harmonise these standards with the basic technical and web standards on the Internet are: the International Organisation for Standardisation (ISO), the ITU, the Internet Engineering Task Force (IETF), and the W3C.  

 

Mr Arvin Kamberi

Multimedia Coordinator, DiploFoundation

An expert in remote participation, Mr Arvin Kamberi heads up Diplo’s Webinar Team. Based in Belgrade, he has been working on webinars and other web-based remote participation since 2011. Arvin has been part of the remote participation team for many international forums, such as the IGF, EuroDIG, and local IGF events such as IGF Africa. From 2014 to 2015, he was a part of the IGF Working Group on Remote Participation (established in 2008), and involved in the elaboration of IGF remote participation guidelines.

Arvin has a keen interest in cryptocurrency and blockchain developments; first as an avid ‘miner’, then more in terms of regulation and consensus mechanisms surrounding the decentralised systems. His primary focus is on Bitcoin development, but he follows other cryptocurrency developments and the blockchain/distributed ledger technology, too. Vice President of Bitcoin Association of Serbia, Arvin writes extensively about Bitcoin and blockchains. He holds an MA in Film and Video Production from Belgrade University of Art.

Share on FacebookTweet