Précédent Suivant

Chapter 2. Standards Agreements and Normative Collisions in Internet Governance

p. 67-89


Texte intégral

INTRODUCTION

1The development of the Internet has disrupted ways of conceiving the technical standards and of handling the normalization process. It has troubled the hierarchy of norms for at least two parallel reasons: on one hand, conflicting sources of normativity and legitimacy have appeared due to the growing number of standard-setters; on the other hand, “economical or legal rules have progressively come out as side-effects of technical rules” [Du Marais, 2002]1.

2Various drivers have indeed contributed to the structuring of a new ecosystem of standardization: since the beginnings of the Internet, bureaucratic machinery was deliberately avoided, due to the academic context of innovation, the openness of the basic protocols and the need for the quickest implementation possible. The motto has been “rough consensus and running code”. This USA-fully supported approach led to the technical self-regulation fulfilled by various stakeholders in forums such as the Internet Engineering Task Force (IETF) or the World Wide

3Web Consortium (W3C). All have in common that they grew up alongside the traditional intergovernmental organizations, mainly the International Standards Organization (ISO) and the International Telecommunications Union (ITU), and as such they stand outside international Public Law. This was reinforced by the vertical integration of the telecom and content industries and the conflicting business and regulative models between them. Another challenge took place between proprietary and open standards. Furthermore, the driving role of two main Internet features, end-to-end innovation and knowledge-sharing practices, must be taken into account. This situation creates a severe confusion as regards the nature of Power in the digital age, with a lot of debates about the respective prominence of Technics, Law or Society committments.

4Up until now, the Internet has been working quite well, so that it can appear to some observers as a successful counter-model in establishing standards and norms. Other observers tend to see “the product of the self-regulative activity of the cyberespace, embedded in unformal norms” [Kablan and Oulaï, 2007], as “a ‘natural’ complement of formal norms”2. Our 10 year-experience in studying Internet Governance issues inclines us to more perplexity and less optimism.

5The present paper follows our previous research on Internet Governance Normativity [Massit-Folléa, 2008a] with a more specific approach. It exploits some of the results of a workshop organized in Paris on March 31st, 2009 by the Vox Internet research programme and the Orange Regulation Chair, whose title was “From Standardization to behavorial and societal Norms”. The purpose of this paper is to point out that the two types of standardization – formal and unformal – though not necessarily contradictory, are creating conflicts of normativity:

“as the daily practices which are spread on the networks are at the same time universalized by the technical standards and are differentiated according to diverse values and projects, the normativity of the Internet is being built through controversies, due to multiple practical, commercial, political, intellectual or ethical affordances” [Benghozi & Massit-Folléa, 2009a].

6To be perfectly clear, we must emphasize that the English word ‘standards’ applies to technical agreements, be they formal or unformal. The French word ‘normes’ applies both to technical standards once ‘officialized’ by national and/or international bodies, and to moral rules of behaviour – whether they derive from religion, law, democratic policy or civic activism, is not the point here. The first semantic field refers to the ‘skeleton’ of the Internet – a technical apparatus which interplays with policy choices and access provisions, far from the understanding of basic users. According to Duf, the second meaning, prominent in English social sciences, encompasses “the ethical and legal principles which inform human behaviour and by which social action is regulated” [Duf, 2008]. Regarding the Internet, these two meanings are deeply intertwined: let us remember Lessig’s famous formula “code is law” [Lessig, 2000].

7Standards and norms always work at a collective level. But the way they are produced, codified and applied is different, often contradictory, though more and more overlapping in the global environment of communication devices and practices. The partial normative orders of technical standards tend “to bump into other normative domains, even to squash them” [Schemeil & Eberwein, 2009]; at the same time we must acknowledge the tremendous - and controversial - efforts of both public policies and informed users in order to make the fast and continuous development of ICT in general, and Internet particularly, fit into social rules – whether previous rules or new ones. This is what we call the ‘normative collisions’ in Internet governance.

8The subject is dealt with in two chapters. Firstly, we concentrate on the places, processes and builders of technical Internet standards, and we analyse how the diffusion and appropriation of these standards reveal and generate at the same time diverse frameworks, goals and conflicts of power. Secondly, we introduce the need for principles able to combine the respective benefits of formal and unformal Internet regulations. The case of the Internationalization of the Domain Name System will be used as an example of the current competition and need for agreement between the sources of standardization and the values they are carrying, between privatization and public policy, between unicity and diversity.

CHANGE IN STANDARDS-SETTING

9Going back to classical definitions, the technical meaning is the following: a “standard” in English – a “norme” in French – is a set of functional rules or technical prescriptions relative to products, activities or their results, produced by a consensus between specialists and established by an official document delivered by a national or international organization (AFNOR, agence française de normalisation; ISO, International Standards Organization). The aim is to supply a common level of security, performance, quality and interoperability for uses in a given context. A standard in French is but a reference frame published by a private entity, based upon accreditations given by sectorial professionals or interest groups to technical specifications having reached a consensus. It is an «open standard» when freely distributed, be it published by a private company (eg. Adobe’s Postscript) or by non-for-profit bodies (eg. W3C’s Recommandations or IETF’s Requests for comments)3. Standards may evolve from a de facto to a de jure status. Beyond the highly technical functions of formatting, compressing, transmitting, accessing, securing and displaying information, the design and implementation of standards have significant economic and political implications.

Internet standards at a glance

10In her book “Protocol Politics”, De Nardis describes the specific basis of Internet functioning: Transfert Control Protocol/Internet Protocol. It deserves to be detailed to give light to our analyses:

“The TCP/IP suite traditionally groups protocols into four functional layers: the Link layer, the Internet layer, the Transport layer, and the Application layer. The Link layer refers to protocols defining the interfaces between a computing device and a transmission medium and is closely associated with local area network (LAN) standards such as Ethernet. The Internet layer includes standards for network-layer addressing and for how packets are routed and switched through a network. The most prominent example of a standard operating conceptually at this level is the Internet Protocol, including both IPv4 and IPv6 (for Internet addresses). Two important examples of Transport-layer protocols are TCP and User Datagram Protocol (UDP), standards responsible for ensuring that information has successfully been exchanged between two network nodes. Finally, the Application-layer protocols interact with actual applications running on a computer and include Internet protocols such as HTTP for web communications or FTP for exchanging files” [De Nardis, 2009b].

11Beyond this basic set of standards, computer languages (e.g. HTML, XML), application software (e.g. Office, RealPlayer), network architecture (for addressing, naming and routing) are to be considered as ensuring the growing richness and complexity of online information exchanges.

12Relating to Internet standardization issues, the main bodies we already mentioned are unformal groups of voluntary experts4: IETF (Internet Engineering Task Force) for standardizing protocols and interoperability and ISOC (Internet Society), devoted to the fair and secure development of the Internet - it is the umbrella of the IETF; W3C (World Wide Web Consortium) for standardizing computer languages used on the web. Other bodies are created in the business sphere, under consortium agreements, in order to regulate the flow of contents through purely technical devices, for instance DRM (Digital Rights Management) or PETs (Privacy Enhancing Technologies). Some are more successful than others… The International Telecommunications Union’s Telecommunications Sector (ITU-T) sets Internet-related standards for what regards carriage and user-access and in areas such as voice over the Internet. The IEEE (Institute of Electrical and Electronics Engineers, a non-profit professionnal organization) has also contributed in establishing various formulas from the Ethernet standard to the Wi-Fi family of standards.

13In this listing we deliberately include ICANN (the Internet Corporation for Assigned Names and Numbers) whose mission is the coordination of the so-called ‘Internet critical resources’, by allocating IP addresses and managing the Domain Name System. In that case, we are leaving the world of purely technical standardization: we face a de facto monopoly on a technical architecture, more or less consensus-based, and we consider this as one of the ‘norms’ of Internet functioning – although a fragile and contested one5.

14Finally, in the context of ‘convergence’ – that is the mix of computer, telecommunications and audiovisual techniques, products and services – we see quite clearly the interpenetration of the rules, methods and objects of standardization. Professional experts themselves recognize that:

“it has become more and more difficult to agree on common standards: the standard bodies (like ITU) have diminishing influence and are replaced by pre-standard bodies (e.g. the DVB6 project, in the borderland between broadcast and mobiles technology) and by private consortia (erg. HDMI7 consortium) that can come into conflict between adverse interests” [Vermaele, 2010].

Conflicting models and values in standard-setting

15In the traditional models of growth of the telecommunication networks, standardization has a crucial role. Thanks to the technical standards defined ex ante, the various actors implied in the development of the global network (hardware and peripheral equipments producers, software developers, service providers…) can incorporate and the market can evolve together with the infrastructure. Additionally and adversely, standardization fits with a process of aggregation and adjustment between autonomous networks or the existing infrastructures.

16The development of the network is based on gateways, interfaces, black-boxes. They guarantee compatibility and the exchanges between the various components of the infrastructure. In this second case, standardization intervenes primarily ex post, be it through agreements between industrial partners anxious to organize their common market, or by subcontractors gradually linked to a standard owner by the means of software interfaces. The evolution is no longer led by the initiative of the bodies officially in charge of standardization: it comes out in a sporadic, bursting, cumulative way, under the pressure of professional user-groups and service providers. However, new technologies of information and communication (ICTs) lead by their specific dynamic to widen this traditional dichotomic framework of understanding. First of all, standardization takes a new direction when the contexts of innovation are open and when the traditional separation of the technical layers is no longer evident. Then, the multiplication and the competition between the aggregation platforms to capturing the relation with the consumer tend to make the technical standards evolve mainly from the market standards. Lastly, there is no longer a single actor suitable for controlling the technical developments in one sector or one geographical area, as the operators of public networks or the computer makers did in the past. The emergence of open standards is fully representative in this context: technical flexibility is transferring on the market the capacity of initiative.

17Since the ’90s, the pressure of successful alternative models led ISO to elaborate more flexible rules in standard setting. But the outsiders also experienced problems due to the increased number of population involved and the rise of commercial stakes. In a synthesis of the 2009 Vox Internet – Orange Chair workshop, those problems are described as following:

“The standardization process of Internet, initiated at the beginning of the ’80s, formalized and cleared in 1992, allowed to base technical norms upon simple principles, emblematic of Internet pioneers: voluntary and collaborative working, exchanges by mails much more than in presence meetings, consensus-based decisions. But the Web explosion damaged the former arrangements: it became difficult to take care of big scale applications in a for-profit environment and to maintain coordination and engagement between a growing number of stakeholders” [Vallée, 2009].

18A consequence of the proliferation of standardization bodies is that only big companies can attend several tens of meetings, mobilize a great number of delegates and invest millions of euros (or dollars) in this activity. One noticed, for instance, that for establishing the XML standard more than 80 % of the posts (e-mail messages in the working group) came from the computing industry. Moreover, the English language and the western cultural habits are sine qua non conditions for participation. Louis Pouzin8 told us about a study comparing the origin of 1000 Requests For Comments in IETF: 79 % of the writers came from the USA, 3,6 % from Sweden (Ericsson guys …), 3,2 % from Germany, then came Japan (2,4 %), UK (2,3 %), France (2,1 %) and “others” (4,4 %).

19Not surprising are the opposite views about Internet standardization bodies we read on a specialized mailing-list on 28-29 May 2008. One expert was highly satisfied by what he called the ‘meritocratic’ character of IETF functioning:

“It has no legal existence, and anyone can register for and attend an IETF meeting. It doesn’t matter where you are from, who you work for, what color, gender you are, or any other attribute of your personal life and/or beliefs that have no bearing on how well you think, contribute or perform. Once you are in one of its sessions, you are judged almost completely by the ideas that you contribute to the discussion.”

20The answer he received was more well-balanced and referenced:

“The good thing about the IETF is that it values plain and frank speaking (unlike UN organizations or ICANN, which value diplomatic speech). That’s why you can use IETF documents themselves. For instance, the RFC 3774 or other IETF material, such as the tutorial for IETF beginners, also says quite the opposite of the nice legend you described. Sure, IETF is more open than the GAC9, less secretive than the ISO, technically better than ITU, etc. Big deal! In the kingdom of blind men the one-eyed is king.”10

From standardization-setting complexity to regulation dilemnas

21ICT standards-setting today is “a system under stress”:

“The modern ICT standards infrastructure is a lightweight, highly distributed and only loosely connected system. As such, it is theoretically democratic, reasonably responsive, and economically efficient. But it is also ill–suited to address complex problems, and democratic only for those that find it sufficiently in their self–interest to participate” [Updegrove, 2007a].

22Such a contrasted standardization landscape makes difficult the search for a common pattern of Internet Governance to be shared by a lot of private and public bodies besides and outside intergovernmental organizations - so that “a logic of privatization, unformal procedures, and the extension of expertise domains increase the distance between the citizens and the political establishment” [Graz, 2004].

23Actually, we do share the thesis that “protocols are political”, mainly because:

“they control the global flow of information and make decisions that influence access to knowledge, civil liberties online, innovation policy, national economic competitiveness, national security, and which technology companies will succeed” [De Nardis, 2009a].

24In Updegrove’s words:

“[we live] in a world where ICT access is becoming a prerequisite to enjoying the full rights and opportunities of society, democracy and the economy. That access is only feasible, however, if standards exist. […] But very different rules must be cobbled together11. … Is the traditional standard setting infrastructure adequate to the task, either technically or democratically? And to the extent that it is not, how, and by whom, and to what result will its shortcomings be addressed? ”.

Normative collisions in Internet Governance

25Those questions are at the heart of Internet Governance issues, fed both by technical requirements and socio-political frameworks. In De Nardis’s wording:

“many Internet governance examinations inquire within a closed sphere of institutional interactions and their internal technical decision-making processes, (but it) does not necessarily reflect the contextual milieu that shapes decisions or the broader social implications of these decisions”12.

26The doctrine produced by the World Summit on the Information Society is that “the international management of the Internet should be multilateral, transparent and democratic, with the full involvement of governments, the private sector, civil society and international organizations”. Since 2005, when it was officially agreed in Tunis, we do not see any progress in that direction. Whether market-driven or legally binding, invisible or visible, Internet Governance enforcement currently uses diverse sources of inspiration and legitimacy. Here is not the place to come back to the never-ending controversy about the role of ICANN or the weak followings of the UN Summit, but one can consider that a common vision of “what is right” is currently missing.

27Following Thevenot’s theoretical approach, it seems thus necessary to reconsider the agreed distinctions between technical norms, social norms and norms of the True. In his views:

“contemporary societies are typified by a huge technical equipment and a mix of humanity, nature and artefacts. Industry was the first order of worth, using standardization for efficiency purpose. Gradually shifting from State ruling to voluntary innovation, this movement is modifying the way we consider technical objects under their economical value, the classical figures of the producer and the consumer, even the political design of the citizen, of his modes of commitments and the good forms of governing” [Thévenot, 1997].

28Consequently, he sees standardization as:

“a focal issue in the gestation of a new worth paradigm, the ‘Information City’, including social and civic norms beyond technical ones, distributing power and responsibilities more widely than ever. It has political implications and requires not to separate subjective values and interests from technical and factual aspects”.

29With standardization processes, we are leaving the traditional exclusive frameworks of public or private regulation. The DNS (domain name system) case offers a good example. In the first times of the Internet, the domain name was a pure technical identifier, managed by computer academics on a voluntary basis. Now, a generic top-level domain is a stake of trademark and a country-code top-level domain is a matter of State sovereignty. For the ccTLDs, the paradox is evident: based on the listing of countries – with some marginal exceptions – used by the ITU (an intergovernmental body), it is distributed by ICANN (a non-for-profit californian association) to governments’ national mandatories13. Furthermore, their allocation as a “public resource” is not legally supported enough: last October, the French Conseil d’Etat judged as partially unconstitutional the Art. 45 of the Code des postes et des télécommunications, and consequently postponed the renewal of the devolution to AFNIC14 of the.fr distribution – despite the fact that AFNIC is a non-profit association with several Public Departments represented in its Board. It will take several months to find – if possible – a better legal framework clarifying the status of this activity.

30This case illustrates a major tendency in international political economy, underlined by Guillaume Devin [Devin, 2008]: the growing power of private companies leans on implicit or explicit forms by which States agree to delegate their authority. One consequence is a greater legitimacy for private experts, at the expense of public representative, and it is particularly visible in the field of ICT and Internet. In the same spirit, Graz and Nolke estimate that transnational private governance in general refers to:

“the ability of non-state actors to cooperate across borders in order to establish rules and standards of behaviour accepted as legitimate by agents not involved in their definition. Non-state actors not only formulate norms, but also have a key role in their enforcement. Accordingly, the current privatization of rule-making and enforcement goes much further than traditional lobbying in allowing private actors an active role in regulation itself ” [Graz & Nolke, 2007].

31The current situation can be seen as a time of struggle for appointing which authority is the more legitimate for deciding the norm. There is a gap between the definitions of standardization as a strategical resource on the market place and as a socialized and institutionalized method of coordination and regulation. Devin’s wording helps summing up the first part of this paper:

“In the ICT field, a technological innovation unable to find its place in the world of standards is sentenced to death (e.g. the French Minitel). But technical excellency is not enough … It is important to pay attention to economical factors, but social hierarchies in experts’ communities, standardizers’ beliefs and openness, are also decisive. Last, political and institutional considerations are as well necessary in order to keep in mind the whole resources potentially mobilized by the actors” [Devin, 2008].

32According to a ITU survey, the total number of users on the Internet is set to surpass two billion by the end of the year 2010. Due to the fact that the infrastructure and role of the ICTs are both material and immaterial, some consider we are entering a wonderful global and unique ‘Information Society’. Everybody knows that it is far from reality. In the second part of this paper, we would like to go further into the domain of the technical versus societal normalization by paying attention to three prominent criteria of standards: effectiveness, openness and legitimacy. On the way we will see confirmed that the main questioning about standards is less “what they are useful to” than “what kind of values are involved into”. Multilingualism online, more specifically the internationalization of domain names, will be used as an example.

STANDARDS, NORMS AND PRINCIPLES

33One path for taking on the ‘normative collisions’ is opened by Duf:

“Norms can be opposed by law, or developed by law. They are internalised, shared rules of action. They are procedural, not substantive, in contrast to values. But they can express a kind of ‘social morality’. In a disoriented, uncertain world, what is required is a new normativity to orientate the social and political framework of the Information Society”. [Ibid.]

Three Criteria for Standards

Effectiveness

34Regarding the products and services, the effectiveness of standards – their ability to achieve a desired effect - is measured by technological stability and economical success. In Internet issues, contradictory requirements are at stake.

35On one side, Internet core protocols are “by nature” supporting interoperability and innovation. They give it the potential of a worldwide device. But the other technical layers (architecture, tools and applications) are built according to a fast evolving stack or combination of technical standards together with a complex economical struggle between their promoters. Last, technical capacities are submitted to cultural contexts and political circumstances that orientate their achievement: for instance, it is clear that security concerns came at the forefront since the September 11 event in the US, at the expense of the concerns of the ‘free flow of information’; in a same way, Intellectual Property Rights holders are desperately trying to counter the opportunities of peer-to-peer software for accessing knowledge and entertainment contents. In both cases, however, it seems that no restrictive device or legislation may stop technical innovation, because what is considered by one part as undesirable effects is considered by the other part as highly desirable. Agreement or non-agreement on standards is not a purely technical matter!

36On another side, it is rather surprising that Internet functioning could have developed without a major break from a few computing laboratories to a kind of a mass medium. But temporal stability, on one hand, is shaped within and by the practices of Internet users and it can be guaranteed much more through sensible management policies than through technical rules. In the case of the Domain Name System we already mentioned, evolutions such as financial speculation and UDRP decisions15 about gTLDs, or States requirements about ccTLDs, have been barely disrupting the former assignment. For the deployment of IPv616, market competition fed by geopolitical interests, added to the distributed and decentralized nature of Internet architecture, make the transition from IPv4 still expected – though unavoidable. Is this a fair sign of stability? In fact, all Internet Governance debates underline that some players (erg. the US Department of Commerce and its agent ICANN) feel being “historically in charge of ” Internet general stability, with the consequence of keeping other players from equally participating. So that US companies are dominating the business on all Internet layers – as long as we stay in the current framing.

37Instead of asking “can a rapidly evolving technology such as the Internet be assessed on standards monitoring?” it is more useful to demonstrate how technical standards do interact with other kinds of norms in assessing effectiveness.

Openness

38Openness is a magic word in the Internet realm. Be it used for interoperability requirements, for free software or for knowledge availability and sharing, it always combines technical, economical and socio-political rationales. In that sense De Nardis calls open standards those “providing the broadest possible exchange of information, a borderless playing field for innovation, a comprehensive policy for liberties online. It applies to the different layers of Internet system” [De Nardis, 2009a]. The contrary of openness here is property, monopoly or surveillance and censorship.

39Economic success is in no way automatic, even on an open market-place designed by an open technical architecture. In a study of e-commerce websites, Schubert & Selz estimate that “the Internet could be, at least from a technological point of view, the closest approximation of the perfect market and frictionless economy” [Schubert & Selz, 2001]. But consumers’ behaviour expresses preferences that are not only price-based: easy access, habituation and trust actually are key-factors that give place to the consolidation of a few prominent firms. And those firms use secret, tricks and expensive lobbying in order to consolidate their position.

40If we look at the struggle between the big competitors in the Internet economy, we can observe that the actions, interests and values of the players cannot be easily delineated. We will take an example brought up in a recent paper [Quéau, 2010]. For years, Apple and Google have fought together against Microsoft, and they succeeded in preventing the PC Giant from entering the market of online services and mobile devices. Today Apple and Google are fighting each other to become the winner with opposite strategies: the first one is trying to make the consumers ‘prisoners’ of the endlessly growing number of AppStore facilities; the second one is developing the Android open platform to offer the most various access to the greatest number of (its) consumers. At the same time, Google closely keeps secret the ranking algorithms that are supported by thousands of hidden data-servers.

41Another challenge lies in the false equivalence of openness and transparency. Obviously the majority of Internet users – and it is generally worst for policy-makers – are quite unaware of the multiple technical and economical arrangements that rule over the communication devices. And if they want to get any information, they face a lot of difficulties to open the door. Who seriously has the expertise necessary to analyse more than 6000 RFC of the IETF history? Who is successful enough in lobbying to get involved into ICANN Board discussions? Who really understands the interest of a more open and accessible documentation of ITU resolutions, standards and other documents that was lately agreed at its last plenipotentiary meeting? Moreover, how many users are dedicated to exploit and benefit from the innovative capacities of open standards? A few experts and geeks, opposed to the ever-growing multitude of ordinary web-surfers who ingenuously let Google giving its shape to knowledge, Facebook watching their relationship and preferences, thus agreeing to the control of private companies onto their lives and liberties. At the same time, new State laws are increasing the burden of surveillance and censorship in the name of security and property rights. Despite some citizens’ voices claiming for a new deal – both economical and political – for the digital era, the main disappointing answer seems to be but ‘opening’ public consultations or favoring a “multistakeholder dialogue” that let the choice of the decision into the same traditional hands of wealth and power – according to their own and controversial conceptions of “transparency”.

Legitimacy

42The main Internet governance dilemma is to decide who is in control: code, or business, or law-makers, or users, or all of them? The two previous criteria of effectiveness and openness are not indeed useful enough. Effectiveness is precarious, due to the controversial requirements of a global socio-technical system evolving through distributed interests and practices. Openness is not laissez-faire: the GPL license for open software, the Creative Commons license for balancing authors’ and consumers’ rights, as well as the Wikipedia participative regulation demonstrate that the problem is not that much the control itself but the status and motivation of the controllers, expressed in the regulating process. We must remember that no norm can impose if it is not perceived as legitimate. Devin makes clear that “technical standards do not escape this rule, though their procedures are mostly based on scientific legitimacy” [Ibid.]. Generally speaking, legitimacy refers to a popular explicit or implicit consent to an authority, be it moral, legal or factual. Rosanvallon’s typology is more accurate to our purpose, with its threefold definition: procedural legitimacy, legitimacy by impartiality, substantial legitimacy [Rosanvallon, 2008]. The first one could apply to standards building, the second one to standards setting, the third one to standards acceptance in the light of the common interest. In all cases, legitimacy is associated with procedures of participation on one part, accountability on the other part and responsibility for both parts.

43This questioning appears more and more crucial as we are facing the unlimited expansion and innovations of the Internet. On the Internet Society website, a file entitled “a brief Internet history” points out the challenge of a technology becoming a social artifact:

“The most pressing question for the future of the Internet is not how the technology will change, but how the process of change and evolution itself will be managed. The architecture of the Internet has always been driven by a core group of designers, but the form of that group has changed as the number of interested parties has grown – a proliferation of stakeholders, now with an economic as well as an intellectual investment in the network. […] If the Internet stumbles, it will not be because we lack for technology, vision, or motivation. It will be because we cannot set a direction and march collectively into the future.”17

44Actually the existing framework of Internet Governance does not fulfill the democratic characters of transparency, openness and inclusion that inspired the UN-WSIS Declaration and Agenda for action. We may consider as good news the rise of civil society concerns and expertise in the public debates, the renewal of the Human Rights dimension in technical issues such as Multilingualism on the Internet, Access to Knowledge, Personal Data-tracking, etc. or the growing interest of policy-makers for introducing more rules into the market-jungle and “ensuring that standards development and uptake best serve the public interest” [Updegrove, 2007b]. But the unsolved point is the power balance inside (not only between) formal and non-formal bodies, and the way they articulate. And the missing element is the minimal knowledge and awareness of Internet users regarding these questions. It is thus necessary to go back to some principles able to give a framework for action. The case of the internationalization of domain names will be used as an illustration.

Unicity versus Diversity

45Norms and standards are both facilitators and constraints for human exchanges. Their ultimate value lies into their capacity to organize human relationship. Using technical devices has continuously affected and transformed human life. The Internet system, mixing global and local, private and public, commercial and not-for-profit levels of conduct, appeals, in Thevenot’s words, to “an accurate distinction between what can be generalized and what must be referred to contingency and particularism”(op. cit.).

Internet Principles

46At first sight, this distinction implies distributive effects that can be analyzed according to what Mueller et al. consider to be the basic technical principles of the Internet:

“at the core, the global commons principle (open standards developed by an open epistemic community … in contrast to the older model in which standards documents were developed by formally appointed representatives working in closed committees, and often sold as very expensive documents); at the endpoints, the private market principle, based on a decentralized networks of networks and interoperability requirement; between them, and complementing each other, the end to end principle: it means that the design of the network provides basic data transport only, leaving applications and other forms of user-specific information processing to the devices attached to the ends of the network” [Mueller et al., 2004].

47But the recognition of this separation “between the parts of the system that are subject to private initiative and control, and the parts that are subject to global coordination and non exclusive access” (ibid.) has to be fully examined if we endorse the conception of the Internet as a ‘global facility’ – to speak in UN terms. It becomes a hot topic when the world spread of the Internet is associated to the promise of a ‘common world’ bridging the gap between the cultures. In Quéau’ views, an Internet-based civilization could “limit diversity and reduce potential alterity through compelling codification standards and behavioral norms” (ibid.). Is the current effort towards more linguistic diversity on the Net able to calm down this kind of a fear?

Multilingualism online

48Looking to OECD data, we see that from 2000 to 2009 the number of native English speakers on the Internet increased by 237 %, the number of native French speakers by 500 % and of Chinese speakers by 1000 %. During the same time, most of the indigenous or local languages are still missing.

49By introducing the problem of Internet multilingualism, we encounter an historical evidence:

“Early planners of the Internet were generally American, and were implicitly thinking only about how to facilitate communication in English, so they did not anticipate the problems that might arise when speakers of other languages tried to communicate online. The text-transmission protocol on the Internet is based on the ASCII character set. ASCII, an acronym for ‘American Standard Code for Information Interchange’, was established in the 1960s. This character set is based on the Roman alphabet and the sounds of the English language, thus excluding diacritical marks and other alphabets. Since 2003, the limitations of the ASCII code have been overcome in some respects. Word-processing is available in an increasing number of the world’s languages, and many languages besides English have become usable online” [Danet & Herring, 2003].

50To a great extent, this progress is due to the development of Unicode – the Unicode Standard is:

“A commonly used single encoding scheme that provides a unique number for each character, no matter what the platform, no matter what the program, no matter what the language. The Unicode standard contains tables that list the ‘code points’ (unique numbers) for each local character identified. These tables continue to expand as more and more characters are digitalized. It is developed by the Unicode Consortium, a non-for-profit organization working closely with W3C and ISO. The Consortium members include major computer corporations, software producers, database vendors, government ministries, research institutions, international agencies, various user groups, and interested individuals.” [Ibid.]

51Before Unicode implementation, around 2005, word processing was based on ISO, IBM or Apple codes and in Western Europa ISO codes are still frequently used. Internet users already have keyboards with cyrillic, arabic or chinese characters; software offers are progressively translated and web contents are more and more available in many languages, thanks to the development of the HTML standard (provided that they have local producers…). There are a lot of reports and research works dealing with multilingualism for Internet content18. Sticking to our interest for Internet architecture, we will focus onto the controversy risen around the internationalized domain names (IDN).

Internationalized domain names

52ICANN is the manager of the Domain Name System, which translates user-friendly names into network addresses for locating Internet resources. Till recently, for the country-code top-level domain names, a URL ended with.eg for Egypt or.cn for China, despite the linguistic habits and capacities of the Egyptian and Chinese people. IDN appeared as a technical solution making it possible to translate names written in language-native scripts into an ASCII text representation, using a Punycode transcription that fits into the Domain Name System.

53The IDN standard was originally proposed in 1996 and implemented in 1998; it faced competing proposals and was finally adopted, then updated by a dedicated IETF working group that produced the RFC 589. During the 2009 Internet Governance Forum, six years after the standard was available, ICANN proudly made the announcement of the first IDN for three countries, one being Egypt, host of the meeting. One month later, ICANN concluded an agreement with UNESCO stating that:

“UNESCO will notably call upon its network of linguistic experts to help in the process; inform Member States about the new IDNs; encourage involvement of other relevant United Nations agencies and establish working groups to help developing and least-developed countries participate fully”19.

54This agreement may be surprising, indeed. Does not the UNESCO mission of protecting and promoting cultural and linguistic diversity challenge the ‘global Internet’?

55Facing the risk of Internet “balkanisation”, it appears that the main player does intend to keep a unique Naming space. Some experts argue that other technical solutions could have been chosen in order to escape ICANN’s exclusiveness. A few critical arguments can be presented. Firstly, it is inappropriate to use the word ‘multilingualism’ that means the capacity for one person to speak several languages: “IDNs are really solely topical: as long as they have to be translated from UNICODE into ASCII and back, they are not truly multilingual”20. Secondly, the word “internationalization” is no better convenient because IDN is functioning inside the dominant US-led framework. This topic once again illustrates the competition between the standards bodies: a “fast track process” decided by ICANN (a private US non-for-profit organization) allows its contractors to assign internationalized country-codes domain names outside the official ISO table, so that, according to ICANN-IDN guidelines, “each IDN ccTLD manager develops the appropriate IDN Tables that specifies the associated variants”. In the same time, ISO technical committee TC 37 goes on working on “the standardization of principles, methods and applications relating to terminology and other language and content resources in the contexts of multilingual communication and cultural diversity.” Finally, ICANN’s openness to other languages than English in DNS appears more a way to keep its “chasse gardée” on cyberspace naming and the profit it generates: in the case of generic top-level domains, a single entity will be obliged to claim for assignment of all possible transcriptions of its name in different alphabets, with or without diacritical signs, etc.

56It seems that promoting linguistic diversity “had rather become a matter related to economic development in cyberspace”21 than a will to favouring larger access and content enrichment in the Internet landscape – a reason why ICANN goes on making policy beyond its so-called technical functions and beyond legitimate authorities. Here below is a recent example.

On the blog DomainIncite.com it was published on 2 December 2010 that a Bulgarian association, Uninet, working since 2007 towards the creation and introduction of an Internet top-level domain written with Cyrillic characters, has filed a Documentary Information Disclosure Policy request, asking ICANN to publish its reasons for turning down the.бг (.bg) application and the criteria it used. The domain.бг which had the backing of the Bulgarian government and people, was rejected in May on the grounds that it is “confusingly similar to an existing TLD”, believed to be Brazil’s.br. Uninet said that the Bulgarian government plans to challenge the.бг decision if and when ICANN revises its existing IDN ccTLD Fast Track program to create an appeals process. In the meantime, the Bulgarian government’s IT ministry started encouraging its citizens to write to ICANN to demand that its application is re-evaluated.
In August 2010, the Newsletter CircleID already gave comments to ICANN decision, expressing a deep concern: “A major point that ICANN is missing in their current evaluation criteria for confusingly similar strings is that they do not review the TLDs, especially IDNs, in the context they will be used in. ICANN staff ’s reasoning for declining Bulgaria is that ‘Internet is a world resource and uniqueness is most important’. ICANN wants to open the Internet to languages based on scripts other than Latin in order to make it more accessible, but at the same time it imposes limitations on its openness, thus effectively contradicting itself.”
On a public mailing-list where the case was discussed came an answer from the ICANN IDNs Senior Adviser. It states that: “It was a factor in the community development of the Fast Track Process that there should be no specific reconsideration process because it was a limited approach to only those requests where no dispute, questions, or otherwise concerns existed. On the specific requests from Bulgaria, part of the Fast Track process prevents me from discussing that specifically, as staff is not allowed to discuss such details publicly. It is not an ICANN decision to make anything about an IDN ccTLD application public. Contrary, according to the process, we need to keep it confidential. Finally, to note that a long-term policy for IDN ccTLDs (i.e. not the limited initial and more careful approach) is under development in the ccNSO22”.

57Maybe the case can be solved later when the fast track process goes to an end and is replaced by a more robust and transparent rule. But opacity has been a permanent feature of ICANN techno-policy and moreover it is surprising, even shocking, that a sovereign State could be submitted to a technical, private and foreign decision as for its internationalized ccTLD23.

58The IDN issue is thus deepening the symbolic value of a country name, linked to a sovereign state, or to communities’ membership. Obviously the ‘cyber-linguasphere’ appeals to a double reference: the one of the immaterial technical standards, the one of the real life of institutions and people. Internet governance is de facto “a governance of cultures, of languages as a vector of cultures”24. Some technical progress occurred in countering this particular part of the digital divide, but we are far from the objective of considering cultural and linguistic diversity in the scope of Human Rights, basis of the 2005 UNESCO Covenant.

59We could have developed other issues challenged by the Internet, such as Access to Knowledge, or Privacy, or Intellectual Property. All of them demonstrate the same confusion and conflicts. The weight of the interests upon the values leads to the lack of a common and transparent agreement on principles and of the translation of principles into action. But it is time to conclude.

CONCLUSION

60From our study of a few standards and technical principles, we have highlighted some of the normative collisions that make the Internet world both confusing and promising. The monitoring and ruling of the Internet actually rest on multigovernance arrangements in which contracts challenge Law, State decisions do not harmonize internationally, sectorial bodies express opposite interests and hardly cooperate. Normative pluralism is still to be built.

61Could it be achieved around the promotion of the Internet into the field of Commons? It is more realistic to admit it is already a semi-commons. Applying to the Internet this property theory provided by Henry Smith, James Grimmelmann says:

“[The Internet] mixes private property in individual computers and network links with a commons in the communications that flow through the network. Because private control and open-to-all-comers common access necessarily coexist on the Internet, it has had to develop distinctive institutions that include the technical features and community norms” [Grimmelmann, 2010].

62Their mixing is a condition for success and at the same time it leads to tensions between these very different ways of managing technical resources.

63In order to overcome these normative collisions, politics must prevail upon technics. Instead of vainly attempting to counter complexity and uncertainty, we must learn how to live with. Balancing the technical, industrial, cultural and moral imperatives thus requires first to closely delineate the respective roles and powers of all the parties and to work towards a public understanding of their underlying values. Then it makes necessary to take seriously into account the way Internet is modifying the usual production of rules and communities: more bottom-up than top-down, more ex post than ex ante, more based on reputation and trust than on political representation and delegation – while the older system is still existing.

64Finally, the recognition of normative pluralism not as a cause for confusion but as a prospect for action encourages the research for training dialogues, scaling analyses and sharing values under the horizon of Human Rights – both individual and collective. Here is a general principle that could make it possible to articulate formal and unformal norms of the Internet and to include the needs and hopes of the whole stakeholdership.

Notes de bas de page

1 Special thanks to Louis Pouzin and Patrick Maigron for their remarks on the paper-inprogress.

2 All translations from French to English are due to the author.

3 To go futher, see [Alverstrand et al., 2009].

4 At the beginning, public researchers were the majority. Now, it is completely different: experts belonging to big high-tech companies are the majority.

5 Contestation does exist even about the usefulness of ICANN’s technical mission, erg. IPv6 deployment and DNS exclusiveness. See for instance Bill St. Arnaud’s blog [St Arnaud, 2010].

6 Digital Video Broadcasting.

7 High-Definition Multimedia Interfaces.

8 A French pionneer of the Internet, inventor of the datagram concept.

9 Government Advisory Committee, one part of the ICANN Constellation.

10 Exchanges between G. Sadowski and S. Bortzmeyer.

11 Regulating intellectual property rights, privacy, hate speech, etc.

12 op. cit.

13 To go further, see [Mueller, 2002].

14 Association française pour le nommage Internet en coopération.

15 Uniform domain-name Dispute-Resolution Policy. Dispute proceedings arising from alleged abusive registrations of domain names (for example, cybersquatting) may be initiated by a holder of trademark rights. Since 1999, ICANN and WIPO agree general principles for solving such a problem.

16 It is a new version of the current IPv4 addressing protocol, supposed to allow the exponential need of connected devices.

17 http://www.isoc.org/Internet/history/brief.shtml

18 For instance, at the 2008 IGF in Rio, the Dynamic Coalition for Linguistic Diversity brought together actors interested in and working on this issue in cyberspace. The Coalition is coordinated by Maaya– the World Network for Linguistic Diversity: http://www.maayajo.org/spip.php?article28&lang=en.

19 Official release, see http://portal.unesco.org/ci/en/ev.php (Dec. 12, 2009).

20 Tidjani Ben Jemaa, workshop of the Dynamic Coalition on Multilingualism, Vilnius IGF, September 2010.

21 Andrew Mark, workshop of the Dynamic Coalition on Multilingualism, Vilnius IGF, September 2010.

22 Country-code name supporting organization, one of the constituencies in ICANN.

23 It is not a new challenge: see [Châtillon, 2005].

24 Divina Frau-Meigs, at the Vilnius IGF workshop, Sept. 2010.

Précédent Suivant

Le texte seul est utilisable sous licence Licence OpenEdition Books. Les autres éléments (illustrations, fichiers annexes importés) sont « Tous droits réservés », sauf mention contraire.