Version classiqueVersion mobile

Normative Experience in Internet Politics

 | 
Françoise Massit-Folléa
, 
Cécile Méadel
, 
Laurence Monnoyer-Smith

Introduction. From Internet Governance to Internet Politics

Françoise Massit-Folléa, Cécile Méadel et Laurence Monnoyer-Smith

Texte intégral

1The history of the creation and development of the Internet is undeniably, albeit very unevenly, shared between public organisations and private firms, between application developers and service and content providers, between international infrastructures and national regulations, between collective actors and autonomous individuals. The regulation of this new worldwide eco-system therefore, lying as it were at the crossroads between technology, economy, law and social practices, was very quickly given the term “governance”.

  • 1 See [Lascoumes & Le Galès, 2004] and [Campbell, Crépeau & Lamarche, 2000].

2In suggesting that private actors – particularly major firms or Non Governmental Organizations (NGOs) – join with States and intergovernmental organisations to regulate collective issues, governance does indeed appear to be “the sum of a myriad – literally millions – of control mechanisms driven by different histories, goals, structures and processes” [Rosenau, 1997]. It is not surprising then that many view it as an apolitical or non-legal concept1.

  • 2 The Internet Assigned Numbers Authority (IANA) is the body responsible for coordinating some of th (...)

3In terms of the Internet, the primary aim of governance was “critical resource management”, namely the management of the IP addresses and domain names that direct the flow of messages and data in the digital world. Created under the liberal-libertarian practices of the first network pioneers, the IANA-DNS system2 was largely appropriated by the government and private firms of the United States. Its democratisation and internationalisation were the subject of many a heated debate during the World Summit on the Information Society (WSIS, held from 2003 to 2005 under the auspices of the UN), which proved pioneering compared to previous summits by declaring its opening up to «civil society».

4In the definition produced during the WSIS, the scope of the concept was expanded and at the same time became blurred:

“A working definition of Internet governance is the development and

application by governments, the private sector and civil society, in their respective roles, of shared principles, norms, rules, decision-making procedures, and programmes that shape the evolution and use of the Internet.” (Tunis Agenda, art. 34, 20053).

5The idiom «Internet governance» now encompasses several meanings that go well beyond the management of IP addresses and domain names: political regulation through the Internet which, from Twitter protests through to “Wikileaks” provocations, defies traditional governments’s ruling; the economic regulation of contents, which brings into confrontation new and more traditional models of production and consumption; the regulation of Internet access infrastructures on an international, national and local level, where equity is yet to be established; the regulation of behaviours, that disregards the regulation of technical artefacts; and finally the ethical concerns surrounding an open technology that is in constant evolution. Each of these levels combines, to varying degrees, technical, legal, cultural and market norms.

6The weak descriptive value of the WSIS definition above corresponds to an even weaker value of prescription, insofar as no principle allows any of the various normative fields to prevail or even be articulated. This has prevented international agencies from making progress on the question of the priorities, means and tools for deliberation necessary to reach a consensus on the principles of governance. The UN left the discussion in the hands of a new informal agency, the Internet Governance Forum, which is a forum for non-binding, international, multi-stakeholder dialogue. So far however, it is between States and major firms within the framework of the G8 and G20, that the political stakes of the Internet endeavour to be formatted, despite attempts by civil society to stay in the game.

7And yet the Internet, which was created barely 40 years ago for use by a handful of individuals, now numbers more than two billion users. Its weight in the economy is constantly growing, both in terms of jobs and Gross Domestic Product (GDP). Its importance in strictly political terms is widely recognised in traditionally democratic countries as well as in those countries that aspire to democracy. The more Internet tools and usage became widespread and diverse, the more the questions concerning its control entered the fields of public policy and rules of law. The debate shifted from issues of technical regulation to issues that combine economic considerations and human rights. This is true for the issues of Internet neutrality, intellectual property rights, freedom of expression or the protection of personal data on the various networks.

  • 4 Expression taken from Mireille Delmas-Marty (see Foreword).

8Three movements can be seen to distinguish the current context: the stacking up of laws that are often repressive, in an attempt to “civilise the Internet” in each individual country; the quest, which still remains vain, for a unifying international framework, having the agreement of all the stakeholders; the movement for the capture of public good by means of individual interest (industrials and/or followers) which seems almost impossible to channel. The theoretical or ideological debates surrounding governance, which in recent years have focussed on tools subject to manipulation and low legitimacy procedures, have not enabled the leap from technical governance to the “legal humanisation” of the Internet4.

WHAT TYPE OF NORMATIVITY FOR THE INTERNET? A RESEARCH QUESTION

9This book is the result of the Vox Internet research programme, begun in 20035. We first established a diagnostic under the heading “Internet Governance: the state of play and the state of law”6. In a second phase, we attempted to examine Internet Governance in the scope of a “democratic construction of norms”7. In order to determine whether, beyond the definition put forward by the WSIS, it was possible to construct a coherent Internet System governance that was efficient and fair, Vox Internet directed its efforts, on the one hand, to the study of the causes and consequences of developments in the Internet and the debates arising thereof; and on the other hand, to the analysis of four complementary sources of contemporary normativity: the technical architecture of the Internet, the reality of online social networking practices, the evolution in the economic models of what is termed the “digital society”, and the challenge that the Internet poses in terms of national and international law [Lessig, 1999]. In an attempt to elucidate the multifaceted make-up of Internet norms, but also to assign to it a democratic aim, two questions framed the process:

  • what are the reference frameworks that already exist and how do they help us understand the process of Internet normalisation, management and appropriation?
  • what general principles could be used as a base for improvements to existing tools or the creation of new ones for Internet governance?

10As a medium for globalised communications, the Internet as a “global resource for transporting digital data” is confronted with the contradictory movements of access and appropriation versus exclusion, openness and sharing versus capture and control. The sphere of its activity is simultaneously international and local, intangible and physical, invasive and ephemeral. This is true for its current form, and even more so in terms of what is expected from the Internet of the Future: the Internet of RFID, ‘ambient intelligence’, ‘cloud computing’ and the convergence with biotechnologies and nanotechnologies [Benghozi, Bureau & Massit-Folléa, 2009]. But according to the layer in question (computer protocols, data transport, naming and addressing, applications, uses), the weight and interest of the actors involved are very distinct. Even if we consider the Internet primarily as a new intellectual universe where digital content is transferred between people and/or machines, its media function remains ambiguous, because it combines bits and meaning, public and private messages, symmetrical and asymmetrical, or synchronous and asynchronous exchanges. Its uses, which go beyond the flow of content on the web, bring into confrontation communities and sovereignties, freedom of expression and public order, social skills and cybercrime, creative innovation and the defence of established positions.

11The Internet then is at the heart of strategies that may appear erratic. At a time of search engines and social networks, the centralisation of ‘core’ resources (allocation of IP addresses and domain names), which appears paradoxical in a distributed network, has largely lost its criticality to the functioning of online exchanges. The dispersal of regulatory zones and the legal destabilisation that characterise the Internet do in fact concern increasingly vast sectors of global economic activity and act as an indicator of certain more general contemporary evolutions [Massit-Follea & Naves, 2008a]. By means of a dual movement of de-institutionalisation and reinstitutionalisation, a multitude of public and private stakeholders are seeking to retain or gain power. The elaboration, or sometimes the imposing, of technical and economic norms that are “alternatives to legislation” [Boraz, 2006] constitutes one of the main characteristics of this situation.

12Observation shows that certain technical decisions have major consequences. For example, filtering software or browser settings from opaque algorithms lead to State censorship or customer profiling that has little concern for the rights of the individual, even in so-called democratic societies. Decisions are taken but not acted upon (this is the case, among others, for the successive legislation in the attempt to fight against content “piracy”). In general, the mechanisms for imposing regulations by public powers or supranational bodies are opaque and their effectiveness arbitrary.

13The study of such a ‘kingdom of uncertainty’ presupposes going back over the way in which the Internet upsets existing normative references, including the establishing of norms and standards related to technology, to legislation, or to the social contract. We have adopted an approach according to which «this takeover of classical formalism by the recognition of a pluralism of norms reveals a change of perspective rather than a causality switch: in a situation where the coherence of existing rules, the hierarchical functioning of norms and the binary opposition (private/ public, exterior/interior…) are contested, norms can be understood as potential forms of learning in a context of uncertainty.” [Berten, 1997].

  • 8 This appropriate expression was brought by Meryem Marzouki, co-founder of GigaNet (Global Internet (...)
  • 9 [Grimmelmann, 2010]: see details at the end of Chapter 2.

14The purpose of this book is not to initiate yet another Byzantine debate on the concept of governance, which removes the normative foundations of its possible construction. Neither does it aim to square the circle between sovereignty and globalisation, between cooperation and competition, between freedom and security: we will leave to others the temptation to develop an international constitution of the Internet or a new article of the Universal Declaration of Human Rights. However, no-one can be satisfied with a system that “works technically but not politically”8. By deliberately adopting a multi-normative approach, our aim is to clarify the way in which the Internet governs as much as it is governed, and to situate Internet policies as close as possible to the actors that go to make it up. This being so, we will challenge the dynamics of regulation at work, their confrontation with more traditional forms of regulation and their capacity to formulate an open concept of Internet politics as “common management of a semi-common good”9. This constitutes both a phase of research and a new hypothesis.

FROM ONE NORMATIVITY TO ANOTHER

  • 10 In the sense of [Deleuze, 1988].

15Here, as elsewhere [Monnoyer-Smith, 2011], the development of the Internet and its social practices create a strong denaturalising effect as much on the normative principles that regulate forms of cooperation between the actors, as on the technical architectures supporting them. The Internet therefore has helped to unfold10 and examine the various existing forms of global governance, while at the same time stressing the considerable importance of historical, cultural and normative contexts in which international institutions have emerged. Now, certain characteristic features of the Internet and its uses undermine the benchmarks that a century of international cooperation helped to build. The principles of territoriality, universality of values and effectiveness that are presupposed in the regulation of actions, collide head-on with the fluidity of information flows, the heterogeneity of values relied on by the actors and the instability of web productions and architectures. This denaturalising effect brings to light a number of fundamental differences between the Internet as an object of regulation, and other domains that have been regulated internationally since World War II, such as for example telecommunications, air traffic or even space and sea.

  • 11 Without the limits of this exploitation always being negotiated by international agencies. This is (...)

16On the one hand, the Internet today is perceived and claimed by the majority of users as a common good, in the same sense as a natural resource – even though it is made up of technical artefacts. In this respect, its appropriation by major powers appears less legitimate than in other domains where the population consider it “acceptable” for a State (or group of countries) having high performance technologies to claim the rights of commercial and scientific exploitation11. This appears even less legitimate in an international multi-polar context, which has challenged the traditional forms of domination by the major powers that emerged after the First World War. In this respect, Internet governance forms part of the movement of these new emerging and chaotic forms of governance surrounding the physical common goods of mankind, whose importance has been obvious since the Rio Summit of 1992.

17On the other hand however, the Internet and its related social practices are in fact more complex to deal with than other natural common resources such as water, marine resources or fossil fuels. And this for two reasons at least. The first is that we are dealing with a technological creation made up of artefacts. In this respect, the Internet must be approached both as a resource whose possession constitutes an issue of power, and as a technical infrastructure whose use constitutes an issue of domination. It is as important to protect it as a resource and give equal access for all, as to ensure that its territorial extension and its technical development do not result in forms of technological domination and neo-colonialism. By adopting a perspective taken from sociology of the sciences [Akrich, Callon & Latour, 2006], [Latour, 2006], it is possible to demonstrate that indeed any technical architecture carries with it standards, methods and norms that are imposed on those territories where it is deployed, without the latter always having the opportunity to object. Over and above the notion of international competition to establish norms among the various institutions that seek to demonstrate or reinforce their scope of competence (ICANN, ISOC, IETF, SMSI…), it is indeed this normative contradiction between the technical architecture and the places where it is developed which raises the question of the legitimacy of the norm. This multi-normativity, driven by technology on the one side and territories on the other, cannot be resolved by the determination of the location of the resource, as is the case for other natural resources.

18The second specific characteristic of the Internet resides in its evolving and distributed nature. Unlike mankind’s other common goods, the Internet benefits from the constructive input of a multitude of users and organisations that contribute to developing not only the content that circulates online, but also its overall architecture. The active intervention by States that invest in infrastructure, that facilitate or on the contrary control the flow of contents, also constantly adds to the evolution in the structure of a network that has become a complex patchwork of technical norms and values that emanate from the actors that contribute to it. It is easier then to understand the difficulty involved in controlling the particular resource that is the Internet. On the one hand, the efficiency of the network presupposes a minimum level of agreement among actors regarding technical formats, so that they can benefit from the power that is inherent in its global, cooperative and distributed nature; but on the other hand, these actors are reluctant to accept normative models defined by the users and operators of a network to which they feel, quite rightly, that they are entitled, insofar as it constitutes an essential resource for their economic, social and political development.

19The very nature of the Internet raises new issues, as is clearly demonstrated by Mireille Delmas-Marty in the foreword of this book. The multi-normativity of the rules that apply to the Internet results as much from the nature of the object as a resource, as from an axiological position that values the search for agreement according to a principle of justice, while respecting the characteristics of the individual. As with climate change, Internet governance requires cooperation for the good of everyone, in an international framework where habit has lead generations of players and policy makers to promote, using force if necessary, their corporate interests above all other considerations. Not only do the traditional forms of domination and competition appear undesirable from an axiological point of view, here in fact they prove to be counter-productive insofar as they come into conflict with the very logic of the development of the Internet.

CONFLICTING NORMATIVE MODELS

20The various chapters in this book underline the difficulty of considering both the diversity of levels of normativity to be articulated (technical, legal, social and political) and the procedures that States have traditionally employed to undertake this articulation. The result is a diverse and fragmented production of norms that vary enormously depending on whether they concern naming, data flow protocols, network architectures or rules for data protection. However, two contradictory trends seem to be emerging that enable an understanding of what is at stake in the development of the Internet.

  • 12 In March 1993, the Telecommunications Council of Japan recommanded urgent measures to facilitate t (...)

21The first is a traditional process of political and economic domination. It reflects the politico-economic motivation of the States that pioneered the development of the Internet to ensure that their choice of a knowledgebased economy, which relies heavily on significant investment in information superhighways, properly bears fruit and remains a significant competitive advantage. In this respect, Al Gore’s famous speech of September 15th 1993 comes to mind, which, with a curious mix of geopolitics and geo-economics, prophesied the advent of a global society based on democratic values, made possible by setting in place a network of cultural exchange and global trade. As demonstrated by A. Mattelart [1999], this approach has difficulty in concealing its true economic aims: the intention is indeed to base US competitiveness on the increased production that the Internet can offer and on the new economy promised by network development. Other countries quickly took this way12.

22Thus, behind the distributed nature of the Internet and despite the utopian and liberal stance that put forward an exportation of western models of democracy, what we are in fact witnessing is a real battle for economic domination being waged between the Internet’s major stakeholders. The emergence of major firms such as Google and Amazon, and the development of paid-for applications and integrated services by Apple and Microsoft, are a result of the reconfiguration of markets that merges telecommunications networks and software industries with the production of content and services. The integration of proprietary technologies (terminals and mobiles) and the applications that are specific to them, constitute new modes of economic domination that are imposed on the users without their having the possibility of opting out entirely (except by fighting for open source technologies, which at the moment still remain marginal alternatives). As shown in the chapters compiled respectively by Bernhard Rieder and Dominique Boullier, behind the optimistic stance that the Internet highlights people’s creativity, marketing strategies operate to capture the data provided by Internet users; these are then used to feed the growth of the major firms that monopolise access to Internet content.

23Interoperability between services and applications therefore comes at a price, that of exaggerated centralisation around major stakeholders who control the actions of users and exploit the information that they leave in their wake.

  • 13 Facebook therefore only presents an interest to the user if he is certain to find connections. Sim (...)

24This form of economic domination is also political and cultural. Political insofar as States (directly or indirectly, by means of agencies such as ICANN for example) organise and regulate technical standards that render possible or facilitate access to the net, as seen above. As clearly described by Françoise Massit-Folléa in her contribution to this book, the plethora of organisations (IETF, ISOC, W3C) which manage standards is in fact made up of High Tech companies and specialists that protect their markets under the benevolent gaze of States, whose aim is to ensure their sustainability. The historical US monopoly of ICANN, which manages the critical resources of the Internet by allocating IP addresses and domain names, demonstrates the reluctance of States to enter into a logic of international confrontation for the fair allocation of resources. But on a deeper level, this domination is also cultural, since along with the technical standards, it is the content and techno-semiotic formats that are distributed around the planet. The logic of services is such that “they are all the more interesting, the more users they have that sign up to them13”; it is their extension that in turn feeds their monopolistic position. In this way, a neo-participative and neo-democratic digital culture develops that, in quite a contradictory manner, favours capitalist domination logic. The linguistic (an English vulgate) and normative (individualism, libertarian and creative values) impositions, thus collide head-on with emerging countries which, while claiming access to the resource, suspect the models of society that are put forward on the network.

25This logic, that enables us to understand how the various layers of technical standards and communication on the Internet are structured and organised, is confronted with another logic that it has in fact helped to create. As a technical device, the Internet is the result of research produced in a specific environment: “the context of the american counter-culture and the meritocratic spirit of the academic world.” [Cardon, 2010, 13]. So that whereas the history of the design of the Internet is complex and dependant on numerous civil and military actors that are sometimes reluctant to promote its participatory and democratic dimensions, the fact remains that the pioneering spirit persists. According to Cardon, this presents three main characteristic features:

  • nobody shall be able to appropriate or control it completely (even though we accept that certain States, such as China for example, manage to implement modes of content censorship that are relatively effective);
  • the Internet is based on the valuing of a culture of exchange and cooperation between equals. The new economy, as we have seen, takes advantage of this feature: the interest of services and applications lies in mass contribution;
  • the architecture of the Internet is designed in such a way that its expansion does not evolve from a technical «centre» – as is the case for dial-up telephony – but from its borders. The expansion of the network, its connectivity and interoperability depend on the capacity of its users to innovate at its periphery. It therefore encourages the bottom-up innovation described by Von Hippel [Von Hippel, 2005].

26The rapid massification of the Internet and its being invested by business logic models, which as we have seen favoured forms of economic domination during the last decade of the 20th century, have not completely obliterated the ideals of decentralised cooperation and the logic of individual expression that are written into its architecture and its “code”. Indeed, the global geo-political context combines with the flow of democratic paradigms that emanate from John Rawls’s Principles of Justice and from Habermas’ dialogic democracy, to reactivate the spirit endorsed by the Internet pioneers. Amartya Sen [Sen, 1999] summarises well this school of thought that makes exchange and discussion the essence of the democratic spirit, a common good for all civilisations on the planet. In this respect, the principles of sharing and cooperation that are an integral part of the architecture of the Internet are claimed by all those who, subject to economic domination by former world powers that helped shape the Internet, now wish to take advantage of its potential for development without however submitting to their cultural diktat.

27These principles of justice that develop via the proceduralisation of discussion and normative exchange can be found therefore at the heart of many practices, as much for the normalisation of Internet standards as for uses encouraged by communities of practice, in open software for example. In a way that sometimes appears somewhat erratic, incomplete and performative, Internet governance procedures seek to put into practice this ideal of discussion that favours the inclusion of new actors from civil society, in order to ensure a network development that is more equal and fair. Even if it remains a normative horizon, this ideal deliberation held up by Rawls and Habermas remains nonetheless at the heart of many Internet governance organisations. The contributions from Claudia Padovani and Romain Badouard, Francesca Musiani, Cécile Méadel and Laurence Monnoyer-Smith, while recognising the limitations thereof, nevertheless analyse the positive nature of this ideal in the design of procedures for adopting standards within the IGF and other institutions that regulate the Internet. Observation shows that their statutes and their modes of operation have clearly borrowed from the major principles of deliberative democracy that are: inclusion, transparency, freedom of expression, equality of participants, advertising and responsiveness (in other words the link between the deliberation and the decision). Although the practical functioning of these institutions brings with it serious nuances regarding the empirical possibility of seeing these ideals achieved, it remains that we cannot underestimate the weight of this “deliberative imperative” [Blondiaux & Sintomer, 2002] in the functioning of Internet governance institutions.

28On a more micro level, that of applications and software, this spirit of the Internet pioneers takes form and self-regulates according to a logic that is close to the proceduralisation advocated by the adepts of deliberative democracy. In his chapter, Dominique Cardon describes the complex organisation that governs the functioning of Wikipedia: the five founding principles are norms that are disposed to facilitate the proper execution of a coded, and incidentally quite dense, procedure whose purpose is to ensure that the free and anarchic participation of all does not lead to the re-creation of domination logics, of an academic nature for example.

29However, participatory proceduralisation is not without a number of perverse effects. We refer to two of these here. The increase in the volume of procedures generates regulatory inflation. This is because, added to the norms and standards produced by the activity of participants, is an incremental increase in associated rules (as for Wikipedia), whose purpose is to make the principle of equal discussion respected. All communities, therefore, end up rigidly structuring themselves around rules of operation; violation of these rules can lead to major crises, as was the case for the Debian community in 2009, when the release team, a sort of executive committee of the community, announced without consultation the biannual freezing of new versions of Linux.

30Paradoxically, this regulatory inflation subsequently contributes to deterring potential members from integrating the procedure due to the high associated costs (cognitive or real). It is indeed one thing to have to meet the criteria of competence to join an organisation (as is the case for W3C and ISOC for example), it is another to be eligible but be put off by the formal complexity of functioning of that organisation. Especially since, as was the case with Wikipedia before the enactment of the Wikilove principle, the incapacity to master these rules leads to the disqualification of content of those less practised in terms of the device. We can understand then that the difficulty of the exercise lies precisely in the necessity of avoiding the transformation of rules that apply democratic principles into procedural lock-ins that undermine the foundations.

31Internet governance can therefore be approached as the result of a workin-progress that juggles between models inherited from World War II and diplomatic tradition, business models responding to a logic of market dominance that exploit the infrastructures of information superhighways, and axiological normative models that place value on a democratic ideal whose content is interpreted differently in different cultural traditions. The result is a real complexity in which the winner is not, in fine, guaranteed to be the citizen user.

32Following this investigation into the general norms and principles challenged by the Internet, we will halt at three specific points in the search for normative models, which is a way of approaching the question on a plural and a local level; it is an approach that assumes that the singular nature of the Internet overlays environments that may not necessarily be aggregated. The search for normativity(ies) is concentrated in three areas: the investigation of appropriate and specific institutional formats, the implementation of governing capabilities in online communities, and the reformatting of governing procedures by the architecture itself.

NEW INSTITUTIONAL FORMS IN INTERNET GOVERNANCE

33The specific way in which the Internet is constructed, with its multitude of actors, its globalisation of powers, its exponential growth, its relative autonomy of technical spheres, has led to a proliferation of organisations that define, each according to its own idiosyncrasies, technical, legal and social norms; in other words, organisations that exercise or aim to exercise a form of governance of the Internet. The term “governance” is an appropriate designation of this alternative approach to a political organisation that must enable the management of differing and divergent interests, without being able or willing to defer to States or supranational organisations. This form of co-regulation does not however imply that these no longer exercise their authority, but simply that they compose, share, delegate [Marzouki & Méadel, 2004]. The actions of States, supranational organisations and private firms remain pivotal to certain issues that affect the Internet, as evidenced for example by the multitude of debates and legislation surrounding the responsibility of web hosts. But it proves to be ineffective in global regulation due to the worldwide nature of the network, the power of a few economic actors, and the independence of stakeholders. The role of public powers is hitherto exercised in competition, confrontation or – at best – in complement with new organisational forms, which intervene to set new technical norms, explain the legal or quasi-legal rules, secure forms of co-habitation, or define the specifics of the architecture, etc. These co-regulation organisations participate in the scaling-up of the “poles of normativity”, to employ the words of legal expert Pierre Trudel [2000 & 2002], that have been observed on the Internet:

“Cyberspace is also made of mediators through which normativity and its consequences clarify themselves and spread out. Rules emanating from these ‘poles of normativity’ relay each other and spread out in various virtual spaces. They coexist in the cyberspace either complementarily with other rules or in competition with them, offering an alternative to those emanating from other poles of normativity”.

34Institutions, groups and organisations, each having diverse statutes, make up the hybrid “arenas” whose rules and productions challenge the existing order, without their norms and processes being necessarily explicit.

35Moreover, we have already mentioned that the phenomenon is by no means limited to the Internet. From a background of challenges to expertise and sovereign power and the rise of the deliberative imperative [Blondiaux et Sintomer, 2002; Monnoyer-Smith, 2006 & 2011], these attempts at innovative institutions, that maintain state power remotely while operating on some of their prerogatives, are on the increase [Rosanvallon, 2006]. Within these organisations, public powers, States, local communities, administrations, supranational institutions, corporate bodies, etc… are present and delegate, to varying degrees and in various ways, the functions of expertise, the coordination between actors, the building up of competencies, and public awareness [Frydman, 2004]. The distribution of power between private and public actors is therefore blurred. This can be seen regarding the application of environmental regulation [Beck, 1992] surrounding climate change, with the Intergovernmental Panel on Climate Change.

36Why then the particular example of the Internet? As demonstrated by Mireille Delmas-Marty, the Internet disrupts the rule of law in a very particular way, by raising conflicts between legality and jurisdiction, values and freedom. This disruption of the law, coupled with the globalised nature of the network, makes the Internet an ideal environment to explore and experiment new regulatory arenas in which norms are established. Organisations that are neither private nor public have emerged, and have taken on (they conceive or receive) the function of co-regulation in terms of the Internet. These are extremely heterogeneous in various respects: their objective, their size, their authority, their “number of divisions”, their articulation with other agencies and public authorities. Some of them are iconic: ICANN and its control of the entire naming system [Froomkin, 2003], the former Forum of Internet Rights which aimed to establish in France a national forum for debate [Marzouki and Méadel, 2006], W3C involved in developing technical standards for the web [Russell, 2003] or the Creative Commons community and its format for regulating intellectual property rights [Bourcier, Casanovas, Dulong de Rosnay, and Maracke, 2010] in excess of national or international legislation [Dusollier, 2006]… Others are less well known, but they all have the capacity to exercise forms of localised sovereignty, within a geographical space, a technical norm, a particular jurisdiction. These arenas presuppose a specific articulation between local governance and a globalised environment which they cannot necessarily master. Finally, private orders can replace or be added to public orders.

37Co-regulation may be challenged by the actors if they believe that its legitimacy is open to question the legitimacy of its institutions but also that of its monitoring procedures, its operating rules, and its selection and designation of members. This was largely the case for ICANN [Antonova, 2008; Weinberg, 2000]; this is also true when it is estimated that control is taken by one category of actors whose only claim is themselves, their seniority in the field, or a representivity whose basis does not appear established [Massit-Folléa, 2008b]. Victor Pickard [2007], in reference to the WSIS and “its self-legitimating rhetoric and self-congratulation regarding its own process”, also recognises this trend as a process of legitimisation of corporations in a neo-liberal state.

HOW TO MANAGE COMMON DIGITAL GOODS?

38These critical approaches, although fundamentally legitimate, remain slightly outside their subject: the “social reason” of these hybrid organisations is in keeping with their image, it defines itself as and when these institutions inscribe their actions and is continuously being updated in their processes. In the case of ICANN, one could go so far as to say that its incompleteness has been the historical condition for its success, i.e. its capacity to be recognised, despite repeated objections and protests, as the dominant regulating body for domain names. A poorly defined field of focus and constitutive modes of operation can bring these hybrid organisations into competition with each other. The example cited here by Herbert Burkert (Chapter 4) is the competition between ITU and ICANN. Rather than adding to the large body of texts that seek to discuss and determine their areas of authority which are often difficult to define, or their constitution, which is never set in stone, Chapter 5 examines at least their formal procedures. These institutional forms, which authors refer to as “arrangements” in reference to Foucault, should indeed be considered both in terms of their mode of organisation and their outcomes.

39Which model is best suited to managing a digital common good? The question has often been raised regarding ICANN, it has been said, and the responses may tend to regret its democratic guise, with the ICANN At Large elections [Massit-Folléa, 2002] as much as in the distribution of the thin powers attributed to its Governmental Advisory Committee (GAC), or simply underline its undeniable subjection to the United States government. Others applaud its capacity to impose an order, all the more credible since it is correlated to the technical architecture of the network, before (but not exclusively) being correlated to compromises and negotiations between national policies, as are international organisations. Like many other forums whose point of focus is their technical vocation, ICANN implements its decisions by the now well described process of rough consensus, which is reputed to enable an operational functioning, pragmatically named running code (both inherited from the technical community). In this collaborative process of exchange between experts that “roughly” agree, no recognition other than effectiveness and the lack of dissent is required and no overseeing authority is mobilised. Applied to the regulation of domain names, this model has lead to a form of capture of common goods: the UDRP (Uniform Domain Name Regulation Procedure) has systematically favoured a category of actors connected to trademarks (trademark community); inequality between Internet users was not written into the general rules, but resulted from the practical organisation of dispute resolution procedures put in place [Geist, 2002; Komaitis, 2010; Méadel and Marzouki, 2004].

40Nothing obliges the stakeholders involved, even if they hold a part of the common good, to subject themselves to the terms and rules of representative democracy, which incidentally they sometimes even reject, without however adopting a hierarchical model. This is true of the “governance” (which does not refer to itself but operates as such, as demonstrated by Dominique Cardon) of Wikipedia, or that of electronic communities of patients [Akrich and Méadel, 2007]. Authority is not delegated; it appears as the result of work rather than as its prerequisite. The majority of conflicts are managed locally by reference to the rule, the punishment of individuals being the exception, compared to the correction of wrong behaviour, and subject to a central authority whose intervention remains exceptional. On Wikipedia, monitoring is decentralised and the responsibility of all, and is constantly referred to by each participant [Levrel, 2006]. The rough consensus, the independence or individualism of actors, the procedural devices, etc., all enable the appointment of representatives and a vote to be avoided. In this universe that does not bear any overriding principle capable of arbitrating, how then is conflict between parties regulated? Doesn’t the multitude of institutions, norms and groups of stakeholders lead to a multitude of normativities and modes of regulation? This is the sense of an article published in 2003 by three legal experts [Johnson, Crawford, and Palfrey Jr, 2004] under the heading The Accountable Net, that promotes “peer production of Internet governance”, a form of regulation distributed between final actors. Or is it the notion of governance that gives new meaning to the range of issues raised by community life on the Internet and enables a common policy to be defined, by means of its imperatives and institutional requirements? This is the view of Milton Mueller [Mueller, 2010], and also the one of Chris Marsden [Marsden, 2010] who advocates light governance, not unifying as such but which employs partial solutions, by always adopting the shared goal of transparency and investigation.

41Thus, with the multitude of Internet governance arenas, their lack of hierarchy and articulation, and the plurality of their principle of authority, the quest for a global Internet governance appears less an utopian ideal than a contradiction in terms. As Mireille Delmas-Marty explains, this should not lead to the rejection of all forms of regulation, but rather to the search for “orderly pluralism”; in other words, the construction and maintenance of devices (or “arenas” in our vocabulary) that are compatible not only with the various national orders, but also, and perhaps more profoundly, with the various registers of the Internet (registers of use, architecture, policies, etc.). Paul Mathias also advocates this pluralism and emphasises the necessity for diversity in terms of regulations, access, governing principles, and regulatory agencies. The capacity to adapt to the overall complexity of the Internet system, to the rapid evolution of its features, its uses, its standards, requires therefore that we abandon the quest for one unifying principle capable of ordering the different worlds involved.

THE REGULATION OF ONLINE COMMUNITIES

42If the plurality of norms and modes of governance is obvious in terms of arenas for Internet regulation, it is hardly documented when dealing with electronic communities. Their “governmentality”, to borrow Foucault’s terminology [1994], appears almost natural, probably because research work has concentrated more on their social impact, their specific production or their “virtual” ethnology. This unawareness of the issue also stems from the fact that it is difficult to agree on a common definition of these communities, because it is hard to identify what brings them together [Preece, 2000] – apart from the fact that they remotely gather around a common interest, a shared task or a shared condition, contributors who may be connected by only very tenuous links. And yet there is good reason to examine the forms of regulation put in place, to varying degrees of success, by the communities themselves, over and above the existing laws, technical norms and contracts that are imposed on them by outside authorities. Indeed it appears that, even in those communities whose purpose seems to be to transgress the law (such as sex pictures traders [Slater, 2002]), a social order is set in place that necessitates formalised rules and forms of internal policing.

43Organising exchanges, defining members, their acceptance or exclusion, fixing the rules of participation, resolving inevitable conflicts, managing the group’s production and meeting all the conditions necessary to ensure that no-one feels excluded, all these tasks and the many more that enable the group to thrive, develop and produce, result from norms that are forged both by the group’s practices, but also by the digital tools that are entrusted, to a certain extent at least, with a proportion of these tasks. The incorporation of political and social principles into technical devices is simultaneously the result of collective activity and the prerequisite that enables it to occur. This is true for example of Wikipedia, that continues to develop through progressive adjustments between the encyclopaedist ambition and the technical device: this is not therefore, unlike approaches in line with Winner [1986], the autonomous power of technology that acts directly on its users, but rather a mutual transformation where technological devices and users perform reciprocally.

44Since the groups are formed online and do not necessarily have a preexisting identity, it follows that their regulation can only come from the activities of the group itself and its technical artefacts; it is therefore defined as the result of collective and mutual learning. Hence the interest of looking into how these groups define, implement and update their norms. Certain features deserve to be particularly scrutinised. As with Internet governance organisations, we can see a recurrent refusal of delegation and the proceduralisation of work organisation. Some research works point towards the non-hierarchical functioning of these groups, their preference for consensus rather than traditional processes of associative democracy, the extensive efforts of animation and control necessary for the group to thrive and the sophisticated nature of the tools put in place to ensure a capacity for collective sanction that is acceptable to a group that rejects top-down operations [Akrich and Méadel, 2007 & 2012]. One could also underline their high tolerance of varying degrees of member engagement, the community accepting both those that invest intensely as well as those whose investment is minimal. Hence the recurring debates to discuss the reasons why these “parasite” or “free riders” that bring nothing to the group but reap its benefits are considered acceptable or even useful. This is read by some as a gift/counter gift economy (cf. Mathias, infra) or as a symbolic economy that allows people to safeguard their position, to save face [Auray, to be published].

45These observations may lead to questions regarding the development of online communities: are we headed towards the creation of empowered communities, detached from the offline world, doing their business in a totally independent manner, self-regulating, and without recourse to shared normative principles? Is the world of online communities properly ungovernable? The many concerns that arise from the Internet and its communities – if we consider for example the vast amount of critical writing surrounding social networks, their derivatives and their alleged wrongdoings – might point in this direction. However, as will be observed in the present work, such assumptions are based on the idea of an autonomy of the online world which nothing allows us to presuppose: even in the most autonomous of communities, it remains impossible to draw the boundaries between the standards and practices of online communities and their inscription, however partial, however limited, in existing standards and architectures, in offline activities and practices.

INTERNET ARCHITECTURE AND TECHNOLOGIES

46As we have said, the standards that are created by organisational innovations or specific communities are also written into what Lawrence Lessig calls architectures, the technical artefacts or black boxes that contain, for a time at least, the organisational terms of Internet practices. Information technologies (ICT) can reconfigure the devices of control and authority: they equip political processes (for example by enabling new deliberative formats, offering greater organisational capacity…); as demonstrated by Lessig, they allow the implementation of standards in which the practices will be embedded, at least until new uses, new technical devices, new regulations come to challenge them. As such, technical standards can be strong agents for centralisation or even monopoly. However, for some time now they have no longer been the preserve of States, since the intervention of private firms and technicians who were able to obtain recognition of their role by normalising authorities (see UIT). In terms of the Internet, the latter are themselves contested and no longer control the production of Internet standards, a result which is now further complicated by the tangle between States, private businesses, international authorities and above all expert communities. Moreover, the law is not always in a position to dictate its rules to the technical standard. Thus, DRM (digital right management) technologies that were supposed to protect the actors, authors as well as users, have, as demonstrated by Severine Dusollier [2008], dictated to the law the limits of the authorisations that it delivers: the opportunities left to the user have been defined according to the configuration of technical tools and not according to the principles of law that regulate the exchange of cultural content and intellectual property rights. Thus, the architecture can write the law into the artefacts, but it can also deflect and reformat it. In the words of Mireille Delmas-Marty, the architecture becomes the “structuring force of the law”.

47But in this instance, “structuring” is dependant on differentiated formats and principles, specific to certain applications, to certain worlds. Is this disparity of models, highly productive in the history of the Internet [Mounier, 2002] now compromised? This is the view held by some, who see the emergence of a generalised client-server model (cf. Boullier, infra). We are witnessing therefore the capturing of the capacities of Internet users by a number of private firms, encouraged by the comparative advantages of centralised systems.

48However, two remarks would tend to temper this concern. First of all, the history of the Internet is paved with risk: dominant players have lost at a stroke what once made them a mandatory passage; others have won, just as suddenly, a dominant position capturing a maximum number of users. We subsequently see the development of multiple projects and applications that, in addition to these strong incentives towards centralisation, explore or update numerous attempts to revive the peer to peer model, whether in the delegated form of cloud computing, or in a more decentralised context, architectures that rethink the role of server and distribute it [Musiani, 2010].

49Finally, in this environment of uncertainty, the combination of normative experiences that the Internet is obviously enabling could be the starting point for framing democratic Internet Politics by avoiding the useless controversy between the exclusive prerogatives of States and the promotion of non-governmental stakeholders in decision-making. It demonstrates the complex combination of principles, practices and values that are currently at stake in the regulation of the Internet.

50Internet governance, as we have observed, is at once an opaque managerial system, a place of techno-economic confrontations and a field for sociopolitical experimentation. If its future consists in establishing a technical coordination that is both effective and fair within an internationally accepted legal framework, this would probably require an acceptance of the radical complexity of the norms and an avoidance of a monolithic and definitive instrumental response in terms of power.

51To conclude, we would say that our work highlights, on the one hand, the expression of a radical tension between the principle of creation and the principle of conservation, which spills over the traditional notions of normativity of the rule of law and political control of actions; and on the other hand, the need to re-politicise the concept of governance, to be no longer viewed as the limp convergence of unequal groups of interests but as a dynamic collective articulation of its normative pluralism.

CHAPTERS PRESENTATION

52The chapters in this book were compiled and revised following presentations and discussions that took place during three scientific events organized or co-organized by the Vox Internet research programme:

  • Joint workshop with the “Innovation and Regulation of digital services Chair” Orange – École Polytechnique – ParisTech, theme of discussion: Technical Regulation of the Internet: From Standardisation to behavioural and societal Norms (Paris, 31 March 2009)14;
  • European seminar “Internet Governance: Transparency, Trust and Tools”, theme of discussion: Forms of Execution and Institutionalisation of Internet Governance (Paris, 12 June 2009)15;
  • International symposium to close the Vox Internet programme, theme of discussion: Internet Governance and the Dynamics of Commons: The Issue of Access (Paris, 26-27 March 2010)16.

53The questions and issues addressed by this book pertain to three main sectors: technical standardisation, management of the cyberspace and actors’ behaviour vis-à-vis the rule of law. Between a Global Regime and Specific Arrangements, the so-called Internet Governance is a form of techno-political regulation suffering from a democratic deficit that is particularly obvious as multiple devices are intended to reduce it. Standards are born in private places. States regulations are erratic. The creation of Fora (Forum des Droits sur l’Internet, Internet Governance Forum…) is one of the means by which civil society is supposed to exert an influence on the development of norms and decision-making. However, an empirical examination of the organisations and operations reveals important gaps: don’t these authorities ultimately legitimise the existing techno-political practices and ignore or neutralise some of the dynamics of collective action? At the same time, technical choices incorporate values and interests that can be contradictory.

54Arguing on the claim for an anarchic cyberspace, on one side, and a proliferation of laws quickly circumvented by technical innovation, on the other side, many scholars consider the Internet as threatening any kind of regulation. In the foreword, Mireille Delmas-Marty assesses that theses analyses, be they partly founded, are missing the regulative potential of the Internet itself. This is a core element of the following chapters: an unprecedented process of norm creation is currently underway for the Internet.

55Technical infrastructures and information exchanges with strong social and economic value originate a confrontation, at the global level, of political and cultural systems of reference, inasmuch as they transcend national boundaries and question national juridical systems. This is Paul Mathias’s approach of the Code vs Law dilemna (Chapter 1).

56Taking the strong impact of technical standardisation into account, Françoise Massit-Folléa studies the ways in which the rupture with ISO-like intergovernmental mechanisms takes place. The case of the domain name system illustrates the different sources and limits of the functional legitimacy, both in terms of creation and adoption of standards (Chapter 2).

57The search for an articulation between different legal regimes (hard law, soft law, common law) and between formal and unformal instruments is far from evident. As Herbert Burkert tells us in Chapter 3, looking for a unique institutional tool is a wrong way.

58Chapter 4 (by Romain Badouard, Cécile Méadel, Laurence Monnoyer-Smith & Francesca Musiani), selecting several organizations involved in Internet Governance, analyzes their respective criteria for normative production and makes an attempt towards modelisation.

59Due to this multipolarity, the normative reflection then engages in crucial questions: how to harmonize the interests of multiple, different actors? What deliberation, decision and institutionalisation processes are put in place? In Chapter 5, Claudia Padovani applies this questioning to a hybrid arena, the Internet Governance Forum.

60Another source of complexity exists Bernhard Rieder deals with in Chapter 6: “liberal/libertarian” discourses and practices previously pled for the autonomy of cyberspace and, currently, put at the forefront its pronounced specificity in terms of self-regulation. This generates hostility vis-à-vis any reinforcement of the State, in the traditional sense of “statutory power”.

61But it does not mean the absence of power at all. If we want to answer questions such as “How does the frontier between the individual expression and the constitution of collective actors evolve?” or “What forms of autonomy, responsibility, solidarity are enacted in the invisible universe of networks?” we need to go back to the practical arrangements.

62In Chapter 7, Dominique Boullier argues for preserving diversity in the architecture of the Internet as a pre-condition for keeping users from being captured by conventional private devices that seem to them being familiar and unquestionable universes.

63And Dominique Cardon demonstrates, with the Wikipedia case, that we must take seriously the emergence of a “capillary power”, producing a kind of a community governance within a benevolent and creative process of mutual surveillance (Chapter 8).

64At the crossroad of the chapters, we consider: i) the problems arising from the idea of a global regime for the Internet and the sectorial realities of its management; ii) the unveiling of the necessary prerequisites that have underpinned Internet Governance since the nineties and the conditions that would make agreement possible between the market-related aspects, the public policies and the “digital common good” requirements; and iii) the organisation of normative pluralism in contextual devices in which ethics and policies are brought up to date.

Notes

1 See [Lascoumes & Le Galès, 2004] and [Campbell, Crépeau & Lamarche, 2000].

2 The Internet Assigned Numbers Authority (IANA) is the body responsible for coordinating some of the key elements that keep the Internet running smoothly. Whilst the Internet is renowned for being a worldwide network free from central coordination, some key parts of the Internet are globally coordinated – and this coordination role is undertaken by IANA. Specifically, IANA allocates and maintains unique codes and numbering systems that are used in the technical standards (“protocols”) that drive the Internet. As regards to the Domain Name System, IANA manages the DNS root and the.int and.arpa domains. It is currently operated by the Internet Corporation for Assigned Names and Numbers (ICANN), a non-for profit Californian organization linked to the Department of Commerce of the US Government (source: IANA website).

3 http://www.itu.int/wsis/docs2/tunis/off/6rev1.html

4 Expression taken from Mireille Delmas-Marty (see Foreword).

5 The programme was organised, under the scientific responsibility of Françoise Massit-Folléa, according to a multidiscipline framework, by network, on a European scale. From 2006 to 2010, it received the support of the «Agence Nationale de la Recherche» of the French ministry for further education.

6 See http://www.voxInternet.org/spip.php?rubrique12&lang=en.

7 Ibid.

8 This appropriate expression was brought by Meryem Marzouki, co-founder of GigaNet (Global Internet Governance Academic Network).

9 [Grimmelmann, 2010]: see details at the end of Chapter 2.

10 In the sense of [Deleuze, 1988].

11 Without the limits of this exploitation always being negotiated by international agencies. This is obvious from the difficult negotiations currently underway regarding gas and petrol reserves in the Arctic.

12 In March 1993, the Telecommunications Council of Japan recommanded urgent measures to facilitate the development of information highways so that by 2010 all homes would have access to the Internet. In January 1994, the European Commission published a white paper which placed this development at the heart of its strategy for employment. In June 1994, the Bangemann Report set out EU economic policies for the next century based on ICT infrastructures development, public-private partnership and the creation of new services and applications.

13 Facebook therefore only presents an interest to the user if he is certain to find connections. Similarly, the services provided by Amazon are based on an analysis of clients’ purchasing histories (those who bought this book also bought this cd…).

14 Co-leader: Pierre-Jean Benghozi. See http://innovation-regulation2.telecom-paristech.fr/.

15 Co-leaders: Philippe Goujon & Sylvain Lavelle. See http://www.info.fundp.ac.be/IG3T.

16 See http://www.voxInternet.org/spip.php?article351&lang=en.

© Presses des Mines, 2012

Licence OpenEdition Books

Rechercher dans OpenEdition Search

Vous allez être redirigé vers OpenEdition Search