Version classiqueVersion mobile

Ouvrir, partager, réutiliser

 | 
Clément Mabi
, 
Jean-Christophe Plantin
, 
Laurence Monnoyer-Smith

Les dimensions éthiques et juridiques des infrastructures de publication des données

What’s in a password?

Decentralized data storage and privacy by design

Francesca Musiani

Résumé

Ce chapitre se base sur l’analyse d’un service de stockage en « cloud pair-à-pair » afin de discuter comment les modifications dans la conception architecturale de services en réseau affectent la circulation, le stockage et la protection des données – et en ce faisant, elles reconfigurent l’articulation de fonctionnalités locales et globales dans le réseau. En particulier, ce chapitre se focalise sur le rôle du mot de passe généré par l’utilisateur dans le système. Stocké localement, dans le client P2P de l’utilisateur, le mot de passe devient une forme de désengagement du fournisseur de service par rapport à des questions de sécurité : un détail, dont l’importance peut sembler minime au tout début, mais qui amène à une redéfinition de la privacy by design, à des changements dans les formes de solidarité technique créées dans le système et, enfin, à une reformulation et « rééquilibrage » de la relation entre l’utilisateur et le fournisseur de service.

Texte intégral

Introduction

  • 1 Yannick Guerrini, “Wuala : le P2P comme solution de stockage,” Wuala, April 21, 2010, available at (...)
  • 2 Miranda Mowbray, “The Fog over the Grimpen Mire: Cloud computing and the law,” SCRIPTed, 6(1), 2009 (...)

1Early 2007. The industry of online data storage—a service allowing users to store, save and share data on one or several terminals connected to the Internet—has “never felt better.”1 Google, Amazon, Microsoft and Oracle, to name but a few, propose their storage platforms, each with its specificities and one common denominator: the “cloud.” According to this model, the service provider is in charge of both the physical infrastructure and the software. Thus, it hosts applications and data at once—in a location, and according to modalities, unknown or at best ambiguous for the user.2 The so-called “server farms” proliferate, to support and manage this increasing remoteness of data from users and users’ terminals.

  • 3 The name is fictitious (“light rain”) and recalls the fragmentation and the distribution of data in (...)
  • 4 Philip E. Agre, “Peer-to-Peer and the promise of Internet equality,” Communications of the ACM, 46( (...)

2In this context, Drizzle,3 a small Swiss start-up founded by Dietrich Poell and Kurt Gueden, both graduates of the École Polytechnique Fédérale de Lausanne, makes an unusual foundational choice: its cloud storage platform will mainly be composed—alongside more “classical” data centres—of portions of the users’ hard disks, directly linked in a peer-to-peer, decentralized network architecture.4 The “communitarian” model subtending the platform will allow its users to barter their local disk space with an equivalent space in the decentralized cloud, thereby improving the quality of this storage space, which will become permanently available and accessible. By shaping their decentralized storage service, the developers of Drizzle carry on a double experimentation: with the frontier between centralization and decentralization, and with sharing modalities that blend peer-to-peer with social networking.

  • 5 Madeleine Akrich, “De la position relative des localités. Systèmes électriques et réseaux socio-pol (...)
  • 6 Nicolas Dodier, Les hommes et les machines. La conscience collective dans les sociétés technicisées(...)

3Building on the analysis of Drizzle’s “peer-to-peer cloud,” this chapter discusses how changes in the architectural design of networked services affect data circulation, storage and privacy—and in doing so, reconfigure the articulation of the ‘locality’ and the ‘centrality’ in the network.5 Ultimately, we will see how decentralizing the cloud leads to a reformulation and “re-balancing” of the relationship between the user and the service provider. The local, client-side encryption of data first, and their fragmentation afterwards—both operations conducted within the P2P client installed by the user, and entirely on his terminal—are proposed by Drizzle as evidence that the firm, in its own words, “does not even have the technical means” to betray the trust of users. In particular, this conception of privacy by design takes shape around two entities, the password and an encryption-enabling “friendship key.” The chapter addresses the negotiations leading to the development of these two features and in particular, it examines how the password, that remains locally stored in the user’s P2P client, becomes a form of disengagement of the service provider with respect to security issues: a detail whose importance may seem small at first, but eventually leads to changes in the forms of technical solidarity6 established between users and provider.

Approach: Infrastructure studies meet privacy by design

  • 7 Susan L. Star, “The ethnography of infrastructure,” American Behavioral Scientist, 43(3), 1999: 339
  • 8 Bruno Latour, The Pasteurization of France, A. Sheridan and J. Law (transl.), Cambridge (MA), Harva (...)

4“Study an information system and neglect its standards, wires, and settings, and you miss equally essential aspects of aesthetics, justice, and change,” once wrote science and technology studies (STS) scholar Susan Leigh Star.7 Indeed, the history of internet innovation suggests that the shaping of technical architectures populating the network of networks is, in the words of philosopher Bruno Latour, ‘politics by other means.’8 This chapter situates itself into, and contributes to, the interdisciplinary body of work that seeks to investigate how, in networked services and media, architecture is politics, protocols are law, code shapes rights. In doing so, it seeks to build bridges between approaches grounded in STS—infrastructure studies in particular—and the emerging studies on privacy-by-design approaches, in which computer science and legal sciences are featured most prominently with respect to the social sciences.

  • 9 William J. Mitchell, City of Bits: Space, Place and the Infobahn, Cambridge (MA), MIT Press, 1995.
  • 10 Joel R. Reidenberg, “Lex informatica: The formulation of Internet policy rules through technology,” (...)
  • 11 Yochai Benkler, “Sharing nicely: On shareable goods and the emergence of sharing as a modality of e (...)
  • 12 Lawrence Lessig, Code and Other Laws of Cyberspace, New York, Basic Books, 1999.
  • 13 Yochai Benkler, “Sharing nicely…,” op. cit.
  • 14 Lawrence Lessig, Code and Other Laws of Cyberspace, op. cit.

5The relationship between the design of technical architecture for networked media and the shaping of law and rights has been an increasingly central interdisciplinary preoccupation since the late 90s/early 2000s. Early uses of the metaphor ‘code is law’ can be found in William Mitchell’s City of Bits9 and in Joel Reidenberg’s article on lex informatica, the formation of information policy rules through technology10. However, legal scholars Yochai Benkler and Lawrence Lessig have arguably been the ‘scene-setters’ in this field, with their work on sharing as a paradigm of economic production in its own right11 and technical architecture as politics,12 respectively. While the former argued for the rise of a ‘networked information economy’ as a system of ‘production, distribution, and consumption of information goods characterized by decentralized individual action carried out through widely distributed, nonmarket means,’13 the latter introduced technical architecture as one out of the four main (and interconnected) society regulators, the other three being law, market and norms. The application of this principle to the text of computer programs led to what remains, perhaps, the most famous incarnation of the famous ‘code is law’ label.14

  • 15 Susan Leigh Star, “The ethnography of infrastructure,” op. cit.: 337.
  • 16 Lev Manovich, The Language of New Media, Cambridge (MA), MIT Press, 2001; Matthew Fuller (ed.), Sof (...)
  • 17 Matthew G. Kirschenbaum, “Virtuality and VRML: Software studies after Manovich,” Electronic Book Re (...)

6Drawing on Star’s “call to study boring things,”—the idea that architectural design choices, technical specifications, standards and number sequences are no less important to the study of information systems because they are “hidden mechanisms subtending those processes more familiar to social scientists”15—emergent bodies of work such as software studies, critical code studies and cyberinfrastructure studies16 seek to balance “the deployment of critical terms like ‘virtuality’ […with] a commitment to meticulous documentary research to recover and stabilize the material traces of new media.”17

  • 18 Ann Cavoukian, “Privacy by Design: The 7 foundational principles. Implementation and mapping of fai (...)

7One can easily see how this body of work may be suitable to address the increasing relevance of the so-called privacy-by-design (PbD) approaches.18 These approaches establish that privacy protection should be taken into account and “embedded” into the whole engineering process, and conform to it throughout the entire life cycle of a particular technology, so as to offer privacy “by technical design” rather than “by policy.” The protection of privacy is built into the technology, incorporating a technical device of legal protection to the design of Internet services. While the concept and the very ontology of PbD are hotly debated, objects, markets, and economic realities are beginning to build around this concept, eliciting interest and being monitored by national, supranational, and international regulatory authorities. In Canada, mostly due to the work of Ann Cavoukian as the Information and Privacy Commissioner of Ontario, PbD is proposed as a mandatory integration into those ICT and security technologies, such as video surveillance, which are based on the collection, analysis, and exchange of personal data. In Europe, data protection by technical design has been incorporated into the prospective General Data Protection Regulation. In short, technical infrastructures and architectures are increasingly being recognized as a locus of privacy enactment, control and protection, one that brings to the foreground the agency of different actors—from users to service providers—in shaping and providing particular definitions of privacy.

Methodology

  • 19 Susan L. Star, “The ethnography of infrastructure,” op. cit.
  • 20 Janet Abbate, “L’histoire de l’Internet au prisme des STS,” Le Temps des médias, 18, 2012: 170–180.
  • 21 Laura DeNardis, The Global War for Internet Governance, New Haven (CT), Yale University Press, 2014
  • 22 Francesca Musiani, Nains sans géants. Architecture décentralisée et services Internet, 2nd edition, (...)

8This work blends internet studies with science and technology studies (STS)—with a particular attention to methods that Star19 has described as ‘ethnography of infrastructure.’ This blend of qualitative methods has proven fruitful to draw the “ballet between programmers, software and users”20 that builds decentralization into internet-based services. The underlying hypothesis for this method is that the ‘lower layers’ of a networked system have, or may have, consequences on the purpose that the system serves, the dynamics that are enacted within it, the techno-legal procedures it entails. Thus, it contributes to shape the present and future of Internet users’ rights, via the “sinews of power embedded in the architectures and infrastructures of the internet.”21 In the case of the present article, space constraints do not allow to present more than a very limited part of the ethnographic work which spans three chapters in my book Nains sans géants22; the aim is here to give a ‘flavor’ of this ethnography, which has been conducted (over a period of approximately a year and a half, 2010-2011) via in-depth, semi-structured interviews with the members of the Drizzle developer team, consultation of working documents and the observation of the Drizzle user forum.

Defining users via “encrypted fragments” of data

  • 23 Ibid.
  • 24 Unless otherwise noted, citations are derived from interviews with the developers of Drizzle, condu (...)

9As both a technical experiment and an attempt to differentiate their commercial offer from the proliferation of cloud storage services provided by the “giants” of Internet-based services, Dietrich Poell and Kurt Gueden make the choice of decentralization as the defining feature of their storage system Drizzle, born in 2007. The storage platform of their creation will mainly be composed of portions of the users’ hard disks, directly linked in a peer-to-peer, decentralized network architecture. This choice entails—alongside other implications less closely related to this chapter’s focus, which we have addressed elsewhere23—the implementation of a technical feature defined by Dietrich as “encrypted fragmentation.”24 This feature consists in encrypting locally—on the user’s computer, and by means of a previously installed Drizzle P2P client—the content that is to be stored. The content will then be divided into fragments, duplicated to ensure redundancy, and spread out to the network.

10The “encrypted fragments” resulting from this process will become a central feature of the system, moving beyond the technical process to incarnate social, political and quasi-juridical aspects. They become the very symbol of the P2P storage solution proposed by the firm, a solution on which rely, to a large extent, the credibility and the specificity of the service; the technical mechanism that ensures the security of data, not only from unauthorized users, but also from the firm itself (an aspect which we will come back to later in more detail); or again, the arena where the controversy between users, “trustful” or “doubtful” about the system’s capacity to fulfil its promise of confidentiality and security, will play out.

11The developers of Drizzle program the functions of encryption and file fragmentation to run on every computer that, having installed the P2P client, has selected the “shared storage” option. The fact of selecting this option is not necessarily a mandatory choice: over its first months of existence, Drizzle had been programmed to give the choice, to those users needing less than one gigabyte of storage memory, to obtain it not from the network of connected users, but from the firm’s data centres; in this case, the encrypted fragments would not be stored on that specific user’s computer. However, this option—which ranks among the “precautionary measures” implemented by the firm during the bootstrapping phase—only meets with a very modest level of interest from pioneer users; as Dietrich underlines, “people who tried Wuala were interested in the P2P storage mechanism, and that involved the storage trading, thus, the encryption.” A specific profile of pioneer user/client of Drizzle begins to be painted: someone with an intrinsic interest for the P2P storage mechanism, both its technical functioning and its community dimension—people who “follow” the system because they possess the technical capacity and expertise to do so, and see the technical features of decentralization as an added value for the different levels of control and responsibility they entail. The choice of decentralization is not that of delegation. Dietrich explains how the storage and sharing processes work in the Drizzle system this way, on the occasion of a 2007 Google Tech Talk:

  • 25 This roughly implies that, retrieved independently, those fragments have no meaning of their own.

“Let’s say that Alice wants to store a file […] The first thing that happens when Alice drags the file into Wuala is that the file gets encrypted on her computer. Then the encrypted file is split into non-executable25 fragments, and these fragments are redundantly encoded into other fragments. Now let’s say that Alice wants to share the file with Bob. What Bob then tries to do is to find end fragments, a subset of all fragments, from the P2P network. Then the application would decode, decrypt, and open the file.”

  • 26 See Madeleine Akrich, “De la position relative des localités…,” op. cit.

12In this process, an interesting articulation of the local and global dimensions of the system takes shape, and contributes to define the role of the user as envisaged by the system’s developers.26 Until the operations of encryption and fragmentation of files are concluded, the heart of the service is the user’s terminal: it is there that, by means of the P2P client, these operations take place. As Kurt states, “all encryption and decryption is performed locally—the advantage of having your software running on the client.” Once the data have taken the shape of the encrypted fragments, these are stored and shared, “spread out” to the network. As the shift from the local dimension of a single machine to the network of users takes place, the material location of the stored data (rather, the multiple location of these data) is constituted by sub-groups of domestic computers, forming networks that are constantly evolving and the shape of which is temporary. Instead of the network of computers having installed Drizzle, the “global” dimension of the system is, then, the algorithm that allows, at any moment, to define what computers, at what time and for what duration, are hosting the data of any particular user.

13At the same time, unless a sharing operation is explicitly initiated, each user ignores what fragments, at a specific moment, are stored on her own machine rather than any other machine in the network. This translates into the fact that a file stored this way on the Drizzle network is, indeed, invisible for users, even if their machines are storing parts of it. Users only have a direct knowledge of their acceptance to pool the computational and material resources necessary for the encryption and the fragmentation of files, as well as their storage. The CPU capacity of their machines is mobilized, but they will not know to what purpose in particular these resources are destined at any given time. Thus, since the first version of the system was released, Drizzle developers have thought it necessary to include in the system’s Terms of Use a very peculiar clause, clarifying what resources the system will be able to use and for what set of tasks—while not detailing what resource will serve what task at any specific moment:

“[…] Services of the User : The user acknowledges that Drizzle may use processor, bandwidth and hard disk (or other storage media) of his computer for the purpose of storing, encrypting, caching and serving data that has been stored in Drizzle by the user or any other users. The user can specify the extent to which local resources are used in the settings of the Drizzle client software. The amount of resources the user is allowed to use in Drizzle depends on the amount of local resources the user is contributing to Drizzle. Resources are allocated and monitored in accordance with the Privacy Policy.”

Peer-to-peer storage: cloud and social networking meet privacy by design

  • 27 Fabrice Le Fessant, “Les réseaux sociaux au secours des réseaux pair-à-pair,” Défense nationale et (...)

14“In 2007, it was all starting to get social,” Dietrich recalls three years later. This reference to the explosion of social media—which, Facebook and Twitter first and foremost, were at that moment entering the daily life of millions of Internet users in an increasingly pervasive way—situates Drizzle’s first steps within a context, and a community of research and development, that works on hybrid systems crossing P2P systems with social networks.27

  • 28 danah m. boyd and Nicole B. Ellisson, “Social network sites: Definition, history, and scholarship,” (...)
  • 29 danah m. boyd, “Facebook’s privacy trainwreck: Exposure, invasion, and social convergence,” Converg (...)
  • 30 Alessandro Acquisti and Ralph Gross, “Imagined communities: Awareness, information sharing, and pri (...)
  • 31 E.g. Michael Arrington, “Facebook privacy issue won’t die” [on line], TechCrunch, November 26, 2007 (...)

15In 2007, Facebook has been in existence for three years. Millions of users take part in it, and contribute to the massive success of these Web-based services that allow individuals to build a public or semi-public profile within a system, define a list of other users with whom to interact, and see/browse the list of their connections and others made in “public mode” within the system.28 In parallel with their spectacular growth, these social networks raise vibrant discussions and controversies, both within the expert community and the public at large, because of the ways in which social networking service providers leverage personal information and data of users, sometimes allowing external applications to access them, sometimes for direct commercial purposes.29 Researchers in computer science and in law of technology manifest their preoccupation30 to underline that very few users are fully aware that in using these applications, they leave open to ill-defined or non-defined publics the access to some of their private information stored on the servers of companies proposing such services—behavior defined as risky by several commentators.31 The rise of the so-called cloud—the model, applying to an increasing number of services, according to which the service provider is in charge of both the physical infrastructure and the software—does nothing to mitigate the impression of risk for informed users, as applications and data are increasingly hosted in locations and ways unknown or at best ambiguous, and the so-called “server farms” proliferate, to support and manage this increasing remoteness of data from users and their terminals.

  • 32 See also Eben Moglen, “Freedom in the cloud: Software freedom, privacy and security for Web 2.0 and (...)

16In this context, several developers—including Drizzle’s—work at the crossing of two main “tendencies.” On one hand, they identify in a peer-to-peer type of network architecture a possible way of approaching the protection of personal data privacy from a different angle: through the relocation and “re-appropriation” of data within the terminals of users, who would be able to host their own profiles and the information they contain (figure 1).32

Figure 1. Schematic representation of a hybrid P2P storage network

Figure 1. Schematic representation of a hybrid P2P storage network

Source: https://bradlaura.wordpress.com/​2010/​04/​06/​file-sharing/​

  • 33 Renato J. Figueiredo et al., “Social VPNs: Integrating overlay and social networks for seamless P2P (...)

17On the other hand, they see in the reproduction at the architectural level of the dynamics that are now ensuring the spectacular success of social networks—friendship links, community—and group-building, attribution of different degrees of trust—a way to counter a number of fragilities of classic P2P file-sharing networks. Indeed, in order to make filtering, blocking and identifications more difficult for external parties, “classic” P2P file-sharing software tends to safeguard as much as possible the anonymity of users, even when a reciprocal knowledge of the identity of communicating nodes would make the network more robust.33

  • 34 Renato J. Figueiredo et al., “Social VPNs…,” op. cit.; Francesca Musiani, “When social links are ne (...)

18While the idea to make Drizzle a “social P2P storage cloud” had in fact pre-dated by a few months the explosion of social networks in Europe, thus, Dietrich and his team work on something that is potentially “fashionable,” as Kurt ironically points out. Drizzle developers work on a network architecture that should be able to mix a distributed structure—with most operations of securitization and conservation of data conducted at the local level and within the P2P client—with dynamics peculiar to social networking, where direct links can be established between users for sharing purposes: direct at both levels, the “lower layers” and the interface. Social networking infrastructures are skewed towards finding and establishing social links, but scarcely adapted to allowing connections between users and their peers by means of network links (at the logical or infrastructural layers). Thus, the challenge for developers becomes to envisage an innovative architecture capable of integrating networking at both the interface and application levels.34

  • 35 Ann Cavoukian (ed.), “Special Issue: Privacy by Design: The next generation in the evolution of pri (...)

19A conception of privacy and confidentiality of personal data that is conceived and enforced via technical means—what has been called privacy by design35 takes shape with the development of Drizzle, and is proposed to its users/clients. This conceptualization of privacy is defined by means of the constraints and the opportunities linked to the treatment and the location of data, according to the different moments and the variety of operations taking place within the system. In particular, the confidentiality of data (personal data as well as the content stored in the P2P cloud) is defined by the implementation of the resource allocation system on which Drizzle is based. This definition of privacy, as it can be observed, is shaped in the discussions about the evolutions of two objects and their role within the system: the password that identifies the user vis-à-vis the network, and the “friendship key” permitting two users to share content between them.

Password and (/as) user responsibility

20In Dietrich’s intentions, the role of the user-selected and—generated password for the Drizzle system should have “stri[cken] the user as soon as he had access to the system for the very first time.” Indeed, the virtual form that appears to users at the moment of their first subscription to the service may surprise the user: it informs that

“We do not know your password as it never leaves your computer. Please, do not forget your password and use, if needed, your password hint.”

21The status of the password is thus negotiated, beyond its usual meaning of unique identifier vis-à-vis the system, to define, detail and legitimize the process of local encryption and decryption of data within the Drizzle system. This feature comes to symbolize the specificity of Drizzle’s promise of security and privacy as well as users’ trust, as it becomes the symbol and the graphical representation of the “local” dimension of the encryption process—as it never leaves the computer of the user who created it. The operations, mostly managed in an automatic fashion, that are linked to the protection of personal data are thus hosted on the terminals of users. Indeed, this entails a modification of the user’s role within the service’s architecture: node among equal nodes, it becomes a server itself, instead of a starting point and a final point for operations that are otherwise conducted on another machine or group of machines.

22Through the attribution of this status to the password, the developers of Drizzle are also proposing an alternative to the balance between the rights exerted by users on their own data and the rights acquired by the service provider on these same data—a balance that is usually heavily bent on the provider’s side. However, this reconfiguration in the balance of rights comes with a trade-off. As the password stays with the user and is not sent to the servers controlled by the firm, the latter cannot retrieve the password if needed. Thus, users do not only see their privacy reinforced, but at the same time and for the same reasons, the responsibility for their actions is augmented—while the service provider renounces some of its control on the content that circulates thanks to the service it manages. Drizzle’s terms of use, and more specifically its privacy policy, make this “trade-off” official:

“Drizzle encrypts and decrypts all files locally. Your password never leaves your computer, hence, we do not know your password. This protects your privacy, but also means that apart from sending you password hint (specified when creating your account or checking your account settings) we cannot help you recovering it. (If you forget your password, you can request your password hint. If your password hint does not help you, we cannot do anything about it. Since your password never leaves your computer, we do not know your password. We cannot recover your password.”

23The meaning of this “renunciation,” Dietrich explains, is double. On the one hand, the Drizzle team wishes to make evident, almost translate into a specific object the user can easily relate to, the “obscure” and unfamiliar process of client-side encryption, which is an ongoing source of controversies and perplexities. On the other hand, it is also a matter of Drizzle’s business model: the more the firm knows about its users, the more it is mandatory for it to submit the users to regular surveillance and control—and this requires an investment of material resources and time that, in its first phases of existence, the firm does not have:

“If we can know what is in your account, starting with your password, we have heightened obligations to police the content and to make sure nobody can eavesdrop on the traffic.”

Direct links and “friendship keys”

  • 36 I.e. that the degree of redundancy of the fragments is such that there are enough fragments in the (...)

24The second critical point confronted by the development team is the moment when the content users introduce in the system, once encrypted and fragmented, leaves the local dimension of the P2P client installed on the machine of every user, to be spread out, dispersed and stored in the P2P network—the ensemble of user computers—and Drizzle’s data centres. The challenge confronted by the developers at this stage is double: not only to make sure that, at any moment, the extraction of a file is possible,36 but also—what interests us more for the scope of this article—that any user of the system does not have access to content of which she is not the intended recipient. To sum it up, the developers need to achieve a balance between the agility of the system—the rapidity of its response—and the confidentiality of the data circulating within it.

25The approach described above—based on the three aspects of encryption, fragmentation and redundancy—is paralleled with the development of a layer that should secure the sharing and exchange practices between users. The ways in which this layer should take shape are discussed at length. Finally, the team decides to work on a system that should be able to foster direct exchanges at the two levels of the interface and the application—a decision that is perhaps informed, Dietrich recognizes, by the media buzz surrounding Facebook and its opaque and ever-present servers.

26The choice, that has already been made, of founding the system on a P2P architecture facilitates the implementation of this decision, as the direct relationship established between two machines exchanging data packets at the “lower layers” has an intuitive counterpart in the user’s action of sharing or exchanging by means of the social tool:

“The distributed system that made programming more complicated in other areas, for once, could enable us to build a system in which the experience of the user would mirror what was happening below. Whenever a user was publishing a file, the encryption key of that file would be revealed to the public; but similarly, when the user wanted to share with other specific users, the key of these files would be securely revealed only to the friends and groups he shared the file with. Among us [the developers], at some point we called it the friendship key, and it was something that stayed afterwards.”

27For the sharing operations, the “friendship key” comes into the picture, an authorization of exchange that must be known only by the user sending it and by the user receiving it, and that allows to reconstruct the file. The introduction of an “add friends” mechanism, typical of social networks, and the continuity between the established friendship link and access to the friend’s data—a continuity from which intermediaries are excluded—are elements that reinforce even more the dimension of privacy and confidentiality in sharing and storage activities. During the above-mentioned Google Tech Talk, Dietrich depicts the “key” as the basic principle of exchanges happening within Drizzle, and he does not resist the temptation to add to his primarily technical discourse a humorous reference to the current debates on Facebook’s “almighty” servers and the risk they represent for privacy:

“When they first got friends [sic], Alice and Bob exchanged the friendship key, now Alice encrypts the file key with the friendship key—this is a simplification of what actually happens, but it serves for this example—and exchanges it with Bob, who proceeds to download the file. No strings attached, and no servers attached, either.”

28While the interface of Drizzle’s alpha version is developed to closely resemble a file system—with a number of directories painted in different colors, according to the more or less strict level of confidentiality they are endowed with—the “social” features of the system are underlined, in the subsequent beta version, by the user interface. Indeed, the interface is built, from then on, in such a way as to suggest the similarity of the platform with those social networks that are becoming increasingly popular: labels such as “profile,” “sharing,” “friends” and “groups” are borrowed and paralleled with a feature that allows each user to attribute different degrees of trust to her friends (or to the whole network, for public content), for each of the files she stores in Drizzle.

29The notions of “social network” and “friendship key” are understood here not only as a “marketing layer,” or as a way to structure the organization of storage in practice, but as a factor of change that begins at the architectural level. They propose a definition of social organization for the users, shaped by a reciprocal “technical trust,” by the action of opening up and pooling a portion of their computational resources (figure 2)—paralleled with a reconfiguration of the chain of intermediaries in the practice of sharing and exchanging.

Figure 2. “Reciprocal technical trust” as summarized on Drizzle’s blog

Figure 2. “Reciprocal technical trust” as summarized on Drizzle’s blog

Data privacy and resource allocation

30The third aspect that contributes to define privacy within Drizzle comes with the fine-tuning of the privacy policy for the system. Indeed, with the release of the beta version of the platform, Drizzle introduces in this document a section detailing the conditions for the allocation of computational resources of the different computers participating in the system.

31As we have seen above, the release of the first version of the platform had already made it necessary, due to the very particular status of the resources used by the system it supported, to detail several aspects in the terms of use: the role of computers belonging to users, the types of resources that Drizzle would have been able to use, and their purpose. It had also become necessary to detail the extent to which users would have been able to decide—and communicate to their P2P client, thus to the system—the maximum quantity of local resources that the rest of the network/storage system could use. However, the first version of the terms of use had not made explicit the articulation between the availability of resources allowed by users, on one hand, and the way in which these resources would be destined to different operations within the system.

32The articulation of these two aspects, however, has important implications—the Drizzle team ultimately concludes—for the confidentiality of data circulating in the system (both personal information and content stored by users). Several users eventually frame the resource allocation process enacted by the Drizzle system—and by the team of developers proposing it—as a possible “surveillance” or “monitoring” of these resources, in a way that can potentially be highly automatized, invasive, privacy-threatening. After a discussion between these concerned users and the developers, via the Drizzle forum, two modifications are applied to the terms of use: while the general terms now state that “resources are allocated and monitored in accordance with the Privacy Policy,” the privacy policy itself details the extent of automation and pervasiveness of the system that allocates and monitors resources:

“In order to ensure a fair allocation of resources within Drizzle, various data about the computers participating in the Drizzle network is collected. This data includes their IP addresses, disposability and the amount of resources they are contributing (e.g. bandwidth, memory). (…) Drizzle keeps track of how much storage space (e.g. bandwidth, memory) you have used and earned […] Drizzle collects statistical information for the purposes of monitoring, debugging and improving the system. This includes automatically generated problem, performance, network analysis and general usage reports, as well as logs of the connections and queries made to Drizzle’s servers (including the involved IP addresses), as well as analytical data about the usage of the Drizzle website. However, none of this data contains information from your private or shared files.”

  • 37 Francesca Musiani, Nains sans géants…, op. cit.

33Thus, the correct functioning of the allocation system indeed implies the gathering of several pieces of information concerning the material, computational and memory resources pooled by each participating computer. However, the information is processed automatically and in an aggregate manner, for statistical purposes. The collection of information, affirms the revised version of the document, has the purpose of automatically computing the storage space made available by each user—and, as we have analyzed elsewhere,37 of establishing the extent to which each user can reclaim her place in the “P2P cloud,” an equivalent storage space in the network of participating users.

34Moreover, the developers call for the collaborative spirit of users, asking for feedback on the forum and on other platforms, and underlining the usefulness of the information they will obtain for the improvement and ulterior development of the system. Most importantly, by redefining the essential issue at stake with this resource allocation system—the stability and durability of Drizzle—the privacy policy contributes to separate this issue from the question of the confidentiality of personal data and of stored content. The pooling of the storage equipment (i.e., users’ local resources, made available by each of them) does not imply an intrusion in the stored content itself. Such content remains, as a rule, secured by a three-fold system embedded in the technical architecture: the content encryption and fragmentation processes; the local safeguard of the password; and finally, the “direct” sharing procedures based on the exchange of the friendship keys.

Conclusions

  • 38 Francesca Musiani, “Caring about the plumbing: On the importance of architectures in social studies (...)

35This chapter has explored the “decentralized cloud” system Drizzle, as an example of a platform whose P2P architecture becomes the differentiating aspect. The work on “plumbing”38 reconfigures the security of the system, the privacy of stored data, the rapidity of downloads and uploads, as well as the modalities of sharing practices. The “encrypted fragments” resulting from Drizzle’s treatment of stored content become, in the eyes of both developers and users, the very symbol of the P2P storage solution proposed by the firm, a solution on which rely, to a large extent, the credibility and the specificity of the service; the technical mechanism that ensures the security of data, not only from unauthorized users, but also from the firm itself; the arena where the controversy between users, “trustful” or “doubtful” about the system’s capacity to fulfil its promise of confidentiality and security, will play out. Technical infrastructures and architectures become the primary locus where privacy is, almost “materially,” enacted, controlled and protected; a close observation of how their development unfolds brings to the foreground the agency of different actors, from users to service providers, in shaping and indeed defining privacy.

  • 39 Madeleine Akrich, “De la position relative des localités…,” op. cit.

36The articulation of the local and global dimensions of the storing and sharing process in the Drizzle system shapes the envisaged role of users: “the ‘locality’ and the ‘centrality’ are exchanged in multiple places […] as the network is an aggregator and at the same time, it is built on a compromise.”39 A system is built in which, until the encryption and the fragmentation of data are completed, the heart of the service is the terminal of the user: it is on this terminal that, thanks to operations conducted within the P2P client, the encryption and division into fragments take place. Once the data have assumed the shape of “encrypted fragments,” each file present in the system is stored and shared, “spread out” to the network. It is at that moment that the shift from the local dimension of a single machine to the network of users takes place, and the stored content becomes global. The relationship between the user and the service provider is, consequently, redefined: a technical model results where the local encryption first, and the fragmentation next—before any sharing or downloading operation that would imply the circulation of data within the network—are proposed by the Drizzle developers as the evidence that the firm “does not even have the technical means” to betray the trust of users, as given the worst case scenario—one in which the network would be compromised—an eventual attacker may only obtain non-executable fragments, devoid of any independent signification.

  • 40 Nicolas Dodier, Les hommes et les machines…, op. cit.

37Drizzle’s early steps within a research and development context that, after 2005, works on hybrid systems crossing peer-to-peer with social networking, influence its taking shape as a “P2P social cloud,” and the definition of data privacy and confidentiality implied by this. Operations of securitization and preservation of data at the local level are merged with a dynamic where direct links—direct at both the “lower layers,” logical and applicative, and the interface, thus from the user’s viewpoint—can be established by users for sharing operations. This fuels into the privacy-by-design approach, which shapes privacy via the constraints and opportunities deriving from the treatment and the location of data, according to the different moments and operations taking place within the system. Within Drizzle, this conception of privacy has particularly taken shape by means of two “objects,” the password and the friendship key; the relationship between the confidentiality of data (both the personal information and the content stored by users in the P2P cloud) and the resource allocation system also play an important role. The password, in particular, ultimately constitutes the service provider’s device of “auto-release” from security issues: a detail whose importance may seem modest at first sight, but in the end, entails the reconfiguration of the forms of technical solidarity40 created between the user and the service provider.

  • 41 Michel Callon, “Sociologie de l’acteur-réseau,” in M. Akrich, M. Callon and B. Latour, Sociologie d (...)
  • 42 Philippe Aigrain, “Declouding freedom…,” op. cit.

38This case study hints at the broader implications that privacy-by-design solutions, and more broadly technology-based approaches to the rights of networked actors, have for our increasingly data-driven economy. Evolutions in architectural design affect the repartition of competences and responsibilities between service providers, content producers, users and network operators. They affect forms of engagement and intéressement41 in networked systems, of users first and foremost, but also of other actors concerned by the implementation and the operations of internet services. They shape the sustainability of the underlying economic models and the technical and legal approaches to the management of digital content and personal data. They make visible, in various configurations, the forms of interaction between the local and the global, the patterns of articulation between the individual and the collective. Architectural design also contributes to shift the boundary between public and private uses of the internet as a global facility. As it helps defining what is a contributor in internet-based services—in terms of computing resources required for operating the system, and of content itself—it is also a crucial factor in defining the right to privacy of users/clients, their right of access to content, their right not to be subjected to excessively intrusive surveillance. In this sense, privacy by design, as implemented in decentralized architectures, appears as a possible “instrument” for users to “reclaim Internet services,”42 and the related rights—an issue that has never appeared more relevant than in our post-Snowden, big data-filled times.

Bibliographie

Abbate, Janet, “L’histoire de l’Internet au prisme des STS,” Le Temps des médias, 18, 2012: 170–180.

Acquisti, Alessandro and Gross, Ralph, “Imagined communities: Awareness, information sharing, and privacy on the Facebook,” in G. Danezis and P. Golle (eds.), Privacy Enhancing Technologies (6th International Workshop, PET 2006, Cambridge, UK, June 28–30, 2006), Berlin, Springer, 2006: 36–58.

Agre, Philip E., “Peer-to-Peer and the promise of Internet equality,” Communications of the ACM, 46(2), 2003: 39–42.

Aigrain, Philippe, “Declouding freedom: Reclaiming servers, services and data,” paper presented at the 3rd edition of the 2020 FLOSS Roadmap (Paris, September 30–October 1, 2010), available at <https://flossroadmap.co-ment.com/text/NUFVxf6wwK2/view/> [accessed 09/11/2016].

Aigrain, Philippe, “Another narrative: Addressing research challenges and other open issues session,” paper presented at the 2nd PARADISO conference (Brussels, September 7–9, 2011).

Akrich, Madeleine, “De la position relative des localités. Systèmes électriques et réseaux socio-politiques,” Cahiers du Centre d’Études pour l’Emploi, 32, 1989: 117–166, available at <https://halshs.archives-ouvertes.fr/halshs-00081709/document> [accessed 09/11/2016].

Arrington, Michael, “Facebook privacy issue won’t die” [on line], TechCrunch, November 26, 2007, available at <http://techcrunch.com/2007/11/26/facebook-privacy-issue-wont-die/> [accessed 09/11/2016].

Benkler, Yochai, “Sharing nicely: On shareable goods and the emergence of sharing as a modality of economic production,” The Yale Law Journal, 114(2), 2004: 273–358.

boyd, danah m., “Facebook’s privacy trainwreck: Exposure, invasion, and social convergence,” Convergence, 14(1), 2008: 13–20.

boyd, danah m. and Ellison, Nicole B., “Social network sites: Definition, history, and scholarship,” Journal of Computer-Mediated Communication, 13(1), 2008: 210–230.

Callon, Michel, “Sociologie de l’acteur-réseau,” in M. Akrich, M. Callon and B. Latour, Sociologie de la traduction. Textes fondateurs, Paris, Presses des Mines, 2006: 267–276.

Callon, Michel, Lascoumes, Pierre and Barthe, Yannick, Agir dans un monde incertain. Essai sur la démocratie technique, Paris, Seuil, 2001.

Cavoukian, Ann, “Privacy by Design: The 7 foundational principles. Implementation and mapping of fair information practices,” Information & Privacy Commissioner, Ontario, Canada, 2006, available at <https://www.privacyassociation.org/media/presentations/11Summit/RealitiesHO1.pdf> [accessed 09/11/2016].

Cavoukian, Ann (ed.), “Special Issue: Privacy by Design: The next generation in the evolution of privacy,” Identity in the Information Society, 3(2), 2010.

DeNardis, Laura, The Global War for Internet Governance, New Haven (CT), Yale University Press, 2014.

Dodier, Nicolas, Les hommes et les machines. La conscience collective dans les sociétés technicisées, Paris, Métailié, 1995.

Elkin-Koren, Niva, “Making technology visible: Liability of Internet service providers for peer-to-peer traffic,” New York University Journal of Legislation and Public Policy, 9, 2006: 15–76.

Figueiredo, Renato J., Boykin, P. Oscar, St. Juste, Pierre and Wolinsky, David, “Social VPNs: Integrating overlay and social networks for seamless P2P networking,” Proceedings of the 2008 IEEE 17th Workshop on Enabling Technologies: Infrastructure for Collaborative Enterprises (June 23–25, 2008), Washington (DC), IEEE Computer Society, 2008.

Fuller, Matthew (ed.), Software Studies: A Lexicon, Cambridge (MA), MIT Press, 2008.

Guerrini, Yannick, “Wuala : le P2P comme solution de stockage,” Tom’s Hardware, April 21, 2010, available at <http://www.tomshardware.fr/articles/Wuala-stockage-cloud-P2P,1-3212.html> [accessed 09/11/2016].

Kirschenbaum, Matthew G., “Virtuality and VRML: Software studies after Manovich,” Electronic Book Review, August 29, 2003, available at <http://www.electronicbookreview.com/thread/technocapitalism/morememory> [accessed 11/28/2011].

Latour, Bruno, The Pasteurization of France, A. Sheridan and J. Law (transl.), Cambridge (MA), Harvard University Press, 1988.

Le Fessant, Fabrice, “Les réseaux sociaux au secours des réseaux pair-à-pair,” Défense nationale et sécurité collective, 717, 2009: 29–35.

Lessig, Lawrence,Code and Other Laws of Cyberspace, New York, Basic Books, 1999.

Manovich, Lev, The Language of New Media, Cambridge (MA), MIT Press, 2001.

Marino, Mark C., “Critical code studies,” Electronic Book Review, December 4, 2006, available at <http://www.electronicbookreview.com/thread/electropoetics/codology> [accessed 11/28/2011].

Mitchell, William J., City of Bits: Space, Place and the Infobahn, Cambridge (MA), MIT Press, 1995.

Moglen, Eben, “Freedom in the cloud: Software freedom, privacy and security for Web 2.0 and cloud computing,” paper presented at the Internet Society’s New York Branch conference (ISOC-NY, February 5, 2010).

Mowbray, Miranda, “The Fog over the Grimpen Mire: Cloud computing and the law,” SCRIPTed, 6(1), 2009: 132–146.

Musiani, Francesca, “Ménager le droit à la vie privée, entre anonymat et connaissance de l’identité : les débuts des réseaux sociaux en pair-à-pair,” Terminal, 105, 2010: 107–116.

Musiani, Francesca, “When social links are network links: The dawn of peer-to-peer social networks and its implications for privacy,” Observatorio, 4(3), 2010: 185–207.

Musiani, Francesca, “Caring about the plumbing: On the importance of architectures in social studies of (peer-to-peer) technology” [on line], Journal of Peer Production, 1, 2012, available at <http://peerproduction.net/issues/issue-1/peer-reviewed-papers/caring-about-the-plumbing/> [accessed 09/11/2016].

Musiani, Francesca, Nains sans géants. Architecture décentralisée et services Internet, 2nd edition, Paris, Presses des Mines, 2015.

Reidenberg, Joel R., “Lex informatica: The formulation of Internet policy rules through technology,” Texas Law Review, 76(3), 1997–1998: 553–593, available at <http://ir.lawnet.fordham.edu/faculty_scholarship/42/> [accessed 09/11/2016].

Ribes, David and Lee, Charlotte P., “Sociotechnical studies of cyberinfrastructure and e-research: Current themes and future trajectories,” Computer Supported Cooperative Work, 19(3), 2010: 231–244, available at <http://link.springer.com/article/10.1007%2Fs10606-010-9120-0> [accessed 09/11/2016].

Schaar, Peter, “Privacy by Design,” Identity in the Information Society, 3(2), 2010: 267–274, available at <http://link.springer.com/article/10.1007/s12394-010-0055-x> [accessed 09/11/2016].

Schollmeier, Rüdiger, “A definition of peer-to-peer networking for the classification of peer-to-peer architectures and applications,” Proceedings of the First International Conference on Peer-to-Peer Computing, Washington (DC), IEEE Computer Society, 2001: 101–102, available at <https://www.computer.org/csdl/proceedings/p2p/2001/1503/00/15030101.pdf> [accessed 09/11/2016].

Star, Susan L., “The ethnography of infrastructure,” American Behavioral Scientist, 43(3), 1999: 377–391.

Taylor, Ian J. and Harrison, Andrew, From P2P and Grids to Services on the Web: Evolving Distributed Communities, 2nd edition, London, Springer, 2009.

van Schewick, Barbara, Internet Architecture and Innovation, Cambridge (MA), MIT Press, 2010.

Vinck, Dominique (ed.), Everyday Engineering. An Ethnography of Design and Innovation, Cambridge (MA), MIT Press, 2003.

Notes

1 Yannick Guerrini, “Wuala : le P2P comme solution de stockage,” Wuala, April 21, 2010, available at <http://www.tomshardware.fr/articles/Wuala-stockage-cloud-P2P,1-3212.html> [accessed 09/11/2016].

2 Miranda Mowbray, “The Fog over the Grimpen Mire: Cloud computing and the law,” SCRIPTed, 6(1), 2009: 132–146.

3 The name is fictitious (“light rain”) and recalls the fragmentation and the distribution of data in the system’s storage mechanism. The developers’ names are pseudonyms, as well.

4 Philip E. Agre, “Peer-to-Peer and the promise of Internet equality,” Communications of the ACM, 46(2), 2003: 39–42; Rüdiger Schollmeier, “A definition of peer-to-peer networking for the classification of peer-to-peer architectures and applications,” Proceedings of the First International Conference on Peer-to-Peer Computing, Washington (DC), IEEE Computer Society, 2001: 101–102, available at <https://www.computer.org/csdl/proceedings/p2p/2001/1503/00/15030101.pdf> [accessed 09/11/2016]; Ian J. Taylor and Andrew Harrisson, From P2P and Grids to Services on the Web: Evolving Distributed Communities, 2nd edition, London, Springer, 2009.

5 Madeleine Akrich, “De la position relative des localités. Systèmes électriques et réseaux socio-politiques,” Cahiers du Centre d’Études pour l’Emploi, 32, 1989, available at <https://halshs.archives-ouvertes.fr/halshs-00081709/document> [accessed 09/11/2016].

6 Nicolas Dodier, Les hommes et les machines. La conscience collective dans les sociétés technicisées, Paris, Métailié, 1995.

7 Susan L. Star, “The ethnography of infrastructure,” American Behavioral Scientist, 43(3), 1999: 339.

8 Bruno Latour, The Pasteurization of France, A. Sheridan and J. Law (transl.), Cambridge (MA), Harvard University Press, 1988: 229.

9 William J. Mitchell, City of Bits: Space, Place and the Infobahn, Cambridge (MA), MIT Press, 1995.

10 Joel R. Reidenberg, “Lex informatica: The formulation of Internet policy rules through technology,” Texas Law Review, 76(3), 1997–1998: 553–593, available at <http://ir.lawnet.fordham.edu/faculty_scholarship/42/> [accessed 09/11/2016].

11 Yochai Benkler, “Sharing nicely: On shareable goods and the emergence of sharing as a modality of economic production,” The Yale Law Journal, 114(2), 2004: 273–358.

12 Lawrence Lessig, Code and Other Laws of Cyberspace, New York, Basic Books, 1999.

13 Yochai Benkler, “Sharing nicely…,” op. cit.

14 Lawrence Lessig, Code and Other Laws of Cyberspace, op. cit.

15 Susan Leigh Star, “The ethnography of infrastructure,” op. cit.: 337.

16 Lev Manovich, The Language of New Media, Cambridge (MA), MIT Press, 2001; Matthew Fuller (ed.), Software Studies: A Lexicon, Cambridge (MA), MIT Press, 2008; Mark C. Marino, “Critical code studies,” Electronic Book Review, December 4, 2006, available at <http://www.electronicbookreview.com/thread/electropoetics/codology> [accessed 11/28/2011]; David Ribes and Charlotte P. Lee, “Sociotechnical studies of cyberinfrastructure and e-research: Current themes and future trajectories,” Computer Supported Cooperative Work, 19(3), 2010: 231–244, available at <http://link.springer.com/article/10.1007%2Fs10606-010-9120-0> [accessed 09/11/2016].

17 Matthew G. Kirschenbaum, “Virtuality and VRML: Software studies after Manovich,” Electronic Book Review, August 29, 2003, available at <http://www.electronicbookreview.com/thread/technocapitalism/morememory> [accessed 11/28/2011].

18 Ann Cavoukian, “Privacy by Design: The 7 foundational principles. Implementation and mapping of fair information practices,” Information & Privacy Commissioner, Ontario, Canada, 2006, available at <https://www.privacyassociation.org/media/presentations/11Summit/RealitiesHO1.pdf> [accessed 09/11/2016]; Id., “Special Issue: Privacy by Design: The next generation in the evolution of privacy,” Identity in the Information Society, 3(2), 2010; Peter Schaar, “Privacy by Design,” Identity in the Information Society, 3(2), 2010: 267–274, available at <http://link.springer.com/article/10.1007/s12394-010-0055-x> [accessed 09/11/2016].

19 Susan L. Star, “The ethnography of infrastructure,” op. cit.

20 Janet Abbate, “L’histoire de l’Internet au prisme des STS,” Le Temps des médias, 18, 2012: 170–180.

21 Laura DeNardis, The Global War for Internet Governance, New Haven (CT), Yale University Press, 2014.

22 Francesca Musiani, Nains sans géants. Architecture décentralisée et services Internet, 2nd edition, Paris, Presses des Mines, 2015.

23 Ibid.

24 Unless otherwise noted, citations are derived from interviews with the developers of Drizzle, conducted within a period of online and “live” ethnography of Drizzle’s development, design and innovation process [see Dominique Vinck, Everyday Engineering. An Ethnography of Design and Innovation, Cambridge (MA), MIT Press, 2003] between 2010 and 2011 (see also “Methodology” section above).

25 This roughly implies that, retrieved independently, those fragments have no meaning of their own.

26 See Madeleine Akrich, “De la position relative des localités…,” op. cit.

27 Fabrice Le Fessant, “Les réseaux sociaux au secours des réseaux pair-à-pair,” Défense nationale et sécurité collective, 717, 2009: 29–35; Francesca Musiani, “Ménager le droit à la vie privée, entre anonymat et connaissance de l’identité : les débuts des réseaux sociaux en pair-à-pair,” Terminal, 105, 2010: 107–116; Id., “When social links are network links: The dawn of peer-to-peer social networks and its implications for privacy,” Observatorio, 4(3), 2010: 185–207.

28 danah m. boyd and Nicole B. Ellisson, “Social network sites: Definition, history, and scholarship,” Journal of Computer-Mediated Communication, 13(1), 2008: 210–230.

29 danah m. boyd, “Facebook’s privacy trainwreck: Exposure, invasion, and social convergence,” Convergence, 14(1), 2008: 13–20.

30 Alessandro Acquisti and Ralph Gross, “Imagined communities: Awareness, information sharing, and privacy on the Facebook,” in G. Danezis and P. Golle (eds.), Privacy Enhancing Technologies (6th International Workshop, PET 2006, Cambridge, UK, June 28–30, 2006), Berlin, Springer, 2006: 36–58; Niva Elkin-Koren, “Making technology visible: Liability of Internet service providers for peer-to-peer traffic,” New York University Journal of Legislation and Public Policy, 9, 2006: 15–76.

31 E.g. Michael Arrington, “Facebook privacy issue won’t die” [on line], TechCrunch, November 26, 2007, available at <http://techcrunch.com/2007/11/26/facebook-privacy-issue-wont-die/> [accessed 09/11/2016].

32 See also Eben Moglen, “Freedom in the cloud: Software freedom, privacy and security for Web 2.0 and cloud computing,” paper presented at the Internet Society’s New York Branch conference (ISOC-NY, February 5, 2010); Philippe Aigrain, “Declouding freedom: Reclaiming servers, services and data,” paper presented at the 3rd edition of the 2020 FLOSS Roadmap (Paris, September 30–October 1, 2010), available at <https://flossroadmap.co-ment.com/text/NUFVxf6wwK2/view/> [accessed 09/11/2016]; Id., “Another narrative: Addressing research challenges and other open issues session,” paper presented at the 2nd PARADISO conference (Brussels, September 7–9, 2011).

33 Renato J. Figueiredo et al., “Social VPNs: Integrating overlay and social networks for seamless P2P networking,” Proceedings of the 2008 IEEE 17th Workshop on Enabling Technologies: Infrastructure for Collaborative Enterprises (June 23–25, 2008), Washington (DC), IEEE Computer Society, 2008.

34 Renato J. Figueiredo et al., “Social VPNs…,” op. cit.; Francesca Musiani, “When social links are network links…,” op. cit.

35 Ann Cavoukian (ed.), “Special Issue: Privacy by Design: The next generation in the evolution of privacy,” Identity in the Information Society, 3(2), 2010; Peter Schaar, “Privacy by Design,” op. cit.

36 I.e. that the degree of redundancy of the fragments is such that there are enough fragments in the system at any time to be able to recompose the file.

37 Francesca Musiani, Nains sans géants…, op. cit.

38 Francesca Musiani, “Caring about the plumbing: On the importance of architectures in social studies of (peer-to-peer) technology” [on line], Journal of Peer Production, 1, 2012, available at <http://peerproduction.net/issues/issue-1/peer-reviewed-papers/caring-about-the-plumbing/> [accessed 09/11/2016].

39 Madeleine Akrich, “De la position relative des localités…,” op. cit.

40 Nicolas Dodier, Les hommes et les machines…, op. cit.

41 Michel Callon, “Sociologie de l’acteur-réseau,” in M. Akrich, M. Callon and B. Latour, Sociologie de la traduction. Textes fondateurs, Paris, Presses des Mines, 2006: 267–276.

42 Philippe Aigrain, “Declouding freedom…,” op. cit.

Table des illustrations

Titre Figure 1. Schematic representation of a hybrid P2P storage network
Crédits Source: https://bradlaura.wordpress.com/​2010/​04/​06/​file-sharing/​
URL http://books.openedition.org/editionsmsh/docannexe/image/9070/img-1.jpg
Fichier image/jpeg, 72k
Titre Figure 2. “Reciprocal technical trust” as summarized on Drizzle’s blog
URL http://books.openedition.org/editionsmsh/docannexe/image/9070/img-2.jpg
Fichier image/jpeg, 100k

Auteur

Francesca Musiani (PhD, MINES ParisTech, 2012) is Associate Research Professor (chargée de recherche), French National Centre for Scientific Research (CNRS), Institute for Communication Sciences (ISCC-CNRS/Paris-Sorbonne/UPMC), associate researcher at the Centre for the Sociology of Innovation of MINES ParisTech-PSL, and academic editor for the Internet Policy Review. Her current research focuses on science and technology studies approaches to Internet governance.

Le texte et les autres éléments (illustrations, fichiers annexes importés) sont sous Licence OpenEdition Books, sauf mention contraire.

Acheter

Rechercher dans OpenEdition Search

Vous allez être redirigé vers OpenEdition Search