De la dépêche secrète aux Crypto Wars (brève histoire politique du chiffrement)

Ce texte a été publié dans le numéro 365 de La Chronique, le magazine d’Amnesty International (avril 2017).

L’acuité des débats contemporains autour du droit au chiffrement de nos communications et données numériques pourrait faire oublier qu’il renvoie en réalité d’un droit ancien. Il est en effet indéfectiblement lié au secret des correspondances, lui-même protégé depuis le milieu du XVIIIème siècle. Quant à la cryptologie – la « science du secret » –, elle est un art plus ancien encore. Et contrairement aux idées reçues, il semble qu’elle ait de tout temps fait l’objet d’usages triviaux et populaires. Certes, ces derniers furent longtemps marginaux, puisque l’écriture fut pendant longtemps essentiellement réservée aux élites. Mais l’un des plus vieux documents connus adoptant un système de codage est une tablette en argile datant de l’Antiquité retrouvée en Irak, dans laquelle le potier avait dissimulé ses techniques de fabrication en jouant sur les consonnes.

À partir du XVIème siècle, le développement concomitant de l’imprimerie et des postes aboutit à la prolifération de traités sur l’art du chiffrement, non seulement pour les empereurs, espions et diplomates, mais aussi pour les commerçants et hommes d’affaires, pour les savants ou même pour les correspondances intimes, avec la volonté d’échapper aux « cabinets noirs » chargés de la surveillance royale, ou plus simplement aux regards indélicats. Il est aussi, déjà, un outil prisé des dissidents politiques. On sait par exemple que les « pères fondateurs » des États-Unis comme Benjamin Franklin, James Madison ou Thomas Jefferson codaient leurs correspondances. C’est d’ailleurs dans un courrier partiellement codé que, le 27 mai 1789, Madison exposera à Jefferson son idée d’ajouter un Bill of Rights à la constitution américaine.

C’est avec l’arrivée de la télégraphie, et l’ouverture progressive de son utilisation au grand public dans la seconde moitié du XIXème siècle, que la cryptographie commence réellement à se démocratiser. Pour les usagers du télégraphe, il s’agit notamment de protéger son intimité vis-à-vis des techniciens chargés de la transmission des messages. Que dit alors la loi ? En France, ce qu’on appelle alors les « dépêches secrètes » (ou « inintelligibles ») sont autorisées, à condition d’être rédigées en signes romains ou en chiffres arabes. Ces codes sont supposés être facilement déchiffrables. Surtout, les bureaux gardent trace de toute communication – ce qu’on appellerait aujourd’hui les « métadonnées ». Un directeur des transmissions télégraphiques à Versailles loue ainsi, dans un écrit de 1870, la contribution du télégraphe à l’ordre public : « la télégraphie réalise pour la sécurité publique l’idéal de M. Vidocq, de terrible mémoire ».

À partir des années 1970, avec l’arrivée des premiers réseaux informatiques, la tension entre confidentialité et surveillance des communications monte d’un cran. À l’époque, la NSA lance un appel à projet pour le développement d’un standard en matière de chiffrement, qu’elle cherchera délibérément à affaiblir pour être en mesure de casser plus facilement les codes. Cet épisode suscite alors l’intérêt de chercheurs en mathématiques qui, au gré de leurs travaux, vont sortir la cryptographie moderne de son giron militaire et poser les bases théoriques de sa démocratisation à l’ère numérique.

Le génie est sorti de sa bouteille. À la toute fin des années 1980, au sein de la mouvance des « Cypherpunks », des militants passionnés d’informatique font de la cryptographie un art ouvertement contestataire. Elle n’est plus seulement un moyen de déjouer la surveillance du peuple par les États, mais aussi l’outil par lequel il devient possible de systématiser les fuites de documents classifiés, et donc d’opérer une remise en cause radicale des secrets d’État. La mailing list éponyme des Cypherpunks sera d’ailleurs l’une des matrices intellectuelles du jeune Julian Assange, au début des années 1990.

Alors qu’Internet est en passe de se démocratiser, les agences de renseignement et de police tentent toutefois de préserver leur mainmise sur ces techniques, déclenchant un conflit politique et juridique resté dans les mémoires comme la première « Crypto War ». En 1993, le New York Times révèle ainsi que la NSA veut s’octroyer la capacité de lire l’ensemble des données et communications informatiques, au travers d’une puce de chiffrement dotée d’une porte dérobée. Aux États-Unis tout comme en France, le droit est également mobilisé pour empêcher la diffusion sur les réseaux de logiciels de chiffrement, au travers du régime applicable à l’exportation des « biens à double usage » (aux applications à la fois civiles et militaires). C’est grâce à la mobilisation des premiers mouvements de défense des libertés publiques dans l’environnement numérique – mais aussi en raison de l’influence du secteur privé qui a besoin du chiffrement pour développer le commerce électronique – que ces projets seront tenus en échec. Le droit applicable sera d’ailleurs progressivement libéralisé dans la deuxième moitié des années 1990.

Pourtant, en dépit de ces avancées juridiques, la problématique du chiffrement est longtemps restée l’apanage d’une poignée d’experts. Il a fallu les révélations d’Edward Snowden pour que s’opère une véritable prise de conscience quant à l’importance de ces pratiques. Depuis 2013, de nombreux déploiements techniques ont en effet permis de démocratiser l’exercice de ce droit essentiel à la défense de la vie privée et de la liberté de communication. Un droit aujourd’hui menacé par ceux de nos gouvernants qui voudraient continuer de se livrer à une surveillance massive d’Internet.

Alt. vs. Ctrl.: Thinking About Alternative Internets

Editorial notes for the Journal of Peer Production #9 on Alternative Internets

By Félix Tréguer, Panayotis Antoniadis, Johan Söderberg

The hopes of past generations of hackers weigh like a delirium on the brains of the newbies. Back in the days when Bulletin Board Systems metamorphosed into the Internet, the world’s digital communications networks – hitherto confined to military, corporate and elite academic institutions – were at grasping reach of ordinary individuals. To declare the independence of the Internet from nation states and the corporate world seemed like no more than stating the bare facts. Even encrypted communication – the brainchild of military research – had leaked into the public’s hands and had become a tool wielded against state power. Collectives of all stripes could make use of the new possibilities offered by the Web to bypass traditional media, broadcast their own voice and assemble in new ways in this new public sphere. For some time, at least, the Internet as a whole embodied “alternativeness.”

Already by the mid-nineties, however, states began to reshape the communications infrastructure into something more manageable. Through a series of international treaties, legislations and market developments, ownership over this infrastructure was concentrated to a few multinational companies (McChesney, 2013). On top of this legal and technical basis, a new breed of informational capitalism sprang up, where value is siphoned from deterritorialized “open” flows (Fuchs, 2015). Meanwhile, the ecological footprint of communication technologies has come to represent a formidable challenge (Flipo et al., 2013).

It is in the light of these transformations that the emancipatory promises inherited from the 1980s and 1990s must be assessed. With every new wave of high-tech products, these promises have been renewed. For instance, when WiFi-antennas were rolled out in the 2000s, community WiFi-activists hoped to rebuild the communications infrastructure bottom-up (Dunbar-Hester, 2014). With the advent of crypto-currencies, some claimed to believe that bankers’ control over global currency flows would be demolished (Karlström, 2014). The technology at hand might be new, but the storyline bundled with it is made up of recycled materials. It basically says: “Technology x has leveled the playing field, now individuals can outsmart the combined, global forces of state and capital.”

Underlying this claim is a grander narrative about (information) technology as the harbinger of a brighter future. Although progressivism goes all the way back to the Scientific Revolution, it was given a particular, informational twist during the Cold War. In the 1950s and 1960s, disillusioned US Trotskyists – most notably among them Daniel Bell – rebranded historical materialism as the post-industrialism hypothesis. With this remake of hist-mat, history did no longer culminate in socialism, but in a global consumer village. Furthermore, the motor of transition was not class struggle anymore, but the inert development of technology (Barbrook, 2007). Though a spark of conflict has of course survived in the post-industrial hypothesis, this technological determinism flares up anew every time hackers and Internet activists rally behind, say, the inevitable demise of copyright or the awaiting triumph of decentralised communication networks (Söderberg, 2013). Determinism is performative, and never more so than when it is mobilized in political struggles.

This observation points to the instability of the meanings invested in computers and in the Internet itself. It suffices to recall the twin roots of these technologies, one in the military-industrial complex (Agar, 2003, Edwards, 1996), the other in the counter-culture and peace movement (Turner, 2006; 2013). The same undecidedness prevails today, as exemplified by the global controversies unleashed by NSA whistleblower Edward Snowden. The documents leaked by Snowden revealed the extent to which communications surveillance has been built into the pipes of a supposedly flat network, giving rise to unprecedented mobilisations aimed at resisting it. But paradoxically, this wave of resistance is now leading to the legalisation of mass surveillance (Tréguer, 2016). Because of these persistent ambiguities, it would be as wrong to denounce the inherent oppressiveness of the Internet as it would be to celebrate the alternative essence of this technology. Either position amounts to the same thing: A foreclosing of the struggle in which the future meaning of the technology is determined. Both Alt. and Ctrl. are possible and competing scenarios. They evolve in constant interaction.

How can we, as scholars and/or activists, sort out this complexity and make an assessment of the balance of forces, while reinvigorating hope for the future? Can we learn from the past to ward off the eternal return of a dystopian future?  Posing these questions – and perhaps contributing to answers – is the task that we have set for ourselves in this special issue of Journal of Peer Production on “alternative Internets.”

If the meaning of the “Internet” is instable, then the definition of “alternative” in “alternative Internets” is even more so. Alternativeness is never an absolute. It is relative to something else, the non-alternative, which must also be defined. In this respect, Paschal Preston notes that alternative Internets were found in online applications that “manage to challenge and resist domination by commercial and other sectional interests”, in particular those “operating as alternative and/or minority media for the exchanges of news and commentary on political and social developments which are marginalized in mainstream media and debates” (Preston, 2001). Likewise, Chris Atton writes that alternative Internets are “produced outside the forces of market economics and the state” (Atton, 2003). As seen from these rather conventional definitions, alternativeness is measured in distance from the centres of state and capital.

How can we move past the couple of “useful others” (the state, the market) to better grasp alternativeness? The tools, applications and media that form part of the Internet can be assessed as composites made up of different dimensions. Some important parameters include the underlying funding and economic models, the governance schemes for taking decisions and allocating tasks, or the modes of production. Nick Couldy puts emphasis on this later dimension when discussing alternative online media, stressing that the most important for them is to challenge big corporate mass media by overcoming ‘‘the entrenched division of labour (producers of stories vs. consumers of stories)” (Couldry, 2003:45).

Another crucial line of inquiry for evaluating an alternative Internet relates to the underlying content or ideology that it circulates. For Sandoval and Fuchs, this is the most important dimension, and anything claiming to be alternative must adopt a critical stance to “try to contribute to emancipatory societal transformation” and “question dominative social relations” (Sandoval & Fuchs, 2009). When we consider the Internet, ideology is found in the values that underly the design of a technology or application, structure its uses or populate the online social space that this application brings about.

Of course, ideology is also embedded in the discourses and practices of the many actors trying to influence its development at the technical, social or legal level. The Internet is indeed a social space made up of a myriad of contentious actors such as hackers, software developers and makers who hack, code and make, of advocacy groups with their value-ridden proclamations and legalese, of Internet users making claims to an enlarged citizenship, and of course of all the entrepreneurs, crooks, bureaucrats, agents provocateurs and politicians they fight against or – less often – coalise with. All of these actors produce, use or advocate for particular technologies, fight against or encourage dystopic trends, work towards or oppose emancipatory projects, and in doing so produce political discourses and imaginaries that weigh on the social construction of the Internet. As such, they are part of our field of inquiry when we talk about “alternative Internets.” Their own contradictions further complicate the analysis. A protagonist might go to bed as a subversive hacker but wake up the next day as a piece-rate worker in someone else’s pension plan, or worse.

This speaks to the more general fact that a socio-technical dispositif that is “alternative” on one level tends to be preconditioned by status quo on some other level. For instance, openness in terms of software licenses often comes hand in hand with a closure in terms of technical expertise. To put it in more general terms, the alternative, if it is to be effective, is necessarily compromised by the dominant. Here as elsewhere, a maximising strategy is paralysing: As the proverb goes,  “the perfect is the enemy of the good.” In this spirit, Marisol Sandoval and Christian Fuchs have argued for “politically effective alternative media that in order to advance transformative political can include certain elements of capitalist mass media” (Sandoval & Fuchs, 2009:147). According to the authors, subscription fees or even advertising might be required if a project is to break out of the niche to reach a broader audience. Assessing trade-offs is part of the alternative game.

In this issue of the JoPP, we present contributions that explore these questions and shed light on the blind spots of alternative Internets.

With “In Defense of the Digital Craftsperson,” James Losey and Sascha D. Meinrath offer a conceptual framework for analyzing control in Internet technical architectures along five dimensions: networks, devices, applications/services, content, and data. By updating prior analysis regarding threats to communicational autonomy and to the ability to tinker with digital technologies, they identify key challenges and help think systematically about strategies of resistance.

Stefano Crabu, Federica Giovanella, Leonardo Maccari, and Paolo Magaudda consider the bottom of the “network” layer of Losey and Meinrath’s framework by analyzing offering an interdisciplinary perspective on Ninux, a network of  wireless community networks in Italy. Their paper, “Hacktivism, Infrastructures and Legal Frameworks in Community Networks: the Italian Case of Ninux.org“, benefits from the active participation of one of the authors in Ninux, and presents interesting evidence about the limited levels of decentralization in a network built exactly around this vision. It is also one of the very few papers that brings insights on the legal aspects of community networks, focusing on the question of liability and different organizational forms that can protect these networks against legal actions.

Christina Haralanova and Evan Light offer an insider’s look at a much smaller community network in Montreal, called Réseau Libre. In their paper entitled “Enmeshed Lives? Examining the Potentials and the Limits in the Provision of Wireless Networks,” they try to understand two other important contradictions in community networks. First, they examine their possible role as both an “alternative Internet provider,” as well as an “alternative to the Internet all together,” that is to say a local infrastructure providing local services for the members of the network. They also identify the lack of adequate security against surveillance, despite the fact that many people cite enhanced privacy and security options as a reason for their participation in the community. As the paper shows, even though they might foster knowledge-sharing around issues such as computer security, these networks remain “as insecure as the Internet itself.”

The paper “Going Off-the-Cloud: The Role of Art in the Development of a User-Owned & Controlled Connected World” by Daphne Dragona and Dimitris Charitos also explores various alternatives of user-owned network infrastructures, this time focusing on an “alternative to the Internet all together”, imagined and experimented by artists and activists. The scale here is much smaller, with most networks comprised by a single wireless router acting as a hotspot allowing only local interactions between those in physical proximity. Such “off-the-cloud” networks, have been given numerous telling names like Netless, PirateBox, Occupy here, Hot probs, Datafield, Hive networks, Autonomous Cube. According to the authors, these and many more similar inspiring projects work towards “new modes of organization and responsibility (…) beyond the sovereignty of the cloud.”

In “Gesturing Towards ‘Anti-Colonial Hacking’ and its Infrastructure,” Sophie Toupin draws on a historical example to investigate the opportunities and limitations for appropriating cryptography today. Her interviews with some of the key actors in this glorious moment of hacker politics is particularly inspiring,  as is Toupin’s willingness to expand our understanding of “hacktivism” by looking beyond Europe and North America.

Primavera De Filippi’s piece focuses on “The Interplay between Decentralization and Privacy,” using blockchain technologies as a case-study. She shows that while decentralized architectures are often key to the design of alternative Internets, they come with important challenges with regards to privacy protection. Her critical assessment is particularly timely, as blockchain technologies are rapidly co-opted by the bureaucratic organizations there were originally meant to subvert.

InFinding an Alternate Route: Circumventing Conventional Models of Agricultural Commerce and Aid,” Stephen Quilley, Jason Hawreliak and Katie Kish present a case study on Open Source Ecology (OSE). OSE started in the United States but has sprouted similar initiatives in Europe and South America. It is now developing a series of open source industrial machines and publishes the designs online. One of the primary goals of OSE is to provide collaboratively produced blueprints for relatively inexpensive agricultural machinery, such as tractors, backhoes, and compressed earth brick presses for constructing buildings. The authors argue that the proliferation of open source networks can reshape domains that have traditionally relied on state and inter-state actors such as international aid.

Lastly, Melanie Dulong de Rosnay’s experimental text on “Alternative Policies for Alternative Internets” raises awareness on the importance of the terms of use of Internet platforms. By quoting numerous such policies – from both mainstream and alternative platforms – on topics like copyright or data protection, she manages to create a diverse mix of feelings, all the way from anger to laughter. Most importantly, this collection warns us about the legal issues that alternative platforms have to deal with, and provides inspiration and useful information on how to address them in practice.

Each of these papers addresses one or more of these “layers” described by Losey and Meinrath, analysing different facets of alternativeness. But there are other dimensions outside this framework that we have not touched upon. For instance, the staggering ecological impact of Internet technologies and their environmental unsustainability is not addressed, despite the growing attention of scholars and engineers to these crucial issues (Chen, 2016). Although two papers focus on urban community networks, other aspects of the urban dimension of alternative Internets are overlooked.  Together with the notion of locality, urbanity appears to be crucial in helping actualise the potential of alternative Internets to become autonomous infrastructures operating outside the commercial Internet. It is also an avenue to think about resistance strategies: As the urban space becomes increasingly hybrid and renders the digital and physical evermore intertwined, those movements fighting for the “right to the city” (Lefebvre, 1996) and those working towards the “right to the Internet” will have renewed opportunities to join forces (Antoniadis & Apostol, 2014).

For sure, advancing alternative Internets will require from a very diverse set of actors to go beyond traditional boundaries so as to engage in effective collaboration. In academia too, transdisciplinary collaboration – though still in its infancy – is extremely promising. We hope that this issue of the JoPP will be read as an invitation to work further in that direction.

As editors, we would like to thank Bryan Hugill for helping us copy-edit the papers, and express our gratitude to both authors and reviewers. We hope that readers will be as inspired as we are by these very diverse contributions, which each in their own ways point towards a more democratic and more inclusive Internet.

References

Agar, J. (2003). The Government Machine: A Revolutionary History of the Computer, MIT Press.

Antoniadis, P. & Apostol, I. (2014). The right to the hybrid city and the role of DIY networking, Journal of Community Informatics, special issue on Community Informatics and Urban Planning, vol. 10.

Atton, C. (2005). An Alternative Internet: Radical Media, Politics and Creativity, University Press.

Barbrook, R. (2007). Imaginary Futures: From Thinking Machines to the Global Village, Pluto Press.

Chen, J. (2016). “A Strategy for Limits-aware Computing”, LIMITS’16, Irvine, California, June 9th.

Dunbar-Hester, C. (2014). Low Power to the People: Pirates, Protest, and Politics in FM Radio Activism, MIT Press.

Flipo, F., Dobré, M, & Michot, M. (2013). La Face cachée du numérique. L’impact environnemental des nouvelles technologies, L’Échappée.

Fuchs, C. (2015). Culture and Economy in the Age of Social Media, New York: Routledge.

Edwards, P. (1996). The Closed World: Computers and the Politics of Discourse in Cold War America, MIT Press.

Karlström, H. (2014). “Do Libertarians Dream of Electric Coins? The Material Embeddedness of Bitcoin”, Scandinavian Journal of Social Theory, 15 (1) p.23-36.

Lefebvre, H. (1996 [1968]). “The right to the city”. In H. Lefebvre (auth), E. Kofman & E. Lebas (Eds.), Writings on Cities, Blackwell, pp. 63-184.

Preston, P. (2001). Reshaping Communications: Technology, Information and Social Change, SLE Pound.

Sandoval, M. & Fuchs, C. (2010). “Towards a Critical Theory of Alternative Media”, Telemat. Inf. 27(2), May 2010, pp. 141–150.

Söderberg, J. (2013). “Determining Social Change: The Role of Technological Determinism in the Collective Action Framing of Hackers”, New Media & Society. 15(8) pp. 1277–1293.

Tréguer, F. (2016). “From Deep State Illegality to Law of the Land: The Case of Internet Surveillance in France”, 7th Biennial Surveillance & Society Conference, Barcelona, April 20th.

Turner, F. (2006).  From Counterculture to Cyberculture: Stewart Brand, the Whole Earth Network, and the Rise of Digital Utopianism, University of Chicago Press.

——— (2013). The Democratic Surround: Multimedia & American Liberalism From World War II to the Psychedelic Sixties, University of Chicago Press.

 

Supporting Community Networks Through Law and Policy

This post was first published on the website of the netCommons project.

During the workshop on community networking infrastructures held in Barcelona on June 17th, 2016, I talked about the regulatory hurdles faced by Community Networks (CNs) in Europe, as well as a few potential solutions. Here is the long version of the talk…

For the most part, the following is based on research conducted for an article I co-authored with Primavera De Filippi after interviewing several leading community networks in Europe that use wireless networks to provide Internet connectivity to their members, such as Guifi.net, Freifunk, Ninux or Tetaneutral.net. As we at netCommons start looking into the legal landscape surrounding community networks, I thought it would be useful to provide an updated version as a starting point for our research.

Regulation creates hurdles for Community Networks

First, it is clear that despite their potential in fostering public interest goals in telecom policy, policy-makers have so far failed to support the efforts of community networks. More often than not, public policy actually puts important hurdles on their way by focusing solely on the needs of big incumbent players.

Exclusion from public networks

The most striking example of such hurdles is the fact that several community ISPs have been precluded from using public broadband networks funded with taxpayers money. In France for instance, many local governments invested in rolling-out fiber networks in both urban and rural areas. These networks are built and managed by a private company contracted by the public authority, a company which then lease access to traditional access providers. ISPs then sell their Internet access offers to subscribers. Yet, the fee charged to access the network is designed for big commercial ISPs, and is often much too prohibitive for nonprofit community networks. Several French community ISPs in the Federation FDN have been unable to afford such fees, and are thus being denied access on a preferential basis.

There is also an issue of transparency and access to public sector information. In at least one reported case, the network operator even refused to communicate its price listing to an interested CN. In a neighboring country, a community ISPs similarly underlined “the lack of collaboration with public administrations” in securing access to landline infrastructure.

Short-term policies: the case of radio spectrum

Another connected issue in current telecom policies are short-term policies, many of which can be linked to the issue of what economists call “regulatory capture,” that is the fact that regulators and policy-makers listen to and serve those they are supposed to regulate and who have the resources to develop full-fledged lobbying strategies.

Let’s look at the issue of spectrum management. Here, as in many other areas, regulatory capture by commercial interests leads to regulatory choices that systematically overlook the potential of more flexible and citizen-centric policies. The allocations the so-called “digital dividend” (i.e. the frequencies left vacant by the switch from analog to digital television) is a textbook case. In France for instance, it was proposed to use part of the spectrum dividend to create new digital TV channels and develop mobile television as well as digital radio (neither of these two technologies has taken off thus far). The remaining half of these “golden frequencies” of the UHF band (sought-after for their long-range propagation) was then auctioned off to telecom operators for their 4G mobile Internet access offers (the lucrative license auctioning took place between October 2011 and January 2012 and brought €3,5bn to the French state). Similar policies have been devised in other European countries.

In the process, one option has never been seriously considered: extending “unlicensed” access to some of these frequencies – that is, effectively turning them into a commons open for all to use, regardless of public, private or non-profit status. Long thought to be unreasonable because of the risk of radio interferences, opening up the spectrum to multiple, non-coordinated radio users has actually been experimented on a worldwide basis more than a decade ago for the WiFi frequencies. Needless to say, it has proved to be a very wise policy choice. At the time, those frequencies were referred to as “junk bands,” because few actually thought they could have valuable applications. Today, they support about half of the Internet communications worldwide. Even exclusive licensees in the telecom sector providing Internet access over 3G and 4G increasingly resort to WiFi’s open spectrum to offload their Internet traffic.

In many regards, though property-based allocations of spectrum and exclusive licensing still have the upper hand, they have often come short of fostering public interest goals, by creating a very significant underutilization of public resource. Moreover, not only does the regulatory focus on exclusive licensing create an enormous opportunity cost by favoring established players over innovative new-entrants (such as CNs), it has even been argued by human rights NGOs that it may actually breach the international law on freedom of expression.

But despite the successes of WiFi and the fact that, as Yochai Benkler has shown, market adoption favors open spectrum policies, unlicensed access remains marginal. For CNs, this is worrying considering that they are increasingly victims of the rapid growth of WiFi traffic. For instance, Guifi.net and Freifunk report having a hard-time maintaining the quality of their network in urban areas because of the saturation of the 5GHz frequency bands. In some instances, they theoretically would be allowed to use the other portion of spectrum open to unlicensed uses in the 2,4 GHz band; yet, this constitutes a niche market for manufacturers of radio transmitters, and the gear necessary to deploy wireless networks in these bands is simply too costly for them.

Another issue for CNs is linked to the topography of their environment: WiFi bands have some important technical limitations, in particular in terms of propagation, and signals are easily blocked by tall buildings or trees. In such cases, CNs are faced with the choice of either renouncing to create a new radio link in a given location, or push the emission power levels beyond the legal limits to overcome these obstacles. A change in spectrum policy would therefore be much welcome.

New software restrictions on radio equipment

Another regulatory challenge for CNs (and many other actors in the radio field) relates to recent changes in legislation in both the US and the European Union. In the EU, a directive on radio equipment was adopted in 2014, and is currently being transposed at the national level. Article 3.3 of this directive might put in jeopardy the ability to flash radio hardware with “unauthorized” software (unauthorized by the manufacturer that is). As the Free Software Foundation Europe explains in its analysis, this provision “implies that device manufacturers have to check every software which can be loaded on the device regarding its compliance with applicable radio regulations (e.g. signal frequency and strength). Until now, the responsibility for the compliance rested on the users if they modified something, no matter if hardware- or software-wise.”

How does this impact CNs? By shifting the responsibility for legal compliance onto manufacturers, the latter could decide to protect themselves by locking down the device they sell (as is happening in the US). This would prevent CNs from installing custom software on the radio equipment that support their infrastructure. Second, FSFE notes that, anticipating on this legal shift, manufacturers have already “installed modules on their devices checking which software is loaded.” According to the organization, “this is done by built-in non-free and non-removable modules disrespecting users’ rights and demands to use technology which they can control.” There is a fear that such software will evolve towards a built-in spying system checking on the user’s behavior or location, which needless to say runs counter to fundamental rights and more generally to the political values defended by CNs.

In the past few months, in Sweden, France, Germany and elsewhere, radio professionals and hobbyists as well as CNs and digital rights group have urged policymakers to ensure that national transposition texts will clarify that radio hardware must remain open to free software and other forms of technical tinkering. Unfortunately, such last-minute advocacy effort might have come too late.

Towards a public policy for the telecom commons

These hurdles already hint at policy reforms aimed at supporting the development of community networks. Here are few other items that should be put on the agenda.

Lifting unnecessary regulatory burdens

First, there is a range of regulations which make CN’s work and very existence significantly and often unnecessarily difficult. In a country such as Belgium for instance, the registration fee that telecom operators must pay to the NRA is at 676€ for the first registration, plus 557€ every following year (for those whose revenues are below 1M€). In France, Spain or Germany, it is free, which may explain why the movement is much more dynamic in these countries. Registration procedures could therefore be harmonized at the EU level, and ensure that they are free for nonprofit ISPs.

Promoting open WiFi

Second, several laws seek to prevent the sharing of Internet connections amongst several users by making people responsible (and potentially liable) for all communications made through their WiFi connection. This is the case in France, for instance, where the 2009 three-strikes copyright law against peer-to-peer file-sharing (the infamous HADOPI) also introduced a tort for improperly securing one’s Internet connection against the unlawful activity of other users. As a result of such legal rules, many community ISPs who would like to establish open WiFi networks in public spaces, such as parks and streets, refrain from doing so. A case regarding the so-called “secondary liability” of the provider of an open WiFi hostpot currently pending before the EU Court of Justice – the McFadden case – could soon bring useful clarifications (for other liability issues surrounding CNs, see this paper by netCommons researcher Federica Giovanella and this other paper, also from a netCommons researcher, Mélanie Dulong de Rosnay).

Expanding the spectrum commons

Third, as I have already suggested, it is not just Internet wireless access points that can be shared, but also, the intangible infrastructure on which radio signals travel. WiFi, as unlicensed spectrum, is a key asset for CNs willing to set up affordable and flexible last-mile infrastructure, but it is currently very limited. In the US, the FCC has initiated promising policies in that field in the past years. But for the moment, the EU has shied away from similar moves.

Yet, in 2012, the EU adopted its first Radio Spectrum Policy Programme (RSPP). During the legislative process, the EU Parliament voted in favor of ambitious amendments to open the spectrum to unlicensed uses. Even if some of these amendments were later scrapped by national governments, the final text still states for instance that “wireless access systems, including radio local area networks, may outgrow their current allocations on an unlicensed basis. The need for and feasibility of extending the allocations of unlicensed spectrum for wireless access systems, including radio local area networks, at 2,4 GHz and 5 GHz, should be assessed in relation to the inventory of existing uses of, and emerging needs for, spectrum (…).” On mesh networks, it adds that “member states shall, in cooperation with the Commission (…) take full account of (…) the shared and unlicensed use of spectrum to provide the basis for wireless mesh networks, which can play a key role in bridging the digital divide.”

In late-2012, as EU lawmakers were finalising on the RSPP, a study (pdf) commissioned by the EU Commission also called for a new 100 MHz of license-exempt bands (half in the sub 1 GHz bands and the other one at 1,4 GHz) as well as for higher power output limits in rural areas to reduce the cost of broadband Internet access deployment. It also warn against underutilization of current spectrum allocations (by the military, by incumbent operators, etc.). Since then, however, EU work on unlicensed spectrum and more flexible authorization schemes more accessible to community ISPs has stalled. A the national level too, save for a few exceptions, concrete steps have been virtually non-existent.

Opening access to public networks

Fourth – and this also relates to what I was explaining before –, networks built with taxpayers money could also be treated as a commons, and should, as such, remain free from corporate capture. Regulators should ensure that nonprofit community networks can access publicly-funded and subsidized physical infrastructures without unnecessary financial or administrative hurdles. Accordingly, they should review existing policies and current practices in this field, providing transparent information to map publicly-funded networks, and mandate rules to allow for grassroots, nonprofit ISPs to use these on a preferential basis.

Offering targeted, direct public support

Of course, countless other policy initiatives can help support grassroots networks, such as small grants, crowdfunding and subsidies to help these groups buy servers and radio equipment, communicate around their initiative, giving them access to public infrastructures (for instance the roof of a church to install an antenna), but also to support their research on radio transmission, routing methods, softwares or encryption. Like Guifi.net, the most successful of these groups suggest that even little governmental support – either municipal, regional or national – can make a big difference in their ability to successfully accomplish the ambitious objectives they set for themselves.

Inviting CNs to the policy table

But all of these policies point to an overarching issue, namely the need to democratize telecom policy and establish procedures that can institutionalize “subversive rationalization” in this filed. In some countries, regulators have already started to reach out to community networks. In Slovenia, on one occasion, Wlan-SI was asked to contribute to policy discussion on a piece of telecom legislation. In Greece, the Athens Wireless Metropolitan Network has also been invited by the NRA to respond to consultations and in France, FFDN has sometimes been convened to technical meetings. However, save for a few exception (like the Net neutrality provisions introduced in Slovenian law in late 2012), their input has so far never translated into actual policies. Then, in many other countries, such as Italy, even though city councils may occasionally actively support these organizations to the extent that they provide better Internet access to their citizens, regional governments and national regulators have so far largely neglected them. Finally, at the EU level, where much of telecom regulation applicable in Europe is ultimately crafted, community networks are virtually absent of policy debates.

Given the revival of CNs in the past years, it is not enough for regulatory authorities to treat citizens as mere consumers by occasionally inviting consumer organizations at the table. Regulators and policy-makers need to recognize that the Internet architecture is a contested space, and that citizen groups across Europe and beyond show that for the provision of Internet access, commons-based forms of governance are not only possible but that they also represent effective and viable alternatives to the most powerful telecom operators. What is more, their participants have both the expertise and legitimacy to take an integral part in technical and legal debates over broadband policy in which traditional, commercial ISPs are over-represented. They can bring an informed and dissenting view to these debates, and eventually help alleviate regulatory capture and allowing for policy-making to be more aligned with the public interest.

Of course, a potential problem is the fact that these are often run by volunteers whose lack of time and resources may sometimes make it difficult for them to participate as actively as the full-time and well-resourced lobbyists of incumbent operators. But overtime, as the movement grows, it may be able sustain its engagement with public authorities, especially if the latter adapts and establish ad hoc contact channels and remote participation mechanisms.

Twenty years after the privatization of national networks in Europe, there is certainly a long way to go for telecom policy to balance the interests of all various stakeholders. But it is clear that community networks have an important role to play in this process. As we move forward with the netCommons project, I hope we can help the policy debate move in that direction as the EU starts updating its telecom laws.