≅ SiSU Spine search ፨
match [ results: ፨ (index) ⚏ (text / grep) ]  [ limit: 1,000 2,500 ]
echo query search url searched sql statement


comp.os.minix 640, 643, 646,
Linux is obsolete” debate on, 646

Concurrent Versioning System (cvs) 318, 623, 686, 690, 693,
history of, 690,
Linux and, 693

coordination (component of Free Software) 52, 315-321, 616-619, 618, 626-635, 636, 648, 662, 666-716, 679, 697, 710, 728, 743, 748, 756, 813, 824,
individual virtuosity vs. hierarchical planning, 618, 648, 662, 679, 710,
modulations of, 697, 743, 748, 756, 813, 824,
of Linux vs. Minix, 636

forking 397-410, 677, 694, 827,
in Apache, 677,
in Connexions, 827,
in Linux, 694

Free Software Foundation 103, 297-298, 324, 328, 604-612, 614, 629, 644-645, 834-835,
cult status of, 629,
Linux and Apache vs., 614

fun, and development of Linux 623, 645-646

Fun, and development of Linux 697

Linux (Free Software project) 52, 209, 322, 335, 396, 511, 608, 619-627, 630, 649, 651, 684, 691, 693, 736, 805,
origins in Minix, 396, 630,
planning vs. adaptability in, 651,
process of decision making, 649,
Source Code Management tools and, 684,
VGER tree and, 693

ontology 383, 395, 412, 660,
of linux, 660,
of UNIX operating system, 383, 395, 412

VA Linux (corporation) 298, 332-335

Sean Doyle and Adrian Gropper opened the doors to this project, providing unparalleled insight, hospitality, challenge, and curiosity. Axel Roch introduced me to Volker Grassmuck, and to much else. Volker Grassmuck introduced me to Berlin’s Free Software world and invited me to participate in the Wizards of OS conferences. Udhay Shankar introduced me to almost everyone I know, sometimes after the fact. Shiv Sastry helped me find lodging in Bangalore at his Aunt Anasuya Sastry’s house, which is called “Silicon Valley” and which was truly a lovely place to stay. Bharath Chari and Ram Sundaram let me haunt their office and cat-5 cables [PAGE xiv] during one of the more turbulent periods of their careers. Glenn Otis Brown visited, drank, talked, invited, challenged, entertained, chided, encouraged, drove, was driven, and gave and received advice. Ross Reedstrom welcomed me to the Rice Linux Users’ Group and to Connexions. Brent Hendricks did yeoman’s work, suffering my questions and intrusions. Geneva Henry, Jenn Drummond, Chuck Bearden, Kathy Fletcher, Manpreet Kaur, Mark Husband, Max Starkenberg, Elvena Mayo, Joey King, and Joel Thierstein have been welcoming and enthusiastic at every meeting. Sid Burris has challenged and respected my work, which has been an honor. Rich Baraniuk listens to everything I say, for better or for worse; he is a magnificent collaborator and friend.

The fifth component, the practice of coordination and collaboration (chapter 7), is the most talked about: the idea of tens or hundreds of thousands of people volunteering their time to contribute to the creation of complex software. In this chapter I show how novel forms of coordination developed in the 1990s and how they worked in the canonical cases of Apache and Linux; I also highlight how coordination facilitates the commitment to adaptability (or modifiability) over against planning and hierarchy, and how this commitment resolves the tension between individual virtuosity and the need for collective control.

Empirically speaking, the actors in my stories are figuring something out, something unfamiliar, troubling, imprecise, and occasionally shocking to everyone involved at different times and to differing extents. 13 There are two kinds of figuring-out stories: the contemporary ones in which I have been an active participant (those of Connexions and Creative Commons), and the historical ones conducted through “archival” research and rereading of certain kinds of texts, discussions, and analyses-at-the-time (those of UNIX, EMACS, Linux, Apache, and Open Systems). Some are stories of technical figuring out, but most are stories of figuring out a problem that appears to have emerged. Some of these stories involve callow and earnest actors, some involve scheming and strategy, but in all of them the figuring out is presented “in the making” and not as something that can be conveniently narrated as obvious and uncontested with the benefit of hindsight. Throughout this book, I tell stories that illustrate what geeks are like in some respects, but, more important, that show them in the midst of figuring things out—a practice that can happen both in discussion and in the course of designing, planning, executing, writing, debugging, hacking, and fixing.

13.The language of “figuring out” has its immediate source in the work of Kim Fortun, “Figuring Out Ethnography.” Fortun’s work refines two other sources, the work of Bruno Latour in Science in Action and that of Hans-Jorg Rheinberger in Towards History of Epistemic Things. Latour describes the difference between “science made” and “science in the making” and how the careful analysis of new objects can reveal how they come to be. Rheinberger extends this approach through analysis of the detailed practices involved in figuring out a new object or a new process—practices which participants cannot quite name or explain in precise terms until after the fact.

Because the stories I tell here are in fact recent by the standards of historical scholarship, there is not much by way of comparison in terms of the empirical material. I rely on a number of books and articles on the history of the early Internet, especially Janet Abbate’s scholarship and the single historical work on UNIX, Peter Salus’s A Quarter Century of Unix. 16 There are also a couple of excellent journalistic works, such as Glyn Moody’s Rebel Code: Inside Linux and the Open Source Revolution (which, like Two Bits, relies heavily on the novel accessibility of detailed discussions carried out on public mailing lists). Similarly, the scholarship on Free Software and its history is just starting to establish itself around a coherent set of questions. 17

16.In addition to Abbate and Salus, see Norberg and O’Neill, Transforming Computer Technology; Naughton, A Brief History of the Future; Hafner, Where Wizards Stay Up Late; Waldrop, The Dream Machine; Segaller, Nerds 2.0.1. For a classic autodocumentation of one aspect of the Internet, see Hauben and Hauben, Netizens.

17.Kelty, “Culture’s Open Sources”; Coleman, “The Social Construction of Freedom”; Ratto, “The Pressure of Openness”; Joseph Feller et al., Perspectives [pg 315] on Free and Open Source Software; see also http://freesoftware.mit.edu/, organized by Karim Lakhani, which is a large collection of work on Free Software projects. Early work in this area derived both from the writings of practitioners such as Raymond and from business and management scholars who noticed in Free Software a remarkable, surprising set of seeming contradictions. The best of these works to date is Steven Weber, The Success of Open Source. Weber’s conclusions are similar to those presented here, and he has a kind of cryptoethnographic familiarity (that he does not explicitly avow) with the actors and practices. Yochai Benkler’s Wealth of Networks extends and generalizes some of Weber’s argument.

Until the mid-1990s, hacker, geek, and computer nerd designated a very specific type: programmers and lurkers on relatively underground networks, usually college students, computer scientists, and “amateurs” or “hobbyists.” A classic mock self-diagnostic called the Geek Code, by Robert Hayden, accurately and humorously detailed the various ways in which one could be a geek in 1996—UNIX/ Linux skills, love/hate of Star Trek, particular eating and clothing habits—but as Hayden himself points out, the geeks of the early 1990s exist no longer. The elite subcultural, relatively homogenous group it once was has been overrun: “The Internet of 1996 was still a wild untamed virgin paradise of geeks and eggheads unpopulated by script kiddies, and the denizens of AOL. When things changed, I seriously lost my way. I mean, all the ‘geek’ that was the Internet [pg 36] was gone and replaced by Xfiles buzzwords and politicians passing laws about a technology they refused to comprehend.” 25

25.See The Geek Code, http://www.geekcode.com/.

Berlin, November 1999. I am in a very hip club in Mitte called WMF. It’s about eight o’clock—five hours too early for me to be a hipster, but the context is extremely cool. WMF is in a hard-to-find, abandoned building in the former East; it is partially converted, filled with a mixture of new and old furnishings, video projectors, speakers, makeshift bars, and dance-floor lighting. A crowd of around fifty people lingers amid smoke and Beck’s beer bottles, [pg 37] sitting on stools and chairs and sofas and the floor. We are listening to an academic read a paper about Claude Shannon, the MIT engineer credited with the creation of information theory. The author is smoking and reading in German while the audience politely listens. He speaks for about seventy minutes. There are questions and some perfunctory discussion. As the crowd breaks up, I find myself, in halting German that quickly converts to English, having a series of animated conversations about the GNU General Public License, the Debian Linux Distribution, open standards in net radio, and a variety of things for which Claude Shannon is the perfect ghostly technopaterfamilias, even if his seventy-minute invocation has clashed heavily with the surroundings.

Before long, I am talking with Volker Grassmuck, founding member of Mikro and organizer of the successful “Wizards of OS” conference, held earlier in the year, which had the very intriguing subtitle “Operating Systems and Social Systems.” Grassmuck is inviting me to participate in a planning session for the next WOS, held at the Chaos Computer Congress, a hacker gathering that occurs each year in December in Berlin. In the following months I will meet a huge number of people who seem, uncharacteristically for artists [pg 38] and activists, strangely obsessed with configuring their Linux distributions or hacking the http protocol or attending German Parliament hearings on copyright reform. The political lives of these folks have indeed mixed up operating systems and social systems in ways that are more than metaphorical.

Perhaps the most familiar and famous of these wars is that between Apple and Microsoft (formerly between Apple and IBM), a conflict that is often played out in dramatic and broad strokes that imply fundamental differences, when in fact the differences are extremely slight. 59 Geeks are also familiar with a wealth of less well-known “holy wars”: EMACS versus vi; KDE versus Gnome; Linux versus BSD; Oracle versus all other databases. 60

59.The Apple-Microsoft conflict was given memorable expression by Umberto Eco in a widely read piece that compared the Apple user interface [pg 320] to Catholicism and the PC user interface to Protestantism (“La bustina di Minerva,” Espresso, 30 September 1994, back page).

60.One entry on Wikipedia differentiates religious wars from run-of-the-mill “flame wars” as follows: “Whereas a flame war is usually a particular spate of flaming against a non-flamy background, a holy war is a drawn-out disagreement that may last years or even span careers” (“Flaming [Internet],” http://en.wikipedia.org/wiki/Flame_war [accessed 16 January 2006]).

In addition to the obvious pleasure with which they deploy the sectarian aspects of the Protestant Reformation, geeks also allow themselves to see their struggles as those of Luther-like adepts, confronted by powerful worldly institutions that are distinct but intertwined: the Catholic Church and absolutist monarchs. Sometimes these comparisons are meant to mock theological argument; sometimes they are more straightforwardly hagiographic. For instance, a 1998 article in Salon compares Martin Luther and Linus Torvalds (originator of the Linux kernel).

A critical point in the emergence of Free Software occurred in 1998-99: new names, new narratives, but also new wealth and new stakes. “Open Source” was premised on dotcom promises of cost-cutting and “disintermediation” and various other schemes to make money on it (Cygnus Solutions, an early Free Software company, playfully tagged itself as “Making Free Software More Affordable”). VA Linux, for instance, which sold personal-computer systems pre-installed with Open Source operating systems, had the largest single initial public offering (IPO) of the stock-market bubble, seeing a 700 percent share-price increase in one day. “Free Software” by contrast fanned kindling flames of worry over intellectual-property expansionism and hitched itself to a nascent legal resistance to the 1998 Digital Millennium Copyright Act and Sonny Bono Copyright Term Extension Act. Prior to 1998, Free Software referred either to the Free Software Foundation (and the watchful, micromanaging eye of Stallman) or to one of thousands of different commercial, avocational, or university-research projects, processes, licenses, and ideologies that had a variety of names: sourceware, freeware, shareware, open software, public domain software, and so on. The term Open Source, by contrast, sought to encompass them all in one movement.

Fomenting Movements The period from 1 April 1998, when the Mozilla source code was first released, to 1 April 1999, when Zawinski announced its failure, couldn’t have been a headier, more exciting time for participants in Free Software. Netscape’s decision to release the source code was a tremendous opportunity for geeks involved in Free Software. It came in the midst of the rollicking dotcom bubble. It also came in the midst of the widespread adoption of [pg 108] key Free Software tools: the Linux operating system for servers, the Apache Web server for Web pages, the perl and python scripting languages for building quick Internet applications, and a number of other lower-level tools like Bind (an implementation of the DNS protocol) or sendmail for e-mail.

Perhaps most important, Netscape’s decision came in a period of fevered and intense self-reflection among people who had been involved in Free Software in some way, stretching back to the mid-1980s. Eric Raymond’s article “The Cathedral and The Bazaar,” delivered at the Linux Kongress in 1997 and the O’Reilly Perl Conference the same year, had started a buzz among Free Software hackers. It was cited by Frank Hecker and Eric Hahn at Netscape as one of the sources for their thinking about the decision to free Mozilla; Raymond and Bruce Perens had both been asked to consult with Netscape on Free Software strategy. In April of the same year Tim O’Reilly, a publisher of handbooks for Free Software, organized a conference called the Freeware Summit.

All through 1998 and 1999, buzz around Open Source built. Little-known companies such as Red Hat, VA Linux, Cygnus, Slackware, and SuSe, which had been providing Free Software support and services to customers, suddenly entered media and business consciousness. Articles in the mainstream press circulated throughout the spring and summer of 1998, often attempting to make sense of the name change and whether it meant a corresponding change in practice. A front-cover article in Forbes, which featured photos of Stallman, Larry Wall, Brian Behlendorf, and Torvalds (figure 2), was noncommittal, cycling between Free Software, Open Source, and Freeware. 105

105.Josh McHugh, “For the Love of Hacking,” Forbes, 10 August 1998, 94-100.

By December 1999, the buzz had reached a fever pitch. When VA Linux, a legitimate company which actually made something real—computers with Linux installed on them—went public, its shares’ value gained 700 percent in one day and was the single [pg 112] most valuable initial public offering of the era. VA Linux took the unconventional step of allowing contributors to the Linux kernel to buy into the stock before the IPO, thus bringing at least a partial set of these contributors into the mainstream Ponzi scheme of the Internet dotcom economy. Those who managed to sell their stock ended up benefiting from the boom, whether or not their contributions to Free Software truly merited it. In a roundabout way, Raymond, O’Reilly, Perens, and others behind the name change had achieved recognition for the central role of Free Software in the success of the Internet—and now its true name could be known: Open Source.

The movement, as a practice of discussion and argument, is made up of stories. It is a practice of storytelling: affect- and intellect-laden lore that orients existing participants toward a particular problem, contests other histories, parries attacks from outside, and draws in new recruits. 107 This includes proselytism and evangelism (and the usable pasts of protestant reformations, singularities, rebellion and iconoclasm are often salient here), whether for the reform of intellectual-property law or for the adoption of Linux in the trenches of corporate America. It includes both heartfelt allegiance in the name of social justice as well as political agnosticism stripped of all ideology. 108 Every time Free Software is introduced to someone, discussed in the media, analyzed in a scholarly work, or installed in a workplace, a story of either Free Software or Open Source is used to explain its purpose, its momentum, and its temporality. At the extremes are the prophets and proselytes themselves: Eric Raymond describes Open Source as an evolutionarily necessary outcome of the natural tendency of human societies toward economies of abundance, while Richard Stallman describes it as a defense of the fundamental freedoms of creativity and speech, using a variety of philosophical theories of liberty, justice, and the defense of freedom. 109 Even scholarly analyses must begin with a potted history drawn from the self-narration of geeks who make or advocate free software. 110 Indeed, as a methodological aside, one reason it is so easy to track such stories and narratives is because geeks like to tell and, more important, like to archive such stories—to create Web pages, definitions, encyclopedia entries, dictionaries, and mini-histories and to save every scrap of correspondence, every fight, and every resolution related to their activities. This “archival hubris” yields a very peculiar and specific kind of fieldsite: one in which a kind [pg 115] of “as-it-happens” ethnographic observation is possible not only through “being there” in the moment but also by being there in the massive, proliferating archives of moments past. Understanding the movement as a changing entity requires constantly glancing back at its future promises and the conditions of their making.

107.It is, in the terms of Actor Network Theory, a process of “enrollment” in which participants find ways to rhetorically align—and to disalign—their interests. It does not constitute the substance of their interest, however. See Latour, Science in Action; Callon, “Some Elements of a Sociology of Translation.”

108.Coleman, “Political Agnosticism.”

109.See, respectively, Raymond, The Cathedral and the Bazaar, and Williams, Free as in Freedom.

110.For example, Castells, The Internet Galaxy, and Weber, The Success of Open Source both tell versions of the same story of origins and development.

Throughout the 1970s, the low licensing fees, the inclusion of the source code, and its conceptual integrity meant that UNIX was ported to a remarkable number of other machines. In many ways, academics found it just as appealing, if not more, to be involved in the creation and improvement of a cutting-edge system by licensing and porting the software themselves, rather than by having it provided to them, without the source code, by a company. Peter Salus, for instance, suggests that people experienced the lack of support from Bell Labs as a kind of spur to develop and share their own fixes. The means by which source code was shared, and the norms and practices of sharing, porting, forking, and modifying source code were developed in this period as part of the development of UNIX itself—the technical design of the system facilitates and in some cases mirrors the norms and practices of sharing that developed: operating systems and social systems. 133

133.The simultaneous development of the operating system and the norms for creating, sharing, documenting, and extending it are often referred to as the “UNIX philosophy.” It includes the central idea that one should build on the ideas (software) of others (see Gancarz, The Unix Philosophy and Linux and the UNIX Philosophy). See also Raymond, The Art of UNIX Programming.

Minix was not commercial software, but nor was it Free Software. It was copyrighted and controlled by Tanenbaum’s publisher, Prentice Hall. Because it used no AT&T source code, Minix was also legally independent, a legal object of its own. The fact that it was intended to be legally distinct from, yet conceptually true to UNIX is a clear indication of the kinds of tensions that govern the creation and sharing of source code. The ironic apotheosis of Minix as the pedagogical gold standard for studying UNIX came in 1991-92, when a young Linus Torvalds created a “fork” of Minix, also rewritten from scratch, that would go on to become the paradigmatic piece of Free Software: Linux. Tanenbaum’s purpose for Minix was that it remain a pedagogically useful operating system—small, concise, and illustrative—whereas Torvalds wanted to extend and expand his version of Minix to take full advantage of the kinds of hardware being produced in the 1990s. Both, however, were committed to source-code visibility and sharing as the swiftest route to complete comprehension of operating-systems principles.

According to Don Libes, Bell Labs allowed Berkeley to distribute its extensions to UNIX so long as the recipients also had a license from Bell Labs for the original UNIX (an arrangement similar to the one that governed Lions’s Commentary). 144 From about 1976 until about 1981, BSD slowly became an independent distribution—indeed, a complete version of UNIX—well-known for the vi editor and the Pascal compiler, but also for the addition of virtual memory and its implementation on DEC’s VAX machines. 145 It should be clear that the unusual quasi-commercial status of AT&T’s UNIX allowed for this situation in a way that a fully commercial computer corporation would never have allowed. Consider, for instance, the fact that many UNIX users—students at a university, for instance—could not essentially know whether they were using an AT&T product or something called BSD UNIX created at Berkeley. The operating system functioned in the same way and, except for the presence of copyright notices that occasionally flashed on the screen, did not make any show of asserting its brand identity (that would come later, in the 1980s). Whereas a commercial computer manufacturer would have allowed something like BSD only if it were incorporated into and distributed as a single, marketable, and identifiable product with a clever name, AT&T turned something of a blind eye to the proliferation and spread of AT&T UNIX and the result were forks in the project: distinct bodies of source code, each an instance of something called UNIX.

144.Libes and Ressler, Life with UNIX, 16-17.

145.A recent court case between the Utah-based SCO—the current owner of the legal rights to the original UNIX source code—and IBM raised yet again the question of how much of the original UNIX source code exists in the BSD distribution. SCO alleges that IBM (and Linus Torvalds) inserted SCO-owned UNIX source code into the Linux kernel. However, the incredibly circuitous route of the “original” source code makes these claims hard to ferret out: it was developed at Bell Labs, licensed to multiple universities, used as a basis for BSD, sold to an earlier version of the company SCO (then known as the Santa Cruz Operation), which created a version called Xenix in cooperation with Microsoft. See the diagram by Eric Lévénez at http://www.levenez.com/unix/. For more detail on this case, see www.groklaw.com.

Openness and open systems are key to understanding the practices of Free Software: the open-systems battles of the 1980s set the context for Free Software, leaving in their wake a partially articulated infrastructure of operating systems, networks, and markets that resulted from figuring out open systems. The failure to create a standard UNIX operating system opened the door for Microsoft Windows NT, but it also set the stage for the emergence of the Linux-operating-system kernel to emerge and spread. The success of the TCP/IP protocols forced multiple competing networking schemes into a single standard—and a singular entity, the Internet—which carried with it a set of built-in goals that mirror the moral-technical order of Free Software.

The rest of the story is quickly told: Stallman resigned from the AI Lab at MIT and started the Free Software Foundation in 1985; he created a raft of new tools, but ultimately no full UNIX operating system, and issued General Public License 1.0 in 1989. In 1990 he was awarded a MacArthur “genius grant.” During the 1990s, he was involved in various high-profile battles among a new generation of hackers; those controversies included the debate around Linus Torvalds’s creation of Linux (which Stallman insisted be referred to as GNU/Linux), the forking of EMACS into Xemacs, and Stallman’s own participation in—and exclusion from—conferences and events devoted to Free Software. [pg 207]

The Free Software Foundation represents a recognition on his part that individual and communal independence would come at the price of a legally and bureaucratically recognizable entity, set apart from MIT and responsible only to itself. The Free Software Foundation took a classic form: a nonprofit organization with a hierarchy. But by the early 1990s, a new set of experiments would begin that questioned the look of such an entity. The stories of Linux and Apache reveal how these ventures both depended on the work of the Free Software Foundation and departed from the hierarchical tradition it represented, in order to innovate new similarly embedded sociotechnical forms of coordination.

The final component of Free Software is coordination. For many participants and observers, this is the central innovation and essential significance of Open Source: the possibility of enticing potentially huge numbers of volunteers to work freely on a software project, leveraging the law of large numbers, “peer production,” “gift economies,” and “self-organizing social economies.” 261 Coordination in Free Software is of a distinct kind that emerged in the 1990s, directly out of the issues of sharing source code, conceiving open systems, and writing copyright licenses—all necessary precursors to the practices of coordination. The stories surrounding these issues find continuation in those of the Linux operating-system kernel, of the Apache Web server, and of Source Code Management tools (SCMs); together these stories reveal how coordination worked and what it looked like in the 1990s.

261.Research on coordination in Free Software forms the central core of recent academic work. Two of the most widely read pieces, Yochai Benkler’s “Coase’s Penguin” and Steven Weber’s The Success of Open Source, are directed at classic research questions about collective action. Rishab Ghosh’s “Cooking Pot Markets” and Eric Raymond’s The Cathedral and the Bazaar set many of the terms of debate. Josh Lerner’s and Jean Tirole’s “Some Simple Economics of Open Source” was an early contribution. Other important works on the subject are Feller et al., Perspectives on Free and Open Source Software; Tuomi, Networks of Innovation; Von Hippel, Democratizing Innovation.

Coordination is important because it collapses and resolves the distinction between technical and social forms into a meaningful [pg 211] whole for participants. On the one hand, there is the coordination and management of people; on the other, there is the coordination of source code, patches, fixes, bug reports, versions, and distributions—but together there is a meaningful technosocial practice of managing, decision-making, and accounting that leads to the collaborative production of complex software and networks. Such coordination would be unexceptional, essentially mimicking long-familiar corporate practices of engineering, except for one key fact: it has no goals. Coordination in Free Software privileges adaptability over planning. This involves more than simply allowing any kind of modification; the structure of Free Software coordination actually gives precedence to a generalized openness to change, rather than to the following of shared plans, goals, or ideals dictated or controlled by a hierarchy of individuals. 262

262.On the distinction between adaptability and adaptation, see Federico Iannacci, “The Linux Managing Model,” http://opensource.mit.edu/papers/iannacci2.pdf. Matt Ratto characterizes the activity of Linux-kernel developers as a “culture of re-working” and a “design for re-design,” and captures the exquisite details of such a practice both in coding and in the discussion between developers, an activity he dubs the “pressure of openness” that “results as a contradiction between the need to maintain productive collaborative activity and the simultaneous need to remain open to new development directions” (“The Pressure of Openness,” 112-38).

Adaptability does not mean randomness or anarchy, however; it is a very specific way of resolving the tension between the individual curiosity and virtuosity of hackers, and the collective coordination necessary to create and use complex software and networks. No man is an island, but no archipelago is a nation, so to speak. Adaptability preserves the “joy” and “fun” of programming without sacrificing the careful engineering of a stable product. Linux and Apache should be understood as the results of this kind of coordination: experiments with adaptability that have worked, to the surprise of many who have insisted that complexity requires planning and hierarchy. Goals and planning are the province of governance—the practice of goal-setting, orientation, and definition of control—but adaptability is the province of critique, and this is why Free Software is a recursive public: it stands outside power and offers powerful criticism in the form of working alternatives. It is not the domain of the new—after all Linux is just a rewrite of UNIX—but the domain of critical and responsive public direction of a collective undertaking.

Linux and Apache are more than pieces of software; they are organizations of an unfamiliar kind. My claim that they are “recursive publics” is useful insofar as it gives a name to a practice that is neither corporate nor academic, neither profit nor nonprofit, neither governmental nor nongovernmental. The concept of recursive public includes, within the spectrum of political activity, the creation, modification, and maintenance of software, networks, and legal documents. While a “public” in most theories is a body of [pg 212] people and a discourse that give expressive form to some concern, “recursive public” is meant to suggest that geeks not only give expressive form to some set of concerns (e.g., that software should be free or that intellectual property rights are too expansive) but also give concrete infrastructural form to the means of expression itself. Linux and Apache are tools for creating networks by which expression of new kinds can be guaranteed and by which further infrastructural experimentation can be pursued. For geeks, hacking and programming are variants of free speech and freedom of assembly.

From UNIX to Minix to Linux

Linux and Apache are the two paradigmatic cases of Free Software in the 1990s, both for hackers and for scholars of Free Software. Linux is a UNIX-like operating-system kernel, bootstrapped out of the Minix operating system created by Andrew Tanenbaum. 263 Apache is the continuation of the original National Center for Supercomputing Applications (NCSA) project to create a Web server (Rob McCool’s original program, called httpd), bootstrapped out of a distributed collection of people who were using and improving that software.

263.Linux is often called an operating system, which Stallman objects to on the theory that a kernel is only one part of an operating system. Stallman suggests that it be called GNU/Linux to reflect the use of GNU operating-system tools in combination with the Linux kernel. This not-so-subtle ploy to take credit for Linux reveals the complexity of the distinctions. The kernel is at the heart of hundreds of different “distributions”—such as Debian, Red Hat, SuSe, and Ubuntu Linux—all of which also use GNU tools, but [pg 338] which are often collections of software larger than just an operating system. Everyone involved seems to have an intuitive sense of what an operating system is (thanks to the pedagogical success of UNIX), but few can draw any firm lines around the object itself.

Linux and Apache are both experiments in coordination. Both projects evolved decision-making systems through experiment: a voting system in Apache’s case and a structured hierarchy of decision-makers, with Linus Torvalds as benevolent dictator, in Linux’s case. Both projects also explored novel technical tools for coordination, especially Source Code Management (SCM) tools such as Concurrent Versioning System (cvs). Both are also cited as exemplars of how “fun,” “joy,” or interest determine individual participation and of how it is possible to maintain and encourage that participation and mutual aid instead of narrowing the focus or eliminating possible routes for participation.

Beyond these specific experiments, the stories of Linux and Apache are detailed here because both projects were actively central to the construction and expansion of the Internet of the 1990s by allowing a massive number of both corporate and noncorporate sites to cheaply install and run servers on the Internet. Were Linux and Apache nothing more than hobbyist projects with a few thousand [pg 213] interested tinkerers, rather than the core technical components of an emerging planetary network, they would probably not represent the same kind of revolutionary transformation ultimately branded a “movement” in 1998-99.

Linus Torvalds’s creation of the Linux kernel is often cited as the first instance of the real “Open Source” development model, and it has quickly become the most studied of the Free Software projects. 264 Following its appearance in late 1991, Linux grew quickly from a small, barely working kernel to a fully functional replacement for the various commercial UNIX systems that had resulted from the UNIX wars of the 1980s. It has become versatile enough to be used on desktop PCs with very little memory and small CPUs, as well as in “clusters” that allow for massively parallel computing power.

264.Eric Raymond directed attention primarily to Linux in The Cathedral and the Bazaar. Many other projects preceded Torvalds’s kernel, however, including the tools that form the core of both UNIX and the Internet: Paul Vixie’s implementation of the Domain Name System (DNS) known as BIND; Eric Allman’s sendmail for routing e-mail; the scripting languages perl (created by Larry Wall), python (Guido von Rossum), and tcl/tk (John Ousterhout); the X Windows research project at MIT; and the derivatives of the original BSD UNIX, FreeBSD and OpenBSD. On the development model of FreeBSD, see Jorgensen, “Putting It All in the Trunk” and “Incremental and Decentralized Integration in FreeBSD.” The story of the genesis of Linux is very nicely told in Moody, Rebel Code, and Williams, Free as in Freedom; there are also a number of papers—available through Free/Opensource Research Community, http://freesoftware.mit.edu/—that analyze the development dynamics of the Linux kernel. See especially Ratto, “Embedded Technical Expression” and “The Pressure of Openness.” I have conducted much of my analysis of Linux by reading the Linux Kernel Mailing List archives, http://lkml.org. There are also annotated summaries of the Linux Kernel Mailing List discussions at http://kerneltraffic.org.

When Torvalds started, he was blessed with an eager audience of hackers keen on seeing a UNIX system run on desktop computers and a personal style of encouragement that produced enormous positive feedback. Torvalds is often given credit for creating, through his “management style,” a “new generation” of Free Software—a younger generation than that of Stallman and Raymond. Linus and Linux are not in fact the causes of this change, but the results of being at the right place at the right time and joining together a number of existing components. Indeed, the title of Torvalds’s semi-autobiographical reflection on Linux—Just for Fun: The Story of an Accidental Revolutionary—captures some of the character of its genesis.

The fact of Linus Torvalds’s pedagogical embedding in the world of UNIX, Minix, the Free Software Foundation, and the Usenet should not be underestimated, as it often is in hagiographical accounts of the Linux operating system. Without this relatively robust moral-technical order or infrastructure within which it was possible to be at the right place at the right time, Torvalds’s late-night dorm-room project would have amounted to little more than that—but the pieces were all in place for his modest goals to be transformed into something much more significant.

So the system is based on Minix, just as Minix had been based on UNIX—piggy-backed or bootstrapped, rather than rewritten in an entirely different fashion, that is, rather than becoming a different kind of operating system. And yet there are clearly concerns about the need to create something that is not Minix, rather than simply extending or “debugging” Minix. This concern is key to understanding what happened to Linux in 1991.

By all accounts, Prentice Hall was not restrictive in its sublicensing of the operating system, if people wanted to create an “enhanced” [pg 217] version of Minix. Similarly, Tanenbaum’s frequent presence on comp.os.minix testified to his commitment to sharing his knowledge about the system with anyone who wanted it—not just paying customers. Nonetheless, Torvalds’s pointed use of the word free and his decision not to reuse any of the code is a clear indication of his desire to build a system completely unencumbered by restrictions, based perhaps on a kind of intuitive folkloric sense of the dangers associated with cases like that of EMACS. 270

270.Indeed, initially, Torvalds’s terms of distribution for Linux were more restrictive than the GPL, including limitations on distributing it for a fee or for handling costs. Torvalds eventually loosened the restrictions and switched to the GPL in February 1992. Torvalds’s release notes for Linux 0.12 say, “The Linux copyright will change: I’ve had a couple of requests [pg 339] to make it compatible with the GNU copyleft, removing the ‘you may not distribute it for money’ condition. I agree. I propose that the copyright be changed so that it conforms to GNU—pending approval of the persons who have helped write code. I assume this is going to be no problem for anybody: If you have grievances (‘I wrote that code assuming the copyright would stay the same’) mail me. Otherwise The GNU copyleft takes effect as of the first of February. If you do not know the gist of the GNU copyright—read it” (http://www.kernel.org/pub/linux/kernel/Historic/old-versions/RELNOTES-0.12).

The most significant aspect of Torvalds’s initial message, however, is his request: “I’d like to know what features most people would want. Any suggestions are welcome, but I won’t promise I’ll implement them.” Torvalds’s announcement and the subsequent interest it generated clearly reveal the issues of coordination and organization that would come to be a feature of Linux. The reason Torvalds had so many eager contributors to Linux, from the very start, was because he enthusiastically took them off of Tanenbaum’s hands.

Tanenbaum’s role in the story of Linux is usually that of the straw man—a crotchety old computer-science professor who opposes the revolutionary young Torvalds. Tanenbaum did have a certain revolutionary reputation himself, since Minix was used in classrooms around the world and could be installed on IBM PCs (something no other commercial UNIX vendors had achieved), but he was also a natural target for people like Torvalds: the tenured professor espousing the textbook version of an operating system. So, despite the fact that a very large number of people were using or knew of Minix as a UNIX operating system (estimates of comp.os.minix subscribers were at 40,000), Tanenbaum was emphatically not interested in collaboration or collaborative debugging, especially if debugging also meant creating extensions and adding features that would make the system bigger and harder to use as a stripped-down tool for teaching. For Tanenbaum, this point was central: “I’ve been repeatedly offered virtual memory, paging, symbolic links, window systems, and all manner of features. I have usually declined because I am still trying to keep the system simple enough for students to understand. You can put all this stuff in your version, but I won’t [pg 218] put it in mine. I think it is this point which irks the people who say ‘MINIX is not free,’ not the $60.” 271

271.Message-ID: 12667@star.cs.vu.nl.

By contrast, Torvalds’s “fun” project had no goals. Being a cocky nineteen-year-old student with little better to do (no textbooks to write, no students, grants, research projects, or committee meetings), Torvalds was keen to accept all the ready-made help he could find to make his project better. And with 40,000 Minix users, he had a more or less instant set of contributors. Stallman’s audience for EMACS in the early 1980s, by contrast, was limited to about a hundred distinct computers, which may have translated into thousands, but certainly not tens of thousands of users. Tanenbaum’s work in creating a generation of students who not only understood the internals of an operating system but, more specifically, understood the internals of the UNIX operating system created a huge pool of competent and eager UNIX hackers. It was the work of porting UNIX not only to various machines but to a generation of minds as well that set the stage for this event—and this is an essential, though often overlooked component of the success of Linux.

Many accounts of the Linux story focus on the fight between Torvalds and Tanenbaum, a fight carried out on comp.os.minix with the subject line “Linux is obsolete.” 272 Tanenbaum argued that Torvalds was reinventing the wheel, writing an operating system that, as far as the state of the art was concerned, was now obsolete. Torvalds, by contrast, asserted that it was better to make something quick and dirty that worked, invite contributions, and worry about making it state of the art later. Far from illustrating some kind of outmoded conservatism on Tanenbaum’s part, the debate highlights the distinction between forms of coordination and the meanings of collaboration. For Tanenbaum, the goals of Minix were either pedagogical or academic: to teach operating-system essentials or to explore new possibilities in operating-system design. By this model, Linux could do neither; it couldn’t be used in the classroom because [pg 219] it would quickly become too complex and feature-laden to teach, and it wasn’t pushing the boundaries of research because it was an out-of-date operating system. Torvalds, by contrast, had no goals. What drove his progress was a commitment to fun and to a largely inarticulate notion of what interested him and others, defined at the outset almost entirely against Minix and other free operating systems, like FreeBSD. In this sense, it could only emerge out of the context—which set the constraints on its design—of UNIX, open systems, Minix, GNU, and BSD.

272.Message-ID: 12595@star.cs.vu.nl. Key parts of the controversy were reprinted in Dibona et al. Open Sources.

Both Tanenbaum and Torvalds operated under a model of coordination in which one person was ultimately responsible for the entire project: Tanenbaum oversaw Minix and ensured that it remained true to its goals of serving a pedagogical audience; Torvalds would oversee Linux, but he would incorporate as many different features as users wanted or could contribute. Very quickly—with a pool of 40,000 potential contributors—Torvalds would be in the same position Tanenbaum was in, that is, forced to make decisions about the goals of Linux and about which enhancements would go into it and which would not. What makes the story of Linux so interesting to observers is that it appears that Torvalds made no decision: he accepted almost everything.

Tanenbaum’s goals and plans for Minix were clear and autocratically formed. Control, hierarchy, and restriction are after all appropriate in the classroom. But Torvalds wanted to do more. He wanted to go on learning and to try out alternatives, and with Minix as the only widely available way to do so, his decision to part ways starts to make sense; clearly he was not alone in his desire to explore and extend what he had learned. Nonetheless, Torvalds faced the problem of coordinating a new project and making similar decisions about its direction. On this point, Linux has been the subject of much reflection by both insiders and outsiders. Despite images of Linux as either an anarchic bazaar or an autocratic dictatorship, the reality is more subtle: it includes a hierarchy of contributors, maintainers, and “trusted lieutenants” and a sophisticated, informal, and intuitive sense of “good taste” gained through reading and incorporating the work of co-developers.

While it was possible for Torvalds to remain in charge as an individual for the first few years of Linux (1991-95, roughly), he eventually began to delegate some of that control to people who would make decisions about different subcomponents of the kernel. [pg 220] It was thus possible to incorporate more of the “patches” (pieces of code) contributed by volunteers, by distributing some of the work of evaluating them to people other than Torvalds. This informal hierarchy slowly developed into a formal one, as Steven Weber points out: “The final de facto ‘grant’ of authority came when Torvalds began publicly to reroute relevant submissions to the lieutenants. In 1996 the decision structure became more formal with an explicit differentiation between ‘credited developers’ and ‘maintainers.’ . . . If this sounds very much like a hierarchical decision structure, that is because it is one—albeit one in which participation is strictly voluntary.” 273

273.Steven Weber, The Success of Open Source, 164.

By 1995-96, Torvalds and lieutenants faced considerable challenges with regard to hierarchy and decision-making, as the project had grown in size and complexity. The first widely remembered response to the ongoing crisis of benevolent dictatorship in Linux was the creation of “loadable kernel modules,” conceived as a way to release some of the constant pressure to decide which patches would be incorporated into the kernel. The decision to modularize [pg 221] Linux was simultaneously technical and social: the software-code base would be rewritten to allow for external loadable modules to be inserted “on the fly,” rather than all being compiled into one large binary chunk; at the same time, it meant that the responsibility to ensure that the modules worked devolved from Torvalds to the creator of the module. The decision repudiated Torvalds’s early opposition to Tanenbaum in the “monolithic vs. microkernel” debate by inviting contributors to separate core from peripheral functions of an operating system (though the Linux kernel remains monolithic compared to classic microkernels). It also allowed for a significant proliferation of new ideas and related projects. It both contracted and distributed the hierarchy; now Linus was in charge of a tighter project, but more people could work with him according to structured technical and social rules of responsibility.

Creating loadable modules changed the look of Linux, but not because of any planning or design decisions set out in advance. The choice is an example of the privileged adaptability of the Linux, resolving the tension between the curiosity and virtuosity of individual contributors to the project and the need for hierarchical control in order to manage complexity. The commitment to adaptability dissolves the distinction between the technical means of coordination and the social means of management. It is about producing a meaningful whole by which both people and code can be coordinated—an achievement vigorously defended by kernel hackers.

The adaptable organization and structure of Linux is often described in evolutionary terms, as something without teleological purpose, but responding to an environment. Indeed, Torvalds himself has a weakness for this kind of explanation.

Let’s just be honest, and admit that it [Linux] wasn’t designed.

And I know better than most that what I envisioned 10 years ago has nothing in common with what Linux is today. There was certainly no premeditated design there. 274

274.Quoted in Zack Brown, “Kernel Traffic #146 for 17Dec2001,” Kernel Traffic, http://www.kerneltraffic.org/kernel-traffic/kt20011217_146.html; also quoted in Federico Iannacci, “The Linux Managing Model,” http://opensource.mit.edu/papers/iannacci2.pdf.

Adaptability does not answer the questions of intelligent design. Why, for example, does a car have four wheels and two headlights? Often these discussions are polarized: either technical objects are designed, or they are the result of random mutations. What this opposition overlooks is the fact that design and the coordination of collaboration go hand in hand; one reveals the limits and possibilities of the other. Linux represents a particular example of such a problematic—one that has become the paradigmatic case of Free Software—but there have been many others, including UNIX, for which the engineers created a system that reflected the distributed collaboration of users around the world even as the lawyers tried to make it conform to legal rules about licensing and practical concerns about bookkeeping and support.

Because it privileges adaptability over planning, Linux is a recursive public: operating systems and social systems. It privileges openness to new directions, at every level. It privileges the right to propose changes by actually creating them and trying to convince others to use and incorporate them. It privileges the right to fork the software into new and different kinds of systems. Given what it privileges, Linux ends up evolving differently than do systems whose life and design are constrained by corporate organization, or by strict engineering design principles, or by legal or marketing definitions of products—in short, by clear goals. What makes this distinction between the goal-oriented design principle and the principle of adaptability important is its relationship to politics. Goals and planning are the subject of negotiation and consensus, or of autocratic decision-making; adaptability is the province of critique. It should be remembered that Linux is by no means an attempt to create something radically new; it is a rewrite of a UNIX operating system, as Torvalds points out, but one that through adaptation can end up becoming something new.

The Apache Web server and the Apache Group (now called the Apache Software Foundation) provide a second illuminating example of the how and why of coordination in Free Software of the 1990s. As with the case of Linux, the development of the Apache project illustrates how adaptability is privileged over planning [pg 223] and, in particular, how this privileging is intended to resolve the tensions between individual curiosity and virtuosity and collective control and decision-making. It is also the story of the progressive evolution of coordination, the simultaneously technical and social mechanisms of coordinating people and code, patches and votes.

Harthill’s injunction to collaborate seems surprising in the context of a mailing list and project created to facilitate collaboration, but the injunction is specific: collaborate by making plans and sharing goals. Implicit in his words is the tension between a project with clear plans and goals, an overarching design to which everyone contributes, as opposed to a group platform without clear goals that provides individuals with a setting to try out alternatives. Implicit in his words is the spectrum between debugging an existing piece of software with a stable identity and rewriting the fundamental aspects of it to make it something new. The meaning of collaboration bifurcates here: on the one hand, the privileging of the autonomous work of individuals which is submitted to a group peer review and then incorporated; on the other, the privileging of a set of shared goals to which the actions and labor of individuals is subordinated. 292

292.Gabriella Coleman captures this nicely in her discussion of the tension between the individual virtuosity of the hacker and the corporate populism of groups like Apache or, in her example, the Debian distribution of Linux. See Coleman, The Social Construction of Freedom.

The technical and social forms that Linux and Apache take are enabled by the tools they build and use, from bug-tracking tools and mailing lists to the Web servers and kernels themselves. One such tool plays a very special role in the emergence of these organizations: Source Code Management systems (SCMs). SCMs are tools for coordinating people and code; they allow multiple people in dispersed locales to work simultaneously on the same object, the same source code, without the need for a central coordinating overseer and without the risk of stepping on each other’s toes. The history of SCMs—especially in the case of Linux—also illustrates the recursive-depth problem: namely, is Free Software still free if it is created with non-free tools?

Both the Apache project and the Linux kernel project use SCMs. In the case of Apache the original patch-and-vote system quickly began to strain the patience, time, and energy of participants as the number of contributors and patches began to grow. From the very beginning of the project, the contributor Paul Richards had urged the group to make use of cvs. He had extensive experience with the system in the Free-BSD project and was convinced that it provided a superior alternative to the patch-and-vote system. Few other contributors had much experience with it, however, so it wasn’t until over a year after Richards began his admonitions that cvs was eventually adopted. However, cvs is not a simple replacement for a patch-and-vote system; it necessitates a different kind of organization. Richards recognized the trade-off. The patch-and-vote system created a very high level of quality assurance and peer review of the patches that people submitted, while the cvs system allowed individuals to make more changes that might not meet the same level of quality assurance. The cvs system allowed branches—stable, testing, experimental—with different levels of quality assurance, while the patch-and-vote system was inherently directed at one final and stable version. As the case of Shambhala [pg 232] exhibited, under the patch-and-vote system experimental versions would remain unofficial garage projects, rather than serve as official branches with people responsible for committing changes.

The Linux kernel has also struggled with various issues surrounding SCMs and the management of responsibility they imply. The story of the so-called VGER tree and the creation of a new SCM called Bitkeeper is exemplary in this respect. 296 By 1997, Linux developers had begun to use cvs to manage changes to the source code, though not without resistance. Torvalds was still in charge of the changes to the official stable tree, but as other “lieutenants” came on board, the complexity of the changes to the kernel grew. One such lieutenant was Dave Miller, who maintained a “mirror” of the stable Linux kernel tree, the VGER tree, on a server at Rutgers. In September 1998 a fight broke out among Linux kernel developers over two related issues: one, the fact that Torvalds was failing to incorporate (patch) contributions that had been forwarded to him by various people, including his lieutenants; and two, as a result, the VGER cvs repository was no longer in synch with the stable tree maintained by Torvalds. Two different versions of Linux threatened to emerge.

296.See Steven Weber, The Success of Open Source, 117-19; Moody, Rebel Code, 172-78. See also Shaikh and Cornford, “Version Management Tools.”

A great deal of yelling ensued, as nicely captured in Moody’s Rebel Code, culminating in the famous phrase, uttered by Larry McVoy: “Linus does not scale.” The meaning of this phrase is that the ability of Linux to grow into an ever larger project with increasing complexity, one which can handle myriad uses and functions (to “scale” up), is constrained by the fact that there is only one Linus Torvalds. By all accounts, Linus was and is excellent at what he does—but there is only one Linus. The danger of this situation is the danger of a fork. A fork would mean one or more new versions would proliferate under new leadership, a situation much like [pg 233] the spread of UNIX. Both the licenses and the SCMs are designed to facilitate this, but only as a last resort. Forking also implies dilution and confusion—competing versions of the same thing and potentially unmanageable incompatibilities.

McVoy was well-known in geek circles before Linux. In the late stages of the open-systems era, as an employee of Sun, he had penned an important document called “The Sourceware Operating System Proposal.” It was an internal Sun Microsystems document that argued for the company to make its version of UNIX freely available. It was a last-ditch effort to save the dream of open systems. It was also the first such proposition within a company to “go open source,” much like the documents that would urge Netscape to Open Source its software in 1998. Despite this early commitment, McVoy chose not to create Bitkeeper as a Free Software project, but to make it quasi-proprietary, a decision that raised a very central question in ideological terms: can one, or should one, create Free Software using non-free tools?

On one side of this controversy, naturally, was Richard Stallman and those sharing his vision of Free Software. On the other were pragmatists like Torvalds claiming no goals and no commitment to “ideology”—only a commitment to “fun.” The tension laid bare the way in which recursive publics negotiate and modulate the core components of Free Software from within. Torvalds made a very strong and vocal statement concerning this issue, responding to Stallman’s criticisms about the use of non-free software to create Free Software: “Quite frankly, I don’t _want_ people using Linux for ideological reasons. I think ideology sucks. This world would be a much better place if people had less ideology, and a whole lot more ‘I do this because it’s FUN and because others might find it useful, not because I got religion.’” 297

297.Linus Torvalds, “Re: [PATCH] Remove Bitkeeper Documentation from Linux Tree,” 20 April 2002, http://www.uwsg.indiana.edu/hypermail/linux/kernel/0204.2/1018.html. Quoted in Shaikh and Cornford, “Version Management Tools.”

Torvalds emphasizes pragmatism in terms of coordination: the right tool for the job is the right tool for the job. In terms of licenses, [pg 234] however, such pragmatism does not play, and Torvalds has always been strongly committed to the GPL, refusing to let non-GPL software into the kernel. This strategic pragmatism is in fact a recognition of where experimental changes might be proposed, and where practices are settled. The GPL was a stable document, sharing source code widely was a stable practice, but coordinating a project using SCMs was, during this period, still in flux, and thus Bitkeeper was a tool well worth using so long as it remained suitable to Linux development. Torvalds was experimenting with the meaning of coordination: could a non-free tool be used to create Free Software?

McVoy, on the other hand, was on thin ice. He was experimenting with the meaning of Free Software licenses. He created three separate licenses for Bitkeeper in an attempt to play both sides: a commercial license for paying customers, a license for people who sell Bitkeeper, and a license for “free users.” The free-user license allowed Linux developers to use the software for free—though it required them to use the latest version—and prohibited them from working on a competing project at the same time. McVoy’s attempt to have his cake and eat it, too, created enormous tension in the developer community, a tension that built from 2002, when Torvalds began using Bitkeeper in earnest, to 2005, when he announced he would stop.

The developer Andrew Trigdell, well known for his work on a project called Samba and his reverse engineering of a Microsoft networking protocol, began a project to reverse engineer Bitkeeper by looking at the metadata it produced in the course of being used for the Linux project. By doing so, he crossed a line set up by McVoy’s experimental licensing arrangement: the “free as long as you don’t copy me” license. Lawyers advised Trigdell to stay silent on the topic while Torvalds publicly berated him for “willful destruction” and a moral lapse of character in trying to reverse engineer Bitkeeper. Bruce Perens defended Trigdell and censured Torvalds for his seemingly contradictory ethics. 298 McVoy never sued Trigdell, and Bitkeeper has limped along as a commercial project, because, [pg 235] much like the EMACS controversy of 1985, the Bitkeeper controversy of 2005 ended with Torvalds simply deciding to create his own SCM, called git.

298.Andrew Orlowski, “‘Cool it, Linus’—Bruce Perens,” Register, 15 April 2005, http://www.theregister.co.uk/2005/04/15/perens_on_torvalds/page2.html.

The story of the VGER tree and Bitkeeper illustrate common tensions within recursive publics, specifically, the depth of the meaning of free. On the one hand, there is Linux itself, an exemplary Free Software project made freely available; on the other hand, however, there is the ability to contribute to this process, a process that is potentially constrained by the use of Bitkeeper. So long as the function of Bitkeeper is completely circumscribed—that is, completely planned—there can be no problem. However, the moment one user sees a way to change or improve the process, and not just the kernel itself, then the restrictions and constraints of Bitkeeper can come into play. While it is not clear that Bitkeeper actually prevented anything, it is also clear that developers clearly recognized it as a potential drag on a generalized commitment to adaptability. Or to put it in terms of recursive publics, only one layer is properly open, that of the kernel itself; the layer beneath it, the process of its construction, is not free in the same sense. It is ironic that Torvalds—otherwise the spokesperson for antiplanning and adaptability—willingly adopted this form of constraint, but not at all surprising that it was collectively rejected.

Novelty, both in the case of Linux and in intellectual property law more generally, is directly related to the interplay of social and technical coordination: goal direction vs. adaptability. The ideal of adaptability promoted by Torvalds suggests a radical alternative to the dominant ideology of creation embedded in contemporary intellectual-property systems. If Linux is “new,” it is new through adaptation and the coordination of large numbers of creative contributors who challenge the “design” of an operating system from the bottom up, not from the top down. By contrast, McVoy represents a moral imagination of design in which it is impossible to achieve novelty without extremely expensive investment in top-down, goal-directed, unpolitical design—and it is this activity that the intellectual-property system is designed to reward. Both are engaged, however, in an experiment; both are engaged in “figuring out” what the limits of Free Software are.

Coordination is a key component of Free Software, and is frequently identified as the central component. Free Software is the result of a complicated story of experimentation and construction, and the forms that coordination takes in Free Software are specific outcomes of this longer story. Apache and Linux are both experiments—not scientific experiments per se but collective social experiments in which there are complex technologies and legal tools, systems of coordination and governance, and moral and technical orders already present.

Free Software is an experimental system, a practice that changes with the results of new experiments. The privileging of adaptability makes it a peculiar kind of experiment, however, one not directed by goals, plans, or hierarchical control, but more like what John Dewey suggested throughout his work: the experimental praxis of science extended to the social organization of governance in the service of improving the conditions of freedom. What gives this experimentation significance is the centrality of Free Software—and specifically of Linux and Apache—to the experimental expansion of the Internet. As an infrastructure or a milieu, the Internet is changing the conditions of social organization, changing the relationship of knowledge to power, and changing the orientation of collective life toward governance. Free Software is, arguably, the best example of an attempt to make this transformation public, to ensure that it uses the advantages of adaptability as critique to counter the power of planning as control. Free Software, as a recursive public, proceeds by proposing and providing alternatives. It is a bit like Kant’s version of enlightenment: insofar as geeks speak (or hack) as scholars, in a public realm, they have a right to propose criticisms and changes of any sort; as soon as they relinquish [pg 240] that commitment, they become private employees or servants of the sovereign, bound by conscience and power to carry out the duties of their given office. The constitution of a public realm is not a universal activity, however, but a historically specific one: Free Software confronts the specific contemporary technical and legal infrastructure by which it is possible to propose criticisms and offer alternatives. What results is a recursive public filled not only with individuals who govern their own actions but also with code and concepts and licenses and forms of coordination that turn these actions into viable, concrete technical forms of life useful to inhabitants of the present.

At about the same time as his idea for a textbook, Rich’s research group was switching over to Linux, and Rich was first learning about Open Source and the emergence of a fully free operating system created entirely by volunteers. It isn’t clear what Rich’s aha! moment was, other than simply when he came to an understanding that such a thing as Linux was actually possible. Nonetheless, at some point, Rich had the idea that his textbook could be an Open Source textbook, that is, a textbook created not just by him, but by DSP researchers all over the world, and made available to everyone to make use of and modify and improve as they saw fit, just like Linux. Together with Brent Hendricks, Yan David Erlich, [pg 249] and Ross Reedstrom, all of whom, as geeks, had a deep familiarity with the history and practices of Free and Open Source Software, Rich started to conceptualize a system; they started to think about modulations of different components of Free and Open Source Software. The idea of a Free Software textbook repository slowly took shape.

The modulated meaning of source code creates all kinds of new questions—specifically with respect to the other four components. In terms of openness, for instance, Connexions modulates this component very little; most of the actors involved are devoted to the ideals of open systems and open standards, insofar as it is a Free Software project of a conventional type. It builds on UNIX (Linux) and the Internet, and the project leaders maintain a nearly fanatical devotion to openness at every level: applications, programming languages, standards, protocols, mark-up languages, interface tools. Every place where there is an open (as opposed to a [pg 256] proprietary) solution—that choice trumps all others (with one noteworthy exception). 310 James Boyle recently stated it well: “Wherever possible, design the system to run with open content, on open protocols, to be potentially available to the largest possible number of users, and to accept the widest possible range of experimental modifications from users who can themselves determine the development of the technology.” 311

310.The most significant exception has been the issue of tools for authoring content in XML. For most of the life of the Connexions project, the XML mark-up language has been well-defined and clear, but there has been no way to write a module in XML, short of directly writing the text and the tags in a text editor. For all but a very small number of possible users, this feels too much like programming, and they experience it as too frustrating to be worth it. The solution (albeit temporary) was to encourage users to make use of a proprietary XML editor (like a word processor, but capable of creating XML content). Indeed, the Connexions project’s devotion to openness was tested by one of the most important decisions its participants made: to pursue the creation of an Open Source XML text editor in order to provide access to completely open tools for creating completely open content.

311.Boyle, “Mertonianism Unbound,” 14.

Perhaps unsurprisingly, the Connexions team spent a great deal of time at the outset of the project creating a pdf-document-creation system that would essentially mimic the creation of a conventional textbook, with the push of a button. 333 But even this process causes a subtle transformation: the concept of “edition” becomes much harder to track. While a conventional textbook is a stable entity that goes through a series of printings and editions, each of which is marked on its publication page, a Connexions document can go through as many versions as an author wants to make changes, all the while without necessarily changing editions. In this respect, the modulation of the concept of source code translates the practices of updating and “versioning” into the realm of textbook writing. Recall the cases ranging from the “continuum” of UNIX versions discussed by Ken Thompson to the complex struggles over version control in the Linux and Apache projects. In the case of writing source code, exactitude demands that the change of even a single character be tracked and labeled as a version change, whereas a [pg 278] conventional-textbook spelling correction or errata issuance would hardly create the need for a new edition.

333.Conventional here is actually quite historically proximate: the system creates a pdf document by translating the XML document into a LaTeX document, then into a pdf document. LaTeX has been, for some twenty years, a standard text-formatting and typesetting language used by some [pg 345] sectors of the publishing industry (notably mathematics, engineering, and computer science). Were it not for the existence of this standard from which to bootstrap, the Connexions project would have faced a considerably more difficult challenge, but much of the infrastructure of publishing has already been partially transformed into a computer-mediated and -controlled system whose final output is a printed book. Later in Connexions’s lifetime, the group coordinated with an Internet-publishing startup called Qoop.com to take the final step and make Connexions courses available as print-on-demand, cloth-bound textbooks, complete with ISBNs and back-cover blurbs.

In the case of shared software source code, one of the principal reasons for sharing it was to reuse it: to build on it, to link to it, to employ it in ways that made building more complex objects into an easier task. The very design philosophy of UNIX well articulates the necessity of modularity and reuse, and the idea is no less powerful in other areas, such as textbooks. But just as the reuse of software is not simply a feature of software’s technical characteristics, the idea of “reusing” scholarly materials implies all kinds of questions that are not simply questions of recombining texts. The ability to share source code—and the ability to create complex software based on it—requires modulations of both the legal meaning of software, as in the case of EMACS, and the organizational form, as in the [pg 290] emergence of Free Software projects other than the Free Software Foundation (the Linux kernel, Perl, Apache, etc.).

The Creative Commons licenses allow authors to grant the use of their work in about a dozen different ways—that is, the license itself comes in versions. One can, for instance, require attribution, prohibit commercial exploitation, allow derivative or modified works to be made and circulated, or some combination of all these. These different combinations actually create different licenses, each of which grants intellectual-property rights under slightly different conditions. For example, say Marshall Sahlins decides to write a paper about how the Internet is cultural; he copyrights the paper (“© 2004 Marshall Sahlins”), he requires that any use of it or any copies of it maintain the copyright notice and the attribution of [pg 295] authorship (these can be different), and he furthermore allows for commercial use of the paper. It would then be legal for a publishing house to take the paper off Sahlins’s Linux-based Web server and publish it in a collection without having to ask permission, as long as the paper remains unchanged and he is clearly and unambiguously listed as author of the paper. The publishing house would not get any rights to the work, and Sahlins would not get any royalties. If he had specified noncommercial use, the publisher would instead have needed to contact him and arrange for a separate license (Creative Commons licenses are nonexclusive), under which he could demand some share of revenue and his name on the cover of the book. 345 But say he was, instead, a young scholar seeking only peer recognition and approbation—then royalties would be secondary to maximum circulation. Creative Commons allows authors to assert, as its members put it, “some rights reserved” or even “no rights reserved.”

345.In December 2006 Creative Commons announced a set of licenses that facilitate the “follow up” licensing of a work, especially one initially issued under a noncommercial license.

Yochai Benkler "Coase’s Penguin, or Linux and the Nature of the Firm", Yale Law Journal, 112.3, 2002, 369-446.

Mike Gancarz "Linux and the UNIX Philosophy", 2003, Digital Press, Boston.

Glyn Moody "Rebel Code: Inside Linux and the Open Source Revolution", 2001, Perseus, Cambridge, Mass..

Matt Ratto "The Pressure of Openness: The Hybrid work of Linux Free/Open Source Kernel Developers", 2003, San Diego.

Eric S Raymond "The Cathedral and the Bazaar: Musings on Linux and Open Source by an Accidental Revolutionary", 2001, 79-135, O’Reilly Press, Sebastopol, Calif..

Maha Shaikh, Tony Cornford "Version Management Tools: CVS to BK in the Linux Kernel", 2003-05-03, Portland, Oregon.

Peter Wayner "Free for All: How LINUX and the Free Software Movement Undercut the High-Tech Titans", 2000, Harper Business, New York.





Δ git