1. A Note on Terminology: There is still debate about how to refer to Free Software, which is also known as Open Source Software. The scholarly community has adopted either FOSS or FLOSS (or F/LOSS): the former stands for the Anglo-American Free and Open Source Software; the latter stands for the continental Free, Libre and Open Source Software. Two Bits sticks to the simple term Free Software to refer to all of these things, except where it is specifically necessary to differentiate two or more names, or to specify people or events so named. The reason is primarily aesthetic and political, but Free Software is also the older term, as well as the one that includes issues of moral and social order. I explain in chapter 3 why there are two terms.
2. Michael M. J. Fischer, “Culture and Cultural Analysis as Experimental Systems.”
3. So, for instance, when a professional society founded on charters and ideals for membership and qualification speaks as a public, it represents its members, as when the American Medical Association argues for or against changes to Medicare. However, if a new group—say, of nurses—seeks not only to participate in this discussion—which may be possible, even welcomed—but to change the structure of representation in order to give themselves status equal to doctors, this change is impossible, for it goes against the very aims and principles of the society. Indeed, the nurses will be urged to form their own society, not to join that of the doctors, a proposition which gives the lie to the existing structures of power. By contrast, a public is an entity that is less controlled and hence more agonistic, such that nurses might join, speak, and insist on changing the terms of debate, just as patients, scientists, or homeless people might. Their success, however, depends entirely on the force with which their actions transform the focus and terms of the public. Concepts of the public sphere have been roundly critiqued in the last twenty years for presuming that such “equality of access” is sufficient to achieve representation, when in fact other contextual factors (race, class, sex) inherently weight the representative power of different participants. But these are two different and overlapping problems: one cannot solve the problem of pernicious, invisible forms of inequality unless one first solves the problem of ensuring a certain kind of structural publicity. It is precisely the focus on maintaining publicity for a recursive public, over against massive and powerful corporate and governmental attempts to restrict it, that I locate as the central struggle of Free Software. Gender certainly influences who gets heard within Free Software, for example, but it is a mistake to focus on this inequality at the expense of the larger, more threatening form of political failure that Free Software addresses. And I think there are plenty of geeks—man, woman and animal—who share this sentiment.
4. Wikipedia is perhaps the most widely known and generally familiar example of what this book is about. Even though it is not identified as such, it is in fact a Free Software project and a “modulation” of Free Software as I describe it here. The non-technically inclined reader might keep Wikipedia in mind as an example with which to follow the argument of this book. I will return to it explicitly in part 3. However, for better or for worse, there will be no discussion of pornography.
5. Although the term public clearly suggests private as its opposite, Free Software is not anticommercial. A very large amount of money, both real and notional, is involved in the creation of Free Software. The term recursive [PAGE 313] market could also be used, in order to emphasize the importance (especially during the 1990s) of the economic features of the practice. The point is not to test whether Free Software is a “public” or a “market,” but to construct a concept adequate to the practices that constitute it.
6. See, for example, Warner, Publics and Counterpublics, 67-74.
7. Habermas, The Structural Transformation of the Public Sphere, esp. 27-43.
8. Critiques of the demand for availability and the putatively inherent superiority of transparency include Coombe and Herman, “Rhetorical Virtues” and “Your Second Life?”; Christen, “Gone Digital”; and Anderson and Bowery, “The Imaginary Politics of Access to Knowledge.”
9. This description of Free Software could also be called an “assemblage.” The most recent source for this is Rabinow, Anthropos Today. The language of thresholds and intensities is most clearly developed by Manuel DeLanda in A Thousand Years of Non-linear History and in Intensive Science and Virtual Philosophy. The term problematization, from Rabinow (which he channels from Foucault), is a synonym for the phrase “reorientation of knowledge and power” as I use it here.
10. See Kelty, “Culture’s Open Sources.”
11. The genealogy of the term commons has a number of sources. An obvious source is Garrett Hardin’s famous 1968 article “The Tragedy of the Commons.” James Boyle has done more than anyone to specify the term, especially during a 2001 conference on the public domain, which included the inspired guest-list juxtaposition of the appropriation-happy musical collective Negativland and the dame of “commons” studies, Elinor Ostrom, whose book Governing the Commons has served as a certain inspiration for thinking about commons versus public domains. Boyle, for his part, has ceaselessly pushed the “environmental” metaphor of speaking for the public domain as environmentalists of the 1960s and 1970s spoke for the environment (see Boyle, “The Second Enclosure Movement and the Construction of the Public Domain” and “A Politics of Intellectual Property”). The term commons is useful in this context precisely because it distinguishes the “public domain” as an imagined object of pure public transaction and coordination, as opposed to a “commons,” which can consist of privately owned things/spaces that are managed in such a fashion that they effectively function like a “public domain” is imagined to (see Boyle, “The Public Domain”; Hess and Ostrom, Understanding Knowledge as a Commons).
12. Marcus and Fischer, Anthropology as Cultural Critique; Marcus and Clifford, Writing Culture; Fischer, Emergent Forms of Life and the Anthropological Voice; Marcus, Ethnography through Thick and Thin; Rabinow, Essays on the Anthropology of Reason and Anthropos Today.
13. The language of “figuring out” has its immediate source in the work of Kim Fortun, “Figuring Out Ethnography.” Fortun’s work refines two other sources, the work of Bruno Latour in Science in Action and that of Hans-Jorg Rheinberger in Towards History of Epistemic Things. Latour describes the difference between “science made” and “science in the making” and how the careful analysis of new objects can reveal how they come to be. Rheinberger extends this approach through analysis of the detailed practices involved in figuring out a new object or a new process—practices which participants cannot quite name or explain in precise terms until after the fact.
14. Raymond, The Cathedral and the Bazaar.
15. The literature on “virtual communities,” “online communities,” the culture of hackers and geeks, or the social study of information technology offers important background information, although it is not the subject of this book. A comprehensive review of work in anthropology and related disciplines is Wilson and Peterson, “The Anthropology of Online Communities.” Other touchstones are Miller and Slater, The Internet; Carla Freeman, High Tech and High Heels in the Global Economy; Hine, Virtual Ethnography; Kling, Computerization and Controversy; Star, The Cultures of Computing; Castells, The Rise of the Network Society; Boczkowski, Digitizing the News. Most social-science work in information technology has dealt with questions of inequality and the so-called digital divide, an excellent overview being DiMaggio et al., “From Unequal Access to Differentiated Use.” Beyond works in anthropology and science studies, a number of works from various other disciplines have recently taken up similar themes, especially Adrian MacKenzie, Cutting Code; Galloway, Protocol; Hui Kyong Chun, Control and Freedom; and Liu, Laws of Cool. By contrast, if social-science studies of information technology are set against a background of historical and ethnographic studies of “figuring out” problems of specific information technologies, software, or networks, then the literature is sparse. Examples of anthropology and science studies of figuring out include Barry, Political Machines; Hayden, When Nature Goes Public; and Fortun, Advocating Bhopal. Matt Ratto has also portrayed this activity in Free Software in his dissertation, “The Pressure of Openness.”
16. In addition to Abbate and Salus, see Norberg and O’Neill, Transforming Computer Technology; Naughton, A Brief History of the Future; Hafner, Where Wizards Stay Up Late; Waldrop, The Dream Machine; Segaller, Nerds 2.0.1. For a classic autodocumentation of one aspect of the Internet, see Hauben and Hauben, Netizens.
17. Kelty, “Culture’s Open Sources”; Coleman, “The Social Construction of Freedom”; Ratto, “The Pressure of Openness”; Joseph Feller et al., Perspectives [pg 315] on Free and Open Source Software; see also http://freesoftware.mit.edu/, organized by Karim Lakhani, which is a large collection of work on Free Software projects. Early work in this area derived both from the writings of practitioners such as Raymond and from business and management scholars who noticed in Free Software a remarkable, surprising set of seeming contradictions. The best of these works to date is Steven Weber, The Success of Open Source. Weber’s conclusions are similar to those presented here, and he has a kind of cryptoethnographic familiarity (that he does not explicitly avow) with the actors and practices. Yochai Benkler’s Wealth of Networks extends and generalizes some of Weber’s argument.
18. Max Weber, “Objectivity in the Social Sciences and Social Policy,” 68.
19. Despite what might sound like a “shoot first, ask questions later” approach, the design of this project was in fact conducted according to specific methodologies. The most salient is actor-network theory: Latour, Science in Action; Law, “Technology and Heterogeneous Engineering”; Callon, “Some Elements of a Sociology of Translation”; Latour, Pandora’s Hope; Latour, Re-assembling the Social; Callon, Laws of the Markets; Law and Hassard, Actor Network Theory and After. Ironically, there have been no actor-network studies of networks, which is to say, of particular information and communication technologies such as the Internet. The confusion of the word network (as an analytical and methodological term) with that of network (as a particular configuration of wires, waves, software, and chips, or of people, roads, and buses, or of databases, names, and diseases) means that it is necessary to always distinguish this-network-here from any-network-whatsoever. My approach shares much with the ontological questions raised in works such as Law, Aircraft Stories; Mol, The Body Multiple; Cussins, “Ontological Choreography”; Charis Thompson, Making Parents; and Dumit, Picturing Personhood.
20. I understand a concern with scientific infrastructure to begin with Steve Shapin and Simon Schaffer in Leviathan and the Air Pump, but the genealogy is no doubt more complex. It includes Shapin, The Social History of Truth; Biagioli, Galileo, Courtier; Galison, How Experiments End and Image and Logic; Daston, Biographies of Scientific Objects; Johns, The Nature of the Book. A whole range of works explore the issue of scientific tools and infrastructure: Kohler, Lords of the Fly; Rheinberger, Towards a History of Epistemic Things; Landecker, Culturing Life; Keating and Cambrosio, Biomedical Platforms. Bruno Latour’s “What Rules of Method for the New Socio-scientific Experiments” provides one example of where science studies might go with these questions. Important texts on the subject of technical infrastructures include Walsh and Bayma, “Computer Networks and Scientific Work”; Bowker and Star, Sorting Things Out; Edwards, The [pg 316] Closed World; Misa, Brey, and Feenberg, Modernity and Technology; Star and Ruhleder, “Steps Towards an Ecology of Infrastructure.”
21. Dreyfus, On the Internet; Dean, “Why the Net Is Not a Public Sphere.”
22. In addition, see Lippmann, The Phantom Public; Calhoun, Habermas and the Public Sphere; Latour and Weibel, Making Things Public. The debate about social imaginaries begins alternately with Benedict Anderson’s Imagined Communities or with Cornelius Castoriadis’s The Imaginary Institution of Society; see also Chatterjee, “A Response to Taylor’s ‘Modes of Civil Society’”; Gaonkar, “Toward New Imaginaries”; Charles Taylor, “Modes of Civil Society” and Sources of the Self.
23. For the canonical story, see Levy, Hackers. Hack referred to (and still does) a clever use of technology, usually unintended by the maker, to achieve some task in an elegant manner. The term has been successfully redefined by the mass media to refer to computer users who break into and commit criminal acts on corporate or government or personal computers connected to a network. Many self-identified hackers insist that the criminal element be referred to as crackers (see, in particular, the entries on “Hackers,” “Geeks” and “Crackers” in The Jargon File, http://www.catb.org/~esr/jargon/, also published as Raymond, The New Hackers’ Dictionary). On the subject of definitions and the cultural and ethical characteristics of hackers, see Coleman, “The Social Construction of Freedom,” chap. 2.
24. One example of the usage of geek is in Star, The Cultures of Computing. Various denunciations (e.g., Barbrook and Cameron, “The California Ideology”; Borsook, Technolibertarianism) tend to focus on journalistic accounts of an ideology that has little to do with what hackers, geeks, and entrepreneurs actually make. A more relevant categorical distinction than that between hackers and geeks is that between geeks and technocrats; in the case of technocrats, the “anthropology of technocracy” is proposed as the study of the limits of technical rationality, in particular the forms through which “planning” creates “gaps in the form that serve as ‘targets of intervention’” (Riles, “Real Time,” 393). Riles’s “technocrats” are certainly not the “geeks” I portray here (or at least, if they are, it is only in their frustrating day jobs). Geeks do have libertarian, specifically Hayekian or Feyerabendian leanings, but are more likely to see technical failures not as failures of planning, but as bugs, inefficiencies, or occasionally as the products of human hubris or stupidity that is born of a faith in planning.
26. Geeks are also identified often by the playfulness and agility with which they manipulate these labels and characterizations. See Michael M. J. Fischer, “Worlding Cyberspace” for an example.
27. Taylor, Modern Social Imaginaries, 86.
28. On the subject of imagined communities and the role of information technologies in imagined networks, see Green, Harvey, and Knox, “Scales of Place and Networks”; and Flichy, The Internet Imaginaire.
29. Taylor, Modern Social Imaginaries, 32.
30. Ibid., 33-48. Taylor’s history of the transition from feudal nobility to civil society to the rise of republican democracies (however incomplete) is comparable to Foucault’s history of the birth of biopolitics, in La naissance de la biopolitique, as an attempt to historicize governance with respect to its theories and systems, as well as within the material forms it takes.
31. Ricoeur, Lectures on Ideology and Utopia, 2.
32. Geertz, “Ideology as a Cultural System”; Mannheim, Ideology and Utopia. Both, of course, also signal the origin of the scientific use of the term proximately with Karl Marx’s “German Ideology” and more distantly in the Enlightenment writings of Destutt de Tracy.
33. Geertz, “Ideology as a Cultural System,” 195.
34. Ibid., 208-13.
35. The depth and the extent of this issue is obviously huge. Ricoeur’s Lectures on Ideology and Utopia is an excellent analysis to the problem of ideology prior to 1975. Terry Eagleton’s books The Ideology of the Aesthetic and Ideology: An Introduction are Marxist explorations that include discussions of hegemony and resistance in the context of artistic and literary theory in the 1980s. Slavoj Žižek creates a Lacanian-inspired algebraic system of analysis that combines Marxism and psychoanalysis in novel ways (see Žižek, Mapping Ideology). There is even an attempt to replace the concept of ideology with a metaphor of “software” and “memes” (see Balkin, Cultural Software). The core of the issue of ideology as a practice (and the vicissitudes of materialism that trouble it) are also at the heart of works by Pierre Bourdieu and his followers (on the relationship of ideology and hegemony, see Laclau and Mouffe, Hegemony and Socialist Strategy). In anthropology, see Comaroff and Comaroff, Ethnography and the Historical Imagination.
36. Ricoeur, Lectures on Ideology and Utopia, 10.
37. Taylor, Modern Social Imaginaries, 23.
38. Ibid., 25.
39. Ibid., 26-27.
40. Ibid., 28.
41. The question of gender plagues the topic of computer culture. The gendering of hackers and geeks and the more general exclusion of women in computing have been widely observed by academics. I can do no more here than direct readers to the increasingly large and sophisticated literature on the topic. See especially Light, “When Computers Were Women”; Turkle, The Second Self and Life on the Screen. With respect to Free Software, see Nafus, Krieger, Leach, “Patches Don’t Have Gender.” More generally, see Kirkup et al., The Gendered Cyborg; Downey, The Machine in Me; Faulkner, “Dualisms, Hierarchies and Gender in Engineering”; Grint and Gill, The Gender-Technology Relation; Helmreich, Silicon Second Nature; Herring, “Gender and Democracy in Computer-Mediated Communication”; Kendall, “‘Oh No! I’m a NERD!’”; Margolis and Fisher, Unlocking the Clubhouse; Green and Adam, Virtual Gender; P. Hopkins, Sex/Machine; Wajcman, Feminism Confronts Technology and “Reflections on Gender and Technology Studies”; and Fiona Wilson, “Can’t Compute, Won’t Compute.” Also see the novels and stories of Ellen Ullman, including Close to the Machine and The Bug: A Novel.
42. Originally coined by Steward Brand, the phrase was widely cited after it appeared in Barlow’s 1994 article “The Economy of Ideas.”
43. On the genesis of “virtual communities” and the role of Steward Brand, see Turner, “Where the Counterculture Met the New Economy.”
44. Warner, “Publics and Counterpublics,” 51.
45. Ibid., 51-52. See also Warner, Publics and Counterpublics, 69.
46. The rest of this message can be found in the Silk-list archives at http://groups.yahoo.com/group/silk-list/message/2869 (accessed 18 August 2006). The reference to “Fling” is to a project now available at http://fling.sourceforge.net/ (accessed 18 August 2006). The full archives of Silk-list can be found at http://groups.yahoo.com/group/silk-list/ and the full archives of the FoRK list can be found at http://www.xent.com/mailman/listinfo/fork/.
47. Vinge, “The Coming Technological Singularity.”
48. Moore’s Law—named for Gordon Moore, former head of Intel—states that the speed and capacity of computer central processing units (CPUs) doubles every eighteen months, which it has done since roughly 1970. Metcalfe’s Law—named for Robert Metcalfe, inventor of Ethernet—states that the utility of a network equals the square of the number of users, suggesting that the number of things one can do with a network increases exponentially as members are added linearly.
49. This quotation from the 1990s is attributed to Electronic Frontier Foundation’s founder and “cyber-libertarian” John Gilmore. Whether there [pg 319] is any truth to this widespread belief expressed in the statement is not clear. On the one hand, the protocol to which this folklore refers—the general system of “message switching” and, later, “packet switching” invented by Paul Baran at RAND Corporation—does seem to lend itself to robustness (on this history, see Abbate, Inventing the Internet). However, it is not clear that nuclear threats were the only reason such robustness was a design goal; simply to ensure communication in a distributed network was necessary in itself. Nonetheless, the story has great currency as a myth of the nature and structure of the Internet. Paul Edwards suggests that both stories are true (“Infrastructure and Modernity,” 216-20, 225n13).
50. Lessig, Code and Other Laws of Cyberspace. See also Gillespie, “Engineering a Principle” on the related history of the “end to end” design principle.
51. This is constantly repeated on the Internet and attributed to David Clark, but no one really knows where or when he stated it. It appears in a 1997 interview of David Clark by Jonathan Zittrain, the transcript of which is available at http://cyber.law.harvard.edu/jzfallsem//trans/clark/ (accessed 18 August 2006).
52. Ashish “Hash” Gulhati, e-mail to Silk-list mailing list, 9 September 2000, http://groups.yahoo.com/group/silk-list/message/3125.
53. Eugen Leitl, e-mail to Silk-list mailing list, 9 September 2000, http://groups.yahoo.com/group/silk-list/message/3127. Python is a programming language. Mojonation was a very promising peer-to-peer application in 2000 that has since ceased to exist.
54. In particular, this project focuses on the Transmission Control Protocol (TCP), the User Datagram Protocol (UDP), and the Domain Name System (DNS). The first two have remained largely stable over the last thirty years, but the DNS system has been highly politicized (see Mueller, Ruling the Root).
55. On Internet standards, see Schmidt and Werle, Coordinating Technology; Abbate and Kahin, Standards Policy for Information Infrastructure.
56. Foucault, “What Is Enlightenment,” 319.
57. Stephenson, In the Beginning Was the Command Line.
59. The Apple-Microsoft conflict was given memorable expression by Umberto Eco in a widely read piece that compared the Apple user interface [pg 320] to Catholicism and the PC user interface to Protestantism (“La bustina di Minerva,” Espresso, 30 September 1994, back page).
60. One entry on Wikipedia differentiates religious wars from run-of-the-mill “flame wars” as follows: “Whereas a flame war is usually a particular spate of flaming against a non-flamy background, a holy war is a drawn-out disagreement that may last years or even span careers” (“Flaming [Internet],” http://en.wikipedia.org/wiki/Flame_war [accessed 16 January 2006]).
62. Message-ID: email@example.com. It should be noted, in case the reader is unsure how serious this is, that EGCS stood for Extended GNU Compiler System, not Ecumenical GNU Compiler Society.
63. “Martin Luther, Meet Linus Torvalds,” Salon, 12 November 1998, http://archive.salon.com/21st/feature/1998/11/12feature.html (accessed 5 February 2005).
65. Message-ID: firstname.lastname@example.org. In one very humorous case the comparison is literalized “Microsoft acquires Catholic Church” (Message-ID: email@example.com).
66. Paul Fusco, “The Gospel According to Joy,” New York Times, 27 March 1988, Sunday Magazine, 28.
67. See, for example, Matheson, The Imaginative World of the Reformation. There is rigorous debate about the relation of print, religion, and capitalism: one locus classicus is Eisenstein’s The Printing Press as an Agent of Change, which was inspired by McLuhan, The Gutenberg Galaxy. See also Ian Green, Print and Protestantism in Early Modern England and The Christian’s ABCs; Chadwick, The Early Reformation on the Continent, chaps. 1-3.
68. Crain, The Story of A, 16-17.
69. Ibid., 20-21.
70. At a populist level, this was captured by John Perry Barlow’s “Declaration of Independence of the Internet,” http://homes.eff.org/~barlow/Declaration-Final.html.
71. Foucault, “What Is Enlightenment,” 309-10.
72. Ibid., 310.
73. Ibid., 310.
74. Adrian Gropper, interview by author, 28 November 1998.
75. Adrian Gropper, interview by author, 28 November 1998.
76. Sean Doyle, interview by author, 30 March 1999.
77. Feyerabend, Against Method, 215-25.
78. One of the ways Adrian discusses innovation is via the argument of the Harvard Business School professor Clayton Christensen’s The Innovator’s Dilemma. It describes “sustaining vs. disruptive” technologies as less an issue of how technologies work or what they are made of, and more an issue of how their success and performance are measured. See Adrian Gropper, “The Internet as a Disruptive Technology,” Imaging Economics, December 2001, http://www.imagingeconomics.com/library/200112-10.asp (accessed 19 September 2006).
79. On kinds of civic duty, see Fortun and Fortun, “Scientific Imaginaries and Ethical Plateaus in Contemporary U.S. Toxicology.”
80. There is, in fact, a very specific group of people called transhumanists, about whom I will say very little. I invoke the label here because I think certain aspects of transhumanism are present across the spectrum of engineers, scientists, and geeks.
81. See the World Transhumanist Association, http://transhumanism.org/ (accessed 1 December 2003) or the Extropy Institute, http://www.extropy.org/ (accessed 1 December 2003). See also Doyle, Wetwares, and Battaglia, “For Those Who Are Not Afraid of the Future,” for a sidelong glance.
82. Huxley, New Bottles for New Wine, 13-18.
83. The computer scientist Bill Joy wrote a long piece in Wired warning of the outcomes of research conducted without ethical safeguards and the dangers of eugenics in the past, “Why the Future Doesn’t Need Us,” Wired 8.4 [April 2000], http://www.wired.com/wired/archive/8.04/joy.html (accessed 27 June 2005).
84. Vinge, “The Coming Technological Singularity.”
85. Eugen Leitl, e-mail to Silk-list mailing list, 16 May 2000, http://groups.yahoo.com/group/silk-list/message/2410.
86. Eugen Leitl, e-mail to Silk-list mailing list, 7 August 2000, http://groups.yahoo.com/group/silk-list/message/2932.
87. Friedrich A. Hayek, Law, Legislation and Liberty, 1:20.
88. For instance, Richard Stallman writes, “The Free Software movement and the Open Source movement are like two political camps within the free software community. Radical groups in the 1960s developed a reputation for factionalism: organizations split because of disagreements on details of strategy, and then treated each other as enemies. Or at least, such is the [pg 322] image people have of them, whether or not it was true. The relationship between the Free Software movement and the Open Source movement is just the opposite of that picture. We disagree on the basic principles, but agree more or less on the practical recommendations. So we can and do work together on many specific projects. We don’t think of the Open Source movement as an enemy. The enemy is proprietary software” (“Why ‘Free Software’ Is Better than ‘Open Source,’” GNU’s Not Unix! http://www.gnu.org/philosophy/free-software-for-freedom.html [accessed 9 July 2006]). By contrast, the Open Source Initiative characterizes the relationship as follows: “How is ‘open source’ related to ‘free software’? The Open Source Initiative is a marketing program for free software. It’s a pitch for ‘free software’ because it works, not because it’s the only right thing to do. We’re selling freedom on its merits” (http://www.opensource.org/advocacy/faq.php [accessed 9 July 2006]). There are a large number of definitions of Free Software: canonical definitions include Richard Stallman’s writings on the Free Software Foundation’s Web site, www.fsf.org, including the “Free Software Definition” and “Confusing Words and Phrases that Are Worth Avoiding.” From the Open Source side there is the “Open Source Definition” (http://www.opensource.org/licenses/). Unaffiliated definitions can be found at www.freedomdefined.org.
89. Moody, Rebel Code, 193.
90. Frank Hecker, quoted in Hamerly and Paquin, “Freeing the Source,” 198.
91. See Moody, Rebel Code, chap. 11, for a more detailed version of the story.
92. Bruce Perens, “The Open Source Definition,” 184.
93. Steven Weber, The Success of Open Source.
94. “Netscape Announces Plans to Make Next-Generation Communicator Source Code Available Free on the Net,” Netscape press release, 22 January 1998, http://wp.netscape.com/newsref/pr/newsrelease558.html (accessed 25 Sept 2007).
95. On the history of software development methodologies, see Mahoney, “The Histories of Computing(s)” and “The Roots of Software Engineering.”
96. Especially good descriptions of what this cycle is like can be found in Ullman, Close to the Machine and The Bug.
103. See Hamerly and Paquin, “Freeing the Source.” The story is elegantly related in Moody, Rebel Code, 182-204. Raymond gives Christine Petersen of the Foresight Institute credit for the term open source.
104. From Raymond, The Cathedral and the Bazaar. The changelog is available online only: http://www.catb.org/~esr/writings/cathedral-bazaar/cathedral-bazaar/.
105. Josh McHugh, “For the Love of Hacking,” Forbes, 10 August 1998, 94-100.
106. On social movements—the closest analog, developed long ago—see Gerlach and Hine, People, Power, Change, and Freeman and Johnson, Waves of Protest. However, the Free Software and Open Source Movements do not have “causes” of the kind that conventional movements do, other than the perpetuation of Free and Open Source Software (see Coleman, “Political Agnosticism”; Chan, “Coding Free Software”). Similarly, there is no single development methodology that would cover only Open Source. Advocates of Open Source are all too willing to exclude those individuals or organizations who follow the same “development methodology” but do not use a Free Software license—such as Microsoft’s oft-mocked “shared-source” program. The list of licenses approved by both the Free Software Foundation and the Open Source Initiative is substantially the same. Further, the Debian Free Software Guidelines and the “Open Source Definition” are almost identical (compare http://www.gnu.org/philosophy/license-list.html with http://www.opensource.org/licenses/ [both accessed 30 June 2006]).
107. It is, in the terms of Actor Network Theory, a process of “enrollment” in which participants find ways to rhetorically align—and to disalign—their interests. It does not constitute the substance of their interest, however. See Latour, Science in Action; Callon, “Some Elements of a Sociology of Translation.”
108. Coleman, “Political Agnosticism.”
109. See, respectively, Raymond, The Cathedral and the Bazaar, and Williams, Free as in Freedom.
110. For example, Castells, The Internet Galaxy, and Weber, The Success of Open Source both tell versions of the same story of origins and development.
111. “Sharing” source code is not the only kind of sharing among geeks (e.g., informal sharing to communicate ideas), and UNIX is not the only [pg 324] shared software. Other examples that exhibit this kind of proliferation (e.g., the LISP programming language, the TeX text-formatting system) are as ubiquitous as UNIX today. The inverse of my argument here is that selling produces a different kind of order: many products that existed in much larger numbers than UNIX have since disappeared because they were never ported or forked; they are now part of dead-computer museums and collections, if they have survived at all.
112. The story of UNIX has not been told, and yet it has been told hundreds of thousands of times. Every hacker, programmer, computer scientist, and geek tells a version of UNIX history—a usable past. Thus, the sources for this chapter include these stories, heard and recorded throughout my fieldwork, but also easily accessible in academic work on Free Software, which enthusiastically participates in this potted-history retailing. See, for example, Steven Weber, The Success of Open Source; Castells, The Internet Galaxy; Himanen, The Hacker Ethic; Benkler, The Wealth of Networks. To date there is but one detailed history of UNIX—A Quarter Century of UNIX, by Peter Salus—which I rely on extensively. Matt Ratto’s dissertation, “The Pressure of Openness,” also contains an excellent analytic history of the events told in this chapter.
113. The intersection of UNIX and TCP/IP occurred around 1980 and led to the famous switch from the Network Control Protocol (NCP) to the Transmission Control Protocol/Internet Protocol that occurred on 1 January 1983 (see Salus, Casting the Net).
114. Light, “When Computers Were Women”; Grier, When Computers Were Human.
115. There is a large and growing scholarly history of software: Wexelblat, History of Programming Languages and Bergin and Gibson, History of Programming Languages 2 are collected papers by historians and participants. Key works in history include Campbell-Kelly, From Airline Reservations to Sonic the Hedgehog; Akera and Nebeker, From 0 to 1; Hashagen, Keil-Slawik, and Norberg, History of Computing—Software Issues; Donald A. MacKenzie, Mechanizing Proof. Michael Mahoney has written by far the most about the early history of software; his relevant works include “The Roots of Software Engineering,” “The Structures of Computation,” “In Our Own Image,” and “Finding a History for Software Engineering.” On UNIX in particular, there is shockingly little historical work. Martin Campbell-Kelly and William Aspray devote a mere two pages in their general history Computer. As early as 1978, Ken Thompson and Dennis Ritchie were reflecting on the “history” of UNIX in “The UNIX Time-Sharing System: A Retrospective.” Ritchie maintains a Web site that contains a valuable collection of early documents and his own reminiscences (http://www.cs.bell-labs.com/who/dmr/ [pg 325] ). Mahoney has also conducted interviews with the main participants in the development of UNIX at Bell Labs. These interviews have not been published anywhere, but are drawn on as background in this chapter (interviews are in Mahoney’s personal files).
116. Turing, “On Computable Numbers.” See also Davis, Engines of Logic, for a basic explanation.
118. See Waldrop, The Dream Machine, 142-47.
119. A large number of editors were created in the 1970s; Richard Stallman’s EMACS and Bill Joy’s vi remain the most well known. Douglas Engelbart is somewhat too handsomely credited with the creation of the interactive computer, but the work of Butler Lampson and Peter Deutsch in Berkeley, as well as that of the Multics team, Ken Thompson, and others on early on-screen editors is surely more substantial in terms of the fundamental ideas and problems of manipulating text files on a screen. This story is largely undocumented, save for in the computer-science literature itself. On Engelbart, see Bardini, Bootstrapping.
120. See Campbell-Kelly, From Airline Reservations to Sonic the Hedgehog.
121. Ibid., 107.
122. Campbell-Kelly and Aspray, Computer, 203-5.
123. Ultimately, the Department of Justice case against IBM used bundling as evidence of monopolistic behavior, in addition to claims about the creation of so-called Plug Compatible Machines, devices that were reverse-engineered by meticulously constructing both the mechanical interface and the software that would communicate with IBM mainframes. See Franklin M. Fischer, Folded, Spindled, and Mutilated; Brock, The Second Information Revolution.
124. The story of this project and the lessons Brooks learned are the subject of one of the most famous software-development handbooks, The Mythical Man-Month, by Frederick Brooks.
125. The computer industry has always relied heavily on trade secret, much less so on patent and copyright. Trade secret also produces its own form of order, access, and circulation, which was carried over into the early software industry as well. See Kidder, The Soul of a New Machine for a classic account of secrecy and competition in the computer industry.
126. On time sharing, see Lee et al., “Project MAC.” Multics makes an appearance in nearly all histories of computing, the best resource by far being Tom van Vleck’s Web site http://www.multicians.org/.
127. Some widely admired technical innovations (many of which were borrowed from Multics) include: the hierarchical file system, the command shell for interacting with the system; the decision to treat everything, including external devices, as the same kind of entity (a file), the “pipe” operator which allowed the output of one tool to be “piped” as input to another tool, facilitating the easy creation of complex tasks from simple tools.
128. Salus, A Quarter Century of UNIX, 33-37.
129. Campbell-Kelly, From Airline Reservations to Sonic the Hedgehog, 143.
130. Ritchie’s Web site contains a copy of a 1974 license (http://cm.bell-labs.com/cm/cs/who/dmr/licenses.html) and a series of ads that exemplify the uneasy positioning of UNIX as a commercial product (http://cm.bell-labs.com/cm/cs/who/dmr/unixad.html). According to Don Libes and Sandy Ressler, “The original licenses were source licenses. . . . [C]ommercial institutions paid fees on the order of $20,000. If you owned more than one machine, you had to buy binary licenses for every additional machine [i.e., you were not allowed to copy the source and install it] you wanted to install UNIX on. They were fairly pricey at $8000, considering you couldn’t resell them. On the other hand, educational institutions could buy source licenses for several hundred dollars—just enough to cover Bell Labs’ administrative overhead and the cost of the tapes” (Life with UNIX, 20-21).
131. According to Salus, this licensing practice was also a direct result of Judge Thomas Meaney’s 1956 antitrust consent decree which required AT&T to reveal and to license its patents for nominal fees (A Quarter Century of UNIX, 56); see also Brock, The Second Information Revolution, 116-20.
132. Even in computer science, source code was rarely formally shared, and more likely presented in the form of theorems and proofs, or in various idealized higher-level languages such as Donald Knuth’s MIX language for presenting algorithms (Knuth, The Art of Computer Programming). Snippets of actual source code are much more likely to be found in printed form in handbooks, manuals, how-to guides, and other professional publications aimed at training programmers.
133. The simultaneous development of the operating system and the norms for creating, sharing, documenting, and extending it are often referred to as the “UNIX philosophy.” It includes the central idea that one should build on the ideas (software) of others (see Gancarz, The Unix Philosophy and Linux and the UNIX Philosophy). See also Raymond, The Art of UNIX Programming.
134. Bell Labs threatened the nascent UNIX NEWS newsletter with trademark infringement, so “USENIX” was a concession that harkened back to the original USE users’ group for DEC machines, but avoided explicitly using the name UNIX. Libes and Ressler, Life with UNIX, 9.
135. Salus, A Quarter Century of Unix, 138.
136. Ibid., emphasis added.
137. Ken Thompson and Dennis Ritchie, “The Unix Operating System,” Bell Systems Technical Journal (1974).
138. Greg Rose, quoted in Lions, Commentary, n.p.
139. Lions, Commentary, n.p.
141. Tanenbaum’s two most famous textbooks are Operating Systems and Computer Networks, which have seen three and four editions respectively.
142. Tanenbaum was not the only person to follow this route. The other acknowledged giant in the computer-science textbook world, Douglas Comer, created Xinu and Xinu-PC (UNIX spelled backwards) in Operating Systems Design in 1984.
143. McKusick, “Twenty Years of Berkeley Unix,” 32.
144. Libes and Ressler, Life with UNIX, 16-17.
145. A recent court case between the Utah-based SCO—the current owner of the legal rights to the original UNIX source code—and IBM raised yet again the question of how much of the original UNIX source code exists in the BSD distribution. SCO alleges that IBM (and Linus Torvalds) inserted SCO-owned UNIX source code into the Linux kernel. However, the incredibly circuitous route of the “original” source code makes these claims hard to ferret out: it was developed at Bell Labs, licensed to multiple universities, used as a basis for BSD, sold to an earlier version of the company SCO (then known as the Santa Cruz Operation), which created a version called Xenix in cooperation with Microsoft. See the diagram by Eric Lévénez at http://www.levenez.com/unix/. For more detail on this case, see www.groklaw.com.
146. See Vinton G. Cerf and Robert Kahn, “A Protocol for Packet Network Interconnection.” For the history, see Abbate, Inventing the Internet; Norberg and O’Neill, A History of the Information Techniques Processing Office. Also see chapters 1 and 5 herein for more detail on the role of these protocols and the RFC process.
147. Waldrop, The Dream Machine, chaps. 5 and 6.
148. The exception being a not unimportant tool called Unix to Unix Copy Protocol, or uucp, which was widely used to transmit data by phone and formed the bases for the creation of the Usenet. See Hauben and Hauben, Netizens.
149. Salus, A Quarter Century of UNIX, 161.
151. See Andrew Leonard, “BSD Unix: Power to the People, from the Code,” Salon, 16 May 2000, http://archive.salon.com/tech/fsp/2000/05/16/chapter_2_part_one/.
152. Norberg and O’Neill, A History of the Information Techniques Processing Office, 184-85. They cite Comer, Internetworking with TCP/IP, 6 for the figure.
153. Quoted in Libes and Ressler, Life with UNIX, 67, and also in Critchley and Batty, Open Systems, 17. I first heard it in an interview with Sean Doyle in 1998.
154. Moral in this usage signals the “moral and social order” I explored through the concept of social imaginaries in chapter 1. Or, in the Scottish Enlightenment sense of Adam Smith, it points to the right organization and relations of exchange among humans.
155. There is, of course, a relatively robust discourse of open systems in biology, sociology, systems theory, and cybernetics; however, that meaning of open systems is more or less completely distinct from what openness and open systems came to mean in the computer industry in the period book-ended by the arrivals of the personal computer and the explosion of the Internet (ca. 1980-93). One relevant overlap between these two meanings can be found in the work of Carl Hewitt at the MIT Media Lab and in the interest in “agorics” taken by K. Eric Drexler, Bernardo Huberman, and Mark S. Miller. See Huberman, The Ecology of Computation.
156. Keves, “Open Systems Formal Evaluation Process,” 87.
157. General Motors stirred strong interest in open systems by creating, in 1985, its Manufacturing Automation Protocol (MAP), which was built on UNIX. At the time, General Motors was the second-largest purchaser of computer equipment after the government. The Department of Defense and the U.S. Air Force also adopted and required POSIX-compliant UNIX systems early on.
158. Paul Fusco, “The Gospel According to Joy,” New York Times, 27 March 1988, Sunday Magazine, 28.
160. Crichtley and Batty, Open Systems, 10.
161. An excellent counterpoint here is Paul Edwards’s The Closed World, which clearly demonstrates the appeal of a thoroughly and hierarchically controlled system such as the Semi-Automated Ground Environment (SAGE) of the Department of Defense against the emergence of more “green world” models of openness.
162. Crichtley and Batty, Open Systems, 13.
163. McKenna, Who’s Afraid of Big Blue? 178, emphasis added. McKenna goes on to suggest that computer companies can differentiate themselves by adding services, better interfaces, or higher reliability—ironically similar to arguments that the Open Source Initiative would make ten years later.
164. Richard Stallman, echoing the image of medieval manacled wretches, characterized the blind spot thus: “Unix does not give the user any more legal freedom than Windows does. What they mean by ‘open systems’ is that you can mix and match components, so you can decide to have, say, a Sun chain on your right leg and some other company’s chain on your left leg, and maybe some third company’s chain on your right arm, and this is supposed to be better than having to choose to have Sun chains on all your limbs, or Microsoft chains on all your limbs. You know, I don’t care whose chains are on each limb. What I want is not to be chained by anyone” (“Richard Stallman: High School Misfit, Symbol of Free Software, MacArthur-certified Genius,” interview by Michael Gross, Cambridge, Mass., 1999, 5, http://www.mgross.com/MoreThgsChng/interviews/stallman1.html).
165. A similar story can be told about the emergence, in the late 1960s and early 1970s, of manufacturers of “plug-compatible” devices, peripherals that plugged into IBM machines (see Takahashi, “The Rise and Fall of the Plug Compatible Manufacturers”). Similarly, in the 1990s the story of browser compatibility and the World Wide Web Consortium (W3C) standards is another recapitulation.
166. McKenna, Who’s Afraid of Big Blue? 178.
167. Pamela Gray, Open Systems.
168. Eric Raymond, “Origins and History of Unix, 1969-1995,” The Art of UNIX Programming, http://www.faqs.org/docs/artu/ch02s01.html#id2880014.
169. Libes and Ressler, Life with UNIX, 22. Also noted in Tanenbaum, “The UNIX Marketplace in 1987,” 419.
170. Libes and Ressler, Life with UNIX, 67.
171. A case might be made that a third definition, the ANSI standard for the C programming language, also covered similar ground, which of course it would have had to in order to allow applications written on one [pg 330] operating system to be compiled and run on another (see Gray, Open Systems, 55-58; Libes and Ressler, Life with UNIX, 70-75).
172. “AT&T Deal with Sun Seen,” New York Times, 19 October 1987, D8.
173. Thomas C. Hayesdallas, “AT&T’s Unix Is a Hit at Last, and Other Companies Are Wary,” New York Times, 24 February 1988, D8.
174. “Unisys Obtains Pacts for Unix Capabilities,” New York Times, 10 March 1988, D4.
175. Andrew Pollack, “Computer Gangs Stake Out Turf,” New York Times, 13 December 1988, D1. See also Evelyn Richards, “Computer Firms Get a Taste of ‘Gang Warfare,’” Washington Post, 11 December 1988, K1; Brit Hume, “IBM, Once the Bully on the Block, Faces a Tough New PC Gang,” Washington Post, 3 October 1988, E24.
176. “What Is Unix?” The Unix System, http://www.unix.org/what_is_unix/history_timeline.html.
177. “About the Open Group,” The Open Group, http://www.opengroup.org/overview/vision-mission.htm.
178. “What Is Unix?” The Unix System, http://www.unix.org/what_is_unix/history_timeline.html.
179. Larry McVoy was an early voice, within Sun, arguing for solving the open-systems problem by turning to Free Software. Larry McVoy, “The Sourceware Operating System Proposal,” 9 November 1993, http://www.bitmover.com/lm/papers/srcos.html.
180. The distinction between a protocol, an implementation and a standard is important: Protocols are descriptions of the precise terms by which two computers can communicate (i.e., a dictionary and a handbook for communicating). An implementation is the creation of software that uses a protocol (i.e., actually does the communicating; thus two implementations using the same protocol should be able to share data. A standard defines which protocol should be used by which computers, for what purposes. It may or may not define the protocol, but will set limits on changes to that protocol.
181. The advantages of such an unplanned and unpredictable network have come to be identified in hindsight as a design principle. See Gillespie, “Engineering a Principle” for an excellent analysis of the history of “end to end” or “stupid” networks.
182. William Broad, “Global Network Split as Safeguard,” New York Times, 5 October 1983, A13.
184. Grier and Campbell, “A Social History of Bitnet and Listserv 1985-1991.”
185. On Usenet, see Hauben and Hauben, Netizens. See also Pfaffenberger, “‘A Standing Wave in the Web of Our Communications.’”
186. Schmidt and Werle, Coordinating Technology, chap. 7.
187. See, for example, Martin, Viewdata and the Information Society.
188. There is little information on the development of open systems; there is, however, a brief note from William Stallings, author of perhaps the most widely used textbook on networking, at “The Origins of OSI,” http://williamstallings.com/Extras/OSI.html.
189. Brock, The Second Information Revolution is a good introductory source for this conflict, at least in its policy outlines. The Federal Communications Commission issued two decisions (known as “Computer 1” and “Computer 2”) that attempted to deal with this conflict by trying to define what counted as voice communication and what as data.
190. Brock, The Second Information Revolution, chap. 10.
191. Drake, “The Internet Religious War.”
192. Malamud, Exploring the Internet; see also Michael M. J. Fischer, “Worlding Cyberspace.”
193. The usable past of Giordano Bruno is invoked by Malamud to signal the heretical nature of his own commitment to openly publishing standards that ISO was opposed to releasing. Bruno’s fate at the hands of the Roman Inquisition hinged in some part on his acceptance of the Copernican cosmology, so he has been, like Galileo, a natural figure for revolutionary claims during the 1990s.
194. Abbate, Inventing the Internet; Salus, Casting the Net; Galloway, Protocol; and Brock, The Second Information Revolution. For practitioner histories, see Kahn et al., “The Evolution of the Internet as a Global Information System”; Clark, “The Design Philosophy of the DARPA Internet Protocols.”
195. Kahn et al., “The Evolution of the Internet as a Global Information System,” 134-140; Abbate, Inventing the Internet, 114-36.
196. Kahn and Cerf, “A Protocol for Packet Network Intercommunication,” 637.
197. Clark, “The Design Philosophy of the DARPA Internet Protocols,” 54-55.
199. RFC Editor, RFC 2555, 6.
200. Ibid., 11.
201. This can be clearly seen, for instance, by comparing the various editions of the main computer-networking textbooks: cf. Tanenbaum, Computer Networks, 1st ed. (1981), 2d ed. (1988), 3d ed. (1996), and 4th ed. (2003); Stallings, Data and Computer Communications, 1st ed. (1985), 2d ed. (1991), [pg 332] 3d ed. (1994), 4th ed. (1997), and 5th ed. (2004); and Comer, Internetworking with TCP/IP (four editions between 1991 and 1999).
202. Sunshine, Computer Network Architectures and Protocols, 5.
203. The structure of the IETF, the Internet Architecture Board, and the ISOC is detailed in Comer, Internetworking with TCP/IP, 8-13; also in Schmidt and Werle, Coordinating Technology, 53-58.
205. The legal literature on Free Software expands constantly and quickly, and it addresses a variety of different legal issues. Two excellent starting points are Vetter, “The Collaborative Integrity of Open-Source Software” and “‘Infectious’ Open Source Software.”
206. Coleman, “The Social Construction of Freedom.”
207. “The GNU General Public Licence, Version 2.0,” http://www.gnu.org/licenses/old-licenses/gpl-2.0.html.
208. All existing accounts of the hacker ethic come from two sources: from Stallman himself and from the colorful and compelling chapter about Stallman in Steven Levy’s Hackers. Both acknowledge a prehistory to the ethic. Levy draws it back in time to the MIT Tech Model Railroad Club of the 1950s, while Stallman is more likely to describe it as reaching back to the scientific revolution or earlier. The stories of early hackerdom at MIT are avowedly Edenic, and in them hackers live in a world of uncontested freedom and collegial competition—something like a writer’s commune without the alcohol or the brawling. There are stories about a printer whose software needed fixing but was only available under a nondisclosure agreement; about a requirement to use passwords (Stallman refused, chose <return> as his password, and hacked the system to encourage others to do the same); about a programming war between different LISP machines; and about the replacement of the Incompatible Time-Sharing System with DEC’s TOPS-20 (“Twenex”) operating system. These stories are oft-told usable pasts, but they are not representative. Commercial constraints have always been part of academic life in computer science and engineering: hardware and software were of necessity purchased from commercial manufacturers and often controlled by them, even if they offered “academic” or “educational” licenses.
209. Delanda, “Open Source.”
210. Dewey, Liberalism and Social Action.
211. Copyright Act of 1976, Pub. L. No. 94-553, 90 Stat. 2541, enacted 19 October 1976; and Copyright Amendments, Pub. L. No. 96-517, 94 Stat. 3015, 3028 (amending §101 and §117, title 17, United States Code, regarding computer programs), enacted 12 December 1980. All amendments since 1976 are listed at http://www.copyright.gov/title17/92preface.html.
212. The history of the copyright and software is discussed in Litman, Digital Copyright; Cohen et al., Copyright in a Global Information Economy; and Merges, Menell, and Lemley, Intellectual Property in the New Technological Age.
213. See Wayner, Free for All; Moody, Rebel Code; and Williams, Free as in Freedom. Although this story could be told simply by interviewing Stallman and James Gosling, both of whom are still alive and active in the software world, I have chosen to tell it through a detailed analysis of the Usenet and Arpanet archives of the controversy. The trade-off is between a kind of incomplete, fly-on-the-wall access to a moment in history and the likely revisionist retellings of those who lived through it. All of the messages referenced here are cited by their “Message-ID,” which should allow anyone interested to access the original messages through Google Groups (http://groups.google.com).
214. Eugene Ciccarelli, “An Introduction to the EMACS Editor,” MIT Artificial Intelligence Laboratory, AI Lab Memo no. 447, 1978, 2.
215. Richard Stallman, “EMACS: The Extensible, Customizable Self-documenting Display Editor,” MIT Artificial Intelligence Laboratory, AI Lab Memo no. 519a, 26 March 1981, 19. Also published as Richard M. Stallman, “EMACS: The Extensible, Customizable Self-documenting Display Editor,” Proceedings of the ACM SIGPLAN SIGOA Symposium on Text Manipulation, 8-10 June (ACM, 1981), 147-56.
216. Richard Stallman, “EMACS: The Extensible, Customizable Self-documenting Display Editor,” MIT Artificial Intelligence Laboratory, AI Lab Memo no. 519a, 26 March 1981, 24.
217. Richard M. Stallman, “EMACS Manual for ITS Users,” MIT Artificial Intelligence Laboratory, AI Lab Memo no. 554, 22 October 1981, 163.
218. Back in January of 1983, Steve Zimmerman had announced that the company he worked for, CCA, had created a commercial version of EMACS called CCA EMACS (Message-ID: firstname.lastname@example.org ). Zimmerman had not written this version entirely, but had taken a version written by Warren Montgomery at Bell Labs (written for UNIX on PDP-11s) and created the version that was being used by programmers at CCA. Zimmerman had apparently distributed it by ftp at first, but when CCA determined that it might be worth something, they decided to exploit it commercially, rather than letting Zimmerman distribute it “freely.” By Zimmerman’s own [pg 334] account, this whole procedure required ensuring that there was nothing left of the original code by Warren Montgomery that Bell Labs owned (Message-ID: email@example.com ).
222. Message-ID: bnews.sri-arpa.988.
225. Various other people seem to have conceived of a similar scheme around the same time (if the Usenet archives are any guide), including Guido Van Rossum (who would later become famous for the creation of the Python scripting language). The following is from Message-ID: firstname.lastname@example.org:
/* This software is copyright (c) Mathematical Centre, Amsterdam,
* Permission is granted to use and copy this software, but not for * profit,
* and provided that these same conditions are imposed on any person
* receiving or using the software.
228. The main file of the controversy was called display.c. A version that was modified by Chris Torek appears in net.sources, Message-ID: email@example.com. A separate example of a piece of code written by Gosling bears a note that claims he had declared it public domain, but did not “include the infamous Stallman anti-copyright clause” (Message-ID: firstname.lastname@example.org ).
237. With the benefit of hindsight, the position that software could be in the public domain also seems legally uncertain, given that the 1976 changes to USC§17 abolished the requirement to register and, by the same token, to render uncertain the status of code contributed to Gosling and incorporated into GOSMACS.
243. Joaquim Martillo, Message-ID: email@example.com : “Trying to forbid RMS from using discarded code so that he must spend time to reinvent the wheel supports his contention that ‘software hoarders’ are slowing down progress in computer science.”
244. Diamond V. Diehr, 450 U.S. 175 (1981), the Supreme Court decision, forced the patent office to grant patents on software. Interestingly, software patents had been granted much earlier, but went either uncontested or unenforced. An excellent example is patent 3,568,156, held by Ken Thompson, on regular expression pattern matching, granted in 1971.
245. Calvin Mooers, in his 1975 article “Computer Software and Copyright,” suggests that the IBM unbundling decision opened the doors to thinking about copyright protection.
247. Gosling’s EMACS 264 (Stallman copied EMACS 84) is registered with the Library of Congress, as is GNU EMACS 15.34. Gosling’s EMACS Library of Congress registration number is TX-3-407-458, registered in 1992. Stallman’s registration number is TX-1-575-302, registered in May 1985. The listed dates are uncertain, however, since there are periodic re-registrations and updates.
250. A standard practice well into the 1980s, and even later, was the creation of so-called clean-room versions of software, in which new programmers and designers who had not seen the offending code were hired to [pg 336] re-implement it in order to avoid the appearance of trade-secret violation. Copyright law is a strict liability law, meaning that ignorance does not absolve the infringer, so the practice of “clean-room engineering” seems not to have been as successful in the case of copyright, as the meaning of infringement remains murky.
251. Message-ID: firstname.lastname@example.org. AT&T was less concerned about copyright infringement than they were about the status of their trade secrets. Zimmerman quotes a statement (from Message-ID: email@example.com ) that he claims indicates this: “Beginning with CCA EMACS version 162.36z, CCA EMACS no longer contained any of the code from Mr. Montgomery’s EMACS, or any methods or concepts which would be known only by programmers familiar with BTL [Bell Labs] EMACS of any version.” The statement did not mention copyright, but implied that CCA EMACS did not contain any AT&T trade secrets, thus preserving their software’s trade-secret status. The fact that EMACS was a conceptual design—a particular kind of interface, a LISP interpreter, and extensibility—that was very widely imitated had no apparent bearing on the legal status of these secrets.
253. The cases that determine the meaning of the 1976 and 1980 amendments begin around 1986: Whelan Associates, Inc. v. Jaslow Dental Laboratory, Inc., et al., U.S. Third Circuit Court of Appeals, 4 August 1986, 797 F.2d 1222, 230 USPQ 481, affirming that “structure (or sequence or organization)” of software is copyrightable, not only the literal software code; Computer Associates International, Inc. v. Altai, Inc., U.S. Second Circuit Court of Appeals, 22 June 1992, 982 F.2d 693, 23 USPQ 2d 1241, arguing that the structure test in Whelan was not sufficient to determine infringement and thus proposing a three-part “abstraction-filiation-comparison” test; Apple Computer, Inc. v. Microsoft Corp, U.S. Ninth Circuit Court of Appeals, 1994, 35 F.3d 1435, finding that the “desktop metaphor” used in Macintosh and Windows was not identical and thus did not constitute infringement; Lotus Development Corporation v. Borland International, Inc. (94-2003), 1996, 513 U.S. 233, finding that the “look and feel” of a menu interface was not copyrightable.
254. The relationship between the definition of source and target befuddles software law to this day, one of the most colorful examples being the DeCSS case. See Coleman, “The Social Construction of Freedom,” chap. 1: Gallery of CSS Descramblers, http://www.cs.cmu.edu/~dst/DeCSS/gallery/.
255. An interesting addendum here is that the manual for EMACS was also released at around the same time as EMACS 16 and was available [pg 337] as a TeX file. Stallman also attempted to deal with the paper document in the same fashion (see Message-ID: firstname.lastname@example.org, 19 July 1985), and this would much later become a different and trickier issue that would result in the GNU Free Documentation License.
258. See Coleman, “The Social Construction of Freedom,” chap. 6, on the Debian New Maintainer Process, for an example of how induction into a Free Software project stresses the legal as much as the technical, if not more.
261. Research on coordination in Free Software forms the central core of recent academic work. Two of the most widely read pieces, Yochai Benkler’s “Coase’s Penguin” and Steven Weber’s The Success of Open Source, are directed at classic research questions about collective action. Rishab Ghosh’s “Cooking Pot Markets” and Eric Raymond’s The Cathedral and the Bazaar set many of the terms of debate. Josh Lerner’s and Jean Tirole’s “Some Simple Economics of Open Source” was an early contribution. Other important works on the subject are Feller et al., Perspectives on Free and Open Source Software; Tuomi, Networks of Innovation; Von Hippel, Democratizing Innovation.
262. On the distinction between adaptability and adaptation, see Federico Iannacci, “The Linux Managing Model,” http://opensource.mit.edu/papers/iannacci2.pdf. Matt Ratto characterizes the activity of Linux-kernel developers as a “culture of re-working” and a “design for re-design,” and captures the exquisite details of such a practice both in coding and in the discussion between developers, an activity he dubs the “pressure of openness” that “results as a contradiction between the need to maintain productive collaborative activity and the simultaneous need to remain open to new development directions” (“The Pressure of Openness,” 112-38).
263. Linux is often called an operating system, which Stallman objects to on the theory that a kernel is only one part of an operating system. Stallman suggests that it be called GNU/Linux to reflect the use of GNU operating-system tools in combination with the Linux kernel. This not-so-subtle ploy to take credit for Linux reveals the complexity of the distinctions. The kernel is at the heart of hundreds of different “distributions”—such as Debian, Red Hat, SuSe, and Ubuntu Linux—all of which also use GNU tools, but [pg 338] which are often collections of software larger than just an operating system. Everyone involved seems to have an intuitive sense of what an operating system is (thanks to the pedagogical success of UNIX), but few can draw any firm lines around the object itself.
264. Eric Raymond directed attention primarily to Linux in The Cathedral and the Bazaar. Many other projects preceded Torvalds’s kernel, however, including the tools that form the core of both UNIX and the Internet: Paul Vixie’s implementation of the Domain Name System (DNS) known as BIND; Eric Allman’s sendmail for routing e-mail; the scripting languages perl (created by Larry Wall), python (Guido von Rossum), and tcl/tk (John Ousterhout); the X Windows research project at MIT; and the derivatives of the original BSD UNIX, FreeBSD and OpenBSD. On the development model of FreeBSD, see Jorgensen, “Putting It All in the Trunk” and “Incremental and Decentralized Integration in FreeBSD.” The story of the genesis of Linux is very nicely told in Moody, Rebel Code, and Williams, Free as in Freedom; there are also a number of papers—available through Free/Opensource Research Community, http://freesoftware.mit.edu/—that analyze the development dynamics of the Linux kernel. See especially Ratto, “Embedded Technical Expression” and “The Pressure of Openness.” I have conducted much of my analysis of Linux by reading the Linux Kernel Mailing List archives, http://lkml.org. There are also annotated summaries of the Linux Kernel Mailing List discussions at http://kerneltraffic.org.
265. Howard Rheingold, The Virtual Community. On the prehistory of this period and the cultural location of some key aspects, see Turner, From Counterculture to Cyberculture.
266. Julian Dibbell’s “A Rape in Cyberspace” and Sherry Turkle’s Life on the Screen are two classic examples of the detailed forms of life and collaborative ethical creation that preoccupied denizens of these worlds.
267. The yearly influx of students to the Usenet and Arpanet in September earned that month the title “the longest month,” due to the need to train new users in use and etiquette on the newsgroups. Later in the 1990s, when AOL allowed subscribers access to the Usenet hierarchy, it became known as “eternal September.” See “September that Never Ended,” Jargon File, http://catb.org/~esr/jargon/html/S/September-that-never-ended.html.
270. Indeed, initially, Torvalds’s terms of distribution for Linux were more restrictive than the GPL, including limitations on distributing it for a fee or for handling costs. Torvalds eventually loosened the restrictions and switched to the GPL in February 1992. Torvalds’s release notes for Linux 0.12 say, “The Linux copyright will change: I’ve had a couple of requests [pg 339] to make it compatible with the GNU copyleft, removing the ‘you may not distribute it for money’ condition. I agree. I propose that the copyright be changed so that it conforms to GNU—pending approval of the persons who have helped write code. I assume this is going to be no problem for anybody: If you have grievances (‘I wrote that code assuming the copyright would stay the same’) mail me. Otherwise The GNU copyleft takes effect as of the first of February. If you do not know the gist of the GNU copyright—read it” (http://www.kernel.org/pub/linux/kernel/Historic/old-versions/RELNOTES-0.12).
273. Steven Weber, The Success of Open Source, 164.
274. Quoted in Zack Brown, “Kernel Traffic #146 for 17Dec2001,” Kernel Traffic, http://www.kerneltraffic.org/kernel-traffic/kt20011217_146.html; also quoted in Federico Iannacci, “The Linux Managing Model,” http://opensource.mit.edu/papers/iannacci2.pdf.
276. The original Apache Group included Brian Behlendorf, Roy T. Fielding, Rob Harthill, David Robinson, Cliff Skolnick, Randy Terbush, Robert S. Thau, Andrew Wilson, Eric Hagberg, Frank Peters, and Nicolas Pioch. The mailing list new-httpd eventually became the Apache developers list. The archives are available at http://mail-archives.apache.org/mod_mbox/httpd-dev/ and cited hereafter as “Apache developer mailing list,” followed by sender, subject, date, and time.
277. For another version of the story, see Moody, Rebel Code, 127-28. The official story honors the Apache Indian tribes for “superior skills in warfare strategy and inexhaustible endurance.” Evidence of the concern of the original members over the use of the name is clearly visible in the archives of the Apache project. See esp. Apache developer mailing list, Robert S. Thau, Subject: The political correctness question . . . , 22 April 1995, 21:06 EDT.
278. Mockus, Fielding, and Herbsleb, “Two Case Studies of Open Source Software Development,” 3.
279. Apache developer mailing list, Andrew Wilson, Subject: Re: httpd patch B5 updated, 14 March 1995, 21:49 GMT.
280. Apache developer mailing list, Rob Harthill, Subject: Re: httpd patch B5 updated, 14 March 1995, 15:10 MST.
281. Apache developer mailing list, Cliff Skolnick, Subject: Process (please read), 15 March 1995, 3:11 PST; and Subject: Patch file format, 15 March 1995, 3:40 PST.
282. Apache developer mailing list, Rob Harthill, Subject: patch list vote, 15 March 1995, 13:21:24 MST.
283. Apache developer mailing list, Rob Harthill, Subject: apache-0.2 on hyperreal, 18 March 1995, 18:46 MST.
284. Apache developer mailing list, Cliff Skolnick, Subject: Re: patch list vote, 21 March 1995, 2:47 PST.
285. Apache developer mailing list, Paul Richards, Subject: Re: vote counting, 21 March 1995, 22:24 GMT.
286. Roy T. Fielding, “Shared Leadership in the Apache Project.”
287. Apache developer mailing list, Robert S. Thau, Subject: Re: 0.7.2b, 7 June 1995, 17:27 EDT.
288. Apache developer mailing list, Robert S. Thau, Subject: My Garage Project, 12 June 1995, 21:14 GMT.
289. Apache developer mailing list, Rob Harthill, Subject: Re: Shambhala, 30 June 1995, 9:44 MDT.
290. Apache developer mailing list, Rob Harthill, Subject: Re: Shambhala, 30 June 1995, 14:50 MDT.
291. Apache developer mailing list, Rob Harthill, Subject: Re: Shambhala, 30 June 1995, 16:48 GMT.
292. Gabriella Coleman captures this nicely in her discussion of the tension between the individual virtuosity of the hacker and the corporate populism of groups like Apache or, in her example, the Debian distribution of Linux. See Coleman, The Social Construction of Freedom.
293. Apache developer mailing list, Robert S. Thau, Subject: Re: Shambhala, 1 July 1995, 14:42 EDT.
294. A slightly different explanation of the role of modularity is discussed in Steven Weber, The Success of Open Source, 173-75.
295. Tichy, “RCS.”
296. See Steven Weber, The Success of Open Source, 117-19; Moody, Rebel Code, 172-78. See also Shaikh and Cornford, “Version Management Tools.”
297. Linus Torvalds, “Re: [PATCH] Remove Bitkeeper Documentation from Linux Tree,” 20 April 2002, http://www.uwsg.indiana.edu/hypermail/linux/kernel/0204.2/1018.html. Quoted in Shaikh and Cornford, “Version Management Tools.”
298. Andrew Orlowski, “‘Cool it, Linus’—Bruce Perens,” Register, 15 April 2005, http://www.theregister.co.uk/2005/04/15/perens_on_torvalds/page2.html.
299. Similar debates have regularly appeared around the use of non-free compilers, non-free debuggers, non-free development environments, and so forth. There are, however, a large number of people who write and promote Free Software that runs on proprietary operating systems like Macintosh and Windows, as well as a distinction between tools and formats. So, [pg 341] for instance, using Adobe Photoshop to create icons is fine so long as they are in standard open formats like PNG or JPG, and not proprietary forms such as GIF or photoshop.
301. Steven Weber, The Success of Open Source, 132.
302. Raymond, The Cathedral and the Bazaar.
303. Gabriella Coleman, in “The Social Construction of Freedom,” provides an excellent example of a programmer’s frustration with font-lock in EMACS, something that falls in between a bug and a feature. The programmer’s frustration is directed at the stupidity of the design (and implied designers), but his solution is not a fix, but a work-around—and it illustrates how debugging does not always imply collaboration.
304. Dan Wallach, interview, 3 October 2003.
305. Mitchell Waldrop’s The Dream Machine details the family history well.
306. In January 2005, when I first wrote this analysis, this was true. By April 2006, the Hewlett Foundation had convened the Open Educational Resources “movement” as something that would transform the production and circulation of textbooks like those created by Connexions. Indeed, in Rich Baraniuk’s report for Hewlett, the first paragraph reads: “A grassroots movement is on the verge of sweeping through the academic world. The open education movement is based on a set of intuitions that are shared by a remarkably wide range of academics: that knowledge should be free and open to use and re-use; that collaboration should be easier, not harder; that people should receive credit and kudos for contributing to education and research; and that concepts and ideas are linked in unusual and surprising ways and not the simple linear forms that textbooks present. Open education promises to fundamentally change the way authors, instructors, and students interact worldwide” (Baraniuk and King, “Connexions”). (In a nice confirmation of just how embedded participation can become in anthropology, Baraniuk cribbed the second sentence from something I had written two years earlier as part of a description of what I thought Connexions hoped to achieve.) The “movement” as such still does not quite exist, but the momentum for it is clearly part of the actions that Hewlett hopes to achieve.
307. See Chris Beam, “Fathom.com Shuts Down as Columbia Withdraws,” Columbia Spectator, 27 January 2003, http://www.columbiaspectator.com/. Also see David Noble’s widely read critique, “Digital Diploma Mills.”
308. “Provost Announces Formation of Council on Educational Technology,” MIT Tech Talk, 29 September 1999, http://web.mit.edu/newsoffice/1999/council-0929.html.
309. The software consists of a collection of different Open Source Software cobbled together to provide the basic platform (the Zope and Plone content-management frameworks, the PostGresQL database, the python programming language, and the cvs version-control software).
310. The most significant exception has been the issue of tools for authoring content in XML. For most of the life of the Connexions project, the XML mark-up language has been well-defined and clear, but there has been no way to write a module in XML, short of directly writing the text and the tags in a text editor. For all but a very small number of possible users, this feels too much like programming, and they experience it as too frustrating to be worth it. The solution (albeit temporary) was to encourage users to make use of a proprietary XML editor (like a word processor, but capable of creating XML content). Indeed, the Connexions project’s devotion to openness was tested by one of the most important decisions its participants made: to pursue the creation of an Open Source XML text editor in order to provide access to completely open tools for creating completely open content.
311. Boyle, “Mertonianism Unbound,” 14.
312. The movement is the component that remains unmodulated: there is no “free textbook” movement associated with Connexions, even though many of the same arguments that lead to a split between Free Software and Open Source occur here: the question of whether the term free is confusing, for example, or the role of for-profit publishers or textbook companies. In the end, most (though not all) of the Connexions staff and many of its users are content to treat it as a useful tool for composing novel kinds of digital educational material—not as a movement for the liberation of educational content.
313. Boyle, “Conservatives and Intellectual Property.”
314. Lessig’s output has been prodigious. His books include Code and Other Laws of Cyber Space, The Future of Ideas, Free Culture, and Code: Version 2.0. He has also written a large number of articles and is an active blogger (http://www.lessig.org/blog/).
315. There were few such projects under way, though there were many in the planning stages. Within a year, the Public Library of Science had launched itself, spearheaded by Harold Varmus, the former director of the National Institutes of Health. At the time, however, the only other large scholarly project was the MIT Open Course Ware project, which, although it had already agreed to use Creative Commons licenses, had demanded a peculiar one-off license.
316. The fact that I organized a workshop to which I invited “informants” and to which I subsequently refer as research might strike some, both in anthropology and outside it, as wrong. But it is precisely the kind of occasion I would argue has become central to the problematics of method in cultural anthropology today. On this subject, see Holmes and Marcus, “Cultures of Expertise and the Management of Globalization.” Such strategic and seemingly ad hoc participation does not exclude one from attempting to later disentangle oneself from such participation, in order to comment on the value and significance, and especially to offer critique. Such is the attempt to achieve objectivity in social science, an objectivity that goes beyond the basic notions of bias and observer-effect so common in the social sciences. “Objectivity” in a broader social sense includes the observation of the conceptual linkages that both precede such a workshop (constituted the need for it to happen) and follow on it, independent of any particular meeting. The complexity of mobilizing objectivity in discussions of the value and significance of social or economic phenomena was well articulated a century ago by Max Weber, and problems of method in the sense raised by him seem to me to be no less fraught today. See Max Weber, “Objectivity in the Social Sciences.”
317. Suntrust v. Houghton Mifflin Co., U.S. Eleventh Circuit Court of Appeals, 2001, 252 F. 3d 1165.
318. Neil Strauss, “An Uninvited Bassist Takes to the Internet,” New York Times, 25 August 2002, sec. 2, 23.
319. Indeed, in a more self-reflective moment, Glenn once excitedly wrote to me to explain that what he was doing was “code-switching” and that he thought that geeks who constantly involved themselves in technology, law, music, gaming, and so on would be prime case studies for a code-switching study by anthropologists.
320. See Kelty, “Punt to Culture.”
321. Lessig, “The New Chicago School.”
322. Hence, Boyle’s “Second Enclosure Movement” and “copyright conservancy” concepts (see Boyle, “The Second Enclosure Movement”; Bollier, Silent Theft). Perhaps the most sophisticated and compelling expression of the institutional-economics approach to understanding Free Software is the work of Yochai Benkler, especially “Sharing Nicely” and “Coase’s Penguin.” See also Benkler, Wealth of Networks.
323. Steven Weber’s The Success of Open Source is exemplary.
324. Carrington and King, “Law and the Wisconsin Idea.”
325. In particular, Glenn Brown suggested Oliver Wendell Holmes as a kind of origin point both for critical legal realism and for law and economics, a kind of filter through which lawyers get both their Nietzsche [pg 344] and their liberalism (see Oliver Wendell Holmes, “The Path of the Law”). Glenn’s opinion was that what he called “punting to culture” (by which he meant writing minimalist laws which allow social custom to fill in the details) descended more or less directly from the kind of legal reasoning embodied in Holmes: “Note that [Holmes] is probably best known in legal circles for arguing that questions of morality be removed from legal analysis and left to the field of ethics. this is what makes him the godfather of both the posners of the world, and the crits, and the strange hybrids like lessig” (Glenn Brown, personal communication, 11 August 2003).
326. Actor-network theory comes closest to dealing with such “ontological” issues as, for example, airplanes in John Law’s Aircraft Stories, the disease atheroscleroris in Annemarie Mol’s The Body Multiple, or in vitro fertilization in Charis Thompson’s Making Parents. The focus here on finality is closely related, but aims at revealing the temporal characteristics of highly modifiable kinds of knowledge-objects, like textbooks or databases, as in Geoffrey Bowker’s Memory Practices in the Sciences.
327. Merton, “The Normative Structure of Science.”
328. See Johns, The Nature of the Book; Eisenstein, The Printing Press as an Agent of Change; McLuhan, The Gutenberg Galaxy and Understanding Media; Febvre and Martin, The Coming of the Book; Ong, Ramus, Method, and the Decay of Dialogue; Chartier, The Cultural Uses of Print in Early Modern France and The Order of Books; Kittler, Discourse Networks 1800/1900 and Gramophone, Film, Typewriter.
329. There is less communication between the theorists and historians of copyright and authorship and those of the book; the former are also rich in analyses, such as Jaszi and Woodmansee, The Construction of Authorship; Mark Rose, Authors and Owners; St. Amour, The Copywrights; Vaidhyanathan, Copyrights and Copywrongs.
330. Eisenstein, The Printing Press as an Agent of Change. Eisenstein’s work makes direct reference to McLuhan’s thesis in The Gutenberg Galaxy, and Latour relies on these works and others in “Drawing Things Together.”
331. Johns, The Nature of the Book, 19-20.
332. On this subject, cf. Pablo Boczkowski’s study of the digitization of newspapers, Digitizing the News.
333. Conventional here is actually quite historically proximate: the system creates a pdf document by translating the XML document into a LaTeX document, then into a pdf document. LaTeX has been, for some twenty years, a standard text-formatting and typesetting language used by some [pg 345] sectors of the publishing industry (notably mathematics, engineering, and computer science). Were it not for the existence of this standard from which to bootstrap, the Connexions project would have faced a considerably more difficult challenge, but much of the infrastructure of publishing has already been partially transformed into a computer-mediated and -controlled system whose final output is a printed book. Later in Connexions’s lifetime, the group coordinated with an Internet-publishing startup called Qoop.com to take the final step and make Connexions courses available as print-on-demand, cloth-bound textbooks, complete with ISBNs and back-cover blurbs.
334. See Johns, The Nature of the Book; Warner, The Letters of the Republic.
335. On fixity, see Eisenstein’s The Printing Press as an Agent of Change which cites McLuhan’s The Gutenberg Galaxy. The stability of texts is also questioned routinely by textual scholars, especially those who work with manuscripts and complicated varoria (for an excellent introduction, see Bornstein and Williams, Palimpsest). Michel Foucault’s “What Is an Author?” addresses a related but orthogonal problematic and is unconcerned with the relatively sober facts of a changing medium.
336. A salient and recent point of comparison can be found in the form of Lawrence Lessig’s “second edition” of his book Code, which is titled Code: Version 2.0 (version is used in the title, but edition is used in the text). The first book was published in 1999 (“ancient history in Internet time”), and Lessig convinced the publisher to make it available as a wiki, a collaborative Web site which can be directly edited by anyone with access. The wiki was edited and updated by hordes of geeks, then “closed” and reedited into a second edition with a new preface. It is a particularly tightly controlled example of collaboration; although the wiki and the book were freely available, the modification and transformation of them did not amount to a simple free-for-all. Instead, Lessig leveraged his own authority, his authorial voice, and the power of Basic Books to create something that looks very much like a traditional second edition, although it was created by processes unimaginable ten years ago.
337. The most familiar comparison is Wikipedia, which was started after Connexions, but grew far more quickly and dynamically, largely due to the ease of use of the system (a bone of some contention among the Connexions team). Wikipedia has come under assault primarily for being unreliable. The suspicion and fear that surround Wikipedia are similar to those that face Connexions, but in the case of Wikipedia entries, the commitment to openness is stubbornly meritocratic: any article can be edited by anyone at anytime, and it matters not how firmly one is identified as an expert by rank, title, degree, or experience—a twelve year old’s knowledge of the Peloponnesian War is given the same access and status as an eighty-year-old classicist’s. Articles are not owned by individuals, and [pg 346] all work is pseudonymous and difficult to track. The range of quality is therefore great, and the mainstream press has focused largely on whether Wikipedia is more or less reliable than conventional encyclopedias, not on the process of knowledge production. See, for instance, George Johnson, “The Nitpicking of the Masses vs. the Authority of the Experts,” New York Times, 3 January 2006, Late Edition—Final, F2; Robert McHenry, “The Faith-based Encyclopedia,” TCS Daily, 15 November 2004, http://www.techcentralstation.com/111504A.html.
338. Again, a comparison with Wikipedia is apposite. Wikipedia is, morally speaking, and especially in the persona of its chief editor, Jimbo Wales, totally devoted to merit-based equality, with users getting no special designation beyond the amount and perceived quality of the material they contribute. Degrees or special positions of employment are anathema. It is a quintessentially American, anti-intellectual-fueled, Horatio Alger-style approach in which the slate is wiped clean and contributors are given a chance to prove themselves independent of background. Connexions, by contrast, draws specifically from the ranks of intellectuals or academics and seeks to replace the infrastructure of publishing. Wikipedia is interested only in creating a better encyclopedia. In this respect, it is transhumanist in character, attributing its distinctiveness and success to the advances in technology (the Internet, wiki, broadband connections, Google). Connexions on the other hand is more polymathic, devoted to intervening into the already complexly constituted organizational practice of scholarship and academia.
339. An even more technical feature concerned the issue of the order of authorship. The designers at first decided to allow Connexions to simply display the authors in alphabetical order, a practice adopted by some disciplines, like computer science. However, in the case of the Housman example this resulted in what looked like a module authored principally by me, and only secondarily by A. E. Housman. And without the ability to explicitly designate order of authorship, many disciplines had no way to express their conventions along these lines. As a result, the system was redesigned to allow users to designate the order of authorship as well.
340. I refer here to Eric Raymond’s “discovery” that hackers possess unstated norms that govern what they do, in addition to the legal licenses and technical practices they engage in (see Raymond, “Homesteading the Noosphere”). For a critique and background on hacker ethics and norms, see Coleman, “The Social Construction of Freedom.”
341. Bruno Latour’s Science in Action makes a strong case for the centrality of “black boxes” in science and engineering for precisely this reason.
342. I should note, in my defense, that my efforts to get my informants to read Max Weber, Ferdinand Tönnies, Henry Maine, or Emile Durkheim [pg 347] proved far less successful than my creation of nice Adobe Illustrator diagrams that made explicit the reemergence of issues addressed a century ago. It was not for lack of trying, however.
343. Callon, The Laws of the Markets; Hauser, Moral Minds.
344. Oliver Wendell Holmes, “The Path of Law.”
345. In December 2006 Creative Commons announced a set of licenses that facilitate the “follow up” licensing of a work, especially one initially issued under a noncommercial license.
346. Message from the cc-sampling mailing list, Glenn Brown, Subject: BACKGROUND: “AS APPROPRIATE TO THE MEDIUM, GENRE, AND MARKET NICHE,” 23 May 2003, http://lists.ibiblio.org/pipermail/cc-sampling/2003-May/000004.html.
347. Sampling offers a particularly clear example of how Creative Commons differs from the existing practice and infrastructure of music creation and intellectual-property law. The music industry has actually long recognized the fact of sampling as something musicians do and has attempted to deal with it by making it an explicit economic practice; the music industry thus encourages sampling by facilitating the sale between labels and artists of rights to make a sample. Record companies will negotiate prices, lengths, quality, and quantity of sampling and settle on a price.
This practice is set opposite the assumption, also codified in law, that the public has a right to a fair use of copyrighted material without payment or permission. Sampling a piece of music might seem to fall into this category of use, except that one of the tests of fair use is that the use not impact any existing market for such uses, and the fact that the music industry has effectively created a market for the buying and selling of samples means that sampling now routinely falls outside the fair uses codified in the statute, thus removing sampling from the domain of fair use. Creative Commons licenses, on the other hand, say that owners should be able to designate their material as “sample-able,” to give permission ahead of time, and by this practice to encourage others to do the same. They give an “honorable” meaning to the practice of sampling for free, rather than the dishonorable one created by the industry. It thus becomes a war over the meaning of norms, in the law-and-economics language of Creative Commons and its founders.
348. See http://cnx.org, http://www.creativecommons.org, http://www.earlham.edu/~peters/fos/overview.htm, http://www.biobricks.org, http://www.freebeer.org, http://freeculture.org, http://www.cptech.org/a2k, [pg 348] http://www.colawp.com/colas/400/cola467_recipe.html, http://www.elephantsdream.org, http://www.sciencecommons.org, http://www.plos.org, http://www.openbusiness.cc, http://www.yogaunity.org, http://osdproject.com, http://www.hewlett.org/Programs/Education/oer/, and http://olpc.com.
349. See Clive Thompson, “Open Source Spying,” New York Times Magazine, 3 December 2006, 54.
350. See especially Christen, “Tracking Properness” and “Gone Digital”; Brown, Who Owns Native Culture? and “Heritage as Property.” Crowdsourcing fits into other novel forms of labor arrangements, ranging from conventional outsourcing and off-shoring to newer forms of bodyshopping and “virtual migration” (see Aneesh, Virtual Migration; Xiang, “Global Bodyshopping” ).
351. Golub, “Copyright and Taboo”; Dibbell, Play Money.
Copyright: © 2008 Duke University Press
Printed in the United States of America on acid-free paper ∞
Designed by C. H. Westmoreland
Typeset in Charis (an Open Source font) by Achorn International
Library of Congress Cataloging-in-Publication data and republication acknowledgments appear on the last printed pages of this book.
License: Licensed under the Creative Commons Attribution-NonCommercial-Share Alike License, available at http://creativecommons.org/licenses/by-nc-sa/3.0/ or by mail from Creative Commons, 559 Nathan Abbott Way, Stanford, Calif. 94305, U.S.A. "NonCommercial" as defined in this license specifically excludes any sale of this work or any portion thereof for money, even if sale does not result in a profit by the seller or if the sale is by a 501(c)(3) nonprofit or NGO.
Duke University Press gratefully acknowledges the support of HASTAC (Humanities, Arts, Science, and Technology Advanced Collaboratory), which provided funds to help support the electronic interface of this book.
Two Bits is accessible on the Web at twobits.net.
SiSU Spine (object numbering & object search) 2022