aboutsummaryrefslogtreecommitdiffhomepage
path: root/data/v2/samples/two_bits.christopher_kelty.sst
diff options
context:
space:
mode:
Diffstat (limited to 'data/v2/samples/two_bits.christopher_kelty.sst')
-rw-r--r--data/v2/samples/two_bits.christopher_kelty.sst36
1 files changed, 18 insertions, 18 deletions
diff --git a/data/v2/samples/two_bits.christopher_kelty.sst b/data/v2/samples/two_bits.christopher_kelty.sst
index 85efb46..1cff4f9 100644
--- a/data/v2/samples/two_bits.christopher_kelty.sst
+++ b/data/v2/samples/two_bits.christopher_kelty.sst
@@ -94,7 +94,7 @@ At first glance, the thread tying these projects together seems to be the Intern
={Internet+12:relation to Free Software;Free Software:relation to Internet;public sphere:theories of}
Both the Internet and Free Software are historically specific, that is, not just any old new media or information technology. But the Internet is many, many specific things to many, many specific people. As one reviewer of an early manuscript version of this book noted, "For most people, the Internet is porn, stock quotes, Al Jazeera clips of executions, Skype, seeing pictures of the grandkids, porn, never having to buy another encyclopedia, MySpace, e-mail, online housing listings, Amazon, Googling potential romantic interests, etc. etc." It is impossible to explain all of these things; the meaning and significance of the proliferation of digital pornography is a very different concern than that of the fall of the print encyclopedia ,{[pg 5]}, and the rise of Wikipedia. Yet certain underlying practices relate these diverse phenomena to one another and help explain why they have occurred at this time and in this technical, legal, and social context. By looking carefully at Free Software and its modulations, I suggest, one can come to a better understanding of the changes affecting pornography, Wikipedia, stock quotes, and many other wonderful and terrifying things.~{ Wikipedia is perhaps the most widely known and generally familiar example of what this book is about. Even though it is not identified as such, it is in fact a Free Software project and a "modulation" of Free Software as I describe it here. The non-technically inclined reader might keep Wikipedia in mind as an example with which to follow the argument of this book. I will return to it explicitly in part 3. However, for better or for worse, there will be no discussion of pornography. }~
-={Wikipedia}
+={Wikipedia (collaborative encyclopedia)}
Two Bits has three parts. Part I of this book introduces the reader to the concept of recursive publics by exploring the lives, works, and discussions of an international community of geeks brought together by their shared interest in the Internet. Chapter 1 asks, in an ethnographic voice, "Why do geeks associate with one another?" The answer—told via the story of Napster in 2000 and the standards process at the heart of the Internet—is that they are making a recursive public. Chapter 2 explores the words and attitudes of geeks more closely, focusing on the strange stories they tell (about the Protestant Reformation, about their practical everyday polymathy, about progress and enlightenment), stories that make sense of contemporary political economy in sometimes surprising ways. Central to part I is an explication of the ways in which geeks argue about technology but also argue with and through it, by building, modifying, and maintaining the very software, networks, and legal tools within which and by which they associate with one another. It is meant to give the reader a kind of visceral sense of why certain arrangements of technology, organization, and law—specifically that of the Internet and Free Software—are so vitally important to these geeks.
={geeks;Napster;technology:as argument}
@@ -209,7 +209,7 @@ The study of distributed phenomena does not necessarily imply the detailed, loca
={Weber, Max}
It is in this sense that the ethnographic object of this study is not geeks and not any particular project or place or set of people, but Free Software and the Internet. Even more precisely, the ethnographic object of this study is "recursive publics"—except that this concept is also the work of the ethnography, not its preliminary object. I could not have identified "recursive publics" as the object of the ethnography at the outset, and this is nice proof that ethnographic work is a particular kind of epistemological encounter, an encounter that requires considerable conceptual work during and after the material labor of fieldwork, and throughout the material labor of writing and rewriting, in order to make sense of and reorient it into a question that will have looked deliberate and ,{[pg 21]}, answerable in hindsight. Ethnography of this sort requires a long-term commitment and an ability to see past the obvious surface of rapid transformation to a more obscure and slower temporality of cultural significance, yet still pose questions and refine debates about the near future.~{ Despite what might sound like a "shoot first, ask questions later" approach, the design of this project was in fact conducted according to specific methodologies. The most salient is actor-network theory: Latour, Science in Action; Law, "Technology and Heterogeneous Engineering"; Callon, "Some Elements of a Sociology of Translation"; Latour, Pandora’s Hope; Latour, Re-assembling the Social; Callon, Laws of the Markets; Law and Hassard, Actor Network Theory and After. Ironically, there have been no actor-network studies of networks, which is to say, of particular information and communication technologies such as the Internet. The confusion of the word network (as an analytical and methodological term) with that of network (as a particular configuration of wires, waves, software, and chips, or of people, roads, and buses, or of databases, names, and diseases) means that it is necessary to always distinguish this-network-here from any-network-whatsoever. My approach shares much with the ontological questions raised in works such as Law, Aircraft Stories; Mol, The Body Multiple; Cussins, "Ontological Choreography"; Charis Thompson, Making Parents; and Dumit, Picturing Personhood. }~ Historically speaking, the chapters of part II can be understood as a contribution to a history of scientific infrastructure—or perhaps to an understanding of large-scale, collective experimentation.~{ I understand a concern with scientific infrastructure to begin with Steve Shapin and Simon Schaffer in Leviathan and the Air Pump, but the genealogy is no doubt more complex. It includes Shapin, The Social History of Truth; Biagioli, Galileo, Courtier; Galison, How Experiments End and Image and Logic; Daston, Biographies of Scientific Objects; Johns, The Nature of the Book. A whole range of works explore the issue of scientific tools and infrastructure: Kohler, Lords of the Fly; Rheinberger, Towards a History of Epistemic Things; Landecker, Culturing Life; Keating and Cambrosio, Biomedical Platforms. Bruno Latour’s "What Rules of Method for the New Socio-scientific Experiments" provides one example of where science studies might go with these questions. Important texts on the subject of technical infrastructures include Walsh and Bayma, "Computer Networks and Scientific Work"; Bowker and Star, Sorting Things Out; Edwards, The ,{[pg 316]}, Closed World; Misa, Brey, and Feenberg, Modernity and Technology; Star and Ruhleder, "Steps Towards an Ecology of Infrastructure." }~ The Internet and Free Software are each an important practical transformation that will have effects on the practice of science and a kind of complex technical practice for which there are few existing models of study.
-={actor network theory;Internet+1}
+={Actor Network Theory;Internet+1}
A methodological note about the peculiarity of my subject is also in order. The Attentive Reader will note that there are very few fragments of conventional ethnographic material (i.e., interviews or notes) transcribed herein. Where they do appear, they tend to be "publicly available"—which is to say, accessible via the Internet—and are cited as such, with as much detail as necessary to allow the reader to recover them. Conventional wisdom in both anthropology and history has it that what makes a study interesting, in part, is the work a researcher has put into gathering that which is not already available, that is, primary sources as opposed to secondary sources. In some cases I provide that primary access (specifically in chapters 2, 8, and 9), but in many others it is now literally impossible: nearly everything is archived. Discussions, fights, collaborations, talks, papers, software, articles, news stories, history, old software, old software manuals, reminiscences, notes, and drawings—it is all saved by someone, somewhere, and, more important, often made instantly available by those who collect it. The range of conversations and interactions that count as private (either in the sense of disappearing from written memory or of being accessible only to the parties involved) has shrunk demonstrably since about 1981.
={ethnographic data:availability of+5}
@@ -293,7 +293,7 @@ _1 2. Boyle, "The Second Enclosure Movement and the Construction of the Public D
2~ From the Facts of Human Activity
Boston, May 2003. Starbucks. Sean and Adrian are on their way to pick me up for dinner. I’ve already had too much coffee, so I sit at the window reading the paper. Eventually Adrian calls to find out where I am, I tell him, and he promises to show up in fifteen minutes. I get bored and go outside to wait, watch the traffic go by. More or less right on time (only post-dotcom is Adrian ever on time), Sean’s new blue VW Beetle rolls into view. Adrian jumps out of the passenger seat and into the back, and I get in. Sean has been driving for a little over a year. He seems confident, cautious, but meanders through the streets of Cambridge. We are destined for Winchester, a township on the Charles River, in order to go to an Indian restaurant that one of Sean’s friends has recommended. When I ask how they are doing, they say, "Good, good." Adrian offers, "Well, Sean’s better than he has been in two years." "Really?" I say, impressed.
-={Doyle, Sean+6;Groper Adrian+6}
+={Doyle, Sean+6;Gropper, Adrian+6}
Sean says, "Well, happier than at least the last year. I, well, let me put it this way: forgive me father for I have sinned, I still have unclean thoughts about some of the upper management in the company, I occasionally think they are not doing things in the best interest of the company, and I see them as self-serving and sometimes wish them ill." In this rolling blue confessional Sean describes some of the people who I am familiar with whom he now tries very hard not to think about. I look at him and say, "Ten Hail Marys and ten Our Fathers, and you will be absolved, my child." Turning to Adrian, I ask, "And what about you?" Adrian continues the joke: "I, too, have sinned. I have reached the point where I can see absolutely nothing good coming of this company but that I can keep my investments in it long enough to pay for my children’s college tuition." I say, "You, my son, I cannot help." Sean says, "Well, funny thing about tainted money . . . there just taint enough of it."
@@ -1106,7 +1106,7 @@ The absence of an economic or corporate mandate for Thompson’s and Ritchie’s
={AT&T+14;McIlroy, Douglas}
UNIX was unique for many technical reasons, but also for a specific economic reason: it was never quite academic and never quite commercial. Martin Campbell-Kelly notes that UNIX was a "non-proprietary operating system of major significance."~{ Campbell-Kelly, From Airline Reservations to Sonic the Hedgehog, 143. }~ Kelly’s use of "non-proprietary" is not surprising, but it is incorrect. Although business-speak regularly opposed open to proprietary throughout the 1980s and early 1990s (and UNIX was definitely the former), Kelly’s slip marks clearly the confusion between software ownership and software distribution that permeates both popular and academic understandings. UNIX was indeed proprietary—it was copyrighted and wholly owned by Bell Labs and in turn by Western Electric ,{[pg 127]}, and AT&T—but it was not exactly commercialized or marketed by them. Instead, AT&T allowed individuals and corporations to install UNIX and to create UNIX-like derivatives for very low licensing fees. Until about 1982, UNIX was licensed to academics very widely for a very small sum: usually royalty-free with a minimal service charge (from about $150 to $800).~{ Ritchie’s Web site contains a copy of a 1974 license (http://cm.bell-labs.com/cm/cs/who/dmr/licenses.html) and a series of ads that exemplify the uneasy positioning of UNIX as a commercial product (http://cm.bell-labs.com/cm/cs/who/dmr/unixad.html). According to Don Libes and Sandy Ressler, "The original licenses were source licenses. . . . [C]ommercial institutions paid fees on the order of $20,000. If you owned more than one machine, you had to buy binary licenses for every additional machine [i.e., you were not allowed to copy the source and install it] you wanted to install UNIX on. They were fairly pricey at $8000, considering you couldn’t resell them. On the other hand, educational institutions could buy source licenses for several hundred dollars—just enough to cover Bell Labs’ administrative overhead and the cost of the tapes" (Life with UNIX, 20-21). }~ The conditions of this license allowed researchers to do what they liked with the software so long as they kept it secret: they could not distribute or use it outside of their university labs (or use it to create any commercial product or process), nor publish any part of it. As a result, throughout the 1970s UNIX was developed both by Thompson and Ritchie inside Bell Labs and by users around the world in a relatively informal manner. Bell Labs followed such a liberal policy both because it was one of a small handful of industry-academic research and development centers and because AT&T was a government monopoly that provided phone service to the country and was therefore forbidden to directly enter the computer software market.~{ According to Salus, this licensing practice was also a direct result of Judge Thomas Meaney’s 1956 antitrust consent decree which required AT&T to reveal and to license its patents for nominal fees (A Quarter Century of UNIX, 56); see also Brock, The Second Information Revolution, 116-20. }~
-={AT&T:Bell Labratories+13;licensing, of UNIX+6;proprietary systems: open vs.;monopoly}
+={AT&T:Bell Laboratories+13;licensing, of UNIX+6;proprietary systems: open vs.;monopoly}
Being on the border of business and academia meant that UNIX was, on the one hand, shielded from the demands of management and markets, allowing it to achieve the conceptual integrity that made it so appealing to designers and academics. On the other, it also meant that AT&T treated it as a potential product in the emerging software industry, which included new legal questions from a changing intellectual-property regime, novel forms of marketing and distribution, and new methods of developing, supporting, and distributing software.
@@ -1160,7 +1160,7 @@ Unfortunately, Commentary was also legally restricted in its distribution. AT&T
={trade secret law+1}
Thus, these generations of computer-science students and academics shared a secret—a trade secret become open secret. Every student who learned the essentials of the UNIX operating system from a photocopy of Lions’s commentary, also learned about AT&T’s attempt to control its legal distribution on the front cover of their textbook. The parallel development of photocopying has a nice resonance here; together with home cassette taping of music and the introduction of the video-cassette recorder, photocopying helped drive the changes to copyright law adopted in 1976.
-={copyright:changes in}
+={copyright:changes in 1976}
Thirty years later, and long after the source code in it had been completely replaced, Lions’s Commentary is still widely admired by geeks. Even though Free Software has come full circle in providing students with an actual operating system that can be legally studied, taught, copied, and implemented, the kind of "literary criticism" that Lions’s work represents is still extremely rare; even reading obsolete code with clear commentary is one of the few ways to truly understand the design elements and clever implementations that made the UNIX operating system so different from its predecessors and even many of its successors, few, if any of which have been so successfully ported to the minds of so many students.
={design+2}
@@ -1241,7 +1241,7 @@ The open-systems story is also a story of the blind spot of open systems—in th
={intellectual property;interoperability+21;openness (component of Free Software):intellectual property and}
Standardization was at the heart of the contest, but by whom and by what means was never resolved. The dream of open systems, pursued in an entirely unregulated industry, resulted in a complicated experiment in novel forms of standardization and cooperation. The creation of a "standard" operating system based on UNIX is the story of a failure, a kind of "figuring out" gone haywire, which resulted in huge consortia of computer manufacturers attempting to work together and compete with each other at the same time. Meanwhile, the successful creation of a "standard" networking protocol—known as the Open Systems Interconnection Reference Model (OSI)—is a story of failure that hides a larger success; OSI was eclipsed in the same period by the rapid and ad hoc adoption of the Transmission Control Protocol/Internet Protocol (TCP/IP), which used a radically different standardization process and which succeeded for a number of surprising reasons, allowing the Internet ,{[pg 145]}, to take the form it did in the 1990s and ultimately exemplifying the moral-technical imaginary of a recursive public—and one at the heart of the practices of Free Software.
-={figuring out;Open Systems Interconnection (OSI), as reference model;Openness (component of Free Software):standardization and;protocols:Open Systems Interconnection (OSI)|TCP/IP;standards organizations;TCP/IP (Transmission Control Protocol/Internet Protocol)}
+={figuring out;Open Systems Interconnection (OSI):as reference model;Openness (component of Free Software):standardization and;protocols:Open Systems Interconnection (OSI)|TCP/IP;standards organizations;TCP/IP (Transmission Control Protocol/Internet Protocol)}
The conceiving of openness, which is the central plot of these two stories, has become an essential component of the contemporary practice and power of Free Software. These early battles created a kind of widespread readiness for Free Software in the 1990s, a recognition of Free Software as a removal of open systems’ blind spot, as much as an exploitation of its power. The geek ideal of openness and a moral-technical order (the one that made Napster so significant an event) was forged in the era of open systems; without this concrete historical conception of how to maintain openness in technical and moral terms, the recursive public of geeks would be just another hierarchical closed organization—a corporation manqué—and not an independent public serving as a check on the kinds of destructive power that dominated the open-systems contest.
={Napster}
@@ -1427,7 +1427,7 @@ The growth of Free Software in the 1980s and 1990s depended on openness as a con
={Open Systems:networks and+28}
The struggle to standardize UNIX as a platform for open systems was not the only open-systems struggle; alongside the UNIX wars, another "religious war" was raging. The attempt to standardize networks—in particular, protocols for the inter-networking of multiple, diverse, and autonomous networks of computers—was also a key aspect of the open-systems story of the 1980s.~{ The distinction between a protocol, an implementation and a standard is important: Protocols are descriptions of the precise terms by which two computers can communicate (i.e., a dictionary and a handbook for communicating). An implementation is the creation of software that uses a protocol (i.e., actually does the communicating; thus two implementations using the same protocol should be able to share data. A standard defines which protocol should be used by which computers, for what purposes. It may or may not define the protocol, but will set limits on changes to that protocol. }~ The war ,{[pg 167]}, between the TCP/IP and OSI was also a story of failure and surprising success: the story of a successful standard with international approval (the OSI protocols) eclipsed by the experimental, military-funded TCP/IP, which exemplified an alternative and unusual standards process. The moral-technical orders expressed by OSI and TCP/IP are, like that of UNIX, on the border between government, university, and industry; they represent conflicting social imaginaries in which power and legitimacy are organized differently and, as a result, expressed differently in the technology.
-={moral and technical order;Networks:protools for+3;Open Systems Interconnection (OSI), as reference model+27;protocols:Open Systems Interconnection (OSI)+27|TCP/IP;TCP/IP (Transmission Control Protocol/Internet Protocol)+27;religious wars+3;social imaginary;standards process+3}
+={moral and technical order;Networks:protools for+3;Open Systems Interconnection (OSI):as reference model+27;protocols:Open Systems Interconnection (OSI)+27|TCP/IP;TCP/IP (Transmission Control Protocol/Internet Protocol)+27;religious wars+3;social imaginary;standards processes+3}
OSI and TCP/IP started with different goals: OSI was intended to satisfy everyone, to be the complete and comprehensive model against which all competing implementations would be validated; TCP/IP, by contrast, emphasized the easy and robust interconnection of diverse networks. TCP/IP is a protocol developed by bootstrapping between standard and implementation, a mode exemplified by the Requests for Comments system that developed alongside them as part of the Arpanet project. OSI was a "model" or reference standard developed by internationally respected standards organizations.
={Arpanet (network)+18;Request for Comments (RFC)}
@@ -1453,7 +1453,7 @@ One important feature united almost all of these experiments: the networks of th
={antitrust}
TCP/IP and OSI have become emblematic of the split between the worlds of telecommunications and computing; the metaphors of religious wars or of blood feuds and cold wars were common.~{ Drake, "The Internet Religious War." }~ A particularly arch account from this period is Carl Malamud’s Exploring the Internet: A Technical Travelogue, which documents Malamud’s (physical) visits to Internet sites around the globe, discussions (and beer) with networking researchers on technical details of the networks they have created, and his own typically geeky, occasionally offensive takes on cultural difference.~{ Malamud, Exploring the Internet; see also Michael M. J. Fischer, "Worlding Cyberspace." }~ A subtheme of the story is the religious war between Geneva (in particular the ITU) and the Internet: Malamud tells the story of asking the ITU to release its 19,000-page "blue book" of standards on the Internet, to facilitate its adoption and spread.
-={Malmud, Carl+1;standards process+4}
+={Malmud, Carl+1;standards processes+4}
The resistance of the ITU and Malamud’s heroic if quixotic attempts are a parable of the moral-technical imaginaries of openness—and indeed, his story draws specifically on the usable past of Giordano Bruno.~{ The usable past of Giordano Bruno is invoked by Malamud to signal the heretical nature of his own commitment to openly publishing standards that ISO was opposed to releasing. Bruno’s fate at the hands of the Roman Inquisition hinged in some part on his acceptance of the Copernican cosmology, so he has been, like Galileo, a natural figure for revolutionary claims during the 1990s. }~ The "bruno" project demonstrates the gulf that exists between two models of legitimacy—those of ISO and the ITU—in which standards represent the legal and legitimate consensus of a regulated industry, approved by member nations, paid for and enforced by governments, and implemented and adhered to by corporations.
={Bruno, Giordano;Usable pasts;International Organization for Standardization (ISO)+3}
@@ -1472,10 +1472,10 @@ Until the mid-1980s, the TCP/IP protocols were resolutely research-oriented, and
={Cerf, Vinton+2;Kahn, Robert;TCP/IP (Transmission Control Protocol/Internet Protocol):goals of+2}
The explicit goal of TCP/IP was thus to share computer resources, not necessarily to connect two individuals or firms together, or to create a competitive market in networks or networking software. Sharing between different kinds of networks implied allowing the different networks to develop autonomously (as their creators and maintainers saw best), but without sacrificing the ability to continue sharing. Years later, David Clark, chief Internet engineer for several years in the 1980s, gave a much more explicit explanation of the goals that led to the TCP/IP protocols. In particular, he suggested that the main overarching goal was not just to share resources but "to develop an effective technique for multiplexed utilization of existing interconnected networks," and he more explicitly stated the issue of control that faced the designers: "Networks represent administrative boundaries of control, and it was an ambition of this project to come to grips with the problem of integrating a number ,{[pg 173]}, of separately administrated entities into a common utility."~{ Clark, "The Design Philosophy of the DARPA Internet Protocols," 54-55. }~ By placing the goal of expandability first, the TCP/IP protocols were designed with a specific kind of simplicity in mind: the test of the protocols’ success was simply the ability to connect.
-={Clark,David}
+={Clark, David}
By setting different goals, TCP/IP and OSI thus differed in terms of technical details; but they also differed in terms of their context and legitimacy, one being a product of international-standards bodies, the other of military-funded research experiments. The technical and organizational differences imply different processes for standardization, and it is the peculiar nature of the so-called Requests for Comments (RFC) process that gave TCP/IP one of its most distinctive features. The RFC system is widely recognized as a unique and serendipitous outcome of the research process of Arpanet.~{ RFCs are archived in many places, but the official site is RFC Editor, http://www.rfc-editor.org/. }~ In a thirty-year retrospective (published, naturally, as an RFC: RFC 2555), Vint Cerf says, "Hiding in the history of the RFCs is the history of human institutions for achieving cooperative work." He goes on to describe their evolution over the years: "When the RFCs were first produced, they had an almost 19th century character to them—letters exchanged in public debating the merits of various design choices for protocols in the ARPANET. As email and bulletin boards emerged from the fertile fabric of the network, the far-flung participants in this historic dialog began to make increasing use of the online medium to carry out the discussion—reducing the need for documenting the debate in the RFCs and, in some respects, leaving historians somewhat impoverished in the process. RFCs slowly became conclusions rather than debates."~{ RFC Editor, RFC 2555, 6. }~
-={standards process;Request for Comments (RFC)+2}
+={standards processes;Request for Comments (RFC)+2}
Increasingly, they also became part of a system of discussion and implementation in which participants created working software as part of an experiment in developing the standard, after which there was more discussion, then perhaps more implementation, and finally, a standard. The RFC process was a way to condense the process of standardization and validation into implementation; which is to say, the proof of open systems was in the successful connection of diverse networks, and the creation of a standard became a kind of ex post facto rubber-stamping of this demonstration. Any further improvement of the standard hinged on an improvement on the standard implementation because the standards that resulted were freely and widely available: "A user could request an RFC by email from his host computer and have it automatically delivered to his mailbox. . . . RFCs were also shared freely with official standards ,{[pg 174]}, bodies, manufacturers and vendors, other working groups, and universities. None of the RFCs were ever restricted or classified. This was no mean feat when you consider that they were being funded by DoD during the height of the Cold War."~{ Ibid., 11. }~
={Software:implementation of;standards:implementation+9|validation of;Secrecy+1}
@@ -1554,7 +1554,7 @@ Stallman’s GNU General Public License "hacks" the federal copyright law, as is
={Copyleft licenses (component of Free Software):as hack of copyright law+1;Copyright+1}
Like all software since the 1980 copyright amendments, Free Software is copyrightable—and what’s more, automatically copyrighted as it is written (there is no longer any requirement to register). Copyright law grants the author (or the employer of the author) a number of strong rights over the dispensation of what has been written: rights to copy, distribute, and change the work.~{ Copyright Act of 1976, Pub. L. No. 94-553, 90 Stat. 2541, enacted 19 October 1976; and Copyright Amendments, Pub. L. No. 96-517, 94 Stat. 3015, 3028 (amending §101 and §117, title 17, United States Code, regarding computer programs), enacted 12 December 1980. All amendments since 1976 are listed at http://www.copyright.gov/title17/92preface.html. }~ Free Software’s hack is to immediately make use of these rights in order to abrogate the rights the programmer has been given, thus granting all subsequent licensees rights to copy, distribute, modify, and use the copyrighted software. Some licenses, like the GPL, add the further restriction that every licensee must offer the same terms to any subsequent licensee, others make no such restriction on subsequent uses. Thus, while statutory law suggests that individuals need strong rights and grants them, Free Software licenses effectively annul them in favor of other activities, such as sharing, porting, and forking software. It is for this reason that they have earned the name "copyleft."~{ The history of the copyright and software is discussed in Litman, Digital Copyright; Cohen et al., Copyright in a Global Information Economy; and Merges, Menell, and Lemley, Intellectual Property in the New Technological Age. }~
-={Copyright:changes in|rights granted by}
+={Copyright:changes in 1976|rights granted by}
This is a convenient ex post facto description, however. Neither Stallman nor anyone else started out with the intention of hacking copyright law. The hack of the Free Software licenses was a response to a complicated controversy over a very important invention, a tool that in turn enabled an invention called EMACS. The story of the controversy is well-known among hackers and geeks, but not often told, and not in any rich detail, outside of these small circles.~{ See Wayner, Free for All; Moody, Rebel Code; and Williams, Free as in Freedom. Although this story could be told simply by interviewing Stallman and James Gosling, both of whom are still alive and active in the software world, I have chosen to tell it through a detailed analysis of the Usenet and Arpanet archives of the controversy. The trade-off is between a kind of incomplete, fly-on-the-wall access to a moment in history and the likely revisionist retellings of those who lived through it. All of the messages referenced here are cited by their "Message-ID," which should allow anyone interested to access the original messages through Google Groups (http://groups.google.com). }~
@@ -1840,10 +1840,10 @@ The final component of Free Software is coordination. For many participants and
={Free Software:open source vs.;Open Source:Free Software vs.;peer production;practices:five components of Free Software+2;Source Code Management tools (SCMs)}
Coordination is important because it collapses and resolves the distinction between technical and social forms into a meaningful ,{[pg 211]}, whole for participants. On the one hand, there is the coordination and management of people; on the other, there is the coordination of source code, patches, fixes, bug reports, versions, and distributions—but together there is a meaningful technosocial practice of managing, decision-making, and accounting that leads to the collaborative production of complex software and networks. Such coordination would be unexceptional, essentially mimicking long-familiar corporate practices of engineering, except for one key fact: it has no goals. Coordination in Free Software privileges adaptability over planning. This involves more than simply allowing any kind of modification; the structure of Free Software coordination actually gives precedence to a generalized openness to change, rather than to the following of shared plans, goals, or ideals dictated or controlled by a hierarchy of individuals.~{ On the distinction between adaptability and adaptation, see Federico Iannacci, "The Linux Managing Model," http://opensource.mit.edu/papers/iannacci2.pdf. Matt Ratto characterizes the activity of Linux-kernel developers as a "culture of re-working" and a "design for re-design," and captures the exquisite details of such a practice both in coding and in the discussion between developers, an activity he dubs the "pressure of openness" that "results as a contradiction between the need to maintain productive collaborative activity and the simultaneous need to remain open to new development directions" ("The Pressure of Openness," 112-38). }~
-={adaptability:planning vs.+1|as a form of critique+1|adaptation vs.;coordination (component of Free Software):individual virtuosity vs. hierarchical planning+2;critique, Free Software+1;goals, lack of in Free Software+1;hackers:curiosity and virtuosity of+1;hierarchy, in coordination+5;planning+1}
+={adaptability:planning vs.+1|as a form of critique+1|adaptation vs.;coordination (component of Free Software):individual virtuosity vs. hierarchical planning+2;critique, Free Software as+1;goals, lack of in Free Software+1;hackers:curiosity and virtuosity of+1;hierarchy, in coordination+5;planning+1}
Adaptability does not mean randomness or anarchy, however; it is a very specific way of resolving the tension between the individual curiosity and virtuosity of hackers, and the collective coordination necessary to create and use complex software and networks. No man is an island, but no archipelago is a nation, so to speak. Adaptability preserves the "joy" and "fun" of programming without sacrificing the careful engineering of a stable product. Linux and Apache should be understood as the results of this kind of coordination: experiments with adaptability that have worked, to the surprise of many who have insisted that complexity requires planning and hierarchy. Goals and planning are the province of governance—the practice of goal-setting, orientation, and definition of control—but adaptability is the province of critique, and this is why Free Software is a recursive public: it stands outside power and offers powerful criticism in the form of working alternatives. It is not the domain of the new—after all Linux is just a rewrite of UNIX—but the domain of critical and responsive public direction of a collective undertaking.
-={Linux (Free Software project)+8;novelty, of free software;recursive public+1}
+={Linux (Free Software project)+8;novelty, of Free Software;recursive public+1}
Linux and Apache are more than pieces of software; they are organizations of an unfamiliar kind. My claim that they are "recursive publics" is useful insofar as it gives a name to a practice that is neither corporate nor academic, neither profit nor nonprofit, neither governmental nor nongovernmental. The concept of recursive public includes, within the spectrum of political activity, the creation, modification, and maintenance of software, networks, and legal documents. While a "public" in most theories is a body of ,{[pg 212]}, people and a discourse that give expressive form to some concern, "recursive public" is meant to suggest that geeks not only give expressive form to some set of concerns (e.g., that software should be free or that intellectual property rights are too expansive) but also give concrete infrastructural form to the means of expression itself. Linux and Apache are tools for creating networks by which expression of new kinds can be guaranteed and by which further infrastructural experimentation can be pursued. For geeks, hacking and programming are variants of free speech and freedom of assembly.
={public sphere:theories of;Apache (Free Software project)+4;experimentation;infrastructure}
@@ -2069,7 +2069,7 @@ Both the Apache project and the Linux kernel project use SCMs. In the case of Ap
While SCMs are in general good for managing conflicting changes, they can do so only up to a point. To allow anyone to commit a change, however, could result in a chaotic mess, just as difficult to disentangle as it would be without an SCM. In practice, therefore, most projects designate a handful of people as having the right to "commit" changes. The Apache project retained its voting scheme, for instance, but it became a way of voting for "committers" instead for patches themselves. Trusted committers—those with the mysterious "good taste," or technical intuition—became the core members of the group.
The Linux kernel has also struggled with various issues surrounding SCMs and the management of responsibility they imply. The story of the so-called VGER tree and the creation of a new SCM called Bitkeeper is exemplary in this respect.~{ See Steven Weber, The Success of Open Source, 117-19; Moody, Rebel Code, 172-78. See also Shaikh and Cornford, "Version Management Tools." }~ By 1997, Linux developers had begun to use cvs to manage changes to the source code, though not without resistance. Torvalds was still in charge of the changes to the official stable tree, but as other "lieutenants" came on board, the complexity of the changes to the kernel grew. One such lieutenant was Dave Miller, who maintained a "mirror" of the stable Linux kernel tree, the VGER tree, on a server at Rutgers. In September 1998 a fight broke out among Linux kernel developers over two related issues: one, the fact that Torvalds was failing to incorporate (patch) contributions that had been forwarded to him by various people, including his lieutenants; and two, as a result, the VGER cvs repository was no longer in synch with the stable tree maintained by Torvalds. Two different versions of Linux threatened to emerge.
-={Miller, Dave;Source Code Management tools (SCMs):see also Bitkeeper;Concurrent Versioning System (cvs):Linux and;Linux (Free Software project):VGER tree and+2;Bitkeeper (Source Code Management software)+12;Torvalds, Linux:in bitkeeper controversy+12}
+={Miller, Dave;Source Code Management tools (SCMs):see also Bitkeeper;Concurrent Versioning System (cvs):Linux and;Linux (Free Software project):VGER tree and+2;Bitkeeper (Source Code Management software)+12;Torvalds, Linus:in bitkeeper controversy+12}
A great deal of yelling ensued, as nicely captured in Moody’s Rebel Code, culminating in the famous phrase, uttered by Larry McVoy: "Linus does not scale." The meaning of this phrase is that the ability of Linux to grow into an ever larger project with increasing complexity, one which can handle myriad uses and functions (to "scale" up), is constrained by the fact that there is only one Linus Torvalds. By all accounts, Linus was and is excellent at what he does—but there is only one Linus. The danger of this situation is the danger of a fork. A fork would mean one or more new versions would proliferate under new leadership, a situation much like ,{[pg 233]}, the spread of UNIX. Both the licenses and the SCMs are designed to facilitate this, but only as a last resort. Forking also implies dilution and confusion—competing versions of the same thing and potentially unmanageable incompatibilities.
={McVoy, Larry+11;Moody, Glyn;forking:in Linux+1}
@@ -2172,7 +2172,7 @@ In part III I confront this question directly. Indeed, it was this question that
={cultural significance;recursive public+3;Free Software:components of+1}
Connexions modulates all of the components except that of the movement (there is, as of yet, no real "Free Textbook" movement, but the "Open Access" movement is a close second cousin).~{ In January 2005, when I first wrote this analysis, this was true. By April 2006, the Hewlett Foundation had convened the Open Educational Resources "movement" as something that would transform the production and circulation of textbooks like those created by Connexions. Indeed, in Rich Baraniuk’s report for Hewlett, the first paragraph reads: "A grassroots movement is on the verge of sweeping through the academic world. The open education movement is based on a set of intuitions that are shared by a remarkably wide range of academics: that knowledge should be free and open to use and re-use; that collaboration should be easier, not harder; that people should receive credit and kudos for contributing to education and research; and that concepts and ideas are linked in unusual and surprising ways and not the simple linear forms that textbooks present. Open education promises to fundamentally change the way authors, instructors, and students interact worldwide" (Baraniuk and King, "Connexions"). (In a nice confirmation of just how embedded participation can become in anthropology, Baraniuk cribbed the second sentence from something I had written two years earlier as part of a description of what I thought Connexions hoped to achieve.) The "movement" as such still does not quite exist, but the momentum for it is clearly part of the actions that Hewlett hopes to achieve. }~ Perhaps the most complex modulation concerns coordination—changes to the practice of coordination and collaboration in academic-textbook creation in particular, and more generally to the nature of collaboration and coordination of knowledge in science and scholarship generally.
-={coordination (components of Free Software);movement (component of Free Software)+2}
+={coordination (component of Free Software);movement (component of Free Software)+2}
Connexions emerged out of Free Software, and not, as one might expect, out of education, textbook writing, distance education, or any of those areas that are topically connected to pedagogy. That is to say, the people involved did not come to their project by attempting to deal with a problem salient to education and teaching as much as they did so through the problems raised by Free Software and the question of how those problems apply to university textbooks. Similarly, a second project, Creative Commons, also emerged out of a direct engagement with and exploration of Free Software, and not out of any legal movement or scholarly commitment to the critique of intellectual-property law or, more important, out of any desire to transform the entertainment industry. Both projects are resolutely committed to experimenting with the given practices of Free Software—to testing their limits and changing them where they can—and this is what makes them vibrant, risky, and potentially illuminating as cases of a recursive public.
={affinity (of geeks);commons+1;Creative Commons+1;pedagogy;recursive public:examples of+1}
@@ -2194,7 +2194,7 @@ Around 1998 or 1999, Rich decided that it was time for him to write a textbook o
={Burris, C. Sidney;Connexions project:textbooks and+4;Rice University}
At about the same time as his idea for a textbook, Rich’s research group was switching over to Linux, and Rich was first learning about Open Source and the emergence of a fully free operating system created entirely by volunteers. It isn’t clear what Rich’s aha! moment was, other than simply when he came to an understanding that such a thing as Linux was actually possible. Nonetheless, at some point, Rich had the idea that his textbook could be an Open Source textbook, that is, a textbook created not just by him, but by DSP researchers all over the world, and made available to everyone to make use of and modify and improve as they saw fit, just like Linux. Together with Brent Hendricks, Yan David Erlich, ,{[pg 249]}, and Ross Reedstrom, all of whom, as geeks, had a deep familiarity with the history and practices of Free and Open Source Software, Rich started to conceptualize a system; they started to think about modulations of different components of Free and Open Source Software. The idea of a Free Software textbook repository slowly took shape.
-={Linux (Free Software project);Open Source:inspiration for Connexions+27;Reedstorm, Ross}
+={Linux (Free Software project);Open Source:inspiration for Connexions+27;Reedstrom, Ross}
Thus, Connexions: an "open content repository of high-quality educational materials." These "textbooks" very quickly evolved into something else: "modules" of content, something that has never been sharply defined, but which corresponds more or less to a small chunk of teachable information, like two or three pages in a textbook. Such modules are much easier to conceive of in sciences like mathematics or biology, in which textbooks are often multiauthored collections, finely divided into short chapters with diagrams, exercises, theorems, or programs. Modules lend themselves much less well to a model of humanities or social-science scholarship based in reading texts, discussion, critique, and comparison—and this bias is a clear reflection of what Brent, Ross, and Rich knew best in terms of teaching and writing. Indeed, the project’s frequent recourse to the image of an assembly-line model of knowledge production often confirms the worst fears of humanists and educators when they first encounter Connexions. The image suggests that knowledge comes in prepackaged and colorfully branded tidbits for the delectation of undergrads, rather than characterizing knowledge as a state of being or as a process.
={Connexions project:model of learning in|modules in+1}
@@ -2210,7 +2210,7 @@ Free Software—and, in particular, Open Source in the guise of "self-organizing
={Connexions project:relationship to education+2;distance learning+2}
Thus, Rich styled Connexions as more than just a factory of knowledge—it would be a community or culture developing richly associative and novel kinds of textbooks—and as much more than just distance education. Indeed, Connexions was not the only such project busy differentiating itself from the perceived dangers of distance education. In April 2001 MIT had announced that it would make the content of all of its courses available for free online in a project strategically called OpenCourseWare (OCW). Such news could only bring attention to MIT, which explicitly positioned the announcement as a kind of final death blow to the idea of distance education, by saying that what students pay $35,000 and up for per year is not "knowledge"—which is free—but the experience of being at MIT. The announcement created pure profit from the perspective of MIT’s reputation as a generator and disseminator of scientific knowledge, but the project did not emerge directly out of an interest in mimicking the success of Open Source. That angle was ,{[pg 252]}, provided ultimately by the computer-science professor Hal Abelson, whose deep understanding of the history and growth of Free Software came from his direct involvement in it as a long-standing member of the computer-science community at MIT. OCW emerged most proximately from the strange result of a committee report, commissioned by the provost, on how MIT should position itself in the "distance/e-learning" field. The surprising response: don’t do it, give the content away and add value to the campus teaching and research experience instead.~{ "Provost Announces Formation of Council on Educational Technology," MIT Tech Talk, 29 September 1999, http://web.mit.edu/newsoffice/1999/council-0929.html. }~
-={Abelson, Hal;Massachusetts Institute of Technology (MIT):open courseware and+2;Open CourseWare (OCW)+2;Connexions poject:Open CourseWare+2}
+={Abelson, Hal;Massachusetts Institute of Technology (MIT):open courseware and+2;Open CourseWare (OCW)+2;Connexions project:Open CourseWare+2}
OCW, Connexions, and distance learning, therefore, while all ostensibly interested in combining education with the networks and software, emerged out of different demands and different places. While the profit-driven demand of distance learning fueled many attempts around the country, it stalled in the case of OCW, largely because the final MIT Council on Educational Technology report that recommended OCW was issued at the same time as the first plunge in the stock market (April 2000). Such issues were not a core factor in the development of Connexions, which is not to say that the problems of funding and sustainability have not always been important concerns, only that genesis of the project was not at the administrative level or due to concerns about distance education. For Rich, Brent, and Ross the core commitment was to openness and to the success of Open Source as an experiment with massive, distributed, Internet-based, collaborative production of software—their commitment to this has been, from the beginning, completely and adamantly unwavering. Neverthless, the project has involved modulations of the core features of Free Software. Such modulations depend, to a certain extent, on being a project that emerges out of the ideas and practices of Free Software, rather than, as in the case of OCW, one founded as a result of conflicting goals (profit and academic freedom) and resulting in a strategic use of public relations to increase the symbolic power of the university over its fiscal growth.
={Reedstrom, Ross}
@@ -2278,7 +2278,7 @@ Creative Commons provided more than licenses, though. It was part of a social im
={moral and technical order;social imaginary}
Creative Commons was thus a back-door approach: if the laws could not be changed, then people should be given the tools they needed to work around those laws. Understanding how Creative Commons was conceived requires seeing it as a modulation of both the notion of "source code" and the modulation of "copyright licenses." But the modulations take place in that context of a changing legal system that was so unfamiliar to Stallman and his EMACS users, a legal system responding to new forms of software, networks, and devices. For instance, the changes to the Copyright Act of 1976 created an unintended effect that Creative Commons would ultimately seize on. By eliminating the requirement to register copyrighted works (essentially granting copyright as soon as the ,{[pg 261]}, work is "fixed in a tangible medium"), the copyright law created a situation wherein there was no explicit way in which a work could be intentionally placed in the public domain. Practically speaking an author could declare that a work was in the public domain, but legally speaking the risk would be borne entirely by the person who sought to make use of that work: to copy it, transform it, sell it, and so on. With the explosion of interest in the Internet, the problem ramified exponentially; it became impossible to know whether someone who had placed a text, an image, a song, or a video online intended for others to make use of it—even if the author explicitly declared it "in the public domain." Creative Commons licenses were thus conceived and rhetorically positioned as tools for making explicit exactly what uses could be made of a specific work. They protected the rights of people who sought to make use of "culture" (i.e., materials and ideas and works they had not authored), an approach that Lessig often summed up by saying, "Culture always builds on the past."
-={copyright:requirement to register;sharing source code (component of Free Software):modulations of;creative commons:activism of+1;public domain+4}
+={copyright:requirement to register;sharing source code (component of Free Software):modulations of;Creative Commons:activism of+1;public domain+4}
The background to and context of the emergence of Creative Commons was of course much more complicated and fraught. Concerns ranged from the plights of university libraries with regard to high-priced journals, to the problem of documentary filmmakers unable to afford, or even find the owners of, rights to use images or snippets in films, to the high-profile fights over online music trading, Napster, and the RIAA. Over the course of four years, Lessig and the other founders of Creative Commons would address all of these issues in books, in countless talks and presentations and conferences around the world, online and off, among audiences ranging from software developers to entrepreneurs to musicians to bloggers to scientists.
={Napster;Recording Industry Association of America (RIAA)}