Issue 2 Crowds and Clouds »

Engineering Collectives: Technology From the Coop

Share

Engineering Collectives: Technology From the Coop

From: lcplvaughn2_8

Sgt. Linley,

Not sure you remember me but I served with you back in the late 80’s early 90’s…2/8…Comm platoon. I want you to know that a lot of young Marines looked up to you back in the day. I was one of them. That platoon didn’t have a lot of great leaders but you were one of the great. I just want to let you know I still feel that way. I still think you are a great leader.

It devistated [sic] me to read about the trouble you have gone through and how bad PTSD got to you. The first thing I thought when I read the article about what happened was something inside must have brought you down. This was NOT the Linley (Chesty) I knew. Now I read your blogs and I can see the you I knew then.

Lieutenant Corporal Vaughn’s reply to Sergeant David Linley was one of dozens left on Linley’s blog at betweenthebars.org.[1] Between The Bars is a blogging platform for the (other) 1% of Americans who are incarcerated, the vast majority of whom have no access to the Internet. Prisoners send handwritten letters to the site and, if they are not censored by the prison, they are published (and collaboratively transcribed) as a blog entry. Visitors to the blog can leave replies, which are then printed and sent back to the prisoner. Linley, a returned marine suffering from PTSD, wrote a few posts to Between the Bars, at first receiving only a few courteous responses from readers. About six months later, one of his fellow servicemen discovered his post and over the next few weeks more than a dozen veterans were letting him know how much he meant to them, sending care packages, and even visiting him in prison. Linley’s case demonstrates how online media can help transform loose online social links into significant “in real world” support in times of need. In the case of Between The Bars (BtB), this was not a coincidence: it was designed by Charles DeTar, a researcher at the Center for Civic Media (C4) at MIT, specifically to help prisoners exercise online self-advocacy, an important prerequisite for collective action and social change.

Though DeTar is a PhD candidate at MIT, the Institute (where I taught for 10 years) is not a hotbed of technology development for progressive causes like prisoner’s rights. Indeed, with most of its money coming from the US government (over 70% of its funding, most of that military) and massive corporations (nearly all the rest of the funding, from companies like BP and Bank of America), MIT largely embeds the needs of the most powerful in society into durable technologies. Nearly every contemporary engineering research institution is funded through similar models, and as a result the bulk of technologies entering the world ultimately reinforce the status quo. For example, technologies for prisons and law enforcement are a significant market for high technology research and development, while technologies for prisoners‘ rights and public defenders are not. Engineering education covers thermal dynamics and differential equations, but its funding structure also means that engineers must be taught to work easily only in areas that support the most powerful entities in society. DeTar’s work, then, is a form of activist technology, a stark contrast to the normative values of the institution in which he works. He is one of a growing number of technologists who bypass the normative structures of technologist education and professional identity by anchoring his work in a different dialog, the Free/Libre Open Source Software movement. F/LOSS is not an inherently progressive movement, but it does offer the largest, most powerful, and most sustained alternative to conventional technology education, development, and distribution. In addition, the free software movement has provided working models for new methods of Internet-enabled collective action that inspired Between the Bars and many other platforms for community collective action that we developed at C4.

1: Engineering Identities

Engineering is socially regressive for several reasons, but perhaps first and foremost because the vast majority of engineering employers—government and corporations—expect their employees to help maintain the status quo. Schools like MIT, RPI, and Carnegie Mellon receive vast sums of government and corporate directed research, and entire areas of enquiry (Artificial Intelligence, Aeronautic and Astronautic Engineering) are primarily—in some cases exclusively—funded by commercial and military contracts. Undergraduates interested in these disciplines must either acquiesce to working on military research or (in many cases) regressive government or corporate research, or they must drop their vocation and find another major.  For some time I wondered why my own research group, which focused on developing technologies for social justice and agonistic politics, received so many enquiries from sophomores and juniors in Aeronautical and Astronautical engineering; later I realized that many were uncomfortable with the nature of their other funded research opportunities. These students, attracted to flight, were gradually realizing that advanced research in the field was rather about fight. No one explicitly told them they must toe this line; rather, faculty rehearse subtle narratives of professionalization, rationalizing why military funding is not only necessary but irrelevant to advanced research. Take for example the following dialog, nearly identical to the countless examples that I experienced at MIT, between an NPR interviewer and NYU computer science professor Peter Belhumeur:

NPR: Do you get government funding in part?

PH: Yes.

NPR: From DARPA or one of those?

PH: We just say the Department of Defense.

NPR: Uh huh. You just say it. Do they say “This is what we’re interested in?” or do they keep their cards close to their chest and say “We like this, here’s some money.

PH: They definitely say what they’re interested in. I think they want to do face recognition and verification in the wild, in unconstrained environments. So, where the person in the photograph does not necessarily cooperate. And you can imagine why that sort of thing is important to them.

NPR: So, especially in war as we have come to know it, in counterinsurgency operations and all that, it would be useful, as opposed to in the traditional battlefield where who cares who that guy is?

PH: That’s, that’s, that’s right. And one of the reasons that face recognition doesn’t go away is because it’s basically this passive biometric, and you can acquire the data at a distance.

NPR: Does the ultimate application of this, the ways it could be used, ever give you pause?

PH: Well, you know we think about it a lot, certainly within the group, and I don’t think we’re at the point at which these sorts of biometrics can essentially label people with perfect identity or any of that. And I think that there are interesting policy questions that surround that. And you know personally I’m on the side of this that less is better.

NPR: Uh huh.

PH: But I think it’s a really interesting scientific question.

NPR:  Will you reach a point though, as you get better and better and better in your scientific research some years hence where your predisposition to think that “less is more” comes to a head with “look how good we’ve gotten.”

PH: Yeah, I think at that point I’ll stop. But there’s no danger of that yet.

NPR: Really? You’ve still got another 10 years of making it better?

PH: There’s still a lot of work to do.

What can we learn from this dialog? First, a computer scientist should do work the DOD asks for, and should not make public the details about that work. Second, that work is a “really interesting scientific question” with “interesting policy questions,” the former of which is the purview of the researcher but not the latter. Third, the engineer’s personal feelings, explicitly at odds with the DOD ethos, are immaterial; he will work on the problem nonetheless. Fourth, he will keep working on the problem the DOD is paying him to solve until the point where his misgivings are realized and something awful has entered the world, at which point he might stop.

Many students accept and repeat these narratives, and learn to subsume their ideals or interests in directions that are militaristic or market-oriented shadows of what they had gone to school to study. Students interested in imaging are steered toward computer vision for weapons; ones interested in robots are directed into military drone research; others interested in environment or green energy are funneled into the interesting sounding MIT Energy Initiative.  (MITEI (pronounced by its members: “mighty”) is funded primarily by BP, Schlumberger, and Halliburton, and seems bent on maintaining petrochemical hegemony). Engineering education can be seen as the first in a series of filters within professional engineering that systematically remove individuals interested in challenging societal power, or remove the will to challenge from individuals.

A second filter is the professional identity of American engineers. Engineers learn that they cannot influence (and thus should not bother thinking about) the course of technologies. This is partly due to the tacit client relationship to power, but it is also part of a larger social and intellectual history described by Matthew Wisnioski in his dissertation, Engineers and the Intellectual Crisis of Technology, 1957-1973 (Phd, Princeton, 2005). Wisnioski describes how through the1960s an intellectual firestorm raged over how to think about technology. One faction argued that technology was a semiautonomous agent, able to drive history and change society, though it was not co-influenced by society or history. Any negative quality of a technology—what engineers call “unintended consequences”—derived from the natural tendencies of the technology. Somehow “through proper study technology could be managed” (even though it could not be influenced by society or history!). The second faction argued that technology was influenced by society, and indeed reflected the values of its builders. This more political view argued that if technology seemed to be running amok, it was a reflection of the priorities of the society behind it, and society itself should be changed. The majority of engineers adopted the former theory, of technological agency, which absolved engineers of responsibility for technology’s negative effects but undermined the engineer’s role as a creative, autonomous agent.

In choosing to limit their liability, engineers had to construct a complex, self-denying logic that dissimulated their own daily thinking, planning, choices, indeed their labor. Despite the fact that no engineer believes that technology is autonomous in the particular — at the scale of their own daily work — they nonetheless adopted a “zoomed out” view that erased the social aspects of their profession. Furthermore, while they might have personal identities, and might think that a particular kind of work is immoral or unethical, these factors matter less than the techno-scientific “interestingness” of the problem. They are oracles of technology, taking orders from political agents (like the DOD or DARPA, Monsanto or Schlumberger) yet somehow purporting to remain apolitical themselves. Engineering education and professional identity doesn’t so much inculcate ethics as systematically separate technical work from ethical thought and action. Ultimately, a professional engineer must subsume their own moral, political, and intellectual agency, channeling instead the interests of their clients.

2: I freed my software, so I freed my mind.

The origin story of the Free Software movement has been well described: generally it is said to have launched when a bearded and poorly socialized programmer named Richard Stallman, frustrated that copyright prevented him from fixing buggy commercial software later developed a nice bit of legal jiu-jitsu which enables software to be simultaneously copy-written yet forced into the public commons (Kelty 2008). Different historians concentrate on different aspects of this history, and certainly this legal coup is important, as was Stallman’s development of the GNU compiler and operating system, and later Linus Torvalds’ related work with Linux. What is often under-described is the actual mechanism of online collaboration, using various technically enabled tools and communication methods that help to coordinate a geographically distributed labor pool of heterogeneous individuals; many of whom have never met; share no client; and have no formal technical education.

Three things are important to take away from this history: First, the technologies developed through the Free Software movement have routinely proven to be superior to, and more popular than, those developed by corporations and governments. Second, many participants in the Free Software movement have not gone through traditional engineering education, though some do so later. Third, Free Software has an ideological component, but it is also a grounded set of technologies and practices that have reduced the advantages that large-scale enterprises like companies or governments had in developing technologies.

On the first point, technical superiority, I was a close witness to this process: When I joined MIT in 2001, it was common for research projects to be built from proprietary systems like Microsoft Visual Studio (a programming environment), Access or DBase (databases), on closed operating systems (like Windows). In the last five years at MIT I have not seen a single project launched from proprietary software. FLOSS software and open data has proven to be so technically superior that it has displaced commercial alternatives, and its influence is gradually moving outward from the tools of hacker production to higher level and more consumer-oriented software, as evidenced by Wikipedia, Firefox and many other crossover technologies.

The second point—that many Free Software participants have not gone through traditional engineering education—means that they have bypassed the inculcation described in the first section of this article.  Not only have they not had to sit in seminars on how to deny their moral agency or made to choose regressive research projects, the Free Software movement offers new standards of exemplary engineering.  Whereas emblematic programming languages might have once been the brainchildren of famous university professors (LISP at MIT) or industry researchers (C at Bell Labs), the new heroes are often independent or loosely institutionally affiliated, like Python’s Guido van Rossom or Linux’s Linus Torvald, in high school when he began the Linux project. The distributed nature of free software has created an alternative structure of education and ethical inculcation to that of conservative engineering education.  One can learn practical software engineering almost entirely online, through free books and tutorials or through intense social interactions on web sites like Stack Overflow or Git Hub, or IRC channels.

The third point is that the free software movement has developed a variety of concrete technology-augmented methods of collaboration. These include forms of self-governance, systems for managing many simultaneous authors (like version control or source management software), and even methods for resisting hostile opposition or sabotage (like the Debian initiation system). Many of these practices have been researched and described by management scholars. Baldwin (2000), for instance, has stressed the importance of design modularity that allows many simultaneous changes without worry about multiple changes conflicting, while Von Hippel (2005) has described how “user innovators” (like F/LOSS developers) are increasingly competitive with “producer innovators” (like companies) when communication costs decrease. Overall, these strategies and techniques are orthogonal to the means of technical production that defined the 20th century, namely the collocation of labor and capital in research universities, labs, and companies. Schools, labs, and firms are important actors in the free software movement, either as allies, hosts, or opponents, but ultimately free software works well without them.

We have seen how technology education and professional identity in engineering ultimately lead heterogeneous individuals to a dependent relationship with government and private enterprise, which in turn leads to the development of conservative technologies that reinforce the status quo. A new collective process of technology development, FLOSS, offers an alternative to technology enculturation and thus liberates technologies from the goals of the most powerful in society. Technologies like Betweenthebars.org rely on FLOSS not simply for the engines which make them run, but also for its model of productive, task- and product-oriented collective action and the accompanying techniques and software, like version control systems, that make FLOSS possible.

 

About the author

Chris Csíkszentmihályi is a scholar and technologist whose work draws from the humanities, design, and art. More »

Bibliography

Baldwin, Carliss Y., and Kim B. Clark. Design Rules: The Power of Modularity. Vol. 1. Cambridge, Mass.: MIT Press, 2000.

Kelty, Christopher. Two Bits: The Cultural Significance of Free Software, Duke University Press, 2008.

Von Hippel, Eric (2005). Democratizing Innovation, Cambridge, MA:MIT Press.