NB: This piece evolved from comments I made in the 2013 #DHPoco Summer School forums, and was originally slated to be part of an ill-fated special issue of the Journal of Digital Humanities. I’m finally getting around to posting it myself, as of December 2014.

Scholars in science and technology studies tell us that computers, like all technologies, are embedded in a web of social relations which profoundly shape their development and use.1 Feminist and critical race theorists reveal how “objectivity” conceals and perpetuates gender and racial biases in the construction of knowledge.2 In light of these insights, along with the persistence of race in the visible aspects of digital media,3 we may ask: does race structure computing in profound, imperceptible ways? Furthermore, if race and racism are in fact implicated in the design of our machines, how can we work against their pernicious effects within the field of the digital humanities?

During the course of #DHPoco Summer School, I began to consider these issues more deeply in response to our discussion of Tara McPherson’s provocative essay “Why Are the Digital Humanities So White?” Her work compares the design principles of the operating system UNIX with the precepts of a regime of covert racism, based on analogous ideas of separation and containment. Although McPherson is careful not to assert a causal relationship between computing and racist ideology, she suggests that the two may be more tightly coupled than we realize: “If […] UNIX hardwired an emerging system of covert racism into our mainframes and our minds, then computation responds to culture as much as it controls it.”4

In McPherson’s essay, modularity emerges as a guiding principle of both programming and neoliberalism. It undergirds a host of harmful phenomena, including racial segregation, the splintering of academic disciplines, and the concealment of the machinations of global capital. McPherson therefore exhorts us to resist modularity in our thinking, and to strive to keep the complete picture in view. This advice is, on its face, unimpeachable – but I fear it may encourage some readers to reject modularity altogether, when in fact the concept can (in the right context) serve as an aid rather than a hindrance to thinking globally.

Modularity, at its most basic, entails the composition of a larger whole out of smaller pieces. Modularity in the context of computing extends human capacities; it offers us a framework to deal with otherwise intractable complexity, abstracting away unnecessary details so that the whole can be perceived. In contrast, modularity in the context of neoliberal capitalism restricts human capacities; it alienates us from each other, our labor, and ourselves, precluding us from developing the “cognitive map” needed to comprehend and transform it.5

Neoliberal modularity in computer production is admittedly a prerequisite to modularity in computer programming.6 Still, the two are not equivalent. Indeed, the slippage between these two ideas is one instance of a larger problem, which should raise the ire of critical race theorists and techies alike. When we draw analogies from the social world to technology and vice versa, we almost invariably obscure the properties of both to our collective detriment.

This is evident, for example, in the history of the “master/slave” terminology used in engineering and computer science. Ron Eglash traces the use of these terms back to a white South African inventor who in 1904 created a two-part clock mechanism with a “master” clock and a “slave” controlled by it.7 These terms later came to be used to describe arrangements of electronic devices wherein one mechanism delivers orders and the other obeys – however, as Eglash observes, the words are often a poor fit for the technical relationship they purport to describe. For example, computer drives have historically been designated as “master” and “slave”; but in actuality, neither drive controls the other. The two could as easily have been called 0 and 1, but the more contentious terminology remains.8

This use of “master” and “slave” is not directly linked with slavery as a social institution, as it did not come about until decades after slavery had lost state approval both in South Africa and the United States. Neither is it inherently racial, although these terms do have a painful valence to those of us whose ancestors were enslaved. Nevertheless, it is indefensible both because it is alienating and because, like all bad metaphors, it leads to sloppy and reductive thinking.

In cases like the “master/slave” metaphor or that of “male” and “female” plugs,9 we project specific configurations of social relations onto our technologies, after which the implied relationship comes to seem even more “natural” despite its social origins. These terms carry harmful and misleading associations both in the wider world and within the technical domains in which they are applied. It behooves us to discard all such metaphors that distort the truth and alienate people, and to instead produce the most illuminating and affirming metaphors we can.

What is to be done, if wish to forge new metaphors and pursue the goals of feminist, anti-colonial, and critical race computing? Far from opposing modularity, those of us who program should make our code as modular as possible – releasing it under a “copyleft”-style license, to ensure that others can reuse and reinterpret it as they will. For comparison, think of hip-hop pioneers in the South Bronx building sound systems from the technological scraps of the post-industrial landscape,10 or Angolan youth making music from cell phone ringtones.11 People have always engaged with and transformed technologies in accordance with their own priorities; free software, in its replicability and mutability, is a prime candidate for such treatment.

We must also combat unequal access to technology and education, and the gender and racial biases in computing’s social context. As Jane Margolis documents in Stuck in the Shallow End: Education, Race, and Computing, black and Latin@ students receive inadequate institutional support to learn computer science even when equipment is present in schools, along with negative societal attitudes that discourage them and their teachers from even trying.12 These dynamics are challenged by groups like Black Girls Code or the Empowermentors and events such as Trans*H4CK, which seek to empower those at the intersections of a variety of marginalized identities to gain technical skills and work in solidarity with each other.

In the digital humanities, coding has been privileged over other modes of participation without consideration of the barriers to entry imposed by race, gender, class, disability, and the legacies of colonialism. The promise of #DHPoco lies in its ability to spark conversation across disciplinary and national boundaries, and to add a sorely-needed analysis of power to a field otherwise predisposed to reproduce dominant ideology. If we encourage broader participation by sharing reusable code and transforming the exclusionary culture surrounding it, computing as a whole and DH in particular can be harnessed to help dismantle all forms of hegemony.

Finally, toward this end, we must expand the very concept of what computing is and can be. Matti Tedre and Ron Eglash suggest the term “ethnocomputing” to describe a variety of cultural practices involving data structures, algorithms, and physical or linguistic realizations thereof.13 This framework encompasses everything from electronic computers, to the Inka khipu, to the Akan strategy game Oware. Efforts like artist-theorist D. Fox Harrell’s experiments with interactive Afro-Diasporic storytelling and his Advanced Identity Representation project illustrate some of the possibilities for cultural and epistemological syntheses in this vein.14 As the aforementioned activist groups indicate, we must also remain cognizant of the practical needs of learning to code as a means to gain one’s livelihood. If we stop regarding algorithmic thinking as a Western invention, and help each other to secure the material conditions needed to engage in it, we may yet succeed in reconfiguring computing as a mode of liberatory praxis.

  1. Donna Haraway, Simians, Cyborgs, and Women: The Reinvention of Nature (New York: Routledge, 1991), 165; Bruno Latour, We Have Never Been Modern (Cambridge, MA: Harvard University Press, 1993).

  2. Kimberlé Crenshaw, Critical Race Theory: The Key Writings That Formed the Movement (New York: New Press: Distributed by W.W. Norton & Co., 1995); Chela Sandoval, Methodology of the Oppressed (Minneapolis, MN: University of Minnesota Press, 2000); José Medina, The Epistemology of Resistance: Gender and Racial Oppression, Epistemic Injustice, and Resistant Imaginations (Oxford; New York: Oxford University Press, 2013).

  3. See for instance Lisa Nakamura’s work on the ways that images of racialized bodies function as visual capital online, as part of a “digital racial formation.” Lisa Nakamura, Digitizing Race: Visual Cultures of the Internet (Minneapolis: University of Minnesota Press, 2008).

  4. Tara McPherson, “Why Are the Digital Humanities So White? or Thinking the Histories of Race and Computation,” in Debates in the Digital Humanities, 2012, http://dhdebates.gc.cuny.edu/debates/text/29.

  5. Fredric Jameson, Postmodernism, or, The Cultural Logic of Late Capitalism (Durham: Duke University Press, 1991), 416. Wendy Chun argues that software enables the creation of precisely the kind of cognitive map identified by Jameson, as it simulates the invisible networks of capital at a comprehensible scale. This, I would argue, is an exemplary use of modularity as a cognitive tool. Wendy Hui Kyong Chun, “On Software, or the Persistence of Visual Knowledge,” Grey Room 18 (Winter 2004): 42, doi:10.1162/1526381043320741.

  6. Neoliberal modularity is that which hides from us the material conditions of life for workers in Shenzhen, assembling computers from parts made by workers in the Philippines or the Czech Republic, who have in turn utilized aluminum mined by workers in Brazil and coltan mined by workers in the Democratic Republic of the Congo… I cannot be the only one who uses a laptop daily without thinking of the astonishing amount of human labor involved in its production.

  7. Ron Eglash, “Broken Metaphor: The Master-Slave Analogy in Technical Literature,” Technology and Culture 48, no. 2, http://www.historyoftechnology.org/eTC/v48no2/eglash.html.

  8. The “master” and “slave” appellations only apply to a now-outmoded standard, known as parallel ATA (PATA) or IDE. New computers have nearly all switched to the serial ATA (SATA) standard created in 2003, which thankfully has done away with these terms.

  9. Calling electrical connectors “male” and “female” belies the diversity of physical configurations that such connectors possess, which often go against the cis-heteronormative assumptions that these terms may suggest. The arbitrary assignation of “male” and female" to these objects can be considered a symbolic side-effect of the violent ‘disambiguation’ of socially-constructed biological sex, on which subject, see for example Anne Fausto-Sterling, Sexing the Body: Gender Politics and the Construction of Sexuality (New York: Basic Books, 2000).

  10. Tricia Rose, Black Noise: Rap Music and Black Culture in Contemporary America (Hanover, NH: University Press of New England, 1994), 63.

  11. Jayna Brown, “Buzz and Rumble: Global Pop Music and Utopian Impulse,” Social Text 28, no. 1 (2010): 125–146, doi:10.1215/01642472-2009-063.

  12. Jane Margolis, Stuck in the Shallow End Education, Race, and Computing (Cambridge, MA: MIT Press, 2008).

  13. Matti Tedre and Ron Eglash, “Ethnocomputing,” in Software Studies: a Lexicon, ed. Matthew Fuller (Cambridge, MA: MIT Press, 2008), 92–100.

  14. D. Fox Harrell, “Toward a Theory of Critical Computing: The Case of Social Identity Representation in Digital Media Applications,” Code Drift: Essays in Critical Digital Studies, http://www.ctheory.net/articles.aspx?id=641.