large black
Newsletters begin with nominal apologies. I’ve only a video, in lieu of innuendo. These Brit-bred guys are endangered and precisely what they’re claimed to be. I’m fond of them. [Sus, sure, but do pigs deserve their historically bad rep? They’re communal, plausibly self-recognitive, and farmed for energy and enjoyment.]
Apropos of nothing: here I’ll write about black philosophy of technology.
---
I’m just going to posit that settlement, scramble, and slavery render all Africana thought societal (or civilizational) critique, and that the prototypical automata probably thought about material culture. Solid enough hypotheses. Less believable may be my rendering of the context in question: “Modernity” is most likely reducible to accounting; thought has, after all, considered itself proprietary (or appropriative) for a hot minute. Today’s ubiquity of actuarial technics should then come as no surprise.
Perhaps my least contentious claim [see?!] is that there is a surprising lack of ethics in AI Ethics. It’s the consensus position in that discourse. Neither maldistributed predictions nor violated compacts (however conceived) encapsulate all that is concerning about today’s technology, and much of AI Ethics testifies to this ethical lack.
Oft-cited is Ruha Benjamin’s term, the “New Jim Code,” which tracks and portends the institutional uptake of tech that “reflect and reproduce existing inequities” yet are legitimated as impartial. Safiya U. Noble identifies the same phenomena as “technological redlining” to lament how computation is socially tantamount to verity. Together they represent arguably the most critical set of technologists today, black feminists who have documented, for some time now, how tech’s legitimacy augments discrimination, deadens political contestation, and paves the way for tech firms’ unchecked power consolidation. They identify this irreflexive trend as a crisis tendency, one which, if unimpeded, will continue to cause catastrophic results, not only for raced populations.
If tech can be so laden with validity as to appear incontestable, we really should ask, “incontestable to whom?” Do Benjamin and Noble consider a swath of the public “judgmental dopes”? Are they exempt from this delusion—due to expertise or social status? What actually enables them to identify a phenomenon that seizes others (especially those with political influence) like a spell?
Raising the stakes further is the view undergirding a substantial bit of AI Ethics today: “[c]omputer systems are part of a larger matrix of systemic racism.” That proposition is the very implication of a compound found in innumerable computing papers these days: sociotechnics. Benjamin neatly sums the word’s meaning in two postulates: (i) “that any given social order is impacted by technological development;” and (2) “that social norms, ideologies, and practices are a constitutive part of technical design.”
Those in the know recognize this as a version of a millennium-old dichotomy. Technical capacity and societal tendencies are depicted in determinate and/or genetic terms with one force constant, the other variable: e.g., a particular social arrangement (its habits, norms, and/or “values”) are the effect of an incessant technical progress, or vice-versa. Though, to be sure, it’s the reciprocal shape of sociotechnics that feeds forth those earlier questions. The two-way sociotechnical arrangement must, necessarily, bear on the technologist herself, for she is of the relations she analyzes. She thinks with and within the racist matrix.
Now we can finger-point all day at computing idioms that betray histories of imperial adventure: e.g., “master” and “slave” databases, the phrase “in the wild” to signal the public release of software, that robot is derivative of the Czech “robota” (serf) of which the linguistic root is “rab” (slave). Yet what’s most pressing in the diversity of “sociotechnical” studies and theories is the tension of disparate institutional norms. For some time now, universities produce, on a daily basis, a mountain of speech and writing with the same conclusion: there is misalignment, not between machine functions and social “values,” but amidst the various common senses of computing, industrial, and public institutions. Statistical fairness is distinct from social justice. A startup CEO is compelled in a direction contrariwise to the drives of medicine.
Further, much ink is spilled about the anemic status of polity in the social use of technology. Where, many ask, is the equal opportunity and parity amongst the “stakeholders” of sociotechnical initiatives? Citizens lack seats at the table for the problem posing, resource gathering, design, implementation, and rolling review stages of, say, the technology that surveils them. Such participation is, many argue, a requirement for the rightful installation of various tech systems into institutions. All this rocks, no complaints, but are we all saying the same thing, and what do our claims demand of us?
How, if at all, do the incentives, offerings, and responsibilities of the city’s institutions undermine such citizen design? In truth, by purporting a racist matrix, the black technologists above identify a similar hindrance: what if the citizens in question are the problems the technology is recruited to solve and on what grounds can those problems articulate their condition as problematic?
---
Cynics enjoy saying that “the system is working as intended,” while knowing intent is surely a red herring. That systems function for many obfuscates how they malfunction for some, and concealed, particular inhibition under a legitimate guise is the very crux of so-called institutional (or systemic) racism. People throw around Hamilton and Ture’s language in a consequentialist way and so deviate from their indictment of “antiblack attitudes and practices.” Alas these authors didn’t thematize this appeal to normative commitments beyond a nod to the “operation” of “a sense of superior group position.”
In this little substack, I’ll periodically pose a couple ways to make sense of that sense and its operation inasmuch as they bear on the making and use of what we call technology. I will probably fuck up, and if you’re nice, you’ll rip me to shreds.
