Passa al documento

Benjamin ch2 glitch - homework

homework
Corso

teoria e tecnica dell'armonia (COTP/01)

5 Documenti
Gli studenti hanno condiviso 5 documenti in questo corso
Anno accademico: 2020/2021
Caricato da:
Studente anonimo
Questo documento è stato caricato da uno studente come te che ha optato per l'anonimità.
Conservatorio Statale di Musica A. Corelli di Messina

Commenti

accedi o registrati per pubblicare commenti.

Anteprima del testo

--

Ruha Benjamin

RACE AFTER

TECHNOLOGY

Abolitionist Tools for the

New Jim Code

polity

Copyright© Ruha Benjamin 2019 The right of Ruha Benjamin to be identified as Author of this Work has been asserted in accordance with the UK Copyright, Designs and Patents Act 1988. First published in 2019 by Polity Press Reprinted: 2019 (twice) Polity Press 65 Bridge Street Cam;,tidge CB2 lUR, UK Polity Press 101 Station Landing Suite 300 Medford, MA 02155, USA All rights reserved. Except for the quotation of short passages for the/ of criticism and review, no part of this publication may be reproduce , stored in a retrieval system or transmitted, in any form or by any means, electronic, mechanical, photocopying, recording or otherwise, without the prior permission of the publisher. ISBN-13: 978-1-5095-2639- ISBN-13: 978-l-5095-2640-6(pb) A catalogue record for this book is available from the British Library. Library of Congress Cataloging-in-Publication Data Names: Benjamin, Ruha, author. Title: Race after technology : abolitionist tools for the new Jim code / Ruha Benjamin. Description: Medford, MA: Polity, 2019. I Includes bibliographical references and index. Identifiers: LCCN 2018059981 (print) I LCCN 2019015243 (ebook) I ISBN 9781509526437 (Epub) I ISBN9781509526390 (hardback) I ISBN 9781509526406 (paperback) Subjects: LCSH: Digital divjde--United States--21st century. I Information technology--Social aspects-•United States--2 lst century. I African Americans--Social conditions--2 lst century. I Whites-United States--Socia I conditions--21st century. I United States•·Race relations--2 lst century. I BISAC: SOOAL SOENCE / Demography. Classification: LCC HN90 (ebook) I LCC HN90 B46 2019 (print) l DOC 303/330973--dc LC record available at lccn.loc/ Typeset in 11 on 14 pt Sabon by Servis Filmsetting Ltd, Stockpart, Cheshire Printed and bound in the Umted States by LSC Communications The publisher has used its best endeavours to ensure that the URLs for external websites referred to in this book are correct and active at the time of going to press. However, the publisher has no responsibility for the websites and can make no guarantee that a site will remain live or that the content is or will remain appropriate. Every effort has been made to uace all copyright holders, but if any have been overlooked the l'ublisher will be pleased to include any necessary credits in any subsequent rcprmt or edition. For further information on Pohry, visit our website: politybooks

Contents

Figures vi Preface ix

Introduction 1 1 Engineered Inequity 49 2 Default Discrimination 77 3 Coded Exposure 97 4 Technological Benevolence 137 5 Retooling Solidarity, Reimagining Justice 160

Acknowledgments 198 Notes 202 Appendix 235 References 240 Index 274

V

·A ALBLA

Oalllebland

Race after Technology

v Then Google Maps was like, 11 turn right on Malcolm Ten Boulevard" and I knew there were no black engineers working there 9:42 PM - 19 Nov 2013

3 Retweets 3,748 Likes •• iOJ. G •• e'IJ.

0 100 t1 3,71( 0 3 B

Figure 2 Malcolm Ten Source: Twitter @alliebland, November 19, 2013, 9:42 p.

Database design, in that way, is "an exercise in world- building," a normative process in which progr .:tmmers are ina position to projecttheirworld views-a process that all too often reproduces the technology of race. 2 Computer systems are a part of the larger matrix of systemic racism. Just as legal codes are granted an allure of objectivity

  • "justice is (color)blind" goes the fiction - there is enor- mous mystique around computer codes, which hides the human biases involved in technical design. The Google Maps glitch is better understood as a form of displacement or digital gentrification mirror- ing the widespread dislocation underway in urban areas across the United States. In this case, the cultural norms and practices of programmers - who are drawn from a narrow racial, gender, and classed demographic - are coded into technical systems that, literally, tell people where to go. These seemingly innocent directions, in turn, reflect and reproduce racialized commands that instruct people where they belong in the larger social order. 3

78

Default Discrimination

Ironically, this problem of misrecognition actually reflects a solution to a difficult coding challenge. A computer's ability to parse Roman numerals, inter- preting an "X" as "ten," was a hard-won design achievement. 4 That is, from a strictly technical stand- point, "Malcolm Ten Boulevard" would garner cheers. This illustrates how innovations reflect the priorities and concerns of those who frame the problems to be solved, and how such solutions may reinforce forms of social dismissal, regardless of the intentions of indi- vidual programmers. While most observers are willing to concede that technology can be faulty, acknowledging the periodic breakdowns and "glitches" that arise, we must be will- ing to dig deeper. 5 A narrow investment in technical innovation necessarily displaces a broader set of social interests. This is more than a glitch. It is a form of exclusion and subordination built into the ways in which priorities are established and solutions defined in the tech industry. As Andrew Russell and Lee Vinsel contend, "[t]o take the place of progress, 'innovation,' a smaller, and morally neutral, concept arose. Innovation provided a way to celebrate the accomplishments of a high-tech age without expecting too much from them in the way of moral and social improvement!' 6 For this reason, it is important to question "innovation" as a straightforward social good and to look again at what is hidden by an idealistic vision of technology. How is technology already raced? This chapter probes the relationship between glitch and design, which we might be tempted to associate with competing conceptions of racism. If we think of racism as something of the past or requiring a particular

Race after Technology

visibility to exist, we can miss how the New Jim Code operates and what seeming glitches reveal about the structure of racism. Glitches are generally considered a fleeting interruption of an otherwise benign system, not an enduring and constitutive feature of social life. But what if we understand glitches instead to be a slippery place (with reference to the possible Yiddish origin of the word) between fleeting and durable, micro-interactions and macro-structures, individual hate and institutional indifference? Perhaps in that case glitches are not spu- rious, but rather a kind of signal of how the system operates. Not an aberration but a form of evidence, illuminating underlying flaws in a corrupted system.

Default Discrimination

At a recent workshop sponsored by a grassroots organization called Stop LAPD Spying, the facilitator explained that community members with whom she works might not know what algorithms are, but they know what it feels like to be watched. Feelings and stories of being surveilled are a form of "evidence," she insisted, and community testimony is data. 7 As part of producing those data, the organizers interviewed people about their experiences with surveillance and their views on predictive policing. They are asked, for example: "What do you think the predictions are based on?" One person, referring to the neighborhood I grew up in, responded:

Because they over-patrol certain areas - if-you're only looking on Crenshaw and you only pulling Black people

80

Default Discrimination

over then it's only gonna make it look like, you know, whoever you pulled over or whoever you searched or whoever you criminalized that's gonna be where you found something. 8

Comments like this remind us that people who are most directly impacted by the New Jim Code have a keen sense of the default discrimination facilitated by these tech- nologies. As a form of social technology, institutional racism, past and present, is the precondition for the car- ceral technologies that underpin the US penal system. At every stage of the process- from policing, sentencing, and imprisonment to parole - automated risk assessments are employed to determine people's likelihood of committing a crime. 9 They determine the risk profile of neighbor- hoods in order to concentrate police surveillance, or the risk profile of individuals in order to determine whether or for how long to release people on parole. In a recent study of the recidivism risk scores assigned to thousands of people arrested in Broward County, Floripa, ProPublica investigators found that the score was remarkably unreliable in forecasting violent crime. They also uncovered significant racial disparities:

In forecasting who would re-offend, the algorithm made mistakes with black and white defendants at roughly the same rate but in very different ways. The formula was particularly likely to falsely flag black defendants as future criminals, wrongly labeling them this way at almost twice the rate as white defendants. White defend- ants were mislabeled as low risk more often than black defendants. 10 The algorithm generating the risk score builds upon already existing forms of racial domination and reinforces

Race after Technology

strategy put it, "it sounds like fiction, but its more like science fact." 17

Predicti Glitches

One of the most iconic scenes from The Matrix film trilogy deals with the power of predictions and self- fulfilling prophecies. The main protagonist, Neo, goes to visit the Oracle, a software program depicted as a Black woman in her late sixties. Neo is trying to figure out whether he is who others think he is - ''the one" who is supposed to lead humanity in the war against the machines. As he tries to get a straight answer from the Oracle and to figure out whether she really has the gift of prophecy, she says, "I'd ask you' to sit down, but you're not going to anyway. And don't worry about the vase."

NEO: What vase? [Neo knocks a vase to the floor] THE ORACLE: That vase. NEo: I'm sorry. THE ORACLE: I said don't worry about it. I'll get one of my kids to fix it. NEO: How did you know? THE ORACLE: What's really going to bake your noodle later on is, would you still have broken it if I hadn't said anything. 18

This scene invites a question about real-life policing:

####### Would cops still have warrants to knock down the

doors in majority Black neighborhoods if predictive algorithms hadn't said anything? The Matrix offers a potent allegory for thinking about

84

Default Discrimination

power, technology, and society. It is set in a dystopian future in which machines overrun the world, using the energy generated by human brains as a vital source of computing power. Most of humanity is held captive in battery-like pods, their minds experiencing an elaborate life-like simulation of the real world in order to pacify humans and maximize the amount of energy brains pro- duce. The film follows a small band of freedom fighters who must convince Neo that the simulated life he was living is in fact a digital construction. Early on in his initiation to this new reality, Neo experiences a fleeting moment of deja vu when a black cat crosses his path - twice. Trinity, his protector and eventual love interest, grows alarmed and explains that this "glitch in the matrix" is not at all trivial but a sign that something about the program has been changed by the agents of the Matrix. The sensation of deja vu is a warning sign that a confrontation is imminent and that they should prepare to fight. The film's use of deja vu is helpful for considering the relationship between seemingly trivial technical glitches and meaningful design decisions. The glitch in this context is a not an insignificant "mistake" to be patched over, but rather serves as a signal of something foundational about the structure of the world meant to pacify humans. It draws attention to the construction and reconstruction of the program and functions as an indication that those seeking freedom should be ready to spring into action.

####### A decade before the Matrix first hit the big screen,

Black feminist theorist Patricia Hill Collins conceptual- ized systemic forms of inequality in terms of a "matrix of domination" in which race, class, gender, and other

Race after Technology

axes of power operated together, "as sites of domination and as potential sites of resistance. " 19 This interlocking matrix operates at individual, group, and institutional levels, so that empowerment "involves rejecting the dimensions of knowledge, whether personal, cultural, or institutional, that perpetuate objectification and dehumanization. " 20 Relating this dynamic to the ques- tion of how race "gets inside" technology, the Roman numeral glitch of Google Maps and others like it urge us to look again at the way our sociotechnical systems are constructed - by whom and to what ends. Racist glitches - such as celebrity chef Paula Dean's admission that "yes, of course" she has used the N-word alongside her desire to host a "really south- ern plantation wedding" with all-Black servers; 21 or a tape-recorded phone call in which former Los Angeles Clippers owner and real estate mogul Donald Sterling told a friend "[i]t bothers me a lot that you want to broadcast that you're associating with black people" 22

  • come and go, as provocative sound bites muffling a deeper social reckoning. In my second example, the scandal associated with Sterling's racist remarks stands in stark contrast with the hush and acceptance of a documented pattern of housing discrimination exercised over many years, wherein he refused to rent his proper- ties to Black and Latinx tenants in Beverly Hills and to non-Korean tenants in LA's Koreatown. 23 In the midst of the suit brought by the Department of Justice, the Los Angeles chapter of the National Association for the Advancement of Colored People nevertheless honored Sterling with a lifetime achievement award in 2009. Only once his tape-recorded remarks went public in 2014 did the organization back out of plans to award

Default Discrimination

him this highest honor for a second time, forcing the chapter president to resign amid criticism. Dragging individuals as objects of the public con- demnation of racist speech has become a media ritual and pastime. Some may consider it a distraction from the more insidious, institutionalized forms of racism typified by Sterling's real estate practices. The deja vu regularity of all those low-hanging N-words would suggest that stigmatizing individuals is not much of a deterrent and rarely addresses all that gives them license and durability. But, as with Trinity's response to Neo in the Matrix regarding his path being crossed twice by a black cat, perhaps if we situated racist "glitches" in the larger complex of social meanings and structures, we too could approach them as a signal rather than as a distraction. Sterling's infamous phone call, in this case, would alert us to a deeper pattern of housing discrimination, with far-reaching consequences.

Systemic Racism Reloaded

Scholars of race have long challenged the focus on individual "bad apples," often to be witnessed when someone's racist speech is exposed in the media - which is typically followed by business as usual. 24 These individuals are treated as glitches in an otherwise benign system. By contrast, sociologists have worked to delineate how seemingly neutral policies and norms can poison the entire "orchard" or structure of soci- ety, systematically benefiting some while subjugating others. 25

Race after Technology

race remains undertheorized in Internet studies and urges more attention to the technology of structural racism. In line with the focus on glitches, researchers tend to concentrate on how the Internet perpetuates or mediates racial prejudice at the individual level rather than ana- lyze how racism shapes infrastructure and design. And, while Daniels does not address this problem directly, an investigation of how algorithms perpetuate or disrupt racism should be considered in any study of discrimina- tory design.

Architecture and Algorithms

On a recent visit that I made to University of California at San Diego, my hosts explained that the design of the campus made it almost impossible to hold large out- door gatherings. The "defensive" architecture designed to prevent skateboarding and cycling in the interest of pedestrians also deliberately prevented student protests at a number of campuses following the Berkeley free speech protests in the mid-1960s. This is not so much a trend in urban planning as an ongoing feature of stratified societies. For some years now, as I have been writing and thinking about discriminatory design of all sorts, I keep coming back to the topic of public benches: benches I tried to lie down on but was pre- vented because of intermittent arm rests, then benches with spikes that retreat after you feed the meter, and many more besides. Like the discriminatory designs we are exploring in digital worlds, hostile architecture can range from the more obvious to the more insidious - like the

Default Discrimination

oddly shaped and artistic-looking bench that makes it uncomfortable but not impossible to sit for very long. Whatever the form, hostile architecture reminds us that public space is a permanent battleground for those who wish to reinforce or challenge· hierarchies. So, as we explore the New Jim Code, we can observe connec- tions in the building of physical and digital worlds, even starting with the use of "architecture" as a common metaphor for describing what algorithms - those series of instructions written and maintained by programmers that adjust on the basis of human behavior - build. But, first, let's take a quick detour ... The era commonly called "Jim Crow" is best known for the system of laws that mandated racial segrega- tion and upheld White supremacy in the United States between 1876 and 1965. Legal codes, social codes, and building codes intersected to keep people separate and unequal. The academic truism that race is "constructed" rarely brings to mind these concrete brick and mortar structures, much less the digital structures operating today. Yet if we consider race as itself a technology, as a means to sort, organize, and design a social structure as well as to understand the durability of race, its consist- ency and adaptability, we can understand more clearly the literal architecture of power. Take the work of famed "master builder" Robert Moses, who in the mid-twentieth century built hundreds of structures, highways, bridges, stadiums, and more, prioritizing suburbanization and upper-middle-class mobility over public transit and accessibility to poor and working-class New Yorkers. In a now iconic (yet still disputed) account of Moses' approach to public works, science and technology studies scholar Langdon

Race after Technology

Winner describes the low-hanging overpasses that line the Long Island parkway system. In Winner's telling, the design prevented buses from using the roads, which enabled predominantly White, affluent car owners to move freely, while working-class and non-White people who relied on buses were prevented from accessing the suburbs and the beaches. And while the veracity of Winner's account continues to be debated, the parable has taken on a life of its own, becoming a narrative tool for illustrating how artifacts "have politics." 33 For our purpose, Moses' bridges symbolize the broader architecture of Jim Crow, But, whereas Jim Crow laws explicitly restricted Black people from numerous "White only" spaces and services, the physi- cal construction of cities and suburbs is central to the exercise of racial power, including in our postcivil rights era. And, while some scholars dispute whether Moses intended to exclude Black people from New York sub- urbs and beaches, one point remains clear: the way we engineer the material world reflects and reinforces (but could also be used to subvert) social hierarchies. Yet plans to engineer inequity are not foolproof. In April 2018 a group of high school students and their chaperones returning from a spring break trip to Europe arrived at Kennedy Airport and boarded a charter bus that was headed to a Long Island shopping center where parents waited to pick up their kids. As they drove to the mall, the bus driver's navigation system failed to warn him about the low-hanging bridges that line the Long Island parkway and the bus slammed violently into the overpass, crushing the roof, seriously wounding six, and leaving dozens more injured. As news reports pointed out, this was only the latest of hundreds of

Default Discrimination

similar accidents that happened over the years, despite numerous warning signs and sensor devices intended to alert oncoming traffic of the unusually low height of overpasses. Collateral damage, we might say, is part and parcel of discriminatory design. From what we know about the people whom city planners have tended to prioritize in their designs, fami- lies such as the ones who could send their children to Europe for the spring break loom large among them. But a charter bus with the roof shaved off reminds us that tools of social exclusion are not guaranteed to impact only those who are explicitly targeted to be disadvantaged through discriminatory design. The best- laid plans don't necessarily "stay in their lane," as the saying goes. Knowing this, might it be possible to rally more people against social and material structures that

####### immobilize some to the benefit of others? If race and

other axes of inequity are constructed, then perhaps we can construct them differently? When it comes to search engines such as Google, it turns out that online tools, like racist robots, reproduce the biases that persist in the social world. They are, after all, programmed using algorithms that are constantly updated on the basis of human behavior and are learn- ing and replicating the technology of race, expressed in the many different associations that the users make. This issue came to light in 2016, when some users searched the phrase "three Black teenagers" and were presented with criminal mug shots. Then when they changed the. phrase to "three White teenagers," users were presented with photos of smiling, go-lucky youths; and a search for "three Asian teenagers" presented images of scant- ily clad girls and women. Taken together, these images

93

Race after Technology

unbiased, warning that, "if we build an intelligent system that learns enough about the properties of language to be able to understand and produce it, in the process it-will also acquire historic cultural associations, some of which can be objectionable. Already, popular online translation systems incorporate some of the biases we study ... Further concerns may arise as AI is given agency in our society." 44 And, as we shall see in the following chapters, the practice of codifying existing social prejudices into a technical system is even harder to detect when the stated purpose of a particular tech- nology is to override human prejudice.

96

3

Coded Exposure

Is Visibility a Trap?

I think my Blackness is interfering with the computer's ability to follow me. Webcam user 1

EXPOSURE

  • the amount of light per unit area
  • the disclosure of something secret
  • the condition of being unprotected
  • the condition of being at risk of financial loss
  • the condition of being presented to view or made known. 2

In the short-lived TV sitcom Better off Ted, the writers parody the phenomena of biased technology in an epi- sode titled "Racial Sensitivity." This episode presents the corporation where the show takes place install- ing a "new state of the art system that's gonna save money," but employees soon find there is a "glitch in the system that keeps it from recognizing people with dark skin. " 3 When the show's protagonist confronts

Notes to pages 71-

a series of choices that designers and technologists have made. Many of them small: what a button says, where a data set comes from. But each of these choices reinforces beliefs about the world, and the people in it." 55 Batsman 2017. 56 Nguyen 2016. 57 Morris 2018. 58 State Council 2014. 59 State Council 2014. 60 Tufekci 2017, p. 128. 61 Nopper 2019, p. 170. 62 Hacking 2007.

Notes to Chapter 2 1 Merriam-Webster Online, n. 2 Personal interview conducted by the author with Princeton digital humanities scholar Jean Bauer, October 11, 2016. 3 See references to "digital gentrification" in "White Flight and Digital Gentrification," posted on February 28 at https://untsocialmedias 13.wordpress/2013/02/28/ white-flight-and-digital-gentrification by jalexander716, 4 Sampson 2009. 5 As Noble (2018, p. 10) writes, "[a]lgorithmic oppres- sion is not just a glitch in the system but, rather, is fundamental to the operating system of the web." 6 Russell and Vinsel 2016. 7 See the conference" Dismantling Predictive Policing in Los Angeles," May 8, 2018, at stoplapdspying/ wp-content/uploads/2018/05/Before-the-Bullet-Hi ts-the -Body-May-8-2018. 8 "Dismantling predictive policing in Los Angeles," pp. 38-9. 9 Ferguson 2017. 10 Angwin et al. 2016.

216

Notes to page 82

11 According to Sharpe (2016, p. 106), "the weather neces- sitates changeability and improvisation," which are key features of innovative systems that adapt, in this case, to postracial norms where racism persists through the absence of race. 12 Meredith Broussard, data journalist and author of Ar#ficial Unintelligence, explains: "The fact that nobody at Northpointe thought that the questionnaire or its results might be biased has to do with technochau- vinists' unique worldview. The people who believe that math and computation are 'more objective' or 'fairer' tend to be the kind of people who think that inequality and structural racism can be erased with a keystroke. They imagine that the digital world is different and better than the real world and that by reducing decisions to calculations, we can make the world more rational. When development teams are small, like-minded, and not diverse, this kind of thinking can come to seem normal. However, it doesn't move us toward a more just and equitable world" (Broussard 2018, p. 156). 13 Brayne 2014. 14 As Wang (2018, p. 236) puts it, "the rebranding of policing in a way that foregrounds statistical imperson- ality and symbolically removes the agency of individual officers is a clever way to cast police activity as neutral, unbiased, and rational. This glosses over the fact that using crime data gathered by the police to determine where officers should go simply sends police to patrol the poor neighborhoods they have historically patrolled when they were guided by their intuitions and biases. This 'new paradigm' is not merely a reworking of the models and practices used by law enforcement, but a revision of the police's public image through the deploy- ment of science's claims to objectivity." 15 I am indebted to Naomi Murakawa for highlighting for

Questo documento è stato utile?

Benjamin ch2 glitch - homework

Corso: teoria e tecnica dell'armonia (COTP/01)

5 Documenti
Gli studenti hanno condiviso 5 documenti in questo corso
Questo documento è stato utile?
--
Ruha Benjamin
RACE AFTER
TECHNOLOGY
Abolitionist Tools for the
New Jim Code
polity