To Live in Their Utopia: Why Algorithmic Systems Create Absurd Outcomes

tags
Predictive Analytics Ali Alkhatib David Graeber

Notes

explain the instances of errors, but how the environment surrounding these systems precipitate those instances

NOTER_PAGE: (1 . 0.2418372993912562)

structural tendency for powerful algorithmic systems to cause tremendous harm.

NOTER_PAGE: (1 . 0.27393469839513)

the future we’ve imagined and promoted for decades, as designers of technical systems, is woefully misaligned from people’s experiences

NOTER_PAGE: (1 . 0.6347537354731599)

cause substantial harms in myriad domains, often surprising the designers

NOTER_PAGE: (1 . 0.6701715550636413)

ML systems surface patterns in the data, generating models that reward recognizable expressions, identities, and behaviors. And, quite often, they punish new cases and expressions of intersectionality, and marginalized groups.

NOTER_PAGE: (2 . 0.26895406751521855)

novel cases, such as when societal norms on the issues most salient to that case have changed.

NOTER_PAGE: (2 . 0.37133370226895407)

power as one of the factors designers need to identify and struggle with

NOTER_PAGE: (2 . 0.5954620918649696)

massive algorithmic systems that harm marginalized groups as functionally similar to massive, sprawling administrative states that James Scott describes in Seeing Like a State1

NOTER_PAGE: (2 . 0.630879911455451)

harm emerges later, as these systems impose the worldview instantiated by the “abridged maps” of the world they generate

NOTER_PAGE: (2 . 0.7703375760929717)

Legibility

absurdity follows when algorithmic systems deny the people they mistreat the status to lodge complaints, let alone the power to repair, resist, or escape the world that these systems create.

NOTER_PAGE: (3 . 0.16214720531267293)

algorithmic systems try to insist that they live in their utopias.

NOTER_PAGE: (3 . 0.19867183176535694)

absurdity and tragedy tend to manifest when bureaucratic imaginations diverge from reality and when people can’t override the delusions baked into those imaginations.

NOTER_PAGE: (3 . 0.3901494189263973)

our community as researchers, as designers, and crucially as participants in society should focus our efforts on designing and supporting resistance & escape from these domains.

NOTER_PAGE: (3 . 0.5096845600442722)

for the first several decades, scientific foresters enjoyed unprecedented success and bounty. But after 70 or 80 years - that is, after a generation of trees had run its full course - things started to go terribly wrong.

NOTER_PAGE: (4 . 0.39125622578859987)

a physician decides to diagnose a patient using the categories that the insurance company will accept. The patient then self-describes, using that label to get consistent help from the next practitioner seen. The next practitioner accepts this as part of the patient’s history of illness

NOTER_PAGE: (5 . 0.20199225235196458)

the data themselves won’t directly implicate the complex histories of slavery, white supremacy, redlining, and other factors that bear on relative inequities in health and the relative inequities of wealth

NOTER_PAGE: (5 . 0.5218594355285002)

The model doesn’t have a place for these people, and the algorithm will flag them as anomalies

NOTER_PAGE: (5 . 0.624792473713337)

some ostensibly objective goal that designers insist is better than decisions humans make in some or many ways

NOTER_PAGE: (6 . 0.1627006087437742)

models of the world that totally fail to account for the history of criminalizing Blackness

NOTER_PAGE: (6 . 0.7166574432761482)

computational models have none of the tools necessary to account for biases of which these systems aren’t aware in the first place.

NOTER_PAGE: (6 . 0.7670171555063641)

observe the world that the computer vision system has constructed: a world without a concept of gender identity beyond “male” or “female”

NOTER_PAGE: (7 . 0.2379634753735473)

This system doesn’t understand that it reaffirms harmful gender binaries because it can’t understand anything.

NOTER_PAGE: (7 . 0.25345877144438295)

For graduate student applicants, for whom this system largely decided the trajectory of their adult lives, these systems were tremendously consequential; for the designers of the system, what mattered was that the system “reducing the total time spent on reviews by at least 74%”

NOTER_PAGE: (7 . 0.588267847260653)

When designers produce and impose navigation systems without regard for bodies of water and buildings, those systems demand that couriers literally traverse a world that only exists in the model of the system

NOTER_PAGE: (9 . 0.3735473159933591)

music streaming services’ influence on expressed patterns in music [55]; what this paper shows is that this kind of homogenizing effect, producing a monoculture by coercing, erasing, and oppressing those who don’t already fit the existing pattern, takes place wherever AI wields overbearing power

NOTER_PAGE: (9 . 0.4764803541781959)

the totalizing need for data is one that motivates technology and AI but dooms it

NOTER_PAGE: (10 . 0.12949640287769784)

structures of transparency inevitably become structures of stupidity as soon as [formalization of cliques into committees] takes place

NOTER_PAGE: (10 . 0.4272274488101826)

One option Graeber offers is to “reduce all forms of power to a set of clear and transparent rules”.

NOTER_PAGE: (10 . 0.45877144438295514)

intentionally limit the power that these informal structures have

NOTER_PAGE: (10 . 0.5467625899280575)

ambition to capture everything to comprehensively describe a phenomenon reveals a certain degree of hubris

NOTER_PAGE: (11 . 0.2501383508577753)

study “. . . how we all travel through the thicket of time and space. . . ”, and accept with humility that we will never be able to fully understand or even document everything

NOTER_PAGE: (11 . 0.2850027670171555)

Footnotes:

1

James C. Scott, Seeing like a State: How Certain Schemes to Improve the Human Condition Have Failed, Yale Agrarian Studies (New Haven, Conn.: Yale Univ. Press, 2008).