Terms of inclusion: Data, discourse, violence

tags
Datafication

Notes

As Ruha Benjamin (2019) summarizes, inclusion is part “of a larger repertoire of ‘happy talk,’ which involves a willingness to acknowledge and even revel in cultural difference without seriously challenging ongoing structural inequality”

NOTER_PAGE: (2 . 0.35014005602240894)

inclu- sion discourses work as a convenient cover for problems that are—as Oscar Gandy (1993) demonstrated—not separate from, but intrinsic to technologies predicated on surveillance, social sorting, and optimization.

NOTER_PAGE: (2 . 0.4208683473389356)

discursive violence as, in particular, a form of violence that operates by diffusing resist- ance, deepening dependency on oppressive structural conditions, and preserving the potential for other forms of violence,

NOTER_PAGE: (2 . 0.6211484593837535)

a society and its people ultimately come to depend on and internalize oppressive (or, in her case, militaristic) values and ideals, making those val- ues and ideals appear “not only valuable but also normal”

NOTER_PAGE: (5 . 0.4789915966386555)

Or, as Iris Marion Young (1990) put it, what makes violence a form of oppression “is less the acts themselves, though these are often utterly horrible, than the social context surround- ing them, which makes them possible and even acceptable”

NOTER_PAGE: (5 . 0.6239495798319328)

data science and technology reproduce, as Louise Amoore (2009) puts it, “a militarization of thinking” (p. 65) that “reinscribes the imaginative geography of the deviant, atypical, abnormal ‘other’” (p. 56) in new, often routinized ways.

NOTER_PAGE: (6 . 0.09803921568627451)

many of the activities that fall under this heading are not antagonistic toward, but congenial to the further proliferation of data science and technology.

NOTER_PAGE: (7 . 0.15266106442577032)

Against inclusion, or: the discursive bases of data violence

NOTER_PAGE: (10 . 0.21498599439775912)

Perhaps most notable are the ways inclusion discourses too neatly pose and resolve the problem of violence.

NOTER_PAGE: (10 . 0.45938375350140054)

Insidiously, this reasoning subtly normalizes a near total absence of the word “no” in discussions of data ethics

NOTER_PAGE: (10 . 0.5868347338935574)

“compensatory individualism” (Walcott, 2019: 402) that offloads the work of security, leveraging iterative updates, and design tweaks to insulate themselves from deeper or more meaningful challenges to their power and position

NOTER_PAGE: (10 . 0.8396358543417367)

such “inclusive” solutions evade accountability and instead “empower” individuals to regulate their own safety and visibility.

NOTER_PAGE: (11 . 0.09943977591036415)

inclusion efforts that convert broad social aspirations to matters of individual responsibility ultimately exacerbate— rather than alleviate—existing inequalities.

NOTER_PAGE: (11 . 0.15266106442577032)

If mass politics and structural transformation are off the table, then ethical intervention is largely reduced to questions of design and development.

NOTER_PAGE: (11 . 0.5315126050420168)

perversely reify the exclusive nature of technical expertise—as Greene et al. (2019: 2129) put it, they frame ethical problems as best solved by those best positioned to technically intervene, especially in areas like machine learning or AI.

NOTER_PAGE: (11 . 0.5861344537815126)

discourses of inclusion ultimately reproduce, rather than subvert the legiti- mating power of dominant actors—that is, the power to mark off and bestow recognition upon diverse “others.”

NOTER_PAGE: (12 . 0.09663865546218488)

the discursive excess of inclusion; inclusion discourses do not simply normalize, but dupe us into celebrating the very power structures that gener- ate asymmetrical vulnerabilities to violence in the first place.

NOTER_PAGE: (12 . 0.40476190476190477)