Abolish the #TechToPrisonPipeline

tags
Algorithmic Discrimination

Notes

expected by-product of any field which evaluates the quality of their research almost exclusively on the basis of “predictive performance.”

NOTER_PAGE: (4 . 0.7611697806661251)

there is no way to develop a system that can predict or identify “criminality” that is not racially biased — because the category of “criminality” itself is racially biased.

NOTER_PAGE: (5 . 0.27213647441104794)

the assumption that data regarding criminal arrest and conviction can serve as reliable, neutral indicators of underlying criminal activity

NOTER_PAGE: (5 . 0.43135662063363117)

any effort to identify “criminal faces” is an application of machine learning to a problem domain it is not suited to investigate, a domain in which context and causality are essential and also fundamentally misinterpreted

NOTER_PAGE: (6 . 0.1397238017871649)

Predictive Analytics

research of this nature creates dangerous feedback loops

NOTER_PAGE: (7 . 0.21283509341998375)

explicitly recognizes the active and crucial role that the data scientist (and the institution they’re embedded in) plays in constructing meaning

NOTER_PAGE: (8 . 0.33143785540211207)

criminality has no stable empirical existence

NOTER_PAGE: (26 . 0.29894394800974816)