- tags
- Prediction Deskilling
Notes
NOTER_PAGE: (1 0.5568993952721276 . 0.5557184750733137)
increasingly involves asserting probable outcomes as bases for definitive judgments - despite mounting evidence that much of what we call AI is “fundamentally dubious”
NOTER_PAGE: (1 0.5975810885101704 . 0.15175953079178886)
novel mechanisms of capture and analysis regularly reproduce pre-existing social asymmetries.
NOTER_PAGE: (1 0.6179219351291919 . 0.6363636363636364)
prediction is primarily not a technological instrument for knowing future outcomes, but a social model for extracting and concentrating discretionary power: that is, people’s ordinary capacity to define their situation.
NOTER_PAGE: (1 0.6717976910390324 . 0.23093841642228738)
frequently means making life and work unpre- dictable for the target of prediction.
NOTER_PAGE: (1 0.7234744365035735 . 0.6114369501466276)
strategies and priorities of over a century of workplace automation.
NOTER_PAGE: (1 0.7537108301264431 . 0.8255131964809385)
“render all agonistic polit- ical difficulty as tractable and resolvable”
NOTER_PAGE: (1 0.7822979659153382 . 0.7390029325513197)
how new methods of producing knowledge generate a redistribution of epistemic power: that is, who declares what kind of truth about me, to count for what kinds of deci- sions?
NOTER_PAGE: (1 0.8246289169873557 . 0.10483870967741936)
it is us – and our discretion – that is being defined as the risk and error.
NOTER_PAGE: (2 0.09785596481583288 . 0.2558651026392962)
framing questions are being settled not in the proverbial roundtable of independent thinkers engaged in good faith, but a quagmire of industry-funded lobby groups, corporate ethics teams that mistreat and fire their own ethics experts, and active co-option of critical vocabulary
NOTER_PAGE: (2 0.2314458493677845 . 0.08284457478005865)
remains the case even when many such predictions remain unproven or, at times, specif- ically disproven.
NOTER_PAGE: (2 0.23474436503573393 . 0.6026392961876833)
NOTER_PAGE: (2 0.2638812534359538 . 0.5740469208211144)
NOTER_PAGE: (2 0.30896096756459596 . 0.5997067448680352)
Big Tech “make[s] the water in which AI research swims”
NOTER_PAGE: (2 0.32435404068169327 . 0.34750733137829914)
corporate affiliated authorship in AI research
NOTER_PAGE: (2 0.38537658053875756 . 0.2470674486803519)
NOTER_PAGE: (2 0.4161627267729522 . 0.13269794721407624)
NOTER_PAGE: (2 0.4766355140186916 . 0.093841642228739)
NOTER_PAGE: (2 0.6113249037932931 . 0.5491202346041055)
social impact of technologies also tends to exceed their actual capabilities or implementation.
NOTER_PAGE: (2 0.6597031335898845 . 0.0813782991202346)
institu- tional and economic responses to what people expected automation to be and to do – responses which often per- sisted even when the technology never quite arrived
NOTER_PAGE: (2 0.6882902693787796 . 0.42888563049853373)
‘relation’, in its modern sense, allows us to think the world in terms of discrete phenomena which may be joined, separated, sorted, mixed, in highly modular and indiscriminate ways.
NOTER_PAGE: (2 0.7476635514018692 . 0.5)
NOTER_PAGE: (2 0.8246289169873557 . 0.6664222873900293)
comput- ing imposes its own ways of seeing on those by other domains.
NOTER_PAGE: (3 0.12864211105002749 . 0.5652492668621701)
dismiss such consequences as ‘side effects’ that simply could not be helped.
NOTER_PAGE: (3 0.1753710830126443 . 0.1693548387096774)
an expectation of calculability which drives out everything that does not fit.
hidden mathematical order that is ontologically superior
NOTER_PAGE: (3 0.4183617372182518 . 0.1063049853372434)
As Scott himself would later note: “legibility doesn’t come cheap.”
NOTER_PAGE: (3 0.5981308411214954 . 0.8607038123167156)
hear one investor fallback exactly on this kind of broad article of faith: “you can’t ban technology. Sure, that might lead to a dystopian future or something, but you can’t ban it.”
NOTER_PAGE: (3 0.6432105552501375 . 0.2060117302052786)
“all epistemology begins in fear—fear that the world is too labyrinthine to be threaded by reason; fear that the senses are too feeble and the intellect too frail”.
NOTER_PAGE: (3 0.7619571192963167 . 0.10117302052785923)
Peter Galison
patterns of extraction shape the research questions and the choice of what to measure (and what to dismiss without measuring). When predictive successes are defined as ‘beating’ humans in tests of accuracy, they reify the model’s own parameters as the world that matters,
NOTER_PAGE: (3 0.8092358438702584 . 0.656158357771261)
capitalism constructs regimes of justification: the rules of the game by which we assess its own success and failure.
NOTER_PAGE: (4 0.1011544804837823 . 0.08870967741935484)
We learn to feel that it is surely unrealistic to expect that an employer will not squeeze their workers as hard as they can
NOTER_PAGE: (4 0.1423859263331501 . 0.07991202346041056)
Crime and policing are emblematic, partly because the very history of crime as a measurable object, and of modern policing as an institution, are defined by this project of extracting discretion.
NOTER_PAGE: (4 0.4612424409015943 . 0.250733137829912)
decisions like what kind of data is and is not gathered, or where the resulting ‘insights’ are deployed, are written off as someone else’s problem.
NOTER_PAGE: (4 0.4612424409015943 . 0.7148093841642229)
NOTER_PAGE: (4 0.5981308411214954 . 0.12096774193548387)
‘criminality’ appears as a state that inheres in the person,
NOTER_PAGE: (4 0.685541506322155 . 0.1217008797653959)
The grammar of data-driven prediction allows, and even encourages, the researcher to avoid asking such questions in the first place
NOTER_PAGE: (4 0.851566794942276 . 0.17521994134897362)
unnecessary to understand what criminality is as long as we can produce actionable measurements bearing its name.
NOTER_PAGE: (4 0.902693787795492 . 0.3980938416422287)
high-profile police kill- ings of Black Americans precipitate repeated public debate on just how prevalent such killings are. Yet each time, it is data on Black crime that tends to be readily available, while data on police misconduct and violence is underfunded, neglected, and sometimes actively suppressed.
NOTER_PAGE: (5 0.22100054975261132 . 0.7639296187683284)
Chicago’s police union was working to destroy records of police misconduct, as they had been doing for years
NOTER_PAGE: (5 0.32545354590434306 . 0.6231671554252199)
deployment of predictive systems can serve to protect and relegitimise bureaucratic discretion
NOTER_PAGE: (5 0.3672347443650357 . 0.18695014662756598)
The mundane work of documenting, photographing, and ‘writing up’ targets thus becomes a site where police workers’ discretion feeds into large data-driven systems.
NOTER_PAGE: (5 0.4595931830676196 . 0.25806451612903225)
condemn certain kinds of suffering and lived experience as ‘merely anecdotal’, forced to an uphill struggle to count as ‘data’, while the police receive the numbers they need by default.
NOTER_PAGE: (5 0.4606926882902694 . 0.6744868035190615)
LSI-R has fared poorly in tests of interrater consistency,
NOTER_PAGE: (5 0.6849917537108302 . 0.09310850439882698)
do not simply shift the entire apparatus towards inhuman objectivity, but rather empower new norms on who gets to impose their discretion upon whom.
NOTER_PAGE: (5 0.7355689939527214 . 0.16788856304985336)
Predictive systems are often used to delegit- imise the kinds of lives and experiences that are already too disadvantaged to generate rich ‘data’ in the first place.
NOTER_PAGE: (5 0.7493128092358439 . 0.626099706744868)
data for me, and not for thee.
NOTER_PAGE: (5 0.7943925233644861 . 0.31451612903225806)
discretion as a means of managing the tension between document and person, government policy and individual needs,
NOTER_PAGE: (6 0.11214953271028037 . 0.3291788856304985)
NOTER_PAGE: (6 0.12699285321605278 . 0.6730205278592375)
the most important use of such discretion was to judge when the rules should not apply,
NOTER_PAGE: (6 0.3122594832325454 . 0.12683284457478006)
predictive technologies often redistribute discretionary power within bureaucracies as well, from individual truck drivers to clerks monitoring them through electronic logging devices
NOTER_PAGE: (6 0.3397471137987906 . 0.7038123167155426)
link prediction indelibly to discretion.
NOTER_PAGE: (6 0.3996701484332051 . 0.21407624633431085)
ground-level customs officers in airports to centralised data analytics teams
NOTER_PAGE: (6 0.40131940626717977 . 0.5359237536656891)
algorithmic systems are enacting a broader shift from street-level bureaucracies to systems-level bur- eaucracies
NOTER_PAGE: (6 0.4310060472787246 . 0.5659824046920822)
establish rule-bound processes for decision-making,
NOTER_PAGE: (6 0.4485981308411215 . 0.08357771260997067)
provide scripts for justifying that judgment
NOTER_PAGE: (6 0.4617921935129192 . 0.1348973607038123)
Discretion, then, describes the always unequal distribu- tion of the power to define the situation – a distribution which data-driven prediction seeks to actively reconfigure.
NOTER_PAGE: (6 0.4738867509620671 . 0.5219941348973607)
prediction serves to reallocate discretionary power across different actors, and additionally to obfuscate the continuing role of discretionary power in decision- making.
NOTER_PAGE: (6 0.5382078064870809 . 0.11950146627565983)
NOTER_PAGE: (6 0.6877405167674546 . 0.5021994134897361)
sucked into not only the opprobrium of rule and all its pains- taking requirements, but also the prosecution’s ability to exercise discretion over that rule.
NOTER_PAGE: (6 0.7493128092358439 . 0.08431085043988269)
NOTER_PAGE: (6 0.7949422759758109 . 0.2287390029325513)
NOTER_PAGE: (6 0.8680593732820231 . 0.843108504398827)
wrap their own discretionary judgment in the cloak of the rule
NOTER_PAGE: (6 0.8697086311159978 . 0.20307917888563048)
NOTER_PAGE: (7 0.14293567894447498 . 0.5652492668621701)
the exact rate that a worker must hit at a given warehouse is never disclosed to the workers them- selves.
NOTER_PAGE: (7 0.20505772402418912 . 0.1429618768328446)
‘project loving energy’, counsels a workstation screen
NOTER_PAGE: (7 0.35459043430456294 . 0.14516129032258066)
I'm going to vomit
cater to managers and employers’ desire for a certain kind of inhuman clarity, in which the many varia- tions and ambiguities inherent in any act of labour are not actually eliminated, but simply neglected.
NOTER_PAGE: (7 0.40131940626717977 . 0.17961876832844575)
The Amazon ‘picker’ is constantly adapting to the algorithmic redistribution of boxes and goods, unable to accumulate their own rhythms for effective and safe work.
NOTER_PAGE: (7 0.4909290819131391 . 0.1407624633431085)
For the worker, it is their lived time chunked up into a pulsating mess of alarms and nudges, distractions and panic; for the manager and employer, it is a vast predictive matrix which externalises everything that the model would prefer not to predict and funnels the cost to the worker. What the model refuses to count cannot hurt it.
NOTER_PAGE: (7 0.502473886750962 . 0.7785923753665689)
This disparity between the predictor and the predicted rep- rises over a century of labour struggle,
NOTER_PAGE: (7 0.6558548653106102 . 0.5227272727272727)
making labour more predictable for some requires making it less predictable for others.
NOTER_PAGE: (7 0.6723474436503574 . 0.2859237536656892)
significantly higher rates of severe injury
NOTER_PAGE: (7 0.7317207256734469 . 0.24120234604105573)
NOTER_PAGE: (7 0.7647058823529412 . 0.7895894428152492)
Amazon’s response to these problems are to intensify this dynamics of profit- oriented datafication, rather than to restore discretion on the floor level.
NOTER_PAGE: (7 0.8526663001649258 . 0.1906158357771261)
data grist for improving predictions of the stowing process. Individual judgments and tacit knowledge are siphoned into a unique source of knowledge for managers, but not for the workers themselves. The key innovation is not merely to pack seven boxes instead of six, but ensuring that it is the manager who can set the ‘rate’ of seven, or eight, or two hundred, in ways that are precise for the manager, and opaque to the worker.
NOTER_PAGE: (8 0.10720175920835624 . 0.2316715542521994)
After giving a presentation on technologies for mapping the ground to identify oil sources, the managers in the room – mostly American expats – ask instead for AI-driven worker surveillance.
NOTER_PAGE: (8 0.26608026388125344 . 0.717741935483871)
The worker is the suspect, and it is this a priori declaration that determines what role the data will play to begin with.
NOTER_PAGE: (8 0.36558548653106104 . 0.6444281524926686)
General Electric’s postwar investment into machine shop automation was an explicit response to the success of nationwide union strikes in 1946,
NOTER_PAGE: (8 0.3996701484332051 . 0.1744868035190616)
NOTER_PAGE: (8 0.4590434304562947 . 0.5102639296187683)
must “eradicate” the “fantasy” that “the employees … were in the driver’s seat.”
NOTER_PAGE: (8 0.5222649807586586 . 0.19794721407624633)
It is the subjects of measurement who are preemptively defined as objects of suspicion and danger, whose exercise of discretionary power over their own circumstances is primarily seen as a source of unwelcome uncertainty.
NOTER_PAGE: (8 0.5634964266080265 . 0.7052785923753666)
‘record-playback’ approach to automation eventually lost out to numeric control in part because the managers and employers making the acquisition decisions wanted the technology to transfer dis- cretionary power away from the worker.
NOTER_PAGE: (8 0.5788894997251237 . 0.3343108504398827)
such pre- sumptions enter into the social life of prediction before and beyond any question of statistical bias in a dataset or the appropriateness of particular object labels.
NOTER_PAGE: (8 0.7768004398020891 . 0.36656891495601174)
“what is today called ‘automation’ is conceptually a logical exten- sion of Taylor’s scientific management [in which] product- ivity required that ‘doing’ be divorced from ‘planning’.”
NOTER_PAGE: (8 0.8086860912589335 . 0.8350439882697948)
NOTER_PAGE: (9 0.18526663001649257 . 0.12976539589442815)