The Silicon Ceiling: How Artificial Intelligence Constructs an Invisible Barrier to Opportunity

tags
Algorithmic Discrimination

Notes

a few dominant companies draw on the same data and apply similar criteria.

NOTER_PAGE: (1 . 0.35815602836879434)

prioritize options with the highest probability of success. This leads to less diverse outcomes

NOTER_PAGE: (1 . 0.3900709219858156)

automated opportunity system precludes, rather than denies, access to opportunity.

NOTER_PAGE: (1 . 0.5510638297872341)

Regulators and activists in the pursuit of equity frequently focus on the criteria and processes that schools and employers use to evaluate applicants.16 However, it is crucial to consider how structural inequality shapes which candidates are even considered.

NOTER_PAGE: (5 . 0.2127659574468085)

includes education and employment components of the “pipeline” leading to socioeconomic success.

NOTER_PAGE: (5 . 0.32269503546099293)

rarely result in final “decisions,” but instead drive transient search results and rankings.

NOTER_PAGE: (6 . 0.32056737588652484)

Opportunity Brokers

NOTER_PAGE: (8 . 0.1624113475177305)

affordances of the automated opportunity system permit opportunity providers—and brokers on their behalf—to evaluate individuals at informal inflection points as well.

NOTER_PAGE: (10 . 0.18014184397163122)

Schools and companies rely on automated resume-parsing tools as essential for coping with an avalanche of digital applications.

NOTER_PAGE: (10 . 0.32269503546099293)

“datafication of employment” creates records that employers can use to determine which workers to mentor, promote, or fire.

NOTER_PAGE: (14 . 0.2843971631205674)

big data systems allow organizations to “see like a market.”

NOTER_PAGE: (16 . 0.16099290780141845)

algorithmic assessments in the potential candidate market do not fit with decision-focused regulatory regimes.

NOTER_PAGE: (16 . 0.2851063829787234)

The scale and pressures of the platform economy encourage both organizations and opportunity seekers to prioritize options that algorithms suggest have the highest probability of success.

NOTER_PAGE: (17 . 0.299290780141844)

often exclude qualified candidates who do not use standardized terminology. In one experiment, fake resumes meeting all job description requirements scored 43 out of 100 based on the formatting of the graduate education section.

NOTER_PAGE: (19 . 0.23617021276595745)

calculated that the factors most predictive of job performance were being named Jared or having played high school lacrosse.

NOTER_PAGE: (19 . 0.43049645390070923)

errors that occur in the automated opportunity system are difficult to detect.

NOTER_PAGE: (20 . 0.39078014184397164)

an anti-bias auditing company may analyze outcomes according to race and gender, but not new axes of discrimination based on thousands of variables that reflect protected class status.

NOTER_PAGE: (22 . 0.124822695035461)

data subjects must not only have access to relevant information but also an ability to understand its significance.

NOTER_PAGE: (22 . 0.4113475177304965)

They simply create a slightly expanded “new boys” network that still promotes candidates with elite class markers.

NOTER_PAGE: (24 . 0.27092198581560284)

Employers, for example, increasingly add non-essential qualifications in their job descriptions to weed out more applications automatically.

NOTER_PAGE: (25 . 0.1773049645390071)

Employers often exclude applicants who meet “knockout” criteria that opportunity providers view as negative, such as a gap in full-time employment.

NOTER_PAGE: (25 . 0.3028368794326241)

This exclusionary outcome is not inevitable, but it is a foreseeable one in a platform economy that pressures organizations to minimize risk and maximize speed.

NOTER_PAGE: (26 . 0.14326241134751774)

Focusing on whether users can technically access information outside their filter bubbles ignores the realities of the attention economy.

NOTER_PAGE: (27 . 0.31631205673758866)

Opportunity brokers may repeatedly draw on older records or use the same information scraped from public digital footprints or purchased from data brokers.

NOTER_PAGE: (28 . 0.21914893617021278)

Opportunity brokers may evaluate candidates using information or inferences from earlier applications, what Ifeoma Ajunwa calls “algorithmic blackballing.”

NOTER_PAGE: (28 . 0.35815602836879434)

Ex ante measures can ameliorate some problematic aspects of predictive analytics.213 These include running internal and external audits,214 conducting algorithmic impact assessments,215 and hiring more diverse developers.

NOTER_PAGE: (31 . 0.4787234042553192)

However, structural reforms better address the problems posed by the Silicon Ceiling. These would undermine standardization by diversifying algorithmic decision making, reducing reliance on the same data inputs, and reducing horizontal and vertical market domination.

NOTER_PAGE: (31 . 0.5843971631205674)

Technology can help make these automated assessments less deterministic.

NOTER_PAGE: (32 . 0.1425531914893617)

Regulators could also enact statutes to diversify the information input into algorithmic decisions.

NOTER_PAGE: (32 . 0.2822695035460993)

Privacy regulation could reduce reliance on older or particularly problematic records.

NOTER_PAGE: (32 . 0.3007092198581561)

“right to be forgotten”

NOTER_PAGE: (32 . 0.4042553191489362)

proposed “reputation bankruptcy,”

NOTER_PAGE: (32 . 0.44539007092198585)

Economic reforms promoting competition among opportunity platforms and brokers

NOTER_PAGE: (33 . 0.14397163120567377)