Information Fiduciaries in Practice: Data Privacy and User Expectations

tags
Privacy

Notes

The law does nothing to manage this relationship

NOTER_PAGE: (2 . 0.41038771031455745)

a company took advantage of personal information it possessed-information with which users had entrusted them-for its own benefit

NOTER_PAGE: (4 . 0.5730994152046783)

the threat of trust disappearing is not enough to influence service providers to protect user privacy on their own

NOTER_PAGE: (6 . 0.21198830409356723)

markets rely on consumers having enough knowledge to inform their decision-making.

NOTER_PAGE: (6 . 0.26973684210526316)

As designated information fiduciaries, service providers would have "special duties to act in ways that do not harm the interests of the people whose information they collect, analyze, use, sell, and distribute."

NOTER_PAGE: (8 . 0.3467446964155084)
NOTER_PAGE: (8 . 0.6437454279444038)

II. BACKGROUND

NOTER_PAGE: (9 . 0.30926331145149527)

III. BREACHING FIDUCIARY STATUS: FOUR MAIN PRINCIPLES

NOTER_PAGE: (18 . 0.1485003657644477)

what separates an acceptable practice from an unacceptable one is users' expectations

NOTER_PAGE: (18 . 0.3152889539136796)

a reasonable user's expectations can-and should-shift in response to various prompts

NOTER_PAGE: (18 . 0.5910753474762254)

user notification should not be a complete safe harbor.

NOTER_PAGE: (18 . 0.7073884418434528)

A. ANTI-MANIPULATION OF THE USER

NOTER_PAGE: (19 . 0.5273922571219869)

often, the user has no easy way of knowing whether and how it is happening

NOTER_PAGE: (19 . 0.5982468955441929)

manipulative statement or action as one that "does not sufficiently engage or appeal to people's capacity for reflective and deliberative choice."

NOTER_PAGE: (19 . 0.6216216216216216)

an action is not manipulative simply because it is an attempt to alter another person's behavior;

NOTER_PAGE: (19 . 0.716581446311176)

attempt to circumvent the other person's "capacity for reflection and deliberation."

NOTER_PAGE: (19 . 0.7772096420745069)

1. A Dignity- and Autonomy-Focused Conception of Manipulation

NOTER_PAGE: (20 . 0.27578639356254575)

2. A Welfarist Conception of Manipulation

NOTER_PAGE: (20 . 0.640087783467447)

3. Targeted Advertising

NOTER_PAGE: (21 . 0.4514952589350839)
users are conditioned for this; they are familiar with the concept of advertising and know that ads are meant to manipulate them
NOTER_PAGE: (21 . 0.6148796498905909)
there is no evidence to suggest that people trust Facebook as an advisor on the variety of products advertised through its platform
NOTER_PAGE: (22 . 0.246525237746891)

4. Hypotheticals

NOTER_PAGE: (22 . 0.7029992684711046)

drivers' ultimate decisions may incorporate Uber's influence without them realizing it

NOTER_PAGE: (24 . 0.6217995610826628)

using what [it knows] about drivers, their control over the interface, and the terms of transaction to channel the behavior of the driver

NOTER_PAGE: (25 . 0.17153284671532845)

the expectation is that the algorithm changes in the same way for every user

NOTER_PAGE: (25 . 0.6978102189781021)

B. ANTIDISCRIMINATION

NOTER_PAGE: (27 . 0.3897810218978102)

must not offer different services or prices to individuals based on their membership (or non-membership) in a protected class

NOTER_PAGE: (27 . 0.5408759124087591)

how meaningful is the concept of a protected class in this context? does it count if you've only inferred e.g. gender based on past purchasing patterns? how is any recommender system possible then? what if you haven't inferred it directly but effects are discriminatory?

Big Data makes it unwise to focus only on traditional targets of discrimination, such as racial minorities. As more data emerges, it may become the case that the people against whom firms discriminate do not correspond with the traditional categories

NOTER_PAGE: (28 . 0.27066569129480617)

discriminate against a new group: those who are less "valuable."

NOTER_PAGE: (28 . 0.3847841989758596)

antidiscrimination principle, then, involves a moving target that will need to be reassessed periodically to identify who may be harmed or left behind by algorithmic decision-making.

NOTER_PAGE: (28 . 0.48354059985369424)

1. Access to Services

NOTER_PAGE: (28 . 0.5647403072421361)
tailored services are often seen as a feature of Big Data, rather than a bug
NOTER_PAGE: (29 . 0.5284671532846715)
not discrimination as long as the services are available to both people.
NOTER_PAGE: (29 . 0.664963503649635)
The runner may have to actively search for the Game of Thrones pages if she wants to view them, but as long as they are available to her, the practice is consistent with the fiduciary duty
NOTER_PAGE: (29 . 0.7423357664233576)

2. Price Discrimination

NOTER_PAGE: (30 . 0.12435991221653256)
The change in pop-tart pricing would be based on publicly available data, such as timing and weather patterns. But the second example should be prohibited for fiduciaries, since it disadvantages a group based on data about users' gender obtained for another purpose
NOTER_PAGE: (30 . 0.31382589612289685)

3. Digital Redlining

NOTER_PAGE: (31 . 0.3773722627737226)

4. Hypotheticals

NOTER_PAGE: (33 . 0.47337709700948216)

C. LIMITED SHARING WITH THIRD PARTIES

NOTER_PAGE: (37 . 0.2961342086068563)

D. VIOLATING THE COMPANY'S OWN PRIVACY POLICY

NOTER_PAGE: (44 . 0.7461594732991953)