- tags
- Automation
Notes
Abstract
NOTER_PAGE: (1 0.3104125736738704 . 0.14915254237288136)
the concept of a moral crumple zone to describe how responsibility for an action may be misattributed to a human actor who had limited control over the behavior of an automated or autonomous system.
NOTER_PAGE: (1 0.3935821872953504 . 0.3711864406779661)
Introduction
NOTER_PAGE: (1 0.7066142763588736 . 0.15)
At the steering wheel of the putative “autonomous vehicle,” a safety driver sat. Her job was to monitor the car’s systems and take over in the event of an emergency. The safety driver now may face criminal charges of vehicular manslaughter
NOTER_PAGE: (1 0.7452521283562541 . 0.3889830508474576)
I articulate the concept of a moral crumple zone to describe how responsibility for an action may be misattributed to a human actor who had limited control over the behavior of an automated or autonomous system.
NOTER_PAGE: (2 0.32678454485920105 . 0.6898305084745763)
Just as the crumple zone in a car is designed to absorb the force of impact in a crash, the human in a highly complex and automated system may become simply a component—accidentally or intentionally—that bears the brunt of the moral and legal responsibilities when the overall system malfunctions. While the crumple zone in a car is meant to protect the human driver, the moral crumple zone protects the integrity of the technological system, at the expense of the nearest human operator.
NOTER_PAGE: (2 0.3759004584151932 . 0.1483050847457627)
The Accident at Three Mile Island
NOTER_PAGE: (3 0.42043222003929276 . 0.1423728813559322)
It would later come to light that these filters had consistently caused problems that the plant management had ignored.
NOTER_PAGE: (4 0.39685658153241654 . 0.4152542372881356)
Unintentionally, the workers choked off the flow of the entire feedwater system, preventing the secondary cooling water from circulating.
NOTER_PAGE: (4 0.4623444662737394 . 0.3347457627118644)
relief valve designed to release pressure in the core had been triggered. The valve opened as designed, but the mechanism jammed, and the valve never closed, as it should have.
NOTER_PAGE: (4 0.5455140798952194 . 0.15)
two days earlier, a routine testing procedure of the valves in question had accidentally been left closed. The incorrect position of the valve was not linked to any indicators in the control room, and the mistake went unnoticed.
NOTER_PAGE: (4 0.6070726915520629 . 0.15508474576271186)
The operators, in the midst of multiple visual and audio error messages, misinterpreted the situation and relied on system readings
NOTER_PAGE: (4 0.710543549443353 . 0.21525423728813559)
news coverage in the weeks and months following the accident focused on the role of operator error,
NOTER_PAGE: (5 0.3117223313686968 . 0.3898305084745763)
Only at the end of the article was it stated that the plant design made it especially hard to control
NOTER_PAGE: (5 0.4315651604453176 . 0.6152542372881356)
Without a doubt, actions taken by the plant operators led to the accident and exacerbated its severity.
NOTER_PAGE: (5 0.49836280288146695 . 0.2076271186440678)
the design of the control room played a central role in compounding human misinterpretations
NOTER_PAGE: (5 0.6090373280943026 . 0.2389830508474576)
the physical conditions of the system were not adequately represented in the control interface
NOTER_PAGE: (5 0.6444007858546169 . 0.19915254237288135)
no direct indicators of the level of cooling water in the steam generator tank.
NOTER_PAGE: (5 0.6640471512770137 . 0.5389830508474576)
NOTER_PAGE: (5 0.7301899148657499 . 0.673728813559322)
workers had been directed to test the valves and document the testing in a way that cut corners and saved money and time for the plant managers.
NOTER_PAGE: (5 0.8264571054354944 . 0.15169491525423728)
the clogged pipe in question had been generating issues for weeks prior, but plant management chose not to shut down the reactor.
NOTER_PAGE: (6 0.19580877537655533 . 0.18559322033898304)
management climate that viewed regulations as empty bureaucratic hoops
NOTER_PAGE: (6 0.2632612966601179 . 0.5745762711864406)
the narrative placing blame on the operators existed following the accident, and continued to exist even as expert reports complicated that narrative.
NOTER_PAGE: (6 0.60445317616241 . 0.39576271186440676)
The Crash of Air France Flight 447
NOTER_PAGE: (7 0.29862475442043224 . 0.1440677966101695)
Most accidents are edge-cases.
NOTER_PAGE: (8 0.19580877537655533 . 0.3016949152542373)
As both a practical response and liability shield, autopilots are certified to work as closed systems that do not work under every condition.
NOTER_PAGE: (8 0.1990831696136215 . 0.535593220338983)
At this point, the pilots should have had enough knowledge and time to fix this relatively simple problem
NOTER_PAGE: (8 0.6149312377210217 . 0.21101694915254238)
the design of the Airbus controls only allow one pilot to be in control at a time. The design also does not provide haptic feedback to indicate what the other pilot is doing, or even which pilot is in control if both are operating the controls. One pilot was pushing forward, the other pushing back. Neither was aware of the actions of the other.
NOTER_PAGE: (9 0.17812704649639818 . 0.26864406779661015)
a recovery was theoretically easily within reach. But the chaos in the cockpit and breakdown in communication and coordination of the aircraft rendered all the pilots helpless,
NOTER_PAGE: (9 0.2652259332023576 . 0.42118644067796607)
Every time one of the pilots would lower the nose and reduce the angle of attack, the reading would fall back into the acceptable range, and a stall state would be announced. Any effectively correcting move he made perversely resulted in the synthesized male voice announcing “STALL,”
NOTER_PAGE: (9 0.329404060248854 . 0.7110169491525423)
subsumed under a narrative in which the pilots lost “cognitive control,”
NOTER_PAGE: (9 0.6142763588736084 . 0.5389830508474576)
Airbus had recognized an issue with Pitot tube failures due to icing in the A330 model, and were beginning to replace the parts.
NOTER_PAGE: (9 0.7884741322855272 . 0.4991525423728813)
the autopilot and associated automation are smart enough to outsmart and save the human every time, the same narrative we saw in nuclear power plant design. The idea that the automation and its software could fail was never a possibility.
NOTER_PAGE: (10 0.306483300589391 . 0.45084745762711864)
social tendency to overestimate the capacity of machines and underestimate the abilities of humans
NOTER_PAGE: (10 0.5245579567779961 . 0.3635593220338983)
pilot error has been a consistent catchall for explaining commercial and private aircraft accidents
NOTER_PAGE: (10 0.5553372626064178 . 0.7016949152542372)
when “human error” is invoked, it generally refers to operator error, not the error of human designers or systems architects.
NOTER_PAGE: (10 0.6227897838899804 . 0.28220338983050847)
automation is seen as safer and superior in most instances, unless something goes wrong, at which point humans are regarded as safer and superior.
NOTER_PAGE: (10 0.704649639816634 . 0.14152542372881355)
jump into an emergency situation at the last minute, is something humans do not do well
NOTER_PAGE: (11 0.18140144073346431 . 0.23728813559322035)
While automation is generally assumed to relieve humans of menial tasks, freeing them to think about more important decisions, this has proven not to be the case
NOTER_PAGE: (11 0.22593320235756387 . 0.535593220338983)
pilot awareness generally decreases with increased automation
NOTER_PAGE: (11 0.283562540929928 . 0.5050847457627119)
skills atrophy when automation takes over
NOTER_PAGE: (11 0.3097576948264571 . 0.38813559322033897)
Deskilling has been suggested to be a primary component of the pilots’ inability to implement the stall corrective procedure
NOTER_PAGE: (11 0.3621480026195154 . 0.1483050847457627)
NOTER_PAGE: (11 0.45841519318926 . 0.18050847457627117)
Discussion
NOTER_PAGE: (11 0.610347085789129 . 0.1483050847457627)
four main barriers to the establishment of accountability, or what she termed answerability, in the development and use of computational technologies. Each of these barriers (the problem of many hands, bugs, blaming the computer, and software ownership without liability) implicates a set of development practices as well as a set of social attitudes toward accountability.
NOTER_PAGE: (11 0.6437459070072037 . 0.22118644067796608)
the causes of accidents are multiple and pointing to one error is usually a vast overstatement of the problem
NOTER_PAGE: (12 0.32678454485920105 . 0.502542372881356)
Therac-25 accidents as an example of the “the problem of many hands,”
NOTER_PAGE: (12 0.36345776031434185 . 0.1771186440677966)
In the case of Therac-25, the operator had no way of knowing that the system had malfunctioned, except for reports from patients that felt pain.
NOTER_PAGE: (12 0.5952848722986248 . 0.20254237288135593)
protecting the integrity of the technological system at the expense of the nearest human operator. The technology is maintained as faultless, while the human operator becomes the faulty feature of the system.
NOTER_PAGE: (13 0.16306483300589392 . 0.14915254237288136)
Robots on the Road
NOTER_PAGE: (13 0.5455140798952194 . 0.14745762711864407)
The system used to detect and classify objects around the car misrecognized the pedestrian as an object.
NOTER_PAGE: (14 0.24361493123772104 . 0.2389830508474576)
software that might have enabled automatic braking had been disabled:
NOTER_PAGE: (14 0.26195153896529144 . 0.24152542372881355)
Given the known existence of the “hand-off problem,” described in the aviation context above, it is reasonable to question the appropriateness of the role and expectations of the safety driver in and of itself.
NOTER_PAGE: (14 0.444007858546169 . 0.5271186440677966)
While elsewhere the autonomy of the Tesla Autosteer is emphasized, here we see how the human retains all responsibility.
NOTER_PAGE: (14 0.7328094302554028 . 0.21440677966101696)
Google’s self-driving car program has switched focus after making the decision that it could not reasonably solve the “handoff problem,”
NOTER_PAGE: (15 0.18140144073346431 . 0.3347457627118644)
all except one of the accidents had been caused by the Google car; all were the fault in some way of human drivers.
NOTER_PAGE: (15 0.37786509495743287 . 0.15169491525423728)
Conclusion
NOTER_PAGE: (16 0.23968565815324167 . 0.15169491525423728)
A prevailing rhetoric of human-computer interaction design suggests that keeping a “human in the loop” assures that human judgment will always be able to supplement automation as needed. This rhetoric emphasizes fluid cooperation and shared control. In practice, the dynamics of shared control between human and computer system are more complicated,
NOTER_PAGE: (16 0.4603798297314997 . 0.5805084745762712)