Should autonomous cars be designed to minimize harm


Assignment task:

"You are riding in your autonomous SUV on a two-lane road when a car coming toward you blows a front tire.  Rather than stopping and absorbing the impact your 'self-driving' car does a quick calculation and concludes that stopping will save you but will almost certainly lead to the deaths of the four passengers in the other car.  Rather than stopping "the robotic vehicle steers right. . .you are over a cliff, in free fall. . . .Your robot, the one you paid good money for, has chosen to kill you. . . There were [four] people in that car, to your one. The math couldn't be simpler." Think about these questions:

1. Should autonomous cars (or technology in general) be designed to minimize harm?

2. Assuming you drive, should you be forced to drive one of these cars?

3. Assume, for the sake of argument, that 'self-driving' cars and ethical decision making algorithms save thousands of lives each year compared to human decisions and reactions.  Does this change your view?

4. What would Mill, Kant, enlightened ethical egoism, or feminist ethics say about this case?  For example, on grounds of caring the feminist ethicist would argue that since you have obligations to your kids, family, and friends, it is morally permitted to disable the system/hack the system to save yourself.

Request for Solution File

Ask an Expert for Answer!!
Other Subject: Should autonomous cars be designed to minimize harm
Reference No:- TGS03324681

Expected delivery within 24 Hours