How accountable am I in an accident?

On a level 3 SAE vehicle if the driver is responsible for an accident, since the driver is supposed to be aware and intervene if necessary1, then the companies will feel free to develop the technology without any pressure or accountability. Competition between different companies is kept and the evolution tend to be fast.

At this stage, the question is if the driver will react fast enough when required since the reliance on the autonomy of the vehicle might lead to an excess of confidence and lack of attention to the road.

However, driver’s accountability becomes a complicated concept if we’re talking about a level 5 SAE vehicle. On a stage where the advantages of automation may be fully used, it will be possible to integrate people that are not able to drive in a vehicle without an actual driver. Based on these examples, how are we make the driver accountable in case of an accident?

To approach this we should split responsibility between moral and criminal accountability. Relative to the moral accountability there are 2 ideas24 that deserve some attention:

The first one says that since the vehicles are a risk to others, owners/drivers should continue to have accountability over the vehicle. This will force the owner to pay some kind of tax or insurance.

The second one blames the driver in case of an accident. The result or consequences are beyond the driver’s control because one can only be blamed in case of direct intervention. In an example where there is an accident derived from a child running in front of the vehicle, the driver is only accountable if there was any intervention. This second approach is not sustained in itself.

To add to the driver’s accountability, there is the manufacturer accountability that is required to incentive the optimisation of AV. The manufacturer will be, in the end, “the ultimate responsible for the final product”25.

The dark side

An autonomous vehicle (AV) is not something that will be easily introduced, there will be several obstacles until we can use such an advanced technology. We ´re not just talking about hardware or software issues, but to general issues like the passenger´s confidential data, the vehicle data safety, the insurance or the ethical issues relative to life or death decisions.

Imagine yourself inside an AV facing an hazardous situation caused by a package falling from a lorry in front of you. The vehicle will have to avoid the package and choose one of the following options:

a) kill a person on a zebra crossing on the left; or
b) kill a dog on the right.

What option would you chose? There´s no place for “none of the above”. Even if the answer could be easy in this case, from our point of view, we can add some complexity. Imagine that you would have to choose between:

a) kill a cyclist without a casket that is not respecting safety rules; or
b) seriously injure a cyclist with a casket that puts safety first.

Things get tricky don’t they? More comparisons can be done. On the MIT website Moral Machine6, you can find a kind of a game where, facing a hazardous situation, you will have to choose one of the options that will challenge your ethics, moral and common sense. You will have to choose between killing an old lady or the young and sporty guy; the thief or your bank manager; or, the most difficult one, from our point of view, 3 people on the zebra crossing or the person that is inside the vehicle. This option leaves you less comfortable, doesn’t it? So would you buy a vehicle that can kill you?

When facing these questions the reader starts realising that the AV will make decisions for you as a passenger, and a question arises: would you trust that the AV will make the “right” decision in a life or death situation?

A quick search on the MIT Moral Machine website6, allow us to conclude that the enquired prioritise saving lives and protect Mankind compared to animals. It also shows that they tend to take in consideration one’s higher social value, youngsters and females as well as the fact of respecting the law and avoid the direct intervention of AV when facing the decision.