Who is responsible for Autonomous Vehicles?

Legal-ese
3 min readFeb 11, 2022
Source: CNN

The second issue concerns where accountability lies for the actions of autonomous vehicles, especially in instances where it has caused injury or death, or performs an action considered moral objectionable by human standards (say, colliding with a pedestrian as opposed to colliding with a lamppost). It does not follow that the individual in the car may be held liable, as is the case for human-operated vehicles. The matter is further complicated when it comes to partially autonomous vehicles, but for simplicity’s sake, it may be better to examine the issue exclusively using vehicles with full autonomy.

Similarly, the AV, or the artificial intelligence (AI) controlling it, is not considered a legal entity, so it cannot be prosecuted either. Nevertheless, the discussion surrounding the potential accordance of a legal personality to AI will naturally have wider implications on who to blame. For instance, the extent to which a product can be considered to utilise artificial intelligence will likely be a sticking-point for future legislation. Then there is the issue of whether artificial intelligence should be classed as having a legal personality, the difference being that corporations at least are operated by individuals, and thus justifiably qualify as having one. Artificial intelligence therefore must be proven as acting in a similar enough way to humans, which, in the case of AVs, is already unfeasible, given expectations of them to be at least as safe as human drivers, if not more. In any case, the obvious question would be how AI would be punished for its actions in this scenario, given its non-physical form and lack of knowledge of human ethics.

The role of AV companies is one that needs to be considered as well: if manufacturers are to be held responsible for the actions of AI, then it would follow that they would be able to be sued. One concern raised, however, is that most accidents will happen as a result of software malfunctions rather than mechanical errors. In these cases, manufacturers can be sued under the UK’s Consumer Protection Act, but only under the basis that the software driving the car is considered a product. The idea of a “faulty” product also bears consideration: as per the previous factor discussed on this blog, if a car performs an action that is considered morally objectionable, such as fatally colliding with a woman pushing a bicycle, would the company who made it be considered negligent? On the one hand, the manufacturers could not have reasonably expected that the AV would have failed to register the woman’s shape as human; on the other, it could have reasonably foreseen that failure to identify unusually-shaped objects on the roads may cause potentially fatal accidents. It can be argued that as AVs become more commonplace, manufacturers will become more attuned to the vehicles’ software failings, and higher standards of their duties of care will thus be implemented with this knowledge.

Current regulations under the Automated and Electric Vehicle Act 2018 put in place a system of no-fault compensation for persons injured by AVs, where compensation is then paid for by the insurer, who can then sue the manufacturers in turn for the losses incurred. Given that AVs are not yet allowed on British roads, there is no pressing need to reconsider the existing legislation yet, but perhaps it is better to allow all the legal issues surrounding AVs to be ironed out before something truly goes wrong.

Tune in next week for the final part of this mini-series on Abs! Thanks for reading!

--

--