The idea of artificial intelligence (AI)-powered autonomous cars has seized the imagination of a lot of people.
Almost all the large automobile makers are now pitted against each other to come up with their own variants of safe and efficient self-driving vehicles.
General Motors, (the company is still trying to deal with the bailout issue and how it will pay America back) is one of the foremost innovators in this technology. GM wants to achieve competitive advantage by having its autonomous cars on the roads first.
However, a few months ago, an autonomous Chevrolet Bolt, on a test run on the streets of San Francisco, was involved in a crash with a motorcyclist.
This incident triggered a debate about the level of safety in these vehicles. An important legal aspect has also come to fore due to this incident—who is accountable for accidents involving self-driving vehicles?
The Crash – Sequence of Events
- On December 7th, 2017, an autonomous Chevrolet Bolt test vehicle attempted a lane change on a San Francisco road.
- It failed in its maneuver and rammed into the commercial photographer, Oscar Nilsson who was on his motorcycle.
- A standby driver was present in the GM vehicle but the automobile was in a self-driving mode when the accident occurred.
- On January 22nd, 2018, Oscar Neilson sued GM for damages.
Critics Question Safety
Three things happened in the aftermath of this incident:
- The spotlight has shifted back to the apprehensions of the critics with regard to self-driving vehicles. The way autonomous vehicles (AV) behave on roads could be different from that of human drivers.
- Apportionment of the blame for accidents now appears tricky. Experts believe that as the number of AVs on the roads increases, so will the number of accidents involving regular automobiles. There will also be a spike in complex litigation.
- Legal and safety matters concerning self-driving vehicles are likely to be the talk of the town during the coming months and years.
Bryant Walker Smith, a researcher at Stanford and Professor of Law at the University of South Carolina, believes that automakers have two alternatives with respect to handling lawsuits related to driverless car accidents:
- If the automobile or its technology caused the crash, the automakers would prefer a quick settlement.
- If they were confident that the crash was the victim’s fault, they could pursue a full-fledged legal battle.
In most such instances, investigators could have access to a large cache of relevant data—from CCTVs, driving records maintained by the AV’s software. This data should help in pinning the blame where it belongs.
Need for Transparency and Stronger Regulation
A number of safety advocates and consumer organizations are now voicing their opinion on the need for greater transparency, sharing of accident data, and the formation of a robust regulatory framework.
- Consumer Watchdog’s John Simpson insists that for accidents related to driverless vehicles this data should be available in the public domain.
- Activists of the ‘Advocates for Highway and Auto Safety’ are pressing for greater regulatory framework for AVs.
- Opinion polls have revealed that most Americans, across age groups, political leanings, and location – agree that greater safeguards for driverless cars were warranted.
- According to a public survey, 64% of the respondents were worried about being on the road with the AVs. (The telephonic survey saw participation from 1,005 adults from across continental US.)
- Seventy-three percent of the surveyed population favored the development of safety regulations by the Department of Transportation (DOT) for the added features built into the self-driving vehicles.
- Joan Claybrook, a former head of the National Highway Traffic and Safety Administration (NHTSA), is of the opinion that the DOT ought to put in place statutory norms and safety guidelines meant exclusively for driverless automobiles.
- Experts say the public expects the DOT to intervene and enforce safety instead of giving automakers the freedom to manufacture and sell vehicles with unregulated new technology not bound by adequate safety compliance norms.
Criminal Acts Involving Driverless Vehicles
Certain analysts are of the opinion that AVs could transform the manner of execution of certain crimes. Some dangerous situations can arise due to the self-driven car’s biggest weakness: hacking. (If you watched the salient movie Blackhat or the riveting show 24 you will know all about this.)
Here’s a glimpse of what the experts foresee and fear:
- The director of the Humans and Autonomy Laboratory at Duke University, and ex-Navy pilot, Dr. Mary Cummings, said that few people were aware that hacking into and changing the route of a driverless vehicle was far from difficult. (She insisted that she couldn’t trust an AV enough to allow her child to ride in one.)
- When it comes to self-driving vehicles, for some experts, hackers are a greater threat than criminals and even terrorists.
- In 2014, The Guardian published an unclassified FBI document that highlighted the FBI’s fear that in future, the AVs could become deadly weapons at the hands of criminals adept at hacking.
- Experts at FBI are worried at the number of ways in which AVs could be deployed with criminal intent.
- Any threat of collision makes an AV alter course. Criminals could exploit this feature to escape from law enforcement officials chasing them.
- Hacking into a driverless vehicle, trapping its passengers and holding them for ransom is also a scenario that the FBI admits, is a real concern.
- Human drivers of vehicles used by criminals to carry out a sinister act have historically been the easiest links to crack in past cases. By eliminating the need for a driver, AVs could make crimes harder to crack.
More Questions than Answers
The industry and the regulators are yet to arrive at a consensus about how much technology in the name of safety is permissible.
- Should AVs be equipped with drug or explosive detection sensors?
- What about placing cameras to record events inside the vehicles?
- Should there be geo-tagging of every AV?
- What can be the tradeoff between security and privacy? (How much of AV data should be accessible to law enforcement without compromising the citizens’ privacy?)
Unless these genuine concerns are addressed, they could eclipse the euphoria associated with the AVs and their promise of transforming our lives.
The liability in case of AVs would primarily depend on the cause of the accident:
- If the incident occurred due to a software glitch, then the manufacturer’s insurers might have to foot the bill.
- If both parties are at fault, then both their insurers could be liable.
However, even in more straightforward accident cases, the insurers could launch into a blame game, while pinpointing and proving liability may not be easy.
It is, therefore, time to establish the necessary legal and regulatory safeguards in order to ensure society’s safe and smooth transition into the era of self-driving cars.