Tesla's Autopilot system has once again been thrust into the "limelight".
Recently, a U.S. court announced a criminal case in which a driver was charged with involuntary manslaughter for misusing Tesla's L2 assisted driving system to kill a person.
In the case, a Tesla Model S driver ran a red light and crashed into another car, killing two people.
The driver was using Tesla's Autopilot assisted driving system at the time of the accident.
This is the first case in the world in which a person was charged with a felony for using the L2 Autopilot system that caused a death.
The case has once again sparked concern from all walks of life about the safety of autonomous driving, and calls for regulating autonomous driving have risen.
In fact, autonomous driving, as an emerging product of the integration of technology and automobile manufacturing, has been raised to the height of national strategies by various countries, and for autonomous driving countries have gradually introduced regulations to regulate.
2 people died
Tesla Autopilot Causes Major Crash
The crash occurred in 2019.
The driver, who had Tesla Autopilot (or AP) on at the time, did not take over the wheel in time to run a red light (Tesla's AP feature cannot recognize red lights and brake), and the vehicle ran the light causing a two-vehicle collision.
The two people in the Honda Civic that was hit died, and the driver and passenger in the Tesla were injured.
After the accident, the families of the deceased sued Tesla and the driver in this accident.
They allege that the driver was negligent in the accident and accuse Tesla of selling a defective vehicle. This caused the vehicle to accelerate suddenly and the automatic emergency brake did not turn on, ultimately causing a serious accident.
In response to the case, the National Highway Traffic Safety Administration (NHTSA) said that while automated systems can help drivers avoid crashes, drivers must use the technology responsibly.
Notably, in 2019, the year of the crash, the Tesla AP feature was not yet able to recognize and react to traffic lights on the road, so drivers should take over the wheel in a timely and proactive manner when going through intersections.
In October 2021, California prosecutors filed two counts of involuntary manslaughter against the Tesla driver in the case, but the driver pleaded not guilty.
A hearing in the case will be held on Feb. 23 of this year, court documents show.
There have been at least 26 accidents involving Tesla
In fact, there have been previous cases of self-driving test vehicles being criminally charged in accidents in the United States.
In 2020, Arizona prosecutors filed manslaughter charges against a driver hired by Uber to monitor a fully automated driving trial when the SUV struck and killed a pedestrian.
Unlike Uber's system, which was in a trial state, assisted driving systems such as Autopilot, which was involved in the California crash case, are already in mass use around the world. It is estimated that more than 760,000 Tesla vehicles are equipped with the system in the United States alone.
And U.S. regulators have also been very concerned about Tesla's Autopilot assisted driving system. It is understood that the National Highway Traffic Safety Administration (NHTSA) has long been investigating the shortcomings of Autopilot.
So far in 2016, the agency has sent investigation teams to investigate 26 accidents involving Tesla's Autopilot assisted system, and at least 11 people died in these accidents.
A 21st Century Business Herald reporter combed through the cases and found that the first fatal accident involving Tesla within the United States occurred in 2016, when a driver of a Tesla Model S was killed in a crash on a state highway. According to reports, the driver was driving with Autopilot on and a tractor trailer turned left in front of him, and Tesla did not apply the brakes, causing the crash. Tesla said Autopilot failed to recognize the truck because the truck was white and it was bright behind it.
Regulations are gradually being improved
Given the immaturity of autonomous driving technology and the repeated occurrence of autonomous driving accidents, countries have begun to regulate legislation for autonomous driving.
At the UN level, the Vienna Road Traffic Pact and the Automatic Lane Keeping System (ALKS) are the two main norms. The Automatic Lane Keeping System (ALKS) is the first binding international regulation for L3 level vehicle automation.
In the U.S., policies and regulations have been introduced at both the national and local levels, and in 2017, the Self Drive Act proposed to consolidate the work of 38 federal departments, independent agencies, commissions, and the Executive Office of the President in the area of autonomous driving to provide guidance to state and local governments on their work on self-driving vehicles. Related legislative efforts in the remaining states continue to advance, with bills introduced in Nevada, California and Washington.
In addition, foreign countries generally require vehicle manufacturers or sellers to be responsible for informing consumers or users of vehicle-related usage information, and drivers testing smart cars are required to undergo professional training. In Japan, the "Self-Driving Vehicle Safety Technical Guidelines" require that information be provided to users of self-driving vehicles, and that vehicle manufacturers, dealers, and mobile service system providers take steps to make self-driving users aware of the relevant information.