Francesco Biondi, University of Windsor
December 21, 2023
On Dec. 12, the U.S. Department of Transportation issued a recall regarding Autosteer, a feature included in Tesla’s semi-autonomous suite Autopilot, because “there may be an increased risk of a collision.”
The recall, which affects over two million vehicles in the United States, is a watershed moment in modern automotive history, as it affects nearly every Tesla on the road in the U.S.
Transport Canada extended the recall to 193,000 Tesla vehicles in Canada.
Tesla says only vehicles in the U.S. and Canada are affected by the recall.
Unlike technologies that can be defined as fully autonomous — like elevators where a user steps in and pushes a button — Autosteer is not an autonomous system, despite what drivers may think.
A 2018 study found that 40 per cent of drivers believed Tesla vehicles are capable of being fully self-driving. A similar study concluded that participants “rated [Autopilot] as entailing less responsibility for the human for steering than ‘high automation,’ and it was not different from ‘autonomous’ or ‘self-driving’.”
Instead, Tesla Autopilot falls into the category of level 2, or semi-autonomous, systems. These system can handle vehicle steering and accelerating but the human driver must stay vigilant at all times.
Confusing communication
In human factors research, believing that a system can do something it can’t is referred to as mode confusion. Mode confusion not only misleads the user, but also has direct safety implications, as in the 1992 Air Inter Flight 148 plane crash in France. That situation was the direct result of the pilot operating the aircraft system in a mode different from its original design.
Safety researchers have sounded the alarm about risks inherent to semi-autonomous systems. In fully manual and fully autonomous modes, it is clear who’s responsible for driving: the human and the robot driver, respectively.
Semi-autonomous systems represent a grey area. The human driver believes the system is responsible for driving but, as lawyers representing Tesla have already successfully argued, it is not.
A second important factor is also the role of misleading information. The automotive industry as a whole has, for years, tiptoed around the actual capabilities of autonomous vehicle technology. In 2016, Mercedes Benz pulled a TV commercial off the air after criticism that it portrayed unrealistic self-driving capabilities.
More recently, Ashok Elluswamy, director of Autopilot software at Tesla, said the 2016 video promoting its self-driving technology was faked.
False sense of security
Thinking that a system is fully autonomous creates a false sense of security that drivers may act on by losing vigilance or disengaging from the task of supervising the system’s functioning. Investigations on prior accidents involving Tesla Autopilot showed that drivers’ overrelience on the semi-autonomous system indeed contributed to some reported crashes.
The recall is a logical, albeit long-awaited, effort by transportation agencies to regulate a problem that researchers have attempted to draw attention to for years.
In her 2016 study, Mica Endsley, a pioneer in the research field on user automation, highlighted some potential safety risks of these systems. A more recent study published by my research group also shows the dangers that operating semi-autonomous systems pose to drivers’ attention.
With the recall, Tesla will be releasing over-the-air software updates that are meant to “further encourage the driver to adhere to their continuous supervisory responsibility whenever Autosteer is engaged.” These may include additional “visual alerts” and other additions to the system to help drivers stay vigilant while Autosteer is engaged.
In all, although this may be the first time regulators strike a direct, concrete blow at Tesla and its marketing, it won’t be the last.
Francesco Biondi, Associate Professor, Human Systems Labs, University of Windsor
Subscribe to our newsletter.
This article is republished from The Conversation under a Creative Commons license. Read the original article.