LOGO

Tesla Self-Driving Software Under Fire: Consumer Reports Concerns

July 20, 2021
Tesla Self-Driving Software Under Fire: Consumer Reports Concerns

Tesla’s Full Self-Driving Beta Under Scrutiny

Recent footage depicts a Tesla vehicle, operating in full self-driving mode, executing a left turn from the center lane on a bustling San Francisco roadway. The vehicle subsequently entered a bus lane, an unauthorized maneuver. It then proceeded through a turn, narrowly avoiding collisions with parked cars, prompting the driver to regain immediate control. These incidents, documented by car reviewer AI Addict and others, are gaining visibility on platforms like YouTube.

FSD Beta 9 and Reliance on Cameras

Earlier this month, Tesla initiated the distribution of over-the-air software updates for its Full Self-Driving (FSD) beta version 9. This represents an advanced driver-assistance system that uniquely depends on camera vision, differing from prior Tesla ADAS systems which utilized both cameras and radar.

In response to circulating videos showcasing potentially unsafe driving behaviors, such as unprotected left turns, and additional reports from Tesla vehicle owners, Consumer Reports released a statement on Tuesday. The organization expressed concerns that the software upgrade currently lacks the necessary safety for deployment on public roads and announced plans to independently evaluate the update on its Model Y SUV once the required software is received.

Concerns Over Testing Practices

The consumer advocacy group voiced apprehension that Tesla may be utilizing its customer base and their vehicles as test subjects for novel features. Reinforcing this perspective, Tesla CEO Elon Musk cautioned drivers against complacency while operating the system, acknowledging the potential for “unknown issues” and urging a “paranoid” level of attentiveness.

Many Tesla owners participating in the Early Access Program are aware of the inherent risks, having voluntarily enrolled to provide feedback on beta software. However, other individuals sharing the road have not consented to participate in these trials.

Regulatory Landscape and Accountability

Tesla’s software updates are being deployed to drivers nationwide. The electric vehicle manufacturer did not provide a response when asked about its consideration of state-specific self-driving regulations – 29 states currently have legislation pertaining to autonomous driving, though these laws vary considerably.

Other companies developing self-driving technology, including Cruise, Waymo, and Argo AI, informed Consumer Reports that their software testing is conducted either on private test tracks or with the oversight of trained safety drivers.

“Automotive technology is evolving rapidly, and automation holds considerable promise, but policymakers must establish robust and sensible safety regulations,” states William Wallace, manager of safety policy at Consumer Reports. “Without such measures, some companies may treat public roadways as private testing grounds, with limited accountability for safety.”

NHTSA and Crash Reporting

In June, the National Highway Traffic Safety Administration (NHTSA) issued a standing general order mandating that manufacturers and operators of vehicles equipped with SAE Level 2 ADAS or SAE levels 3, 4, or 5 automated driving systems report all crashes.

“NHTSA’s primary focus is safety. By requiring crash reporting, the agency will gain access to vital data that will facilitate the swift identification of safety concerns within these automated systems,” explained Dr. Steven Cliff, NHTSA’s acting administrator. “Data collection will also help foster public trust in the federal government’s oversight of automated vehicle safety.”

Distraction and Driver Monitoring

The FSD beta 9 software incorporates features that automate a greater range of driving tasks, such as navigating intersections and urban streets under the driver’s supervision. However, the detailed graphics illustrating the vehicle’s surroundings, including pedestrians and cyclists, may inadvertently distract drivers from the road at critical moments.

“Simply asking drivers to pay attention is insufficient – the system must actively ensure driver engagement while operational,” asserted Jake Fisher, senior director of Consumer Reports’ Auto Test Center. “Evidence indicates that testing developing self-driving systems without adequate driver support can, and will, result in fatalities.”

Fisher recommended that Tesla implement an in-car driver monitoring system to verify that drivers are maintaining focus on the road, preventing accidents similar to the 2018 incident involving Uber’s self-driving test vehicle, which struck and killed a pedestrian in Phoenix, Arizona.

#Tesla#self-driving#autopilot#consumer reports#safety#software