LOGO

New York Times Ad Warns Against Tesla’s “Full Self-Driving”

January 17, 2022
New York Times Ad Warns Against Tesla’s “Full Self-Driving”

Tesla’s “Full Self-Driving” Software Under Fire

A comprehensive advertisement featured in Sunday’s New York Times directly criticized Tesla’s Full Self-Driving software, labeling it as “the most deficient software offered by any Fortune 500 company.” The advertisement proposed a $10,000 reward to the first individual who could identify another commercial product from a Fortune 500 firm exhibiting a critical malfunction every 8 minutes.

The Dawn Project’s Campaign

The advertisement was sponsored by The Dawn Project, a newly established organization dedicated to prohibiting the use of potentially unsafe software in safety-critical applications susceptible to military-grade hacking. Their objective is to remove Tesla’s Full Self-Driving (FSD) from public roadways until it demonstrates “1,000 times fewer critical malfunctions.”

Dan O’Dowd, the founder of the advocacy group, concurrently serves as the CEO of Green Hill Software. This company specializes in the development of operating systems and programming tools for embedded safety and security systems. At CES, Green Hill Software announced that BMW’s iX vehicle utilizes its real-time OS and other safety-focused software.

Scrutiny and Potential Bias

Despite the potential for competitive bias stemming from the founder’s position, Tesla’s FSD beta software—an advanced driver-assistance system accessible to Tesla owners for limited driving automation—has faced increasing examination. This scrutiny arose following the viral spread of several YouTube videos showcasing system flaws.

The New York Times advertisement appeared shortly after the California Department of Motor Vehicles indicated it would “re-evaluate” its stance on Tesla’s testing program. This program utilizes consumer drivers rather than professional safety operators, raising questions about compliance with autonomous vehicle regulations.

Regulatory Concerns and Tesla’s Response

The California DMV oversees autonomous driving tests within the state, mandating that companies like Waymo and Cruise submit reports detailing crashes and system failures, known as “disengagements.” Tesla has not historically provided these reports.

Elon Musk, CEO of Tesla, offered a brief response on Twitter, asserting that Tesla’s FSD has not been linked to any accidents or injuries since its release. However, the U.S. National Highway Traffic Safety Administration (NHTSA) is currently investigating a report concerning a Tesla Model Y. The owner reported the vehicle unexpectedly veered into the wrong lane during a left turn while in FSD mode, resulting in a collision.

Autopilot Incidents and Safety Analysis

While this may be the first reported FSD-related crash, Tesla’s Autopilot, a standard ADAS feature, has been implicated in approximately a dozen incidents.

In conjunction with the NYT ad, The Dawn Project released a fact check of its assertions, referencing its own FSD safety analysis. This analysis examined data extracted from 21 YouTube videos, totaling seven hours of driving footage.

Methodology of the Safety Analysis

The analyzed videos featured beta versions 8 (released December 2020) and 10 (released September 2021). The study deliberately avoided videos with overtly positive or negative titles to minimize bias. Each video was assessed according to the California DMV’s Driver Performance Evaluation criteria, the standard used to evaluate human drivers for licensing.

California drivers must demonstrate 15 or fewer scoring maneuver errors—such as failing to signal lane changes or maintaining a safe following distance—and zero critical driving errors, like causing a crash or running a red light, to pass the driving test.

Findings of the Study

The study revealed that FSD v10 averaged 16 scoring maneuver errors within an hour and experienced a critical driving error roughly every 8 minutes. Although improvements were observed between versions 8 and 10, the analysis estimated that it would take an additional 7.8 to 8.8 years to achieve a human-level accident rate.

Statistical Considerations and Public Road Testing

The Dawn Project’s claims, while assertive, should be considered cautiously due to the limited sample size, which may not be statistically significant. Nevertheless, if the seven hours of footage accurately represents typical FSD performance, the findings could indicate a substantial issue with Tesla’s FSD software. This raises the question of whether continued testing on public roads is appropriate without stricter regulation.

The advertisement stated, “We did not consent to having our families serve as crash test subjects for the thousands of Tesla vehicles operating on public roads…”

Regulatory Action and Consumer Reports

Federal regulators have begun to address concerns regarding Tesla’s Autopilot and FSD beta software systems.

In October, NHTSA issued two letters to Tesla concerning the company’s use of non-disclosure agreements for early FSD beta access recipients and its decision to employ over-the-air software updates to rectify a problem in the standard Autopilot system that should have triggered a recall. Consumer Reports also released a statement indicating that FSD version 9 did not appear safe for public use and announced independent testing of the software.

Recent test results from Consumer Reports revealed that “Tesla’s camera-based driver monitoring system fails to maintain a driver’s attention on the road.” In contrast, Ford’s BlueCruise system provides alerts when a driver’s gaze is diverted.

Software Updates and User Feedback

Tesla has since released numerous iterations of its v10 software, with version 10.9 anticipated soon. Version 11, featuring a “single city/highway software stack” and “numerous other architectural enhancements,” is scheduled for release in February, according to Elon Musk.

User reviews of the latest version 10.8 are mixed, with some reporting smoother performance while others express a lack of confidence in the technology. A discussion thread on the Tesla Motors subreddit reveals owners sharing complaints, with one commenter stating, “Definitely not ready for the general public yet…”

Specific User Experiences

One commenter described a delayed right turn onto an empty road, followed by hesitation and lane obstruction during a left turn, culminating in sudden acceleration and deceleration due to fluctuating speed perceptions. The driver ultimately disengaged the system when it failed to recognize an upcoming left turn at a well-marked intersection.

Concluding Remarks

The Dawn Project’s campaign underscores Tesla’s own warning that its FSD “may perform incorrectly at the most critical moment.”

The advocacy group stated, “How can anyone accept a safety-critical product on the market that may malfunction at the worst possible time? Isn’t that the very definition of a defect? Full Self-Driving must be removed from our roads immediately.”

Neither Tesla nor The Dawn Project responded to requests for comment.

#tesla#full self-driving#new york times#advertisement#safety#autonomous driving