• Wion
  • /Autonews
  • /NHTSA challenges Tesla over misleading full self-driving social media posts - Autonews News

NHTSA challenges Tesla over misleading full self-driving social media posts

NHTSA challenges Tesla over misleading full self-driving social media posts

self-driving vehicle

The National Highway Traffic Safety Administration (NHTSA) has escalated its scrutiny of Tesla's communications strategy, expressing serious concerns about social media posts that could mislead consumers about the capabilities of its Full Self-Driving (FSD) technology. The federal safety agency argues that these posts may encourage dangerous misuse of the driver assistance system.

In a detailed email dated May 14, released publicly on Friday, NHTSA directly challenged Tesla's social media messaging, particularly posts that could lead consumers to view FSD as an autonomous robotaxi service rather than what it actually is: a partial automation system requiring constant driver attention and occasional intervention.

The timing of this communication is particularly significant, as it comes amid an ongoing NHTSA investigation launched in October into 2.4 million Tesla vehicles equipped with FSD software. This investigation was prompted by four reported collisions, including a fatal crash in 2023, occurring under challenging visibility conditions such as sun glare, fog, and airborne dust.

Of particular concern to NHTSA were specific posts on X (formerly Twitter) that the agency believes could promote unsafe usage of the FSD system. These included Tesla's reposting of an account describing a driver using FSD to navigate a 13-mile journey to an emergency room during a heart attack, and another post showcasing a 50-minute autonomous drive home from a sporting event.

"We believe that Tesla's postings conflict with its stated messaging that the driver is to maintain continued control over the dynamic driving task," NHTSA wrote in its communication with the automaker. The agency has requested Tesla to reconsider its communication strategy regarding FSD capabilities.

In response to these concerns during a May meeting, Tesla defended its position by pointing to disclaimers in its owner's manual and other documentation that clearly state the vehicle is not autonomous and requires driver vigilance. However, this response has not fully addressed NHTSA's concerns about the potential disconnect between official documentation and social media messaging.

The investigation has taken on added urgency following a tragic incident in Rimrock, Arizona, where a 71-year-old woman was fatally struck by a Tesla operating in FSD mode. The driver, who was battling sun glare at the time, was not charged in the incident. This case has become a focal point in discussions about FSD's ability to handle challenging environmental conditions.

NHTSA has set a December 18 deadline for Tesla to respond to a series of detailed questions about the investigation, particularly focusing on the system's "potential failure to perform, including detecting and responding appropriately in specific situations where there is reduced roadway visibility that may limit FSD's ability to safely operate."

The agency's investigation will also examine whether the system provides adequate feedback to drivers when operating beyond its capabilities, a crucial safety consideration for partial automation systems.

This latest development follows Tesla's December 2023 agreement to recall over 2 million vehicles in the United States to install new safeguards in its Autopilot system, a decision made under pressure from NHTSA. The agency continues to evaluate whether these safeguards adequately address safety concerns.

The situation is particularly complex given that Elon Musk serves as both Tesla's CEO and the owner of X, the platform where many of the controversial posts appeared. This dual role raises questions about the alignment of corporate messaging across different platforms and its impact on public safety.