Automotive

Ralph Nader calls on NHTSA to recall Tesla’s ‘dangerous’ FSD

Ralph Nader calls on NHTSA to recall Tesla's 'dangerous' FSD

Ralph Nader, a political and consumer advocate and former presidential candidate, has issued a statement calling Tesla’s “so-called” full self-driving (FSD) technology “one of the most dangerous and irresponsible actions by a car company in decades.”

Nader is calling on the National Highway Traffic Safety Administration (NHTSA) to use its safety recall authority to order that FSD technology be removed in every Tesla. Per CEO Elon Musk’s recent statements, that’s about 100,000 vehicles.

The author of the bestselling book “Unsafe at Any Speed,” which criticized the American auto industry, cited research that found FSD malfunctions every eight minutes. That research was published in January by The Dawn Project, an organization aiming to ban unsafe software from safety-critical systems, which analyzed data from 21 YouTube videos of Tesla owners using FSD beta software.

“This nation should not allow this malfunctioning software which Tesla itself warns may do the ‘wrong thing at the worst time’ on the same streets where children walk to school,” wrote Nader. “Together we need to send an urgent message to the casualty-minded regulators that Americans must not be test dummies for a powerful, high-profile corporation and its celebrity CEO. No one is above the laws of manslaughter.”

Nader’s callout comes as Tesla is gearing up to release the next version of its FSD software, version 10.69, on August 20. Musk tweeted out the announcement, saying nothing about the next iteration’s capabilities other than: “This release will be big.” During Tesla’s Q2 earnings call, Musk also said Tesla would increase the price of the software and that the automaker was hoping to “solve full self-driving” by this year.

New Autopilot features are demonstrated in a Tesla Model S during a Tesla event in Palo Alto, California October 14, 2015. REUTERS/Beck Diefenbach      TPX IMAGES OF THE DAY

Really, Nader should be targeting Tesla’s Autopilot, as well. Tesla and Musk have been adamant in the past that FSD has not been responsible for any crashes or deaths. (However, a recent YouTube video from AI Addict shows a Tesla in FSD mode colliding with a bike lane barrier post.) Autopilot, on the other hand, has likely been the cause of several crashes. NHTSA is currently investigating 16 crashes in which Tesla owners were potentially engaging Autopilot and then crashed into stationary emergency vehicles, resulting in 15 injuries and one fatality. Since 2016, there have been 38 special investigations into crashes involving Tesla vehicles, of which 18 were fatal.

Other automakers have come out with similar ADAS technology, and based on NHTSA’s…

Click Here to Read the Full Original Article at Autoblog…