American Authorities Find Tesla’s Autopilot System to Be Misleading

US federal agencies have unveiled an investigation into Tesla's Autopilot, in which they claim that Elon Musk's group's driver assistance program gives a false sense of security to the driver.

American authorities opened a new investigation into Tesla's Autopilot driver assistance system on the 26th. The federal agency NHTSA (National Highway Traffic Safety Administration) claims to have identified several problems related to accidents that occurred despite the last update of the program last December.

Advertisement

Autopilot is designed to help drivers steer, accelerate and brake, even allowing the option to take their hands off the wheel at times. However, it still requires driver intervention and attention. In its report, the agency believes that Autopilot is not designed to maintain the driver's attention on the driving task. Normally, the assistant warns its customers to pay attention when using Autopilot, which includes keeping their hands on the wheel and their eyes on the road.

However, in 59 accidents examined by the NHTSA, the agency found that Tesla drivers had sufficient time, “five seconds or more” to react before hitting an object. In 19 accident cases, the danger was visible for 10 seconds or more before the collision. By reviewing accident logs and data provided by Tesla, NHTSA found that drivers did not brake or turn the steering wheel to avoid danger in the majority of crashes analyzed.

Autopilot in the Model X // Source: Raphaelle
Tesla's Autopilot gives a false sense of security according to American authorities. // Source: Raphaelle

Driver assistance rather than Autopilot

NHTSA also criticized Autopilot's naming, calling it misleading and suggesting it leads drivers to think the software has complete control. In this regard, competing companies tend to use branded terms like “driver assistance.” The California Attorney General and the state Department of Motor Vehicles are also investigating Tesla for deceptive advertising and marketing.

Tesla, on the other hand, says it warns customers to remain attentive when using Autopilot and FSD, according to The Verge. The company says the software has regular indicators that remind drivers to keep their hands on the wheel and their eyes on the road. NHTSA and other safety groups said those warnings don't go far enough and were “ insufficient to prevent abuse “. Despite these statements from security groups, CEO Elon Musk recently promised that the company will continue to do everything for autonomy.

Advertisement


Advertisement