Get the latest tech news

Tesla sued by deceased driver’s family over 'fraudulent misrepresentation' of Autopilot safety


The family of a Tesla driver who died in a crash while using Autopilot last year is suing the company.

Mendoza's attorneys alleged that Tesla and Musk have exaggerated or made false claims about the Autopilot system for years in order to, "generate excitement about the company's vehicles and thereby improve its financial condition." There are at least 15 other active cases focused on similar claims involving Tesla incidents where Autopilot or its FSD — Full Self-Driving (Supervised) — had been in use just before a fatal or injurious crash. The agency has opened a second probe, which is ongoing, evaluating whether Tesla's "recall remedy" to resolve issues with the behavior of Autopilot around stationary first responder vehicles had been effective.

Get the Android app

Or read this on r/technology

Read more on:

Photo of Tesla

Tesla

Photo of Autopilot

Autopilot

Photo of family

family

Related news:

News photo

Xiaomi Teases Tesla-Like SUV for Summer in Big EV Expansion

News photo

Teen creates memecoin, dumps it, earns $50,000. Unsurprisingly, he and his family were doxed by angry traders.

News photo

Your AI clone could target your family, but there’s a simple defense