Get the latest tech news

Tesla’s Full-Self Driving Software Is A Mess. Should It Be Legal?


Elon Musk hypes the AI-enabled system, and getting more people to buy it is key to his new pay package. But in a recent test, it ignored some street signs and squashed a mannequin child.

Turns out, there’s a simple answer: “Driving-assist systems are unregulated, so there are no concerns about legality,” said Missy Cummings, a George Mason University professor and AI expert who has advised the National Highway Traffic Safety Administration on autonomous vehicles. About 60 miles after setting out from San Diego, their new Model Y failed to avoid a large metal object in the middle of their highway lane, causing severe damage to the vehicle’s underbody, according to a video posted by one of the influencers, Bearded Tesla Guy. Mark Rosekind, former chief safety officer for robotaxi developer Zoox and the NHTSA administrator in 2016 when the first fatal Tesla Autopilot crash occurred, thinks a combination of new regulations for technology like FSD and validation by expert outside entities is needed.

Get the Android app

Or read this on r/technology

Read more on:

Photo of Tesla

Tesla

Photo of self

self

Photo of mess

mess

Related news:

News photo

Europe's EV sales surge 26% in 2025 while Tesla faces decline

News photo

Tesla factory technician sues for $51 million after assembly-line robot knocks him unconscious

News photo

Three crashes in the first day? Tesla’s robotaxi test in Austin. | Tesla's crash rate is orders of magnitude worse than Waymo's.