Get the latest tech news
Tesla’s Full-Self Driving Software Is A Mess. Should It Be Legal?
Elon Musk hypes the AI-enabled system, and getting more people to buy it is key to his new pay package. But in a recent test, it ignored some street signs and squashed a mannequin child.
Turns out, there’s a simple answer: “Driving-assist systems are unregulated, so there are no concerns about legality,” said Missy Cummings, a George Mason University professor and AI expert who has advised the National Highway Traffic Safety Administration on autonomous vehicles. About 60 miles after setting out from San Diego, their new Model Y failed to avoid a large metal object in the middle of their highway lane, causing severe damage to the vehicle’s underbody, according to a video posted by one of the influencers, Bearded Tesla Guy. Mark Rosekind, former chief safety officer for robotaxi developer Zoox and the NHTSA administrator in 2016 when the first fatal Tesla Autopilot crash occurred, thinks a combination of new regulations for technology like FSD and validation by expert outside entities is needed.
Or read this on r/technology