Get the latest tech news
'I Took a Ride in a ‘Self-Driving’ Tesla and Never Once Felt Safe
I took a ride in a Tesla with the 'Full Self-Driving' feature engaged — and it almost caused several accidents.
When the “Full Self-Driving” setting is enabled in a Tesla, according to the automaker’s own description, the car “attempts to drive to your destination by following curves in the road, stopping at and negotiating intersections, making left and right turns, navigating roundabouts, and entering/exiting highways.” The Tesla attempts to run a stop sign at an on-ramp for the 110, a notoriously hazardous freeway that requires you to come to complete halt, twist your head far around to check for oncoming traffic, and accelerate rapidly to merge with motorists traveling around 70 mph. In its ongoing tests of this scenario, the Dawn Project has produced this same result time and again, with each new version of FSD, and has been sounding the alarm about it since before it became a horrific reality: one of the accidents examined in the NHTSA report released this year included a Tesla Model Y driver who had Autopilot enabled when they struck a 17-year-old student getting off a school bus in Halifax County, North Carolina, in March 2023.
Or read this on r/technology