According to a report given to the National Highway Traffic Safety Administration (NHTSA), the first reported crash of a Tesla with the company’s Full Self-Driving (FSD) beta software in operation happened on November 3. While we’ve seen some impressive examples of Tesla’s beta semi-automated driving system in action, we’ve also seen plenty of really alarming videos of the FSD Beta system making some really bad driving decisions, including ones similar to what was reported as the cause of the reported wreck.
Now, I should make it clear that this is just a report, and has yet to be fully investigated or proven, and there are definitely some strange things mentioned in the report’s text:
The Vehicle was in FSD Beta mode and while taking a left turn the car went into the wrong lane and I was hit by another driver in the lane next to my lane. the car gave an alert 1/2 way through the turn so I tried to turn the wheel to avoid it from going into the wrong lane but the car by itself took control and forced itself into the incorrect lane creating an unsafe maneuver putting everyone involved at risk. car is severely damaged on the driver side.
The part about FSD Beta guiding the car into the wrong lane of traffic isn’t the odd part—we’ve seen that before, with even some recording incidents of the software guiding the car across double-yellow lines into oncoming traffic lanes:
Other times, we’ve seen situations similar to what the complaint mentioned, going into the wrong lane during a turn:
Those FSD errors don’t sound uncommon; what is very odd is the assertion that “the car by itself took control and forced itself into the incorrect lane,” which technically shouldn’t be possible as the driving assist system should disengage when the driver moves the wheel significantly, and there’s even discussions on Tesla forums of FSD jerking the wheel so suddenly that it forces itself to disengage just from the motion of the wheel.
I’ll be curious to learn more about what happened in this situation. The error that FSD made itself seems well-explored, but if there actually is an issue with driver-instigated disengagements not happening, that’s a whole massive other issue.
If this is driver error, then that’s important, too, because the interaction between the human driver and the system that’s controlling the car is a huge issue with all Level 2 semi-automated systems, and why the human may have had problems taking over is something absolutely worth investigation.
Thankfully, no one was hurt in this incident, though the car was described as being “severely damaged.”
The NHTSA is already investigating Tesla’s semi-automated driver assist system, Autopilot and its alleged tendency to crash into emergency vehicles.
All of this brings up the questions of whether it’s a good idea to test unfinished software that takes control of a 4,000 pound car on public roads, surrounded by people in and out of cars who have not agreed to take part in any tests at all.