This past Sunday, Tesla rolled back Full Self-Driving (FSD, and, just a reminder, it does not fully self-drive, it’s a Level 2 driver assist system) from the just released FSD Beta 10.3 back to FSD Beta 10.2, with Elon Musk, the little-known and reclusive CEO of the company, noting that there were “some issues,” and that “this is to be expected with beta software. It is impossible to test all hardware configs in all conditions with internal QA, hence public beta.” I suppose one could argue that the “public” in that sentence doesn’t make that much sense, but, well, here we are.
Here’s the tweets with the deets:
The specific issues involved weren’t enumerated, though users noted a variety of issues, ranging from Autosteer not activating, the system resetting itself, disappearing icons, and lots of phantom Automatic Emergency Braking (AEB) incidents:
Some people did manage to have the 10.3 version long enough to make some videos, which show a mix of good and bad behaviors, though not necessarily more alarming than previous versions:
What is kind of alarming, and what supports my main issues with all Level 2 semi-automated systems, were some tweets from people suggesting that their own driving abilities are being eroded from their long-time reliance on the FSD system:
If you’re not a little concerned by what’s being expressed in these tweets, you maybe should be. A person—very likely not the only one—is suggesting that they “would really have to get my mindset back on track” to drive the car manually, and this is in a system that could demand they take over in mere seconds in an emergency situation.
There were several examples of this kind of thing in the video I posted in this very article, and, again, it’s a reminder why all L2 systems—not just Tesla—are flawed: if something is doing most of the driving for you, you’re just not always going to be paying attention well enough to take over if needed.
Level 2 systems ask a human to both drive and not drive, simultaneously, and it’s not a space people thrive in. Hell, this is exactly the kind of paradox used to make evil computers explode in sci-fi movies.
The rollback also seems to have removed FSD Beta entirely from some users’ cars:
The good news is that Tesla did get a fix out quite promptly:
…the less good news is the eternal question about all of this: should Beta software really be deployed onto public roads for testing? While the Tesla owners have opted in, they don’t drive on streets reserved for people who have agreed to be part of Tesla’s beta testing program.
Those who have agreed to be part of the beta program certainly have worked for it; to get in, Tesla owners have had to meet a “Safety Score” that demands a strange sort of artificially-sedate driving that doesn’t necessarily equate to safety. People even undertook hours and hours of needless driving to game the system and up their safety scores.
A Verge article about this quoted a tweet from well-known Tesla hacker GreenTheOnly, which provides an interesting and sober counter to the many tweets from excited Elonians who crave the FSD system with the fire of burning lithium. Here’s the tweets:
These issues of driving automation are by no means easy, and Tesla’s work is impressive in many ways. But there’s still so far to go, and I’m still not convinced free, unfettered use of public roads is the way to do it.