If there was any doubt that Tesla CEO Elon Musk knew the company’s much-watched 2016 self-driving demo was staged, emails obtained by Bloomberg should lay that to rest. “Just want to be absolutely clear that everyone’s top priority is achieving an amazing Autopilot demo drive,” Musk wrote in an email. “Since this is a demo, it is fine to hardcode some of it, since we will backfill with production code later in an OTA update.”
Musk saw little wrong with this strategy, saying, “I will be telling the world that this is what the car *will* be able to do, not that it can do this upon receipt,” he wrote. But instead of making this clear, the video, released to the world via Musk’s Twitter account, opens instead with white text on a black background telling the viewer that “the person in the driver’s seat is only there for legal reasons. He is not doing anything. The car is driving itself.”
Musk took to Twitter on the day of the video’s release to tell his followers that the car could read parking signs, and it knew not to park in a disabled spot. He also claimed that someone could use the “Summon” function on a car parked on the other side of the country.
But Summon was only released to Tesla drivers three years later. And the result was rather underwhelming, as the system struggled with navigating low-speed parking lots in a way that makes the suggestion that the system could drive 3,000 miles on public roads unaided rather ludicrous.
As we now know from Tesla’s head of Autopilot software, Ashok Elluswamy, the parking demo actually saw the Model X SUV crash into a fence. A 2021 New York Times article—now mostly confirmed by Elluswamy’s testimony in a lawsuit into the death of Walter Huang—also alleged that the car drove over a curb and through some bushes before finding the fence.
This is not the first time Tesla has shown difficulty in working with facts. In 2019, we discovered that the company’s repeated claims that Autopilot reduced crashes by 40 percent were bogus, and in fact, the system may have increased crashes by 59 percent.
That same year, the National Highway Traffic Safety Administration had to tell Tesla it was misleading customers by claiming that NHTSA had labeled the Tesla Model 3 the safest car it had ever tested.
Once more, with feeling
According to Bloomberg, the video that Tesla released on October 20, 2016, was the subject of a lot of revision. Musk’s chaotic management style—laid bare to the world following his recent purchase of Twitter—was on display back then.
On October 11, 2016, Musk told staff that everyone would be required to write a daily log detailing their contributions to the demo; at Twitter, Musk demanded that staff print out their most recent lines of code for review, an order that was quietly rescinded sometime later (presumably once reality set in). Days after Musk issued his daily log demand, a fourth draft was shared with Musk. This time, the CEO thought there were too many cuts and that the demo should appear “like one continuous take.”
In real-world conditions, the performance of Autopilot and the newer, even more controversial “Full-Self Driving” systems remain poor. NHTSA has multiple open investigations into whether Tesla’s driver assistance systems are safe, including one following hundreds of reports of phantom braking behavior, another to determine if Tesla cars are able to detect the presence of motorcyclists after at least two riders have been killed after they were hit by Teslas, and a third into the propensity of Teslas to crash into emergency vehicles.
Criminal charges are a possibility, too. Intentionally deceiving one’s investors or customers remains a crime in the United States, and federal prosecutors have been looking into whether Tesla’s and Musk’s claims about its driver assistance systems meet that bar. Elluswamy’s testimony surely isn’t helping Tesla’s case.