Tesla's "AI Day" on Thursday ended with a bang. The company unveiled the Tesla Bot, a humanoid robot intended to leverage the company's AI to navigate the world and "eliminate dangerous, repetitive, and boring tasks," as Elon Musk put it. Musk said he hoped it would help bring about a future in which "physical work will be a choice."
But while the Tesla Bot certainly stole the show, the vast majority of AI Day focused on self-driving systems. Tesla executives ventured deep in the weeds on simulation, data labeling and the proprietary Dojo chip (which promises breakthroughs in training neural networks). The event was meant as a recruiting exercise for AI experts, and the overall message was clear: Tesla is as bullish as ever on its ability to deliver full self-driving technology, and to do it fast.
That confidence comes as a bit of a surprise, though, given the mounting regulatory scrutiny surrounding Tesla.
- The National Highway Traffic Safety Administration announced Monday that it would examine whether Tesla's Autopilot system has a tendency to malfunction at first responder sites, potentially due to lights, flares and reflective equipment. The investigation will center on eleven documented crashes. If the NHTSA finds fault in Tesla's systems, it could demand a recall or impose limits on driverless features for an estimated 765,000 Tesla vehicles sold in the U.S.
- Two days later, Tesla's week went from bad to worse: Democratic Sens. Richard Blumenthal and Ed Markey sent a letter to FTC Chair Lina Khan, asking her to investigate Tesla for calling its system "Full Self-Driving" when, it turns out, it's not full self-driving.
- Sens. Blumenthal and Markey also seemingly anticipated the bold claims at Tesla's AI Day event. They wrote in the letter: "As Tesla makes widely available its FSD and Autopilot technology and doubles down on its inflated promises, we are alarmed by the prospect of more drivers relying more frequently on systems that do not nearly deliver the expected level of safety."
The new regulatory push suggests the Wild West days of self-driving are nearing an end.
- Until recently, federal agencies have held off on imposing strict self-driving regulations, likely out of fear that doing so would limit innovation. But in a February 2021 letter, the National Transportation Safety Board's then-chairman Robert Sumwalt expressed concerns over what he saw as the NHTSA's "willingness to let manufacturers and operational entities define safety."
- Tesla has long adopted the "ask for forgiveness, not permission" stance toward regulators. Sometimes a lack of regulation allows for innovation: Tesla rolled out a Smart Summon feature in 2019, for example, that lets passengerless cars navigate parking lots at slow speeds. Smart Summon is pretty much the NHTSA's worst nightmare, but it's also fun and hasn't yet caused an epidemic of violent parking lot crashes (ok, maybe a few, but still).
- But there are also cases in which underregulation has enabled reckless behavior. For instance, there aren't yet strict standards for making sure drivers pay attention with self-driving engaged. Some cars use eye-tracking to enforce driver compliance, while others (including most Tesla vehicles) simply monitor the steering wheel for occasional driver input. Critics argue that Tesla's approach has been too lax, and their case could be bolstered by several high-profile incidents in which drivers appear to have been completely disengaged.
As self-driving technology becomes more prevalent, the NHTSA can no longer afford to take a hands-off approach. Instead, the agency will likely soon set safety standards for things like driver engagement checks, hazardous conditions tests and backup crash avoidance systems.
For Tesla, complying with new regulations would also mean changing the company's culture.
- Tesla is a polarizing company, in part because its CEO so brazenly flaunts his disdain for rules and regulations. (Musk has several ongoing Twitter beefs with regulators, and a few weeks after settling SEC fraud charges called the agency the "Shortseller Enrichment Commission.")
- Sumwalt even once claimed that Musk hung up on him after he called to request that Musk stop disclosing information about an Autopilot crash investigation.
The renewed push for regulation sends a clear message: It's time for Tesla and Musk to get serious about safety or risk paying a hefty price. But there's no sign that Tesla's going to slow down anytime soon. Full Self-Driving is "clearly headed to way better than a human, without question," Musk said toward the end of AI Day. And that apparently goes both for cars and robots.
A version of this story will appear in tomorrow's Source Code newsletter. Sign up now.