Ex-fighter pilot, now researcher, challenges autonomous vehicle safety (BusinessInsider)

In 2021, an engineer named Missy Cummings drew the ire of Elon Musk on the social network then called Twitter. A professor at Duke University, Cummings had conducted research on the safety of self-driving cars, and the findings led her to issue some stark warnings about Tesla’s driver-assistance tech. The cars, she wrote, had “variable and often unsafe behaviors” that required more testing “before such technology is allowed to operate without humans in direct control.” On the strength of her research, Cummings was appointed to the National Highway Traffic Safety Administration — to help with regulation of robot cars.

Tesla fans reacted with their usual equanimity and sense of perspective, by which I mean they absolutely lost it. Their insistence that Cummings would attempt to unfairly regulate their boy Elon soon prompted Musk himself to join the thread. “Objectively,” he tweeted, “her track record is extremely biased against Tesla.” In response, Musk’s stans unleashed their full fury on Cummings — her work, her appearance, her motives. They accused her of conflicts of interest, signed petitions demanding her removal, and emailed death threats.

But the thing is, Musk’s bros of war were messing with the wrong engineer. As one of the Navy’s first female fighter pilots, Cummings used to fly F/A-18s. (Call sign: Shrew.) She wasn’t intimidated by the dick-wagging behavior of a few people on Twitter with anime profile pics. She posted the worst threats on LinkedIn, hired some personal security, and kept right on fighting. “I’m like, are you really going to do this?” she recalls thinking. “I double down. The fighter pilot in me comes out. I love a good fight.”

She didn’t exactly win that particular engagement. A lot of whinging from Tesla pushed NHTSA to force Cummings to recuse herself from anything involving the company. But you know what they say about any landing you can walk away from. Cummings took a new gig at George Mason University and broadened her research from Tesla to the wider world of all self-driving vehicles. With companies like Cruise and Waymo unleashing fully roboticized taxis on the streets of San Francisco and other cities, the rise of the machines has begun — and Cummings is on the front lines of the resistance. In a controversial new paper, she concludes that the new robot taxis are four to eight times as likely as a human-driven car to get into a crash. And that doesn’t count the way self-driving vehicles are causing weird traffic jamsblocking emergency vehicles, and even stopping on top of a person who had already been hit by a human-driven car.

“In the paper that really pissed all the Tesla trolls off, I actually say that this is not just a Tesla problem — that Tesla is the first one to experience the problems,” Cummings tells me. “For years I have been telling people this was going to happen, that these problems would show up in self-driving. And indeed they are. If anyone in the self-driving car community is surprised, that’s on them.”


It turns out that serving in the Navy is a very good way to train for inbound ire from Muskovites. In her 1999 memoir, “Hornet’s Nest,” Cummings recalls how she loved flying jets, and says the excitement of getting catapulted off an aircraft carrier — or landing on one — never got old. But the environment was far from welcoming. Sexual harassment in the Navy was routine, and male colleagues repeatedly told Cummings she wasn’t qualified to fly fighters simply because she was a woman. When she and another female officer showed up at a golf tournament on base, they were told to put on Hooters uniforms and drive the beer carts. Cummings declined. 

Flying tactical engines of destruction also provided Cummings with a firsthand lesson in the hidden dangers of machines, automation, and user interfaces. On her first day of training, two pilots were killed. On her last day, the Navy experienced the worst training disaster that had ever taken place aboard a carrier. In all, during the three years that Cummings flew, 36 people died in accidents.

a white car blocks a line of cars waiting behind it on a city street. It's a Waymo self-driving taxi.
self-driving Waymo blocks traffic in San Francisco

In 2011, while conducting research on robot helicopters for the Navy, Cummings had an epiphany. Even surrounded by nothing but air, those helos were far from perfect — and they relied on the same sensors that self-driving cars do while operating right next to cars and people. “When I got in deep on the capabilities of those sensors,” Cummings says, “that’s when I woke up and said, whoa, we have a serious problem in cars.”

Some of the dangers are technical. People get distracted, self-driving systems get confused in complicated environments, and so on. But other dangers, Cummings says, are more subtle — “sociotechnical,” as she puts it. What she calls the “hypermasculine culture in Silicon Valley” intertwines with Big Tech’s mission statement to “move fast and break things.” Both bro culture and a disruptive mindset, as she sees it, incentivize companies to gloss over safety risks. 

All of which makes it even tougher for women when they level the kind of critiques that Cummings has. “When Elon Musk sicced his minions on me, the misogyny about me as a woman, my name — it got very dark very quickly,” she recalls. “I think the military has made a lot of strides, but I do think that’s what’s happening in these Silicon Valley companies is just a reminder that we haven’t come as far in our society as I thought we would have.”

An example: Last month, the head of safety at Waymo touted a new study from his company on LinkedIn. The research was unpublished and had not undergone peer review. But Waymo used the study to argue that its robot cars were actually much less likely to get into crashes than cars driven by biological organisms like you and me.

Cummings wasn’t having it. She had her new results — also still in preprint — which showed self-driving taxis to be way more crash-prone. So she went on LinkedIn, too, and said so.

The response was familiar to her from her days in the Navy. Kyle Vogt, the CEO of Cruise, slid into the comments. “I’d love to help you with this analysis,” he wrote to Cummings, questioning her number-crunching. “Would be great to connect and discuss this further.”

Cummings responded in kind. “I’d love to help you with your understanding of basic statistics, use of computer vision, and what it means to be a safe and responsible CEO of a company,” she wrote. “Call anytime.”

Women, she figures, caught her vibe. “Every woman who read that was like: Mmm-hmm, you go,” Cummings says. But men — friends in Silicon Valley — did not. They thought she had been too mean to Vogt. “He was just trying to help you,” they told her.

“All the guys read it like: She’s such a shrew!” Cummings says. But, ever the fighter pilot, she was unfazed. “That’s how I got my call sign,” she says. “So I live with it.”


So who’s right: Cummings, or the self-driven men of Waymo and Cruise and Tesla? It’s hard to tell, for a simple reason: The data on the safety of robot cars sucks. 

Take Cummings’ approach in her new paper. First she had to wrestle with NHTSA’s nationwide data for nonfatal crashes by human drivers, to get numbers she could compare to California, the only place where the robot cars run free. Then she had to figure out comparable nonfatal crash numbers and miles traveled for Waymo and Cruise, tracked by divergent sources. Her conclusion: Cruise has eight nonfatal crashes for every human one, and Waymo has four — comparable to the crash rates of the fatigued and overworked drivers at ride-hail services like Uber and Lyft.

The purveyors of robot taxis argue that Cummings is wrong for a bunch of reasons. Chiefly, they say, the numbers for human crashes are actually undercounts. (Lots of fender benders, for instance, go unreported.) Plus, crash numbers for the whole country, or even just California, can’t be compared to those for San Francisco, which is way denser and hillier than the state as a whole. Looked at that way, Cruise argued in a recent blog post, its taxis have been involved in 54% fewer crashes than cars driven by humans. The company also maintains that ride-hail drivers get into one nonfatal crash for every 85,027 miles of driving — 74% more collisions than Cruise’s robots.

Continue reading