The cars took over San Gabriel Street shortly after midnight. The road, which runs through a densely populated Austin neighborhood near the University of Texas, was already crowded. The Longhorns game against Wyoming had concluded just a few hours earlier, and students were out celebrating in the West Campus area. Pedestrians crowded the sidewalks, heading to the next bar or house party. Drivers, accustomed to slowly navigating the packed neighborhood, met an unexpected obstacle: at least twenty cars, bearing no drivers or passengers, had converged on San Gabriel. 

The empty cars were all Chevy Bolts with camera rigs affixed to the roofs, operated by driverless car company Cruise, a subsidiary of General Motors. Those cars were stuck, blocking one another along with other traffic. Motorists honked their horns, but inside the vehicles responsible for the traffic jam, there was no one to hear this cacophony of frustration. Students gawked, snapping photos and posting videos of the clot of stopped vehicles with no one behind the wheel.

This isn’t how the future we were promised was supposed to go, but we’re not in the future just yet. Austin was among a small handful of cities nationally where autonomous “robotaxis” were briefly available for consumers to hail. The cars in Austin were operated by Cruise, one of the largest companies operating in the driverless vehicle industry (another is the Google subsidiary Waymo). Earlier in 2023, similar scenes of confusion and congestion unfolded in Phoenix and San Francisco. 

In San Francisco, the hometown of Cruise, the company’s cars were blamed for blocking an ambulance carrying a patient who later died and for colliding with a fire truck. The day after the August crash into the fire engine, Cruise agreed to cut the size of its San Francisco fleet in half at the request of California regulators. On October 2, a pedestrian was struck by a human-driven car and subsequently thrown into the path of a Cruise vehicle, which dragged her twenty feet; she had to be freed with rescue equipment. Later in the month, after more details emerged about the incident, California regulators stopped the company from providing rides in the Golden State altogether. California’s Department of Motor Vehicles declared that the cars were “not safe for the public’s operation.” By the end of the month, Cruise announced that it would be ceasing all passenger services nationwide, saying in a statement that “the most important thing for us right now is to take steps to rebuild public trust.” 

Cruise will still test its cars without picking up passengers, but the cars will no longer be empty—a safety driver will sit behind the wheel. This is an existential moment for the driverless car industry. Cruise’s ambitions are lofty. Megan Prichard, the company’s vice president of ridehail, told me in late summer—before Cruise’s very bad October—that the ultimate goal is no less than ending car ownership, and building a zippy future in which the most affordable and efficient way to get around will be to summon a robot-driven car to come pick you up and take you where you want to go. 

Driverless cars, when working as intended, feel like they’ve just driven off the screen of a science fiction movie, but when they’re amassing in your neighborhood, the scene is more like something directed by Hitchcock. There’s an old line that no matter what year it is, driverless cars are five years away from widespread adoption, but for a stretch this summer in Austin, autonomous vehicles were here. (Cruise launched in Houston on October 12 before pausing operations two weeks later.) During that time, these cars were dropping passengers off at dinner, taking over city streets, and raising questions not just of safety, but of how cars shape labor, accessibility, and the very understanding of what it means to be sentient. 

Those questions came up when I watched videos of the scene on West Campus. (Cruise, in a statement, blamed the incident on “a crowded, challenging environment” and noted that it caused “no pedestrian, vehicle or property damage.”) It was downright creepy to watch as a swarm of identical cars, with no humans inside, converged in the same spot. “We rely on mental shortcuts to make sense of the world, and cars are strictly in the machine category,” Jaime Banks, a professor of public communication and technology at Syracuse University, told me when I asked her why it felt so eerie to see a bunch of empty cars take over a city street. “And then all of a sudden, it looks like they’re gathering—that’s a social behavior we attribute to animals or people. When something doesn’t fit its category, we don’t like that sort of thing.” 

California may pride itself on watchful regulation of tech companies, but Texas is famously hands-off. A 2017 state law licensed driverless cars to roam Texas streets with no oversight and stripped from cities the power to regulate operators such as Cruise. Notably, Texas officials didn’t intervene to rescind Cruise’s ability to operate in the state—that decision came from the company itself. (A spokesperson from the Texas Department of Transportation didn’t respond to an interview request.)

When I spoke to Prichard in September, she was bullish on the short-term future of driverless cars. “These aren’t five years away,” she told me. “This is happening right now.” A month later, the cars’ widespread use was back to being a little more science fiction than science fact. What does our driverless future look like now? 


Americans don’t love the idea of autonomous vehicles. In March, a poll from the American Automobile Association found that 68 percent of us were afraid of them. Incidents in which the cars behave strangely don’t help. Neither does the bravado that those developing the technology display. Employees of an autonomous vehicle company that was acquired by Uber reportedly distributed stickers that read “Safety Third” as an unofficial company motto. Elon Musk has boasted of Tesla’s “full self-driving” capabilities, which have led to more than seven hundred crashes since 2019, at least seventeen of which were fatal. Famously, he’s argued that if human drivers are capable of operating a car safely (at least most of the time) with only a pair of eyes to take in visual information, then an autonomous car doesn’t need radar or lidar three-dimensional imaging, which enable the artificial intelligence to pinpoint the distances of objects more precisely and keep the cars from being confused by, say, white paint. (Experts generally disagree with him.) But automation, at least in theory, isn’t that difficult—or that new. 

“Airplanes have been autonomous for decades,” said Robert Brydia, a senior research scientist at the Texas A&M Transportation Institute. “Any plane that’s less than forty years old can . . . land by itself without any pilot interference whatsoever.” We successfully sent an autonomous vehicle all the way to Mars all the way back in 1997—letting vehicles drive themselves isn’t inherently difficult. The challenge is when you introduce complications. 

“You can just put in a destination [on an airplane] and sit back and enjoy your Netflix, because airspace is not really a very complicated environment,” Brydia said. A neighborhood like West Campus, with its pedestrians and nonautonomous vehicles and college students playing beer pong on the lawn is rather more complex. Humans make countless calculations instinctively and within fractions of seconds while driving—if we see a soccer ball roll out into the road, for example, we know that the real hazard isn’t the ball, but the child who may end up running into the street after it. We run those scenarios in real time constantly, without registering them. “A computer, even one that’s been trained by millions and millions and millions of occurrences, if it hasn’t encountered a great deal of those types of situations, is going to be confused,” Brydia explained. 

Commuters may be skeptical of driverless cars, but big investors are not. It’s hard to know exactly how much has been spent on developing the technology, which has been studied for decades, but a December 2020 analysis by consulting firm McKinsey estimated that investors had by that point poured roughly $100 billion into autonomous vehicles. 

Cruise, before it paused its operations, operated around four hundred vehicles nationwide, primarily between its Austin, Phoenix, and San Francisco markets. No one knows quite how many drivers Uber has on the roads in any given city, but when it last released its U.S. driver number in 2019, it had more than a million nationwide. Prichard said in September that she expected to match that scale “in the next few years.”

Cruise did not provide an update about the company’s plans following the announcement that it would be pausing its passenger service, but Prichard’s bullish vision has long had skeptics. Kara Kockelman is a professor of transportation engineering at UT-Austin and one of the leading experts on autonomous vehicles in the U.S. She told me that Cruise and Waymo don’t have permission from the National Highway Traffic Safety Administration to manufacture cars in the numbers they’d need to have to get on the road at that pace. (The agency also opened an investigation into Cruise in mid-October, focused on the risks the cars present to pedestrians.) The current fleet of 278 million privately owned commercial and personal vehicles in the U.S.—that is, your car and mine—will take more than twenty years to be replaced by newer models, she said. And autonomous cars are expensive, with a price tag north of $40,000 to manufacture one, according to Kockelman—more than double the estimated cost to produce the average gas-powered sedan. So how soon might they be as common as ordinary, human-driven vehicles? Not soon, said Kockelman. “You have to get the permission, you have to get the price down, you have to retool the assembly lines,” she told me. “All these things take time, I’m afraid.” Brydia agreed.


Whenever the subject of autonomous vehicles comes up, a skeptic invariably asks: “Who asked for this?” I have an answer to that: I did. My wife lives with a degenerative eye condition. After a botched operation in her early twenties, her right eye can do little more than make out light; while her left is usable, it’s not correctable much beyond the point of legal blindness, and it’s at constant risk of worsening with age. She hasn’t had a driver’s license in nearly twenty years, which makes getting around in Texas a challenge. Public transportation is slow, irregular, and constrained to predetermined routes, and it has been limited by decades of underinvestment. Ride-hailing services such as Uber and Lyft, while plagued by well-documented issues, are a boon to her independence—but they’re also pricey. It’s not uncommon for a single ride to the airport to cost almost as much as an entire tank of gas—and that’s with the ride-hailing companies subsidizing those rides as they continue to operate at massive deficits. (In 2022 Uber posted a $9 billion net loss.) It turns out promising everyone a personal driver to take them where they need to go is an expensive proposition.

But what if you take the driver out of the equation? “If you get an Uber ride, it’s loosely three or four dollars per mile,” said Prichard, who spent four years at Uber before joining Cruise in 2021. “The cost of personal car ownership is dramatically increasing right now, but we’re still only talking about sixty cents to a dollar-twenty per mile, depending on what type of metro area you’re in. . . . But what we’re hoping to do is bring down the cost of that mile to the customer to be on par, if not less expensive, than driving a personal car.” That sounds great to me, and it’d be downright transformative for my family. 

But we’re also in a privileged position. Not everyone can afford a car, not everyone lives in a city, and not everyone has the relatively short commute that my wife does. I asked Anna Zivartz, who works on mobility issues for Disability Rights Washington, a nonprofit based in Seattle, whether robotaxis offer the same promise for other people with disabilities that keep them from driving. She offered a reality check. 

“Most of the folks who have the most limited mobility aren’t going to be able to afford it unless it’s almost free. It’s not going to serve the people who are most limited,” Zivartz told me. Severe disabilities can make it hard to find high-paying work, and the average monthly Social Security benefit for people with disabilities is $1,487. “Also, those folks often need some assistance—say, with tie-downs if they’re a wheelchair user, or finding the vehicle, if you’re blind.” 

Cruise has worked to build relationships with those in the disability community. The company commissioned a study that touts the prospective benefits of autonomous vehicle adoption for Americans with disabilities, but it leaves some gaps (it doesn’t define how widespread adoption would need to be, for example, and none of the interview subjects reported living in rural communities).

“They definitely hired really terrific people from the disability community, and I don’t doubt that the people who are working there believe that what they’re doing is going to help,” Zivartz said. But she’s skeptical: a Cruise employee reached out to her organization during the summer to pitch her on the benefits autonomous vehicles offered for the disability community in her state. “That person who reached out to me was actually the same person who, four or five years ago, was working for Lyft, and now she works for Cruise. And she was making all of the same arguments she did then about how it was going to solve all of our problems. So it’s hard to believe it’s really going to be different.” 

There would obviously be benefits to the world Cruise says it’s building, in which a city dweller with low vision who was previously using underfunded public transit could instead tap their phone screen and be chauffeured to work by a robot and pay only a slight premium over the price of bus fare for the ride. But it’s hard to pin down the details of how many lives would be transformed, at what cost, and how quickly.


The cost-benefit analysis gets more optimistic for autonomous vehicle companies when you look beyond cities—and beyond cars. Urban streets are extremely complex. Highways are far less so. “The nice thing about freeways is that they’re simple—there’s not a bunch of cyclists and scooter users and people driving the wrong way,” Kockelman said. Virtually everyone I talked with while reporting this story—with the notable exception of Prichard, at Cruise—told me that robotaxis and local, consumer-use driverless cars might get most of the attention, but if I’m looking for the real action, look to autonomous freight trucks. 

Those trucks are already on the roads. Aurora, an autonomous freight company founded in 2017 by former executives from Google, Tesla, and Uber’s self-driving car division, is already running thirty trucks a day through Texas, along two routes—one from Dallas to Houston, the other from Fort Worth to El Paso—for major trucking companies including FedEx and Uber Freight. You might not notice that you’re driving next to an autonomous truck as you speed down Interstate 45; at the moment, all test runs have human drivers collecting data, ready to take the wheel should anything go wrong. But full autonomy is coming, and probably far sooner than the widespread use of robotaxis. 

“If you ask me how long until we see autonomous trucks on the roadway, operating to the point where people don’t even give them much thought anymore? Five years,” Brydia told me. 

Aurora declined to grant an interview. But independent experts such as Brydia and Kockelman listed the potential benefits of automating much of the trucking fleet. Brydia said that, to the best of his knowledge, even as the number of road miles logged by autonomous trucks continues to grow—he estimates it’s in the millions by now—there’s never been a crash between a human motorist and a self-driving truck. “That would be all over the news,” he noted, though in 2022 an autonomous truck did collide with a concrete barrier.)

“Texas hasn’t had a day without somebody being killed on a [roadway] for over twenty years,” Brydia reminded me. “If the technology can replace the driver and be safer, then ultimately we can cut down on that number.” An autonomous truck will never be sleepy; it’ll never gobble up pills it bought off a guy behind a Buc-ee’s to stay awake; it’ll never drive drunk or secretly watch The Real Housewives of Atlanta on an iPad propped next to the gearshift. Autonomous vehicles sound scary until you consider all the things that can—and frequently do—go wrong when a human is operating a truck. In 2022, Texas roads saw 53,127 commercial truck collisions, of which 686 were fatal and 1,579 caused suspected serious injuries. (Thus all those Thomas J. Henry ads.) 

There are also, of course, economic benefits to autonomous trucking, which will—initially, at least—be enjoyed by the freight companies. Human-driven trucking cost around $2.25 per mile last year; a 2022 white paper from Uber Freight estimated that autonomous trucks could do the same job for $1.06. Given that the trucking industry chalks up around 327 billion road miles each year, that’s a lot of savings. 

Of course, the current system, in which people drive trucks, offers obvious benefits to workers. The trucking industry is one of the largest employers in the United States, with more than two million drivers. If you talk to the companies testing this technology, their officials often say that if you’re a truck driver today, you’ll be able to retire a truck driver, and they remind you of the truck driver shortages the country has faced for years, which accelerated during the COVID-19 pandemic. Peter Finn, vice president of the Teamsters union in the Western Region, doesn’t buy it. “The sole purpose of this technology is to eliminate jobs,” he said. “How does it benefit workers in our communities to have autonomous trucking?” he said. “Why is that better? It’s just better for the corporations that don’t have to pay a driver.”

But the potential good, if autonomous vehicles are as much of a safety upgrade over human drivers as the experts think they’ll be, is obvious: fewer deaths, fewer traumatized motorists, fewer folks suffering from injuries sustained in collisions. Currently, with human drivers, more than four thousand Texans are killed each year on our roads. So why do Americans distrust this technology by a two-to-one margin? 

When a human driver is at his or her best, putting a person behind the wheel might well be the safest way for a car to operate. But often, we’re distracted or tired. There’s tension around autonomous vehicles—they make us uncomfortable, and some of the incidents, such as the one in San Francisco that preceded Cruise’s decision to pause its robotaxi service nationwide, are disturbing. But humans drive poorly all the time.

In 2010, 4,302 pedestrians were killed on American streets. By 2022, after increasingly enormous pickup trucks outsold cars in 2020 and ownership of distracting smartphones went from barely a third of Americans to the vast majority, the number of pedestrians killed had grown to 7,508, a forty-year high.

Robotaxis, at least, are smaller than the ever-growing heavy-duty trucks Americans (and especially Texans) enjoy driving. “We’re driving loaded weapons,” Kockelman said. “I’d much rather have a computer helping me out.”

I asked Banks, the communication and technology professor at Syracuse, why being presented with statistics about how driverless cars are actually safer than human drivers doesn’t seem to make us more comfortable with them. She suggested that it’s a matter of blame. When a human driver blocks traffic, causes a crash, or strikes a cyclist, we know whom to be mad at. With a robot, the question becomes more abstract. “We don’t tend to think that machines are responsible for their behavior. So when a machine goes wrong, who do we blame? The machine, the developer of the machine, the owner of the machine, the user of the machine? That’s a question that we haven’t sorted out yet,” she told me.  

How, then, might we assign blame in our ever-more-roboticized future? Banks’s answer surprised me. “If we treat machines as persons in the future, as a way of dealing with this question, it might make us feel better if the machine dies when it kills somebody,” she said. “That might be the moral foundation of the system, if we find it acceptable.” I imagined someone I love being killed in a collision with an empty truck and then myself watching as that truck was crushed in a car compactor; it wouldn’t, I think, feel like justice, but it’d be better than nothing. 

With Cruise’s service on hold, that’s a question for the future. For now, at least, the chaos in West Campus is back to being fully man-made. But the future is driving toward us, and if we don’t have our hands on the wheel, it’ll keep steering and accelerating anyway.