Facebook > Email > More Pinterest Print Twitter Play

If a Driverless Car Wrecks, Who Is to Blame?

The technology is coming with a lot of questions.

By Comments

Self-driving cars are coming. Are we ready? The short answer: Not yet.

Perhaps no other transportation innovation in modern history has generated more fascination than the prospect of self-driving cars. But before we start thinking that the next car we buy might allow us to safely read or nap during a four-hour road trip, let’s consider a few realities.

What if a self-driving car is faced with a judgment call?  How would it react if a cyclist swerved out of an adjacent bike lane?  How would the car choose between hitting a child who chased a ball into a neighborhood street or swerving into a couple walking their dog on the sidewalk?

It’s important to remember that automated vehicle technology is still very much in the development and testing phase. Much of that testing needs to happen safely in real-world environments so that the cars can address an almost countless range of situations. To that end, the Texas A&M Transportation Institute in December joined with the University of Texas at Austin and the Southwest Research Institute to form the Texas Automated Proving Ground Partnership. When we combine our collective research facilities and the varied roadway networks in our state’s major cities, there’s simply no better place than Texas – with its favorable business climate and geographic diversity – to test self-driving cars.

Texas has a vast array of traffic laws based on the presence of a human in the driver’s seat. Emerging disruptive technologies create ambiguities in how those laws may be applied, and they raise questions for lawmakers.  Should we anticipate the end of driver education as we know it?  Should driver age restrictions be raised or lowered based on how the driving task might change?  If a self-driving car is involved in a crash, who’s at fault? The “driver” or the car’s manufacturer?

Many envision self-driving cars as a safety solution since more than 90 percent of crashes are caused by human error. Minimize the human’s involvement and you reduce the likelihood of a crash, right?  Although it’s reasonable to hope for improvement, we don’t yet have any evidence to support that hope. One thing we do know from our research so far, is that we may need a new definition of “vehicle operator” in our state traffic laws. We also know that we may need to change our current crash reporting practices, modify how we capture and use data from those crashes, and rethink how we can most effectively dispatch first responders when a crash happens. All that—and much more—translates into a lot of complex work for those who meet in the State Capitol every other year.

There also is the issue of consumer confidence in the idea of self-driving cars. We know that the idea of self-driving cars will be transformative. What we don’t yet know is the extent to which consumers will embrace that idea. In a study of drivers in one Texas city last year, half the people we asked through a transportation institute research study said they could see a self-driving car in their future. The other half said they can’t envision ever using one.  The reasons for their answers say a lot about how fickle consumers can be. The “yes” group had confidence in the technology, believed the cars will be safer than those driven by people, and wanted to avoid the stress of driving. The “no” group didn’t trust the technology, had doubts about safety, and would prefer to have control of the car.

The next five years will bring more advancements in vehicle technology than we’ve seen in several decades. And yes, someday relatively soon, we’ll be able to kick back and leave the driving task (or at least most of it) to a trusted machine. But to fully realize the tremendous potential of self-driving cars, there’s a need for safe, real-world testing to help us all understand the opportunities and risks, and a need to support the best approaches for navigating the policy maze.  The end goal is safe operation of these vehicles on our public roadways, which will support economic growth for new industries and a better quality of life for Texans.

Greg Winfree is the agency director of the Texas A&M Transportation Institute (TTI).  Ginger Goodin, P.E., is a senior research engineer and director of TTI’s Transportation Policy Research Center.


Related Content


    Ultimate responsibility must remain with the person(s) who made the decision to allow the machine to operate on its own. AND/OR the firm, etc. who programmed it. A device cannot make a moral decision. Some person had to make a decision if only in the programming. The human cannot be removed from the equation. At least not yet.

  • pwt7925

    I’m all for driver assist technology. I’m very leery of not being able to drive the car at all if I’m the only one in it. Tell me why this is a good idea.

  • Kozmo

    I vehemently object (for all the good that does) of being treated as a human guinea pig for these driverless car experiments designed solely for the benefit of big corporations and techno-geeks and heedless of the risk to flesh and blood people on the roads. Why isn’t there public debate on this FIRST? And a vote?!

  • Kozmo

    So we’ll go from 90% human error to 90% mechanical error? This is progress how?

    The same computers and digital technology that screws up your online banking, arbitrarily refuses to play your DVDs or CDs, and allows hackers entry into your own space will now be screwing around with your life on the streets. Count me OUT.

    • BCinBCS

      Wow, a conservative opposed to new technology.
      Who’d a thunk it?

    • oblate spheroid

      It’s progress because when you eliminate the human element the raw number of crashes goes down drastically. Would you rather have ten thousand crashes, of which nine thousand are human error, or one thousand crashes, nine hundred of which are mechanical error? Additionally, who ever said that once the human element is removed, 90% of crashes will be mechanical error? I’d bet any amount of money that the bulk of those future crashes will be acts of God and not a computer glitch.

      • WUSRPH

        How often per week does your computer lock up or otherwise not perform like it should….and you would trust it to drive your children to school……on fixed rails, maybe.

        • 0 times per week. And a better analogy is how many times does your phone lock up. How many times does a human driver lock up or crash? Multiple times a day, every day.

          • WUSRPH

            Then you obviously have a better computer than I do……

  • I suspect small-time operators will somehow end up responsible for these vehicles while bid businesses will of course escape responsibility entirely.

  • What do you base this on? “Although it’s reasonable to hope for improvement, we don’t yet have any evidence to support that hope.” Does Google’s thousands of hours of test drive data in real traffic not support that hope?