Cyber Monday Sale! 50% Off All Access

What Happens When Self-Driving Cars Crash? The Legal Ramifications of Automation Technology is developed to improve our lives. This has meant that many traditional industries are turning to automation. Among the most notable is the automobile industry. As new practices are developed around these technological advancements, how safe are we with this technology, and what happens when things don't work as designed?

By Hank Stout Edited by Micah Zimmerman

Opinions expressed by Entrepreneur contributors are their own.

Drivers commuting along I-45 between Dallas and Houston will likely notice new passengers on the road. Driverless semi-trucks.

The Houston Chronicle reported that Waymo and Daimler had joined together to test the viability of computer-controlled 18-wheelers. Previously, smaller tests were being run, but the companies have agreed to increase the total to 60 trucks. Other trucks from the partnership are being tested in other parts of Texas and Arizona. This latest move is part of the state's more significant recent economic focus.

If you see one of these trucks on the road, don't worry, a human driver is on board, ready to take over at any moment. The trucks are currently being tested for overall effectiveness and ability to respond to changing road conditions and stop at unpredicted times.

As our technological capabilities advance, we will continue to see innovation across every industry. The thought of computer-driven semi-trucks raises a serious question: how safe are these trucks?

Related: What's Under the 'Hood' of Self-Driving Cars?

In 2020, 12.6% of crashes on U.S. roadways were caused by large trucks. Many safety advocates have questions and concerns about the increased prevalence of these vehicles on roads. From July 2021 to May 2022, the National Highway Traffic Safety Administration reported 400 separate crashes related to or caused by vehicles with at least partial automated control systems.

Beyond just automated trucks, driverless vehicles are becoming more and more prevalent. Globally, the automated vehicle market is estimated to be worth $54 billion. The personal automated vehicle market is following a similar trend, and it's growing.

Audi recently revealed plans to spend $16 billion on self-driving cars by 2023. Regardless of how individual drivers feel about them, it does appear that large corporations are investing heavily. However, how fast the market accepts this growing trend remains to be seen.

Many drivers are finding the increasing prevalence of self-driving vehicles unsettling and concerning. According to PolicyAdvice, 43% of Americans are not comfortable inside a driverless car, citing safety as their chief concern. Many are still not comfortable with the idea of the United States moving towards this phase of automation.

Related: Elon Musk Announced Tesla's 'Full Self-Driving' Technology Will Cost $3,000 More

Despite concerns regarding automated vehicles, car manufacturers continue to try and demonstrate their safety. Although it's challenging to get accurate and relevant safety data regarding automated vehicles, concerned drivers can look to the state of California for some guidance. Since 2014, the state has reported 88 accidents involving driverless vehicles. Among those incidents, 81 of the accidents were caused by other vehicles. Sixty-two of the 88 vehicles were operating in fully autonomous modes.

Despite manufacturers trying to showcase the safety of driverless vehicles through data, public perception may be hard to move. A public appeal is a notoriously difficult opinion to shift. Even if autonomous vehicles cause only fender benders, the public outcry will likely be fierce.

Operating a motor vehicle can already be a dangerous practice. In 2020 alone, 38,824 individuals were killed in car accidents on U.S. roadways. It's a natural reaction to be skeptical and a bit hesitant to turn over control of a vehicle knowing the prevalence at which fatal accidents occur. Even eliminating deadly accidents, there are an average of 17,000 or more car accidents a day in the U.S. These accidents can result in sustainable financial damages to the at-fault party, emotional trauma and related lingering injuries.

Related: A Devastating Car Accident Left This Entrepreneur Unable to Speak or Walk for Months. Here's How She Rebuilt Her Life and Her Business.

The Centre for Data Ethics and Innovations in England surveyed drivers in the UK and found that, as predicted, most drivers aren't ready to turn control over so easily. They found that until trust can be built between consumers and manufacturers, most drivers will unlikely accept automated vehicles on roadways.

Who ultimately takes the blame if a driver is struck by a driverless vehicle? The owner? Or the manufacturer for selling a product that couldn't protect the user, in this case, the passenger in the automated vehicle. For now and the foreseeable future, owners of automated vehicles are held to the same standard as their traditional counterparts in the U.S.

These are the legal questions that will likely begin to become more and more relevant. Specifically, the rise of product liability lawsuits may rise in the future. Some states in the U.S. are already beginning to adopt such laws. In this case, the manufacturer could be seen as the "driver" in some situations. Ultimately, anyone looking to prove fault on behalf of an artificial intelligence or computer program will have a lengthy and difficult battle ahead of them. Limited precedent makes this a difficult theory to argue in a court of law, but not impossible.

Related: Grand Theft Auto V Is Helping Teach Driverless Car AI

Product liability law traditionally covers products that result in bodily harm or general injury to the user. Anyone arguing this side of the law must establish proof of injury, proof of a defect, proof of appropriate use and ultimately, a clear connection between injury and defect. The product not working as advertised does not constitute liability on the manufacturer's side. The product in question must result in injury, illness, or general harm to the user or other relevant parties for there to be a case.

As more and more autonomous vehicles take to the roads, these issues will become more common, and the legal impact will be made clearer.

Hank Stout

Entrepreneur Leadership Network® Contributor

Attorney at Law

Hank Stout co-founded Sutliff & Stout, Injury & Accident Law Firm because he wanted to help real people with real problems. Raised in a small West Texas town, Hank was taught the value of hard work, determination, fairness and helping others.

Want to be an Entrepreneur Leadership Network contributor? Apply now to join.

Business News

Elon Musk Still Isn't Getting His Historically High Pay as CEO of Tesla — Here's Why

A second shareholder vote wasn't enough to convince Delaware judge Kathaleen McCormick.

Leadership

Leadership vs. Management: How to Understand the Difference and 6 Ways to Bridge the Gap

Here are the key differences between leadership and management, highlighting their complementary roles and providing six strategies to develop managers into future leaders.

Legal

How Do You Stop Porch Pirates From Stealing Christmas? These Top Tips Will Help Secure Your Deliveries.

Over 100 million packages were stolen last year. Here are top tips to make sure your stuff doesn't get swiped.

Growing a Business

Her Restaurant Business Is Worth $100 Million — Here's Her Unconventional Advice for Aspiring Entrepreneurs

Pinky Cole, founder of Slutty Vegan, talks about going from TV producer to restaurant owner, leaning into failure and the value of good PR.

Business News

'Something Previously Impossible': New AI Makes 3D Worlds Out of a Single Image

The new technology allows viewers to explore two-dimensional images in 3D.

Business News

Tesla Cybertruck Factory Workers Reportedly Told 'You Do Not Need to Report to Work' for 3 Days This Week

According to a memo first viewed by Business Insider, Tesla factory workers in Austin were reportedly told to stay home Tuesday through Thursday.