Black Friday Sale! 50% Off All Access

Are Autonomous Vehicle Tests a Public Hazard? Unlike other technologies, testing of self-driving cars on public roads could cause safety concerns and have an immediate negative impact on society.

By Doug Newcomb

This story originally appeared on PCMag

Ford

At the dawn of the Internet, could anyone have expected trolling and cyberbullying -- and that a future POTUS would engage with the nation 140 characters at the time? It can be difficult to anticipate the downsides of revolutionary new technologies.

Among the potential drawbacks of autonomous vehicles -- including massive job losses among commercial drivers and possibly even more traffic from robo-taxis -- is what's known as the "trolley problem." This refers to ethical decisions a self-driving car will have to make when confronted with no-win situations, such as whether to run over an individual who darts into its path on a narrow crowded city street or swerve to either side and plow into pedestrians on the sidewalk.

Even though California Polytechnic State University philosophy professor Patrick Lin has been conducting real-world tests on this dilemma in conjunction with the Center for Automotive Research at Stanford, he acknowledges that "something feels dishonest about the moral panic over self-driving cars" and the trolley problem since the scenario is so far-fetched. But as Lin points out, unlike other technologies, the testing of autonomous vehicle on public road could have an immediate negative impact on society.

It gets complicated

Lin contends that while technology developers typically beta-test new products and features with few restrictions and laws, "it gets complicated when their software interacts directly with the larger physical world" -- in this case when controlling a multi-ton machine in public. Lin contends that in the case of autonomous vehicle testing on public roads, "the products don't just directly affect users alone, as is the case with most other gadgets and services."

As an example, Lin points to the crowd-sourced navigation app Waze. It's "giving rise to complaints about 'flocking' behavior: swarms of cars sent by its algorithms through quiet neighborhoods not designed for heavy traffic," he says.

This "could increase risk to children playing on these streets, lower property values if road noise is louder and create other externalities. This means navigation apps are making risk-decisions that users might be unaware of but arguably should be."

Self-driving cars similarly "will need to autonomously select their routes … and there's often not just one correct way to go," Lin adds. But he believes that the larger problem is determining who will be held accountable for the negative impact on traffic or public safety that navigation apps such as Waze cause -- and by extension self-driving cars testing on public roads.

"It's a tragedy of the commons: No one is in the driver seat on how navigation algorithms should be harmonized with society," he says. "So if there is a negative impact on traffic or public safety, it's hard to pin down responsibility for these effects."

The question of who will be responsible in a self-driving car crash -- the automaker, software developer, sensor or map maker -- is one of the issues holding back the technology's widespread adoption, although solutions are being developed. Given that self-driving car tests are already being conducted on public roads, Lin believes what he calls "human-subject research" should perhaps obtain prior clearance by an ethics board.

Cities have begun responding by designating roads as "no thru traffic" zones accessible only to residents and requiring Waze to designate them as off-limits to passing motorists. Some angry and enterprising residents are even going rogue and hacking nav apps like Waze with phantom wrecks, to divert traffic away from their neighborhoods.

Lin stresses that his purpose isn't to create "an argument against technology, but only to seriously reflect on its growing power and implications, as test-cases move through the courts of law and public opinion" -- and as more self-driving cars take to public roads.

Doug Newcomb

Columnist

Want to be an Entrepreneur Leadership Network contributor? Apply now to join.

Living

These Are the 'Wealthiest and Safest' Places to Retire in the U.S. None of Them Are in Florida — and 2 States Swept the List.

More than 338,000 U.S. residents retired to a new home in 2023 — a 44% increase year over year.

Business News

These Are the Highest Paying Jobs Available Without a College Degree, According to a New Report

The median salaries for these positions go up to $102,420 per year.

Starting a Business

This Sommelier's 'Laughable' Idea Is Disrupting the $385 Billion Wine Industry

Kristin Olszewski, founder of Nomadica, is bringing premium wine to aluminum cans, and major retailers are taking note.

Starting a Business

He Started a Business That Surpassed $100 Million in Under 3 Years: 'Consistent Revenue Right Out of the Gate'

Ryan Close, founder and CEO of Bartesian, had run a few small businesses on the side — but none of them excited him as much as the idea for a home cocktail machine.

Business News

DOGE Leaders Elon Musk and Vivek Ramaswamy Say Mandating In-Person Work Would Make 'a Wave' of Federal Employees Quit

The two published an op-ed outlining their goals for their new department, including workforce reductions.

Business News

Is Reddit Down Again? Tens of Thousands of Users Are Reporting Issues With the Platform.

A Reddit outage has been occurring off-and-on for two days.