' My Car Broke the Law | MTTLR

My Car Broke the Law

As automated vehicles start to appear on the streets—and more right around the corner—what happens if and when these vehicles break the law? Are there situations where they should be allowed to break the law?

Most people envision that they will be able to get in a car, enter their destination, and take a nap. No driving. No driver. No worries.

While automation in transportation is approaching this point—first with automated safety features and advanced driver-assistance systems, and now the deployment of some level of automated vehicles on the market—many hard questions still remain.

One of these questions is if automated vehicles should follow every traffic regulation at all times. Although it seems like common sense that traffic regulations are created to protect drivers, and that automated vehicles should follow these rules, social norms dictate that driving today is not as simple as blindly following laws. Deviations from the traffic laws occur in many common situations. Common scenarios like speeding to merge into a narrow space on the highway or crossing a solid lane marker to avoid a collision occur when people habitually break the law without fear of enforcement. These are simply implicit common-sense behaviors that have become norms in the world of driving. Early testing of automated vehicles shows that not abiding by these norms can actually cause more accidents because other drivers do not expect certain strict adherence from vehicles on the road.

Car crashes caused almost 40,000 deaths in 2016, and human error caused a huge percentage of these accidents. Automated vehicles could potentially cut these deaths significantly, and early data shows that these vehicles are already safer than human drivers in many ways. But it will take time to saturate the roads with these vehicles. One major hurdle to getting more automated vehicles on the road is that people do not like how they drive; they are too passive and timid compared to how people usually drive.

In order to combat these situations and try to minimize accidents by making automated vehicles that behave in safe and predictable ways, it may be necessary to program them to break the law. One potential problem is if this creates tort liability for the manufacturers. For example, if an automated vehicle was merging onto a crowded highway, and the only safe gap to merge required exceeding the speed limit by 1 mile per hour, under today’s laws and tort liability, the manufacturers would likely be found negligent per se for violating the law.  This may be problematic if it requires vehicle manufacturers to place the highest priority on following the letter of the law. Imagine the previous scenario, but now the vehicle is programmed to never exceed the speed limit. Rather than merging onto the road, it may be forced to slam on its brakes in a manner that human drivers would not expect because slight acceleration obviously could prevent the issue.

Automated vehicles must prioritize safety first and foremost. Manufacturers should thus not prioritize strict adherence to traffic laws if that causes dangerous situations and accidents. One way to avoid these occurrences is to try to codify how automated vehicles should handle these situations, or at a minimum give some leniency for manufacturers to develop what they think will work best. Addressing situations where laws can give way to social norms and safety can help alleviate much of the uncertain liability as well as guide manufacturers to produce a product that meets the highest safety standards possible and consumer expectations. The Law Commission is seeking commentary to help develop a “Digital Highway Code.” The purpose of this code is to start to develop a machine-readable collection of traffic laws and conventions. This commentary is intended to focus on three scenarios where automated vehicles should potentially violate laws: mounting the curb, exceeding the speed limit, and edging through pedestrians.

To return to the original speeding scenario, if regulation gave auto manufacturers a permissible deviation from speed limits, the vehicle could avoid the situation. The Law Commission paper recommends following a number similar to police tolerance for speeding. The National Police Chiefs Council found this generally to fit the rule of “10% +2,” or 10% of the speed limit plus two miles per hour.  Matching these tolerances to how traditional drivers behave would help to minimize the unpredictability of autonomous vehicles.

Adoption of this scheme will not be quick. Yet it is likely more efficient to proactively address what violations of traffic laws are appropriate, rather than relying on the courts to decide the issue and then require manufacturers to redesign, retest, and update products. Attempting to codify how and when automated vehicles can violate the laws of the road should help to ease the transition of their widespread adoption.*

*Philip Brown is an associate editor on the Michigan Technology Law Review.

Submit a Comment

Your email address will not be published. Required fields are marked *