Self-Driving Cars and the (Non)Race to Regulation

Self-Driving Cars and the (Non)Race to Regulation Transportation has come a long way over the past decade. We are now on the verge of having fully self-driving cars on the roads. Whether or not we should have such vehicles, of course, is an entirely different matter.

There are many dangers associated with self-driving cars and autonomous tech which could be reduced (if not eliminated) by proper regulation. Unfortunately, despite a lot of chest thumping and gnashing of teeth in the halls of Congress, we still have not seen much regulation at all – and manufacturers are not helping the problem.

The problem with Tesla

Recently, a video went viral of a Tesla being driven without anyone in the driver’s seat. The owner was seated in the back seat. (Although the video was only six seconds, law enforcement was able to make an arrest.) There are some concerns when it comes to Tesla’s autopilot features as it is unclear as to what extent it can actually self-drive. It seems as though the general public has a certain idea of what the feature is supposed to do that does not always correlate with what the company states it does.

Another cause of confusion is the bold statements that Elon Musk has made to the general public. Musk has been promising over the last five years that Tesla will go fully autonomous by the end of the year. That has yet to happen, though it clearly has not stopped people from believing it has.

In truth, Teslas are not as advanced as other self-driving cars on the market. Waymo, for example, has more elaborate technology than Tesla. In many instances, the Tesla auto-pilot and self-driving features are sold as an upgrade to the vehicle. This often causes some confusion as to what the differences are and how the technology is used.

Lawmakers struggling to keep up

When it comes to the legal aspect of self-driving cars, things get even more complex. States make their own rules and regulations for the testing of self-driving vehicles. On the federal level, the NHTSA makes the rules. It seems, however, that not enough is being done. This is why Congress keeps pushing the Transportation Department to set legislation regarding autonomous features.

Although this is the right first step forward, lawmakers are having a difficult time keeping up with the ever-advancing tech that car companies continue to introduce. This leaves the space open for drivers to make their own rules while legislation catches up. This can only end in dangerous situations becoming more and more common.

How does self-driving tech work in cars?

Currently on the market are semi-autonomous features that enhance the driving experience but do not replace the need for a driver. (There are no fully autonomous cars on the market right now, but they are in development.) These enhancements essentially use different sensors around the car to observe hazards and movements on the road.

The sensors then use this information to make “decisions” for the driver and vehicle regarding when to slow down or stop, how to avoid a collision, and when to course-correct in a lane. More and more features are slowly being rolled out each year requiring the driver to do less and less.

The NHTSA has a tier system for autonomous features. Vox’s Emily Stewart described that process this way:

Right now, the automation systems that are on the road from companies such as Tesla, Mercedes, GM, and Volvo, are Level 2, meaning the car controls steering and speed on a well-marked highway, but a driver still has to supervise. By comparison, a Honda vehicle equipped with its “Sensing” suite of technologies, including adaptive cruise control, lane-keeping assistance, and emergency braking detection, is a Level 1.

This should give you an idea of where car companies are in the race to autonomy. Many drivers feel that their car is near autonomy when in all reality it is actually at Level one on the scale. Tesla is at Level two on the autonomy scale.

Are there any standards for autopilot features?

There is currently no guide or outline for the rollout of autopilot features. Congress is calling on the NHTSA to further develop guidelines and rules for car makers to implement. There are some testing protocols in place for specific features like collision warning and emergency braking. However, more needs to be done.

Tesla’s autopilot and accidents

While the backseat driver in the viral video suffered no harm (and thankfully, neither did anyone else) that is not always the case. In the last six years since Telsa’s autopilot was introduced, it has been implicated in 11 deaths in the U.S., and another nine deaths abroad. The most recent deaths involved two passengers in a 2019 Tesla Model S, which took a curve wrong and crashed into a tree before bursting into flames. Elon Musk and other executives disputed the claim that the accident was partially caused by their autopilot feature, but investigators found that “one of Autopilot’s features was active” at the time of the crash, per CNN Business. There are currently 23 crashes under investigation by the NHTSA that involve the use of Tesla Autopilot.

The Minot car accident attorneys at Larson Law Firm P.C. represent clients who have sustained injuries when their cars’ safety tech fails, and when other drivers’ negligence causes them harm. Car companies should not get away with marketing and promoting features that are not completely tested and safe for the general public. They should be held liable for their role in car accidents. Negligent drivers should also be held responsible for the misuse of these features. Contact us today at 701-484-HURT, or via our online contact form to discuss your accident. We operate offices in Minot and Bismarck to better serve you.