NTSB to Determine Cause of Fatal Tesla Crash

can be reached at nwakelin@gmail.com
can be reached at nwakelin@gmail.com
Share on facebook
Share on twitter
Share on pocket

The U.S. National Transportation Safety Board (NTSB) is planning to meet to determine the cause of a fatal crash involving a Tesla, according to Reuters. The crash occurred in 2018 in Mountain View, California. The NTSB is meeting on February 25 for a hearing on the matter.

  • The NTSB is meeting to determine the cause of a fatal Tesla crash in 2018.
  • The driver of the vehicle was using Autopilot at the time of the crash.
  • This is just one of several investigations involving Tesla crashes with Autopilot engaged.

The crash in question involved a 2017 Tesla Model X driven by 38-year-old Walter Huang. He was using Autopilot at the time of the crash, but it’s uncertain if this semi-autonomous driving system was in some way at fault.

Several Tesla crashes are currently being investigated. (Photo: Getty Images)

Multiple Investigations

The NTSB and the National Highway Traffic Safety Administration (NHTSA) are investigating several crashes involving Teslas that were using Autopilot due concerns about whether the it’s safe. Self-driving technology of varying degrees is available in vehicles from many automakers, but Tesla’s Autopilot receives extra attention.

A Wired story from 2018 noted that every crash involving Tesla’s Autopilot makes headlines. These headlines make the public fearful about trusting any semi-autonomous features. This leaves safety agencies scrambling to reassure the public and to make sure the technology is truly safe.

One of the concerns with Autopilot is its ability to detect hazards, especially stationary objects. NHTSA is launching an investigation into an accident involving a Tesla in December where a Model 3 collided with a parked fire truck and a passenger died.

It’s also investigating a second fatal crash from December involving a Tesla Model S that ran a red light and hit another vehicle, killing both its occupants. These types of crashes are why Autopilot requires a driver who is paying attention and has his hands on the wheel ready to take over at any time.

The need for human involvement to supervise today’s autonomous driving systems is an additional source of concern. Although drivers are supposed to pay attention, there’s no guarantee they are paying attention. There are also questions about whether using something like Autopilot for a long period of time let drivers mentally disengage from the driving process making them more prone to missing a situation that requires action.

Today’s self-driving technology requires an attentive driver, but not everyone is paying attention. (Photo: Getty Images)

Are You Paying Attention?

Both Tesla and NHTSA instruct drivers to keep their hands on the wheel and pay attention because Autopilot is semi-autonomous, not fully autonomous. The same is true of systems like Nissan’s ProPilot Assist, which warns drivers to take the wheel if it detects someone is driving handsfree. Cadillac’s Super Cruise takes a slightly different approach. It allows handsfree driving but monitors the driver’s head position to ensure he’s looking at the road.

Insurance Institute for Highway Safety testing found that not only are all systems not created equal, but that they don’t always drive as well as human drivers. They looked at features including adaptive cruise control and active lane-keeping assist in controlled conditions. The results were mixed.

These semi-autonomous systems were sometimes too cautious and braked unnecessarily. Other times the technology had trouble reading the edges of the roadway and sent cars toward the curb for no reason. Even automatic emergency braking, which is a proven life-saving feature, sometimes doesn’t work correctly, according to USA Today.

This is why these systems are all billed as semi-autonomous. There’s no system that currently allows the driver to take his hands from the wheel and read a book or go to sleep. Agencies have repeatedly warned consumers and so have automakers, but accidents involving systems like Autopilot seem to indicate that the warnings aren’t enough.

That leaves safety agencies in a tough spot. They need to figure out if the technology is at fault and what exactly went wrong along with figuring out how to keep things from going wrong again. That’s the challenge facing agencies and automakers if they hope to keep semi-autonomous features in their vehicles rather than being told they are no longer allowed.

WHY THIS MATTERS

Semi-autonomous technologies make driving safer and less stressful, but only if they work and are used correctly. Automakers and safety agencies need to figure out a way of educating the public and ensuring compliance in order to prevent potentially fatal crashes.


About the Author

can be reached at nwakelin@gmail.com
Close Menu

We use cookies and browser activity to improve your experience, personalize content and ads, and analyze how our sites are used. For more information on how we collect and use this information, please review our Privacy Policy. California consumers may exercise their CCPA rights here.