Tesla’s Navigate Breaks Driving Laws, Consumer Reports Finds

  • Nick Jaynes has worked for more than a decade in automotive media industry. In that time, he's done it all—from public relations for Chevrolet to new-car reviews for Mashable. Nick now lives in Portland, Oregon and spends his weekends traversing off-road trails in his 100 Series Toyota Land Cruiser.

can be reached at nickjaynes@gmail.com
  • Nick Jaynes has worked for more than a decade in automotive media industry. In that time, he's done it all—from public relations for Chevrolet to new-car reviews for Mashable. Nick now lives in Portland, Oregon and spends his weekends traversing off-road trails in his 100 Series Toyota Land Cruiser.

can be reached at nickjaynes@gmail.com
Share on facebook
Share on twitter
Share on pocket

Is Tesla’s new Autopilot Navigate feature worse than a first-time teenage driver? According to Consumer Reports it is.

  • Tesla recently released its Navigate feature as an update to Autopilot.
  • After real world testing, Consumer Reports found it not only to be inadequate, the autonomous system performed illegal driving maneuvers.
  • Consumer Reports has taken the position that systems such as Autopilot need to show evidence of sufficient testing before being released to the public.

The consumer advocacy brand recently tested Tesla’s new Navigate feature, which is part of an update to the brand’s Autopilot semi-autonomous driving system. Essentially, Navigate is designed to handle all freeway driving duties, from autonomous lane-changes to outright passing of slower vehicles. The results from Consumer Reports’ tests are worrying to say the least.

Consumer Reports’ reviewers found that, not only did Navigate execute illegal right-lane passing maneuvers, it also didn’t leave enough space between the Tesla and other vehicles on the road during automatic merging and lane-changes. This resulted in testers intervening to prevent the Navigate system from, as they put it, “making poor decisions.”

The report gets more damning from there.

“The system’s role should be to help the driver, but the way this technology is deployed, it’s the other way around,” Jake Fisher, Consumer Reports’ senior director of auto testing is quoted as saying in the article. “It’s incredibly nearsighted. It doesn’t appear to react to brake lights or turn signals, it can’t anticipate what other drivers will do, and as a result, you constantly have to be one step ahead of it.”

Consumer Reports concluded that allowing Navigate to autonomously change lanes wasn’t any easier than simply doing it yourself. In fact, the reviewers found the experience of allowing a Tesla to do the driving more stressful than doing it themselves. As a result, they likened it to monitoring the driving of a first-time driver.

Engaging Tesla Autopilot
New Tesla Navigate knows when the vehicle is on a freeway through navigation information. When engaged, it will automatically signal and change lanes — or take off ramps — to keep on the navigation route. | Photo: Tesla

Funnily enough, because reviewers found Navigate’s driving style so unsafe, Consumer Reports is now espousing a position similar to one I’ve been preaching for years: “Before selling these systems, automakers should be required to give the public validated evidence of that system’s safety—backed by rigorous simulations, track testing, and the use of safety drivers in real-world conditions,” the report read.

Given the facts that A.) Tesla owners have died behind the wheel using Autopilot, and B.) Navigate is apparently a woefully deficient system, it’s painfully obvious to me that there is no way that Tesla will achieve Level 5 autonomy and over a million robotaxis on the road by the end of 2020, as company CEO Elon Musk recently claimed.

Thankfully, I am not the only one who thinks Musk is spewing nonsense. Uber’s CEO Dara Khosrowshahi doesn’t think Tesla can hit that lofty timeline either.

Irrespective of the feasibility of Tesla’s autonomous timeline, the fact remains that the company designed and launched an automated driving feature that breaks the law. That’s just insane.

Although I have long advocated that legislators stay out of the way of automated driving tech, for fear they’d hinder its development with unforeseen legal barriers, I believe it’s time they step in. If Tesla can create a system that is not only markedly unsafe but programmed to disregard driving laws, the market (i.e. Tesla) has clearly been left to its own devices for too long. It needs to be regulated.

Remember, the ultimate goal of automated driving tech is to save lives — not endanger them. If Autopilot or any other system flies in the face of that, it needs to either be shut down or highly regulated.


About the Author

  • Nick Jaynes has worked for more than a decade in automotive media industry. In that time, he's done it all—from public relations for Chevrolet to new-car reviews for Mashable. Nick now lives in Portland, Oregon and spends his weekends traversing off-road trails in his 100 Series Toyota Land Cruiser.

can be reached at nickjaynes@gmail.com
Close Menu