With all the current talk about self-driving cars, as multiple carmakers and technical suppliers contribute to that building excitement (?) over such technologies, we must understand one absolute, irrefutable truth: There is no car on the market today that can drive itself. Period.
Once more, just in case you missed that bold print: The self-driving (autonomous) car does not yet exist.
Several models feature what some people call “semi-autonomous” capability, but this term brings up concerns about how carmakers name their features and technologies, especially in the realm of self-driving systems. And at the forefront of that concern point is Tesla’s Autopilot system.
Tesla Autopilot does the same things that other technically advanced driving-assistance systems do in cars from brands as wide-ranging as Chevrolet and Nissan to Cadillac and Mercedes-Benz. It blends adaptive cruise control that maintains your pace and distance relative to the car you’re following with a lane-keeping system that centers the car between the paint stripes on either side of your vehicle.
For Tesla’s Autopilot to work, the driver’s hands must be on the wheel. Even if the driver is not actively performing steering inputs, the system detects hands on the wheel. Remove your hands, and after a period of time (depending on driving circumstances and speed) the car will prompt you to take the wheel. If you don’t, the car will bring itself to a stop and turn on the hazard lights, assuming there’s a serious problem. Regardless, motorists retain all road and legal responsibility for the vehicle’s behavior, even when Autopilot is engaged.
In that intervening time when you’ve let go of the wheel and when the car begins to slow itself, a lot can still happen. A quick search on YouTube will find clips of drivers using Autopilot as a fully self-driving system with some arguably entertaining, yet dangerous results.
To be clear, Tesla Autopilot requires your hands to be on the wheel. Cadillac Super Cruise, however, does not, and is the only hands-free steering technology available to consumers at the start of 2019. With that said, the system is available only on the Cadillac CT6, can only be used on limited-access highways, and if you’re not looking straight ahead at the road, it will stop working.
Understanding Autopilot’s Limitations
In 2018, in Northern California, a nasty crash involving a Tesla Model X resulted in a fatality when the driver had engaged Autopilot, but the SUV hit a barrier. In fact, Tesla was subsequently kicked off of the National Traffic Safety Board (NTSB) after it divulged information about the circumstances immediately before the crash, and seemingly shifted blame onto the driver.
There have been other fatal cases involving Tesla’s Autopilot system, including the 2016 case of Joshua Brown in Florida, in which the Model S that he was driving crashed with Autopilot engaged. Granted, Brown had abdicated his responsibility to actually drive the car because he was allegedly watching a movie on an iPad instead of driving, but this illustrates the overarching point precisely.
Naming the system “Autopilot” conjures up the notion of fully self-driving capability in some people’s minds. Commercial aircraft fly over whole continents using “Autopilot,” after all. And even if Tesla owners read all the legalese connected with Autopilot and read their owner’s manual cover-to-cover, non-owners can and will drive Autopilot-equipped Teslas without the benefit of having read all the warnings.
After the Brown case in 2016, Tesla updated the Autopilot software, increasing the frequency of warnings to drivers not paying enough attention. But the system still operates essentially the same as before, and essentially the same as other systems now on the market. With Autopilot engaged, the car can operate steering, braking, and throttle, but the driver must take over when needed.
Though the NTSB did not find Tesla to be in direct fault in the 2016 Brown case, they stated that in this incident where Autopilot was engaged: “Tesla allowed the driver to use the system outside of the environment for which it was designed, and the system gave far too much leeway to the driver to divert his attention.”
More to the point on terminology, the Center for Auto Safety has stated that Tesla’s marketing efforts violate Section 5 of the Federal Trade Commission Act, citing that the marketing is “likely to deceive even diligent consumers, who would act reasonably in believing them, and are likely to use Autopilot differently than they would if Tesla employed more honest and transparent marketing and advertising strategies.”
And this criticism is global. In October 2016, Germany’s Federal Motor Transport Authority wrote to Elon Musk, chief executive officer at Tesla, asking that the company no longer use “Autopilot,” calling it a “misleading term” for its driver-assistance system.
Let’s review some facts:
- Tesla vehicles with Autopilot are not self-driving cars. Contrary to what the name implies, you cannot set a Tesla’s Autopilot system to drive the car from your home to Saskatchewan or even the local supermarket.
- With Tesla Autopilot engaged, you must still drive. The system will prompt you to take the wheel after a varying length of time based on speed and driving circumstances since you last gripped the wheel.
- All driver-assistance systems are developing. Every automaker is improving and expanding their cars’ capabilities to steer, brake, and maintain speed on the highway, and this assuredly includes Tesla. This is an area of rapid change and will eventually lead to a truly self-driving car. Tesla’s ability to make software changes over the air without the car visiting a workshop provides an advantage in this regard.
- You cannot buy a truly self-driving car anywhere in the world. Yet.