Does Tesla Autopilot Mode Enable Irresponsible Driving Behavior? [Op/Ed]

tesla autopilot feature demonstrated

Tesla introduced advanced Autopilot features to the Model S with the version 7.0 software update at the beginning of October. Autopilot mode — when enabled — can keep the high performance, all-electric sedan in its lane, maintain proper distance and speed in relation to vehicle traffic ahead, avoid highway barriers and other obstacles. It is not capable of reading stop signs or interpreting traffic signal lights. Elon Musk, CEO of Tesla, has stated that Autopilot is a “hands-on” feature, which means that the Autopilot system is expecting steering input from the driver about every five minutes or so. Unfortunately, there are drivers who have been abusing the semi-autonomous mode with stupid stunts like sitting in the back seat with no driver at the wheel, shaving, and even speeding on public roads with Autopilot enabled.

Instructions to owners say “autosteer is a hands-on feature. You must keep your hands on the steering wheel at all times.”

tesla autopilot display

Musk describes Autopilot as “beta” software, which generally means that the software is in its second phase of testing and a sample of the intended audience uses the product with expectations that they are using a pre-release version. I suspect most Tesla owners who are using this new feature are not software engineers who don’t understand the full meaning of beta testing, and are not aware of complexity of getting a car to drive for you.

Google has spent years developing its self-driving car technology by logging thousands upon thousands of miles, collecting data along the way, and then programming an insane number of anticipated traffic scenarios and obstacles into an artificial intelligence system that collectively adds up to 40 years of driving experience. Yet, Google expects it will be another four years before their self-driving car is ready to sell on the open market. How is it that Tesla distributes “beta” software to its customers who are actively driving public roads everywhere, while Google studies showed that people quickly become dangerously ignorant of their surroundings if not actively engaged in driving.


Elon Musk comments on legal implications when using Tesla Autopilot feature.

Musk maintains that ultimate responsibility is with the driver for maintaining control at all times. Actually, that should be true of every single driver on the road today. But if we can’t trust people to stop paying attention to the their smartphone while driving, what is preventing Tesla drivers from using Autopilot improperly? Yes, Musk and Tesla can point the blame to anyone except themselves by chance an accident does occur through the use of Autopilot, but that hypothetical accident may have been prevented altogether if the driver kept his or her hands on the wheel in the first place and stayed focused on the task of driving.

Musk has said he believes fully autonomous cars are inevitable. Even though Tesla builds conventional cars, it looks like he is committed to gradually adding more and more autonomy while relying on humans to stay alert and sensible — a path Google says is too dangerous.

On a November 2 call with investors, Musk recognized the unsafe potential and said additional constraints will be added to Autopilot. Musk did not elaborate on the restrictions to Autopilot or when those changes will appear with the automatic updates the Modes S receives weekly. One can only hope it occurs in the very near future.

How do you feel about Tesla’s beta version of Autopilot being distributed to its customers before thorough testing was done? Does Tesla’s Autopilot feature – with the current restrictions and disclaimers in place – allow too much inattentive control of the car and enable irresponsible driving behavior?

Note: 7.0 software update with the “beta” version of Autopilot only applies to the Model S produced October 2014 or later, and the Model X.


Shaving on the autobahn

 

Autopilot speeding ticket; demo on streets, highway, traffic