Tesla needs safeguards to prevent drivers from sleeping on 'Autopilot': U.S. senator

This post was originally published on this site

© Reuters. The Tesla Model S version 7.0 software update containing Autopilot features are demonstrated during a Tesla event in Palo Alto© Reuters. The Tesla Model S version 7.0 software update containing Autopilot features are demonstrated during a Tesla event in Palo Alto

By David Shepardson

(Reuters) – Democratic U.S. Senator Ed Markey asked Tesla Inc (O:) on Wednesday to disable its “Autopilot” driver-assistance system until it installs new safeguards to prevent drivers from evading system limits that could let them fall asleep.

“Tesla should disable Autopilot until it fixes the problem, Markey said at a Senate Commerce Committee hearing on advanced vehicle technologies.

Markey, who wrote to Tesla about the issue earlier this week, cited YouTube videos and press reports that suggested drivers could travel long distances without touching the steering wheel by using an object to defeat requirements that drivers should regularly touch the wheel “even if they are literally asleep.”

Markey cited a local news report that said a driver had fallen asleep behind the wheel as a Tesla drove 14 miles on Autopilot. Other unconfirmed videos on social media appear to show drivers sleeping behind the wheel of Tesla vehicles.

“That’s not safe. Somebody is going to die because they can go to YouTube as a driver – find a way to (get around safety requirements),” Markey said. “We can’t entrust the lives of our drivers and everyone else on the road to a water bottle.”

Acting National Highway Traffic Safety Administration (NHTSA) chief James Owens told Markey at the hearing Wednesday the agency would be in touch with Tesla about the issue.

Tesla says drivers must keep their hands on the wheel at all times, but many owners say they can use the driver-assistance system to conduct other tasks behind the wheel.

Tesla did not immediately comment but said in September that since 2018, it has “made updates to our system, including adjusting the time intervals between hands-on warnings and the conditions under which they’re activated.”

A series of crashes involving Autopilot has prompted U.S. investigations and criticism from the National Transportation Safety Board (NTSB).

In September, the NTSB said the Autopilot design was a key factor in a January 2018 crash of a Model S into a parked fire truck on a highway in California. The system’s design “permitted the driver to disengage from the driving task” in the 2018 crash and allowed him to remove his hands from the wheel for nearly all of the last 14 minutes of the trip, it said.

Tesla’s Autopilot was engaged during at least three fatal U.S. crashes, and two remain under investigation by NHTSA and NTSB.

Disclaimer: Fusion Media would like to remind you that the data contained in this website is not necessarily real-time nor accurate. All CFDs (stocks, indexes, futures) and Forex prices are not provided by exchanges but rather by market makers, and so prices may not be accurate and may differ from the actual market price, meaning prices are indicative and not appropriate for trading purposes. Therefore Fusion Media doesn`t bear any responsibility for any trading losses you might incur as a result of using this data.

Fusion Media or anyone involved with Fusion Media will not accept any liability for loss or damage as a result of reliance on the information including data, quotes, charts and buy/sell signals contained within this website. Please be fully informed regarding the risks and costs associated with trading the financial markets, it is one of the riskiest investment forms possible.

Add Comment