Third Deadly Tesla Autopilot Crash Renews Questions About System

0
12

Tesla’s Autopilot system was engaged throughout a deadly March 1 crash of a 2018 Mannequin three in Delray Seaside, Florida, in at the least the third deadly US crash reported involving the driver-assistance system, the Nationwide Transportation Security Board mentioned on Thursday.

The NTSB’s preliminary report mentioned the driving force engaged Autopilot about 10 seconds earlier than crashing right into a semitrailer, and the system didn’t detect the driving force’s palms on the wheel for fewer than eight seconds earlier than the crash. The crash sheared off the roof because the Tesla travelled below the semitrailer.

The crash renews questions concerning the driver-assistance system’s means to detect hazards and has sparked issues concerning the security of programs that may carry out driving duties for prolonged stretches of time with little or no human intervention, however which can not fully substitute human drivers.

Tesla mentioned in a press release that after the driving force engaged the system he “immediately removed his hands from the wheel. Autopilot had not been used at any other time during that drive.”

The automobile was travelling at about 68 miles (109 km) per hour (mph) on a freeway with a 55-mph (89-kph) velocity restrict, and neither the system nor the driving force made any evasive manoeuvres, the company mentioned.

The corporate added that “Tesla drivers have logged more than one billion miles with Autopilot engaged, and our data shows that, when used properly by an attentive driver who is prepared to take control at all times, drivers supported by Autopilot are safer than those operating without assistance.”

Whereas some Tesla drivers say they can keep away from holding the steering wheel for prolonged durations whereas utilizing Autopilot, Tesla advises drivers to maintain their palms on the wheel and listen whereas utilizing the system.

“Either Autopilot can’t see the broad side of an 18-wheeler, or it can’t react safely to it,” mentioned Friedman, a vice chairman for advocacy at Client Reviews. “This system can’t dependably navigate common road situations on its own and fails to keep the driver engaged exactly when needed most.”

David Friedman, a former appearing NHTSA administrator, mentioned the incident raises severe questions concerning the system and the dearth of restrictions on its use.

He mentioned Tesla “must restrict Autopilot to conditions where it can be used safely and install a far more effective system to verify driver engagement.”