Business

Teslas with Autopilot Move Closer to Being Recalled

DETROIT — Teslas with partially automated driving systems are a step closer to being recalled after the U.S. elevated its investigation into a series of collisions with parked emergency vehicles or trucks with warning signs.

The National Highway Traffic Safety Administration announced Thursday that the Tesla probe is being upgraded to engineering analysis. This indicates increased scrutiny for the Tesla maker as well as automated systems capable of performing at least some driving tasks.

The final step in an investigation is engineering analysis. In most cases, NHTSA determines within one year whether a recall should be made or the probe should close.

Documents posted Thursday by the agency raise some serious issues about Tesla’s Autopilot system. The agency found that it’s being used in areas where its capabilities are limited, and that many drivers aren’t taking action to avoid crashes despite warnings from the vehicle.

This probe covers almost all the Austin, Texas-based carmakers that have sold vehicles in America since 2014.

NHTSA stated that it had found 16 accidents involving emergency vehicles or trucks with warning signs. This resulted in 15 injuries and one fatality.

Investigators will evaluate additional data, vehicle performance and “explore the degree to which Autopilot and associated Tesla systems may exacerbate human factors or behavioral safety risks undermining the effectiveness of the driver’s supervision,” the agency said.

On Thursday, a message was left requesting comment from Tesla.

Continue reading: Tesla Inc. is One of 2021’s Most Influential Businesses

The Teslas gave forward collision alerts to all 16 crash victims just prior to impact in the vast majority of cases. In about half of the cases, automatic emergency brake intervened to slow down the vehicles. According to NHTSA documents, Autopilot lost control of Teslas in less than one second after the crash.

NHTSA also said it’s looking into crashes involving similar patterns that did not include emergency vehicles or trucks with warning signs.

In many instances, the agency discovered that drivers were not taking action to prevent a collision from happening, despite having their hands on Tesla’s steering wheel. This suggests that drivers are complying with Tesla’s system that makes them keep hands on the wheel, the agency wrote. Yet this doesn’t necessarily make sure they’re paying attention.

According to the agency, in accidents where video has been available, first responders should be seen by drivers at least eight seconds before impact.

Before recalling the vehicle, the agency should determine whether Autopilot has a safety problem.

Investigators also wrote that a driver’s use or misuse of the driver monitoring system “or operation of a vehicle in an unintended manner does not necessarily preclude a system defect.”

The agency document all but says Tesla’s method of making sure drivers pay attention isn’t good enough and that is a safety defect that should be recalled, said Bryant Walker Smith, a University of South Carolina law professor who studies automated vehicles.

Continue reading: Elon Musk and the Tech Bro Obsession With ‘Free Speech’

“It is really easy to have a hand on the wheel and be completely disengaged from driving,” he said. Monitoring a driver’s hand position is not effective because it only measures a physical position, he said. “It is not concerned with their mental capacity, their engagement or their ability to respond.”

Similar systems from other companies such as General Motors’ Super Cruise use infrared cameras to make sure a driver is looking forward. Walker Smith explained that these systems may not allow drivers to zone in, but they can still be used.

“This is confirmed in study after study,” he said. “This is established fact that people can look engaged and not be engaged. You can have your hand on the wheel and you can be looking forward and not have the situational awareness that’s required.”

In total, the agency looked at 191 crashes but removed 85 of them because other drivers were involved or there wasn’t enough information to do a definite assessment. The main reason for about 25% of all crashes is Autopilot being used in places where its capabilities are limited or it can be affected by conditions. “For example, operation on roadways other than limited access highways, or operation in low traction or visibility environments such as rain, snow or ice,” the agency wrote.

Some automakers restrict the access to their systems only on divided roads.

The National Transportation Safety Board, which also has investigated some of the Tesla crashes dating to 2016, has recommended that NHTSA and Tesla limit Autopilot’s use to areas where it can safely operate. NHTSA should also require Tesla to use a more effective system to ensure that drivers pay attention. NHTSA is yet to act on these recommendations. NTSB does not have any enforcement power and is limited to making recommendations to federal agencies.

The NTSB in 2020 blamed Tesla and drivers, as well as lax NHTSA regulation for the two crashes where Teslas on Autopilot collided with tractor-trailers. NHTSA was accused by the NTSB of being a contributor to the accident for not ensuring that automakers had safeguards in place to restrict the use of electronic driving system.

Following an investigation into a fatal crash involving a Tesla Model 3 driver aged 50 in Delray Beach Florida in 2019, this agency reached its conclusions. Autopilot was running and the car was not able to stop or avoid any tractor-trailers.

In a statement, NHTSA said there aren’t any vehicles available for purchase today that can drive themselves. “Every available vehicle requires the human driver to be in control at all times, and all state laws hold the human driver responsible for operation of their vehicles,” the agency said.

The agency stated that driver-assist technology can be helpful in avoiding crashes, but should only be used responsibly.

NHTSA began its inquiry in August of last year after a string of crashes since 2018 in which Teslas using the company’s Autopilot or Traffic Aware Cruise Control systems hit vehicles at scenes where first responders used flashing lights, flares, an illuminated arrow board, or cones warning of hazards.

Read More From Time


Reach out to usAt letters@time.com

Tags

Related Articles

Back to top button