There are some scary similarities between Tesla’s deadly crashes linked to Autopilot

Human guinea pigs.
Human guinea pigs.
Image: Reuters/Beck Diefenbach
By
We may earn a commission from links on this page.

A second fatality allegedly involving Tesla’s Autopilot feature was recently reported in China, and it has some striking similarities to Tesla’s deadly Florida crash.

On Sept. 14, a Chinese state media broadcast from CCTV (link in Chinese) reported a deadly accident in the Heibei province that took place on Jan. 20. A Tesla driver on the highway had activated the autopilot feature, CCTV reported, only to crash straight into the back of a road-sweeping truck. The collision killed the driver, 23-year-old Gao Yuning.

Footage broadcast by CCTV from the crash, collected from what appears to be a dash cam, shows just how abruptly the accident occurred:

Gao’s family filed a lawsuit against Tesla in July asking for 10,000 yuan ($1,499) and legal costs, as well as the company’s admission that Autopilot was faulty. The company isn’t seeking a big financial settlement, the family’s lawyer told CCTV, it just wants people to know Autopilot is flawed.

In a statement to Quartz, Tesla acknowledged the accident but said it can’t confirm whether Autopilot played any role.

“Because of the damage caused by the collision, the car was physically incapable of transmitting log data to our servers and we therefore have no way of knowing whether or not Autopilot was engaged at the time of the crash,” a company spokesperson wrote. Tesla has tried to “work” with Gao’s father, the car owner, to learn more about the circumstances leading to the crash, but claims he “has not provided us with any information that would allow us to do so,” she wrote.

Even in the absence of this data, though, the crash has some striking similarities to the one that killed Tesla Model S driver Joshua Brown in May in Florida.

According to Tesla, the Florida crash happened after a white tractor trailer made a left turn onto the highway, and Autopilot apparently mistook the truck for open sky:

Neither Autopilot nor the driver noticed the white side of the tractor trailer against a brightly lit sky, so the brake was not applied.

If Gao’s vehicle was indeed using Autopilot during the crash, it appears to have made similar errors, including mistaking an obstacle straight ahead for the sky beyond, and failing to react as it grows closer. The China crash happened on a hazy day, and the orange of the truck ahead is mimicked by the haze in the sky beyond and an upcoming sign.

Image for article titled There are some scary similarities between Tesla’s deadly crashes linked to Autopilot

After analyzing the China crash video, Wade Trappe, professor of electrical and computer engineering at Rutgers University and IEEE Signal Processing Society member, told Quartz that “environmental conditions” like the thick haze in the video may have prevented Autopilot from working properly.

“Weather and lighting conditions definitely have an impact on camera-based systems and in this case the lighting is not an ideal ‘crisp’ daylight,” Trappe said. It also might have been misinterpreted a sign for the truck due to a color mix-up.

“There was also an orange sign that was just beyond the orange truck, this could also have thrown things off. Often lighting conditions make it more difficult for an algorithm to lock onto the presence of an object,” he adds, and the algorithm needs more data, in the form of frames of images, to work.

Just as in Florida, Gao’s video shows no sign of the vehicle slowing down, or of any attempt to steer away from impact as the obstacle approached, and police told CCTV the car hadn’t braked before impact. In their lawsuit, his family claims “the autopilot programme’s slow response failed to accurately gauge the road conditions ahead and provide instructions,” Reuters reports.

It’s possible that the car was driving too fast for Autopilot to work effectively. The collision takes place in “less than two seconds,” Trappe noted. “The lighting conditions could have been such that more video would have been needed to have detected the sweeping truck—and this would have needed the vehicle to travel slower so that more frames of video would have been provided to the algorithm,” he writes.

Tesla did not answer questions from Quartz about possible similarities between the two crashes or the circumstances leading up to them. Despite the name, the company considers Autopilot “semi-autonomous” and asks that riders be ready to take over from it at any time. The company is rolling out an update to Autopilot that will rely on radar, not cameras, to sense objects around its cars. CEO Elon Musk said on Sept. 11, when the update was announced, it would “very likely” have prevented the Florida accident.

Tesla has relied heavily on statistics suggesting that its Autopilot is safer than human drivers. But as its car customers double as human guinea pigs, the company’s bold experiment in self-driving vehicles is looking risky.