TR Member Perks!

The US National Highway Traffic Safety Administration(NHTSA) has opened an investigation into Tesla Motors 2015 Model S cars after a driver died in a crash with Autopilot engaged. The crash happened on May 7 in Williston, Florida. According to preliminary investigation by the NHTSA, the crash occurred when a tractor-trailer made a left turn in front of the Model S car at an intersection. The agency has stated that the incident, “calls for an examination of the design and performance of any driving aids in use at the time of the crash.” This investigation will be a necessary step for the agency to determine if the vehicles should be recalled for safety reasons.

In a blog post on the incident, Tesla has stated this is the first known fatality in a Tesla vehicle with Autopilot engaged. Tesla also states that the NHTSA investigation is just a preliminary action to determine if the system was working as intended. The post also contains an account of the crash. It states:

What we know is that the vehicle was on a divided highway with Autopilot engaged when a tractor trailer drove across the highway perpendicular to the Model S. Neither Autopilot nor the driver noticed the white side of the tractor trailer against a brightly lit sky, so the brake was not applied. The high ride height of the trailer combined with its positioning across the road and the extremely rare circumstances of the impact caused the Model S to pass under the trailer, with the bottom of the trailer impacting the windshield of the Model S. Had the Model S impacted the front or rear of the trailer, even at high speed, its advanced crash safety system would likely have prevented serious injury as it has in numerous other similar incidents.

Tesla also states that the Autopilot feature is imperfect and still in beta testing. Due to its imperfect nature, every time the Autopilot is engaged, drivers are instructed to keep their hands on the wheel and remain ready to take control of the vehicle. It states the Autopilot logic is continually improving as more miles are accumulated, but it still requires the driver to remain alert. Tesla concluded that “when used in conjunction with driver oversight, the data is unequivocal that Autopilot reduces driver workload and results in a statistically significant improvement in safety when compared to purely manual driving.”

Is this evidence that Tesla’s Autopilot mode is unsafe, or just an unfortunate accident? Leave your comments below.

Max Michael

Senior Writer

I’m a technology reporter located near the Innovation District of Kitchener-Waterloo, Ontario.

  • BurntToShreds

    I wonder how many former oil company and classic auto industry execs are working at the NHTSA right now? Seriously, this seems like a one-time freak accident brought on by a bizarre situation rather than some system-ingrained error.

  • I’d consider that one a freak accident

  • It’s only evidence that something happened at one point involving a self-driving vehicle. In fact, that this has happened will probably give Tesla (and others) some much-needed data to feed their AI with so this can be prevented in the future.

  • Clairity

    Every time we hear of a crash or accident or safety issue involving self driving cars, it seems like people freak out. Despite that, they just accept as a fact of life that traditional cars get in accidents all the time and have nasty safety issues every so often.

  • morzinbo

    Didn’t even realize Tesla had autopilot.

  • Riosine

    So a big truck managed to fool the 4 sensors; camera, radar, ultrasonics, and GPS, of a tesla model s car.

    Ok lets speculate what could have go wrong
    -the camera went blind by light saturation

    -the GPS failed to update because high car speed and latency
    -and the radar and ultrasonic scan got diffracted

    A simple red laser as a sensor would have been enought to detected if there were something directly in front of it

  • Cytos Lpagtr

    it was bound to happen eventually. the big benefit of self driving cars is that accidents need only happen once, and then all self driving cars can be adapted with a patch so those dont happen again. you know, unlike humans.

    still i think its right they are investigated, after all they might have missed something that someone on the outside can point to and say “this needs to change”. Tesla might be the only car i would buy new if i had the money today, but that does not make them infallible.

  • noun

    This is a joke and these “intelligent people” are idiotic