What does the fatal Tesla crash mean for Autopilot?

tesla crash police report

On June 30th Tesla posted “A Tragic Loss” on their blog regarding a fatal Tesla crash. This was the terribly sad news that a long standing member of the Tesla community had been killed while operating his Model S Tesla with Autopilot engaged.

Why did the Tesla crash?

At the time of the accident, the Model S was operating in Autopilot mode on a highway. This means the driver was not directly in control of the vehicle with all input commands instead being managed by on-board computers. When approaching a junction on the highway, the white trailer of a truck blended into the brightly lit sky, leaving the Tesla Autopilot system and the driver blind to the impending collision. For this reason, the brake was not applied allowing a collision to occur.

Impact sensors would have likely reduced the level of impact should the collision have been front-on, but as the truck was turning, the Tesla Model S sensors aligned to the gap under the trailer, allowing the car to drive under, with its roof colliding with the trailer. It has also been claimed that the driver was not fully paying attention at the time of the accident, which does breach Tesla’s guidelines for Autopilot operation. The driver of the truck has stated that he believed the Tesla driver was watching a Harry Potter movie and didn’t see the danger. This is not possible on Tesla systems, but a third party DVD player was found in the Model S.

Is this bad for Tesla?

Yes, but first of all everyone needs to take a step back and remember that a life has been lost here. The driver, Joshua Brown, was an avid Tesla supporter who just months ago posted a video on YouTube demonstrating the benefits of the Tesla collision avoidance system.

While we can appreciate the tough position this accident puts Tesla in, their blog post was decidedly cold and was written with a clear goal to protect the company and their project.

While Tesla’s choice of words in this blog post could have been a little more human, it’s not without good reason they are coming out all guns blazing to protect the brand. It’s likely the driver was not following correct procedures for driving the car, which certainly didn’t help matters. Also, we have been driving cars for years, causing accidents and doing huge amounts of damage to others on the road. This is the first fatality involving a situation where Autopilot was active in 130 million miles of use. Tesla were quick to highlight this figure compared to the wider US average of a fatality every 60 million miles in standard cars.

Automation is certainly better

While this Tesla crash has made headlines, and it is certainly no less tragic than any other car accident, it should be highlighted that should we automate driving, there will be a reduction in fatalities on our roads. Putting the fatality into the context of total accidents, it becomes blatantly obvious that we shouldn’t be trusted with driving on our own, but to develop an alternative we must learn from our mistakes.

Ads To Pay The Bills
Previous articleFirst look at the OnePlus 3
Next articleVirgin Mobile expands to include mobile phones
Marty
Founding Editor of Goosed, Marty is a massive fan of tech making life easier. You'll often find him testing something new, brewing beer or finding some new foodie spots in Dublin, Ireland. - Find me on Threads

This site uses Akismet to reduce spam. Learn how your comment data is processed.