What does the fatal Tesla crash mean for Autopilot?

On June 30th Tesla posted “A Tragic Loss” on their blog regarding a fatal Tesla crash. This was the terribly sad news that a long standing member of the Tesla community had been killed while operating his Model S Tesla with Autopilot engaged.

Why did the Tesla crash?

At the time of the accident, the Model S was operating in Autopilot mode on a highway. This means the driver was not directly in control of the vehicle with all input commands instead being managed by on-board computers. When approaching a junction on the highway, the white trailer of a truck blended into the brightly lit sky, leaving the Tesla Autopilot system and the driver blind to the impending collision. For this reason, the brake was not applied allowing a collision to occur.

Impact sensors would have likely reduced the level of impact should the collision have been front-on, but as the truck was turning, the Tesla Model S sensors aligned to the gap under the trailer, allowing the car to drive under, with its roof colliding with the trailer. It has also been claimed that the driver was not fully paying attention at the time of the accident, which does breach Tesla’s guidelines for Autopilot operation. The driver of the truck has stated that he believed the Tesla driver was watching a Harry Potter movie and didn’t see the danger. This is not possible on Tesla systems, but a third party DVD player was found in the Model S.

Is this bad for Tesla?

Yes, but first of all everyone needs to take a step back and remember that a life has been lost here. The driver, Joshua Brown, was an avid Tesla supporter who just months ago posted a video on YouTube demonstrating the benefits of the Tesla collision avoidance system.

While we can appreciate the tough position this accident puts Tesla in, their blog post was decidedly cold and was written with a clear goal to protect the company and their project.

While Tesla’s choice of words in this blog post could have been a little more human, it’s not without good reason they are coming out all guns blazing to protect the brand. It’s likely the driver was not following correct procedures for driving the car, which certainly didn’t help matters. Also, we have been driving cars for years, causing accidents and doing huge amounts of damage to others on the road. This is the first fatality involving a situation where Autopilot was active in 130 million miles of use. Tesla were quick to highlight this figure compared to the wider US average of a fatality every 60 million miles in standard cars.

Automation is certainly better

While this Tesla crash has made headlines, and it is certainly no less tragic than any other car accident, it should be highlighted that should we automate driving, there will be a reduction in fatalities on our roads. Putting the fatality into the context of total accidents, it becomes blatantly obvious that we shouldn’t be trusted with driving on our own, but to develop an alternative we must learn from our mistakes.

Written by

Marty
Martyhttps://muckrack.com/marty-goosed
Founding Editor of Goosed, Marty is a massive fan of tech making life easier. You'll often find him testing something new, brewing beer or finding some new foodie spots in Dublin, Ireland. - Find me on Threads

Help Pay the Bills

Related articles

Dublin City Libraries Add Supports for Non-verbal Adults and Kids

Dublin City Libraries have introduced Communication Boards and some...

RIP.ie Charging €100 For Death Notices Really Isn’t A Big Deal

I'm in an unusual position to comment on the...

Navigating Amazon Prices Across Europe: A Guide to Saving Money

As a savvy online shopper, you're likely aware that...

Shane Collective Launches Global Web App to Empower LGBTQ+ Communities

Here at Goosed.ie, we’re always on the lookout for...

Leaving the Church with GDPR: The Final Update

Can you use GDPR law to delete your church...

Keep Reading Goosed

Sponsored Articles