Taking the foot off: Self-driving Cars

It is something that has been envisioned since people began riding in horse drawn carriages; the ability to go from A to B with as little effort as possible. Rich people had their drivers controlling the horse while they sit in the back of the cosy carriage relaxing until they reach their destination. The poorer people ride their own horses, more than likely without a carriage. Not much has changed these days. The rich still have drivers taking them from A to B in their fancy cars while the rest of us drive ourselves like degenerates…apparently.

Now car companies want to put an end to driving cars as they look to bring out self-driving cars. So we can now eat, drink, sleep, text or watch a film safely in a car while going from point to point. Eventually we could drive with our backs to the road and converse with other passengers in a relaxing manner.

self driving car gif

Business forecasters reckon there could be as many as 10 million self-driving cars on the roads by 2020, but do we need self-driving cars? Let’s face it, humans are terrible drivers. In America over 20,000 people die each year in car accidents along with hundreds of Irish road users, so yes we do. Do we want self-driving cars? Humans are also quite lazy people. We want to be able to do things that take very little effort. Like Samsung’s new smart fridge that allows you to see what’s in the fridge from your couch by looking at an app on your smart phone which shows the inside of the fridge via a camera. Wouldn’t you want to drive without keeping your eyes on the road? Again, that’s a yes, but would you trust them enough to not pay attention to where they are going? Let’s delve a little deeper.

Reliability

While testing self-driven cars over the course of a few months the drivers had to take over from the computer 341 times to avoid crashing. 272 of these were detected from the on board computers and willingly handed over control to the drivers. You can see how reliable the Tesla system is below.

The rest of the incidents, where “safe operation of the vehicle requires control by the driver”, may not have been the car’s fault. It could easily have been as a result of a pedestrian or bad driving from other drivers. Oh, and I forgot to mention, these 341 incidents happened after nearly 425,000 miles of driving between 12 different vehicles. The cars have also driven around Lake Tahoe and The Golden Gate Bridge, it a lot of testing took place in the Nevada Desert, something to consider when analysing the figures.

Self-driving cars are reliable to some extent, but they’re not perfect yet, but then again neither are regular cars. Brakes often fail, engines leak, electrics go wrong, and these often can lead to accidents.

The Future

So let’s think about it. These cars are going to be run by computers. At the moment, people are still smarter then computers. We can see something coming in the distance and anticipate what to do 30 seconds before it happens. Common computers don’t exactly have that kind of intelligence yet, but could they learn to? In September last year, an artificially intelligent computer learned how to play chess in 72 hours and got to the level of international chess master. Now to do this, the computer would have to anticipate what moves to make in order to beat a person. But this has happened before, in 1997. IBM’s Deep Blue beat world champion Garry Kasparov by looking at patterns and ‘thinking’ intuitively to gain the victory.

giphy (5)So why is this important? Think of it this way. Your computer needs code to be written by people so that when you click on the Google Chrome icon, Chrome will come up. Now what if computers could teach themselves to do that? Facebook recently developed a program that could recognise people’s faces with near human accuracy. What if your car could ‘teach’ itself what you look like? What your fingerprints are like? Are we looking at an age where grand theft auto no longer exists because your car doesn’t allow drivers that aren’t you to turn them on? Or will they be able to teach themselves to know when they are being stolen and lock a vigilante into the car and escort them to the nearest police station?  It seems that self-driving cars will need to be able to teach themselves

Can self-driving cars work?

In a nut shell, yes. We send people to the moon in the 60’s. We have a satellite oribiting Pluto. We have discovered water on Mars. We can tell what a planet is made of billions of light years away. Surely we can manage to successfully develop a car that can safely drive people around our roads without crashing into another car, a wall, or a pedestrian, etc. Google could have been working on this for years with Google Maps. They simply have to load the maps onto the cars system and that’s the navigation aspect solved. They main issue is to be able to determine what speeds to go in what zones, anticipate people, cars, animals or objects jumping in front of them.

who would a self driving car kill
Algorithmic Morality, courtesy of the MIT Review

What happens if a woman and her child walk onto the road and an elderly person walks on the other side? How does the car avoid them? If they can’t, who do they choose to avoid and how do they make the decision to hit the other person? These are really important issues in which Google, Tesla and other car manufacturers will have to think about a lot.

The transition from normal cars to self-driving cars will take time. A long time, probably a couple of decades before we are looking on DoneDeal to find a used cheap self-driving car. Will we have 10 million self-driving cars on our roads in 2020? It’s highly doubtful.

Ads To Pay The Bills
Previous article6 Signs You’re Addicted to Netflix
Next article4 signs ‘Top Gun 2’ is coming
Dean
One half of The Goosed Podcast, Dean is passionate about football and fantasy football, but just isn't that good at it. Fortunately, technology trends both past and future are just his cup of tea. Goosed Tower Limerick needs its dose of Dean to keep us all firmly on the ground. Read more by Dean