Thursday, October 27, 2016

Self Driving Truck Delivers Beer

Read this news two days ago - Otto driver less truck delivers beer in Colorado. 

An autonomous truck piloted itself 120 miles on Colorado freeway to haul around 51,000+ cans of Bud beer. This is the first time in the history of experimenting with self driving vehicles to let a huge rig drive itself. This truck was fitted with cameras, sensors, lights, radars and the needed technology to self drive itself on a freeway for a good 120 miles. Now anyone who can get their hands on one of the can of beer can brag about how it reached them.

So, are we ready for the driver-less vehicles yet? This has been debated around a lot already. But with technologies developing at a rapid rate; it's not the technology that we ought to be worried about; but about the ethical dilemma associated with such. It is of course very tempting to have someone else do the job for us; be it a person or a program while we sit back and relax or do something productive like answering emails or writing code.

But how about making decisions that involve morality and ethics? These are debatable and very perspective. These depend on lot of factors like the environment we are brought up in, culture, beliefs and social and moral responsibility and thus judgement might differ from one person to another. I was interested in this bit more and started digging up few topics to read.

Learnt a couple of new things. Trolley problem - its an experiment in ethics. In summary; there is a trolley running down a track. Ahead of the track; five men are tied to the track and unable to move. The trolley is headed straight. You are standing few distance ahead next to a lever; if you pull the lever, the trolley will move on another track where one person is on the track. What do you do? Do nothing and let five people die or pull the lever and switch the trolley to another track and let the one person die? Which is the most ethical choice? It poses a life or death situation where depending on the action you take ensures death of certain people and an alternate action means death of several others.

There are so many arguments and discussions around it. And this puzzle determines human psychology. The Trolley problem so far remained a philosophical conundrum but now has become the basis of breaking their head for these autonomous vehicle software designers. Driving is a social responsibility; onus being completely on the driver. The driver is solely responsible for what happens; what the vehicle does. You hit a pedestrian because you were texting while driving; you are responsible.

But with software algorithms taking over the job of "driving" for you; you are also cleared of such social responsibilities. These algorithms respond based on pre-written rules; there is no cognitive process involved here. This is where it gets tricky. This is where your ethical and moral reasoning comes into picture. Let's say you are driving a vehicle and a kid jumps on the road; would you go straight ahead killing the kid or swerve and hit a barricade which ensures you'd end up getting killed?

Every other automaker is experimenting with autonomous vehicles. Seems few vendors have taken a different approach. In this case; the vehicle will plow right into the kid. The reason being as a brand; the product has meant safety, security and other privileges for more than a century. So its automated software will choose to protect its passengers above all others. Of course; who wants to buy a vehicle which might choose to kill you in any given situation, right? And its not difficult to see every other vendor opting for this approach. So we might end up with a fleet of self driven killers who would choose to save their passengers/customers over good of the society.

Imagine; these are not far fetched. But on the flip side; can it get worse than humans? Distraction have always been an issue with us humans. And these autonomous vehicles might end up with a better driving record. And there may be counter technologies to save from a disastrous situation; like limiting speed, an air bag kind of equipment around the vehicle or a scoop up kind of utility on the front hood which will pick the pedestrians instead of hitting them. We don't know yet. While technology is picking up; it is really amazing to see them being used to drive efficiency and assist human beings.


  1. If a kid suddenly jumps on the road, it does not matter if the automobile is driver less or driver driven. There is no time to think and analyze. The reaction would be automatic. Most probably a sudden break. May be the SW engineers have already programmed for this situation in a driver less auto.

    Also, for your example, driver less auto is more suitable for USA. I am sure the SW engineers have taken into consideration all rules and regulations. The most important traffic rule in USA is "the road belongs to pedestrians".

    1. SG,

      Thank you for your comments. So the algos are written based on what humans would or rather supposed to do in such situations. Exactly! a sudden brake is still "considerate" than running over. And the algo design doesnt depend on individual SW engineers but many factors high above then. SW engineers just write code based on the design given to them. How the algos are designed depends on lot of factors. The vendor I didnt name is a trust-able European automobile company thats been there for over a century with high reputation. If they can decide on an algo; most other vendors would follow them.

      And yes; I am not even considering autonomous vehicles in Asia or Africa or Latin America. I am talking only about North America and Europe. No; the sad thing is its too early to consider all rules and regulations to be considered. Thats what my point is. USA is one of the biggest importers of the cars of the vendor Ive mentioned. And if they dont follow the "road belongs to pedestrians" ; just imagine the outcome.


I'd love to know what you thought :-) Please shoot!