When Isaac Asimov wrote about the three laws of robotics way back in 1942 he was no doubt imagining a world where humanoid robots walked the earth, doing household chores and generally making themselves useful.
He probably hadn’t anticipated these robots being 1-tonne metal boxes designed to carry humans around at high-speed.
The three laws
Intended to protect we humans from an inevitable crisis where the robots identify us as a threat to our planet and ourselves (hmm, why would that be…), the laws are there to protect the wellbeing of humans at all costs. The laws state:
- A robot may not injure a human being or, through inaction, allow a human being to come to harm.
- A robot must obey orders given it by human beings except where such orders would conflict with the First Law.
- A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.
Source: Isaac Asimov’s “Three Laws of Robotics”
Now that driverless cars are on the horizon, they provide us with an ethical dilemma. Our roads are shared with a multitude of different users, including other cars; human-controlled cars; trucks and vans. Crucially they are also shared by less predictable users, such as pedestrians and us on our bicycles.
Ghosn worries that driverless cars have a cycle-shaped hurdle to leap: “One of the biggest problems is people with bicycles. The car is confused by [cyclists] because from time-to-time they behave like pedestrians and from time-to-time they behave like cars.”
Source: Nissan driverless car guilty of “close pass” overtake of UK cyclist | Bicycle Business | BikeBiz
Whilst it is true that we are unpredictable –after all, some of us do behave like two-wheeled pedestrians, others like regular traffic, this is only one side of the story.
Ghosts in the machine
A driverless car, for all its sophistication, is one that is bound by the code created by its designer. There are circumstances that a driverless car will encounter that will challenge its interpretation of the three laws, assuming it has been programmed to adhere to them.
It is entirely possible…even likely that a driverless car will be coded to protect the wellbeing of the person sat inside the vehicle over and above vulnerable road users outside it.
Currently we are seeing a number of cars on the market that are so-called “semi-autonomous”. This seems to be a sort-of autopilot where you are still required to be there “in control”, but the car will do a lot of the work for you. Having said that, early signs from BMW are not encouraging.
Put simply, in “semi autonomous” mode the new 640iGT if the road ahead is occupied by a solitary cyclist, unless the driver uses the turn signal (yes, this is a BMW we are talking about here…) you are in for a close pass. This is a similar story with Nissan linked above.
As we’ve talked about before, a close pass is not a collision, so it doesn’t count from a cold, hard, callous, resolutely unsympathetic statistical perspective. To a machine, squeezing through a small gap with inches to spare is probably a calculated risk. It doesn’t stop it being scary as hell though.
Liability
With the “driver” not actually driving the vehicle it raises the question of who gets the blame when things go wrong. We’ve become accustomed to the gutter press apportioning blame to the vehicle already, despite the driver being responsible. We’re also used to motorists not being held to account for the deaths they’ve caused.
Perversely, with autonomous cars on the roads all will be right with the world again. It finally will be the car at fault and the “driver” will probably not be responsible. A new “Automated and Electric Vehicles Bill that has been proposed by the UK Government goes some way to clarify the situation, at least from an insurance perspective.
According to the new Automated and Electric Vehicles Bill, insurers would generally be liable for damage stemming from an accident caused by an automated vehicle “when driving itself” where the vehicle is insured and “an insured person or any other person suffers damage as a result of the accident”.
Source: UK legislates for a future of driverless and electric cars
However, if you don’t keep up with your software updates, or you alter the software in any way, it’s the owner at fault. Just pray you never share the road with an enterprise IT engineer, you know what they are like at updating Windows…
They’re still cars…
One fundamental issue we’ve skirted over here is the march of automation. Driverless vehicles of all shapes and sizes have the potential to make everyone who drives for a living redundant.
Add this to the workers who will continue to lose their jobs through automation, particularly in manufacturing and even some office and legal roles. You have to wonder what it is humans will actually do with their lives in the long run. What is the point of a driverless car if you have nowhere to go?
Also, when all is said and done, these are still motorised vehicles. They still remove the need to use your body. The particles from tire and brake dust will still pollute the air. They will still isolate us and damage social cohesion and they will still kill people when they hit them.
We’re likely to become a society of overweight; wheezing; lonely; angry people divided between those who can afford a driverless car and those that have been automated out of society. Indeed, some would argue that we are halfway there already.
Perhaps we do need to be saved from ourselves after all.
I have always comforted myself that with the advent of driverless vehicles they will at least be forced to drive at the speed limit (i.e. 20mph in a 20mph zone rather than the currently prevailing 30 – 40 mph …) but these close passes are scary. Why aren’t driverless vehicles simply designed to avoid ANYTHING on the road at a safe distance, and speed? As ever, better safe than sorry…
LikeLike