Self-driving vehicles, once a futuristic concept confined to science fiction, are very much upon us. Autonomous cars are already on our roads and driving our kids to school, but are we really ready to put our lives in the hands of machines?
The term “self-driving” is a rather ambiguous term in today’s mobility market. As manufacturers race to build the cars of the future, each company is doing it slightly differently and diversifying the technology. We’re also seeing the arrival of semi-autonomous cars, such as Tesla's latest autopilot, uncovered last week.
It’s just not traditional car manufacturers getting involved either. Apple is working on self-driving cars, as is Google. Uber’s self-driving vehicle plans were suspended in Phoenix, Pittsburgh, San Francisco and Toronto after the company was involved in the first reported fatal accident involving a car that was driving itself in Arizona earlier this year.
In Florida, the Transdev, Babcock Ranch Pilot Autonomous School Shuttle is an autonomous school bus that transports children to and from school. Although the EasyMile Easy10 Gen II, a fully electric shuttle, is capped at 12mph for now, are people really ready to put the lives of their children in the hands of an intelligent machine?
Machine morality will have to play a part in safety
Just this weekend, a study published in Nature: International Journal of Science, found that participants believed that AI should prioritize younger lives over the elderly. When confronted with a scenario in which a self-driving vehicle could swerve to avoid three elderly pedestrians, whilst taking the lives of a young family on board in the process, participants chose to save the passengers.
Almost 40 million decisions were evaluated and the experiment found several distinctions such as humans should be spared over pets, legal pedestrians should be spared over jaywalkers, and even more worrying factors such as men spared over women and those with higher social status spared over those of lower.
In the real world, the information available at the time of making the decision is not the same, of course. In the experiment, participants are told the certain outcome of accidents, including certain fatalities, something that is not possible in reality. Sure, it’s just a social experiment, but self-driving vehicles will have to make these decisions, just as humans do when behind the wheel, to some extent.
The issue of safety is one that can subside over time. I suspect that, like with all modern technology, initial fear will fade once the technology becomes mainstream. What may take longer for society to accept is the moral decisions that self-driving vehicles will have to make when they populate our roads and highways.
It feels like 2018 is a year where autonomous vehicle innovations are speeding up. But I worry that the wider world is moving towards acceptance in a lower gear. I fear that resistance to this new technology is just around the corner.
Is the world ready for self-driving vehicles? Let us know in the comments below.