When Things Go Wrong In An Automated World, Would We Still Know What To Do?
Peter Fisher , 27 Mar 17
       

Are we losing our skills as we hand more tasks to automated systems? Shutterstock/Michal Staniewski

We live in a world that is both increasingly complex and automated. So just as we are having to deal with more complex problems, automation is leading to an atrophy of human skills that may leave us more vulnerable when responding to unexpected situations or when things go wrong.

Consider the final minutes of Air France Flight 447, which crashed into the Atlantic in May 2009 after leaving Rio de Janeiro, Brazil, for Paris, France.

Its flight recorder revealed utter confusion in the cockpit. The plane became tilted upwards at 15º with an automated voice repetitively calling “stall, stall”. Yet the pilots were reeling, one exclaiming: “[…] we don’t understand anything.”

This is not the place to go into the ins and outs of that ill-fated flight, other than to note that any system designed to deal automatically with contingencies the majority of the time leaves a degraded skill base for the minority of situations the designers couldn’t foresee.

Speaking to Vanity Fair, Nadine Sarter, an industrial engineer at the University of Michigan, recalls a conversation with five engineers involved in building a particular aircraft.


    I started asking, ‘Well, how does this or that work?’ And they could not agree on the answers. So I was thinking, if these five engineers cannot agree, the poor pilot, if he ever encounters that particular situation … well, good luck.


In effect the complexity of judiciously flying highly intricate high-tech airliners has been outsourced to a robot, with flight engineers to all intents and purposes gone from cockpits. Only older pilots and ex air force pilots retain those detailed skills.

Back on terra firma, in an autonomous driving world there could be entire future generations with no practical experience whatsoever in driving and navigating a vehicle.

We’re already seeing an indication of what can go wrong when humans leave control to autonomous systems.

An investigation into the fatal crash of a Tesla Model S with autopilot noted that the company provided information about “system limitations” to drivers. In that case, it’s still up to drivers to pay attention.

But what chance would a person have of taking over any controls should things start to go wrong in their future fully autonomous vehicle. Would they even know how to spot the early signs of impending disaster?

Losing our way?

Driving this is a technological determinism that believes any and all innovation is intrinsically good. While emerging technologies may yet define what it is to be human, the challenge is to recognise the risk and what to do to make sure things don’t go wrong.

That’s getting harder as we’ve been adding to complexity, especially with autonomous driving of suburban trains, air taxis and delivery drones.

System designers have been building bigger and more intertwined systems to share computer processing load even though this makes their creations prime candidates for breakdown. They are overlooking the fact that once everything is connected, problems can spread as readily as solutions, sometimes more so.

The growing and immense complexity of an automated world poses similar risks.

Sign in to view full article

       
From ‘White Flight’ to ‘Bright Flight’ – The Looming Risk for Our Growing Cities
If the growth of cities in the 20th century was marked by “white flight”, the 21st century is shaping up ...
Jason Twill
Fri, 19 May 17
Singapore: Securing Tomorrow’s Energy
Finding a green alternative to fossil fuels can never get this tough for Singapore – we can’t use wind turbines ...
Luan Do
Mon, 2 Jan 17
Robots, Aliens, Corporate Drones – Who Will be The Citizens of the Future?
In the 1940s, science fiction author Olaf Stapledon gave a talk to a school about the future. Addressing his audience ...
Will Slocombe
Wed, 22 Mar 17
Organ Harvesting in China: Foreigners ‘Are 1 in 5’ Transplant Recipients
Prisoners of conscience are murdered on demand for their organs in China to supply a state-run transplant industry where one ...
James Burke
Mon, 20 Feb 17
Facebook’s New Anti-Fake News Strategy Is Not Going To Work – But Something Else Might
Over the past year, the social media company has been scrutinized for influencing the US presidential election by spreading fake ...
Paul Ralph
Mon, 1 May 17
At Epoch Times, We Care :o)
An Epoch Times Survey
An Epoch Times Survey
Sports Elements
Read about Forced Organ Harvesting
Sports Elements