Sunday, March 20, 2016

Automation complacency: a cuneate facet

Raoul Hausmann, Mechanischer Kopf (Der Geist
Unserer Zeit)

Over the last couple of years I've followed Nick Carr's blog Rough Type fairly closely. Your appraisal of Carr's positions as those of either a thoughtful tech skeptic or an atavistic Luddite might well serve as a litmus test for digital nativism. (I prefer the former term, but then you'd probably file me under the latter.)

A recurring theme in Carr's writing is the overlooked or understated hazards of handing over an increasing volume of (formerly) human tasks to computers; this was the subject of his 2014 book, The Glass Cage: Automation and Us. A year prior to its publication, he wrote a piece for The Atlantic ("All Can Be Lost: The Risk of Putting Our Knowledge in the Hands of Machines") which expounds in some detail what he calls "automation complacency:"
Automation complacency occurs when a computer lulls us into a false sense of security. Confident that the machine will work flawlessly and handle any problem that crops up, we allow our attention to drift. We become disengaged from our work, and our awareness of what’s going on around us fades...

Examples of complacency ... have been well documented in high-risk situations——on flight decks and battlefields, in factory control rooms——but recent studies suggest that the problems can bedevil anyone working with a computer. Many radiologists today use analytical software to highlight suspicious areas on mammograms. Usually, the highlights aid in the discovery of disease. But they can also have the opposite effect. Biased by the software’s suggestions, radiologists may give cursory attention to the areas of an image that haven’t been highlighted, sometimes overlooking an early-stage tumor. Most of us have experienced complacency when at a computer. In using e-mail or word-processing software, we become less proficient proofreaders when we know that a spell-checker is at work....
In a classic 1983 article in the journal Automatica, Lisanne Bainbridge, an engineering psychologist at University College London, described a conundrum of computer automation. Because many system designers assume that human operators are “unreliable and inefficient,” at least when compared with a computer, they strive to give the operators as small a role as possible. People end up functioning as mere monitors, passive watchers of screens. That’s a job that humans, with our notoriously wandering minds, are especially bad at. Research on vigilance, dating back to studies of radar operators during World War II, shows that people have trouble maintaining their attention on a stable display of information for more than half an hour. “This means,” Bainbridge observed, “that it is humanly impossible to carry out the basic function of monitoring for unlikely abnormalities.” And because a person’s skills “deteriorate when they are not used,” even an experienced operator will eventually begin to act like an inexperienced one if restricted to just watching. The lack of awareness and the degradation of know-how raise the odds that when something goes wrong, the operator will react ineptly. The assumption that the human will be the weakest link in the system becomes self-fulfilling.
I often work a cash register. I'm not proud of it; I have a bachelor's in English and bills to pay.

Math was always my poorest subject in primary school; when I took my first retail job at the age of seventeen I was more than happy to entrust all arithmetical responsibility to the register. The operation proceeded splendidly until (for instance) I'd punch "20.00" into the CASH TENDERED field when handed a twenty-dollar bill for a $16.21 payment, only to be handed an additional $1.30 the customer had discovered in his jacket pocket.

"Uh," I'd say, eyes darting from the money in the customer's hand to the pocket calculator under the countertop to the line behind the customer to the computer screen (CHANGE TENDERED: $3.79) to my own hand in the open till to the customer's face. "Sorry. I already rang it in."

Then I'd turn a little red while I trusted the dude's word that I should just give him $5.09 and send him on his way.

At some point—maybe it was at the trendy supermarket in Maryland or the coffee shop in the Caribbean, who knows—I got in the habit of mentally calculating the difference between cost and cash and checking my result against the machine's (probably out of boredom, initially). These days I ring up every cash transaction as exact change and just do the arithmetic myself. It's faster. And if a customer surprises me with some extra coins and bills from their pocket, I can amend my arithmetic simply enough, while my coworkers who rely entirely on the software are often flabbergasted. Even though I've been in the same position myself, I can't help feeling a little embarrassed for them.

I don't wish for any of this to be read as a boast; being able to do basic addition/subtraction without a calculator isn't something I should (or do) take especial pride in. But I'm not saying any of my coworkers were/are stupid, either—they're just not in the habit of performing arithmetic on the fly, and are unprepared and confounded when the need arises.

But where learned helplessness is the norm, basic competence becomes a feat.

No comments:

Post a Comment