If Google maps has made you less able to find your way around a printed version, consider that some people believe that advanced medical technology has led to some epically stupid human medical errors.
Megan McArdle, a columnist for Bloomberg, has looked into the advanced technology = retarded thinking scenario in health care. She recalled the story of a 16-year-old who was hospitalized in 2013 for a routine colonoscopy to check on his congenital gastrointestinal condition.
First he experienced numbness and tingling all over his body, and soon he was having seizures. Why? Because he had been given 39 times too much antibiotic. The teenager’s tale was told on the website Medium in its story “How Medical Tech Gave a Patient a Massive Overdose.”
“[I]f I had to condense its five parts’ worth of fascinating insights into one sentence,” McArdle wrote, “here’s how it would read: ‘Machines make us stupid.'”
McArdle noted that after having spent three months traveling recently, with only a cellphone, she had forgotten the telephone number of her landline. “To be sure, we don’t use it very often,” she wrote. “Still. We’ve had that number for five years. I forgot it in less than one football season.”
And just as cellphones keep wonderful records of the phone numbers we use, our computers are swift researchers with an excellent memory. Although that capability provides us with more knowledge faster, McArdle accepted that “[I]f I’m cut off from these tools, I am suddenly a moron. And if something gets entered into the computer wrong, I’m totally helpless. A few months back, people gently emailed to inquire where I was, as the panel I was on was about to begin. Turned out I’d put it into the calendar on the wrong day, …”
But that’s a common, human mistake with few real consequences. McArdle’s point was that techno errors can have serious consequences in health care, such as when software that was supposed to prevent a medical error instead contributed to a teenager receiving nearly 40 times the desired dose of his medication.
That horrific occurrence brought to light two contributing human factors: alert blindness and excessive trust in the automated system. In fact, the software did issue a warning to both the doctor and the pharmacist that something was wrong with the prescription. But it also had tried to warn them that something was wrong with a whole lot of the drug orders that were entered into the computer. Most of them were minor, so hospital staffers felt less alerted than annoyed by the serious mistake – they’d glance at the alerts and dismiss them.
If they hadn’t, McArdle wrote, “the hospital would have ground to a halt as everyone devoted their days to reading software alerts. So when a message came along that wasn’t trivial, they didn’t read it. Or didn’t even truly see it.”
The second human failure was creepier. When the system delivered the huge number of pills to the nurse on the floor where the teenager was to receive his procedure, she thought something was wrong. But she administered them anyway, because the system’s bar code system confirmed that they were what the doctor prescribed.
McArdle said that although what the nurse did was idiotic, we’re all prone to that kind of idiocy. “There’s an eerie authority to an automated system,” she wrote. …After all, computers are smarter than we are … It’s easy to turn off our judgment and hand the decision over to the machine.”
But computers are machines, and machines aren’t perfect either. They’re superior performers for certain tasks they’ve been programmed to handle, but humans are more successful performers within an unfamiliar environment because we have the capability to judge; we have common sense.
“So when a computer tells us to do something obviously wacky, where is our common sense?” McArdle wondered.
What happened to the part of our brain that used to “get it” when our mother demonstrated the idiocy of peer group pressure by asking “If Johnny also told you to jump off a bridge, would you do it?”
When medical records were keep strictly on paper, errors were also fairly common. But we knew that, so we were more vigilant, McArdle suggested. Automated systems make fewer mistakes, so we’re less likely to be watching out for them. Which means that when one occurs, it’s more likely to be a whopper. “The old system gave a lot of people the wrong medication, in the wrong dose, but it probably never gave anyone a 39-fold overdose of antibiotics,” McArdle said. “Unfortunately judgment atrophies just like a muscle.”
Use it or lose it.
The author of the Medium article and a related book, Dr. Robert Wachter, a patient safety advocate whose name is familiar to regular readers of this blog, noted how the aviation industry has refined how pilots get the alerts they need without overwhelming them with insignificant information. So why not the medical industry?
But Wachter also observed that pilot training experts are worried about a higher accident incidence when automated systems fail, and that pilots who spend a lot less time actually flying the plane than they used to might lack the judgment to assume control.
“This is a problem that is also bound to afflict driverless cars,” McArdle noted, “… so let’s hope that the experts crack it [before they become commonplace].”
If we don’t have a solution to the machines-make-us-dumb proposition, we do have the ability to design better machines that make fewer of these big errors. But, McArdle concluded, we also need never to forget that “human brains are better than computers at a lot of things – so when the computer’s instructions seem crazy, you should trust your judgment, not the monomaniacal machine.”