Last week, the New York Times summed up pretty well what a lot of people have been thinking: “Poorly designed, hard-to-use computerized health records are a threat to patient safety, and an independent agency should be set up to investigate injuries and deaths linked to health information technology, according to a federal study…”
The paper was referring to “Health IT and Patient Safety: Building Safer Systems for Better Care,” a report by the Institute of Medicine (IOM) calling for greater oversight of health-care technologies.
The U.S. Department of Health and Human Services (HHS) requested the IOM to evaluate electronic health records in the first place out of concern that some such products raised safety risks for patients. Practitioners were wondering if the boom in digital record-keeping is fostering a rash of medical errors thanks to balky, difficult or malfunctioning technology.
The report doesn’t decry the move – for reasons of both cost and care efficiency – from paper to electronic records, it just emphasizes that oversight must be part of the deal: “To achieve better health care, a robust infrastructure that supports learning and improving the safety of health IT is essential. Proactive steps must be taken to ensure that health IT is developed and implemented with safety as a primary focus. If appropriately implemented, health IT can help improve health care providers’ performance, better communication between patients and providers, and enhance patient safety, which ultimately may lead to better care for Americans.”
The IOM said an investigative agency — like the National Transportation Safety Board, which investigates airline accidents and examines safety issues — should be established for health-care technology. And that it should include tracking the safety performance of electronic health records.
So far, such efforts have yielded mixed results: There are tales of success, such as hospitals that use computerized, bar-coded prescription systems, but also tales of patient harm, such as delayed treatment due to lost data and/or problems with human-computer communication.
The IOM advised the Department of Health and Human Services (HHS) to devise a plan within 12 months to monitor patient safety risks associated with health IT, and to report on that progress every year. If, within a year, such progress is insufficient, the scientists’ group said the FDA should regulate these technologies, and that the agency should start planning for that now.
The IOM report is big on transparency. It is the government’s job to ensure that the private sector demonstrates concern for consumers by freely exchanging information about product use, “including details relating to patient safety.” You can’t establish a body of knowledge and develop a functioning market of safe products if you don’t share details of their risks.
This is thwarted today by the common practice of including nondisclosure clauses in contracts with vendors of IT health products. Such provisions impede efforts to improve safety by discouraging users from sharing information.
The report notes that clauses that limit liability (known as “hold harmless”) also undermine best-product practice by shifting liability “from the vendor to the users when an adverse event occurs.” As the story in The Times said, “Such language often limits the freedom of doctors and hospitals to publicly raise questions about software errors or defects.”
No one wants to stifle technological developments or the will to manage health care more efficiently. The key, as the IOM says, is to foster innovation without compromising safety.