Study finds "uneven" effort by hospitals in implementing clinical decision support in CPOE
Although a new study finds "uneven" effort being applied to implementing clinical decision support in CPOE, some hospitals are setting the bar for its use, and this is good news that should not be overlooked, says one of the authors of the study.
The study of 62 hospitals, which was published in Health Affairs, found that individual hospitals scored anywhere from 10 percent to 82 percent on the ability of the implemented decision support to detect and inform physicians about test orders that would cause an adverse drug event if the medication reached the patient.
For the top performing hospitals, which represented 10 percent or six hospitals, performance scores ranged from 71 percent to 82 percent. The six lowest scored hospitals had performance scores from 10 percent to 18 percent.
Healthcare watchdog organization The Leapfrog Group has established a standard for hospitals that requires physicians and other licensed providers to enter at least 75 percent of medication orders using computerized entry and requires that they demonstrate that their clinical decision support can alert physicians to at least 50 percent of common, serious prescribing errors.
"You might say that The Leapfrog Group's standards were the beginnings of meaningful use [for CPOE]," says Jane B. Metzger, principal, emerging practices at Falls Church, Va.-based CSC. "Setting expectations for how the EHR is used through meaningful use is a good thing. Meaningful use guides the hospital to what the implementation should look like. This study shows that those expectations should include active use of clinical decision support to help ensure that progress continues in achieving national goals such as improving patient safety."
According to pooled hospitals scores, the adverse drug event category detected most reliably by hospitals was the drug-to-allergy contraindication. Drug-diagnosis contraindication was only detected 15 percent of the time. One explanation of the variability among hospitals is that, "for drug-to-diagnosis contraindications, for example, many hospitals are likely not yet applying decision support because there is no constantly updated electronic problem list maintained by physicians for patients during their hospital stay," the study reports.
Researchers suggest that there is evidence that indicates that a hospital's vendor choice can have some impact on performance measures, but the study found that only explained 27 percent of the variation between hospitals that used different EHRs.
This study shows that it is "definitely more about the implementation than about the specific product, and that is quite encouraging," says Metzger.
A hospital's teaching status also had some effect on performance, but only accounted for 10 percent of variation within the study. Other factors that the authors suggest could contribute to variability of hospital performance include:
- Completeness and ease of applying decision supports in the EHR
- Relevant knowledge and experience
- Availability and commitment of staff resources
- How long decision support has been used
How the hospitals implemented CPOE explains most of the variation, says Metzger. "Implementing CPOE is a complicated undertaking. It affects every order written for every patient," she said. "Part of the work is deciding how to employ decision support and configuring the software to do that.”