Difference between revisions of "Predictors of Clinical Decision Support Success"

From Clinfowiki
Jump to: navigation, search
m (Further Discussion: ref)
Line 57: Line 57:
 
== Further Discussion ==
 
== Further Discussion ==
  
This article on general CDS was published in 2003. In 2007, Dr David Bates, along with Dr Peter Gross, published more on CDS in computerized provider order entry (CPOE) systems: [[A pragmatic Approach to Implementing Best Practices for Clinical Decision Support Systems in Computerized Provider Order Entry Systems]]
+
This article on general CDS was published in 2003. In 2007, Dr David Bates, along with Dr Peter Gross, published more on CDS in computerized provider order entry (CPOE) systems: [[A pragmatic Approach to Implementing Best Practices for Clinical Decision Support Systems in Computerized Provider Order Entry Systems]] <ref>Gross PA, Bates DW. A Pragmatic Approach to Implementing Best Practices for Clinical Decision Support Systems in Computerized Provider Order Entry Systems. Journal of the American Medical Informatics Association : JAMIA. 2007;14(1):25-28. doi:10.1197/jamia.M2173.</ref>
  
 
== References ==
 
== References ==

Revision as of 19:57, 24 October 2015

This is a review of the paper by Bates et al from 2003, Ten Commandments for Effective Clinical Decision Support: Making the Practice of Evidence-based Medicine a Reality. [1]

Introduction

This paper tries to collect the wisdom from years of attempting optimization with clinical decision support (CDS) at Brigham and Women's Hospital in Boston, MA. The clinicians appreciated a discrepancy between optimal patient care and actual practice. A variety of examples are mentioned that refer to suboptimal care that is provided. Some of the examples cited included: only 50% of eligible patients receiving beta blockers, 27% anti-epileptic drug monitoring was indicated and half of those were drawn at incorrect times, only 68% of vancomycin orders didn't follow the Centers for Disease Control (CDC) guidelines. These instances provide the impetus to make changes for improvement. As stated in the paper, "We believe that decision support delivered using information systems, ideally with the electronic medical record as the platform, will finally provide decision makers with tools making it possible to achieve large gains in performance, narrow gaps between knowledge and practice, and improve safety." The authors have refined their clinical decision support tools with both successes and failures.

The Ten Commandments

1. Speed Is Everything

Speed is a very important priority for clinicians and should be considered "a primary goal." The results from surveys support this. It is important to keep in mind this may differ from the priority of operations and administrative staff.

2. Anticipate Needs and Deliver in Real Time.

The decision support must be available and delivered at the right time. Furthermore, it should try to predict future care needs. When these "latent needs" were addressed by the EHR, there was increased likelihood of a desired action.

3. Fit into the User's Workflow.

It is important to apply decision support that is integrated into a user’s workflow. A great suggestion for care will have little impact if user’s don’t know to look for it or it takes too much time to find.

4. Little Things Can Make a Big Difference.

One doesn’t need to think of huge interventions to create change in outcomes. An example is cited in regards to input field type of a diagnosis. This change created a variable data element from a structured one. Even though this is an isolated input change in the EHR, it had a big impact downstream when it came to trying to later provide feedback about the diagnosis.

5. Recognize that Physicians Will Strongly Resist Stopping.

One must be aware when designing support systems that there is a strong desire to continue with the original plan. This is especially true when there is no alternative provided. The authors recommend allowing providers to override CDS reminders. They site an example of clinicians finding ways to “game the system” even in the case of mandatory stops.

6. Changing Direction Is Easier than Stopping.

In contrast to command #5, the ability to alter orders is significantly easier to implement. This is especially true if the providers don’t feel strongly about the suggested element being changed. The paper cites examples of doses, route, or frequency of medications or number of views needed for certain imaging studies.

7. Simple Interventions Work Best.

Try to keep CDS simple. The authors suggest that guidelines should easily fit on a single screen. Additionally, any tool that requires multiple inputs from the provider may result in aborting the tool prior to delivery of the support information.

8. Ask for Additional Information Only When You Really Need It.

This is related to rule #7 in that CDS should strive to limit the amount of inputted information from clinicians. The authors state, “our experience has been that the likelihood of success in implementing a computerized guideline is inversely proportional to the number of extra data elements needed.”

9. Monitor Impact, Get Feedback, and Respond.

It is important to be cognizant of the number and quality of reminders clinicians encounter. Poor quality or frequent interventions may lead to an increase in reminder dismissals, even when they happen to be very important. CDS designers should collect feedback from end users regarding low quality alerts and suggestions for improvement. Of note, the authors mention that they have an empiric threshold of about 60% for positive responses to reminders.

10. Manage and Maintain Your Knowledge-based Systems.

Regularly monitoring of CDS reminders is imperative to providing optimal care. Action may need to be taken if an unusual spike in alerts are appearing. This may represent new medical knowledge or outdated support tools that hasn’t made it’s way into the CDS program.

Discussion

The authors again emphasize the importance of improving care and reducing costs through CDS. There is a gradient that represents the degree to which computing can assist in decision making. On one end, the computer doesn’t help at all and, on the other end, the computer makes all decisions without human help. While clinicians may fear loss of autonomy, there is room for improvement over the current state of CDS in health care. The point is that that one extreme or another isn’t appropriate, but somewhere in between is best. The level of computer intervention may not even be the same for all types of encounters.

Summary

Bates et al. provide concise and recommendations for those interested in implementing CDS in their practice. The problems of suboptimal care in medicine are well documented and CDS may be a tool by which this can be mitigated. Ultimately, the “ten commandments” presented are good guidelines, but the research done by the authors to support much of the recommendations were carried out within a single healthcare system. Also, the authors point out that the majority of the users were residents in training.

Further Discussion

This article on general CDS was published in 2003. In 2007, Dr David Bates, along with Dr Peter Gross, published more on CDS in computerized provider order entry (CPOE) systems: A pragmatic Approach to Implementing Best Practices for Clinical Decision Support Systems in Computerized Provider Order Entry Systems [2]

References

  1. Bates DW, Kuperman GJ, Wang S, et al. Ten Commandments for Effective Clinical Decision Support: Making the Practice of Evidence-based Medicine a Reality. Journal of the American Medical Informatics Association : JAMIA. 2003;10(6):523-530. doi:10.1197/jamia.M1370. http://www.ncbi.nlm.nih.gov/pmc/articles/PMC264429/
  2. Gross PA, Bates DW. A Pragmatic Approach to Implementing Best Practices for Clinical Decision Support Systems in Computerized Provider Order Entry Systems. Journal of the American Medical Informatics Association : JAMIA. 2007;14(1):25-28. doi:10.1197/jamia.M2173.

Submitted by Marc Tobias