Evaluation of Medication Alerts in Electronic Health Records for Compliance with Human Factors Principles

From Clinfowiki
Revision as of 11:53, 24 February 2015 by Annathehybrid (Talk | contribs)

(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to: navigation, search

Introduction

Alert fatigue has been defined as “declining clinician responsiveness to a particular type of alert as the clinician is repeatedly exposed to that alert over a period of time, gradually becoming ‘fatigued’ or desensitized to it.[1] This situation creates far too many false-positive alerts, which leads to alert fatigue.[1] In 2014, JAMIA published a study conducted by Phansalkar et al. entitled Evaluation of Medication Alerts in Electronic Health Records for Compliance with Human Factors Principles. [2] The Authors assert that the propensity of clinicians to over-ride clinical decision support (CDS) alerts significantly diminishes the potential benefits of CDS, and by extension EHRs in general. The direct clinical consequences of overalerting clinicians in a pediatric setting have not been well demonstrated as well. Incorporating clinical evidence in electronic drug allergy alerting systems remains challenging, especially in pediatric settings.[1] They hypothesize that adherence to principles of human factors design will improve the acceptance rates of these alerts.

The evaluation aimed at two things:

  • Draw comparison the EHR of drug-drug alerts and see the compliance with human factors using I-MeDeSA.
  • Provide recommendation for appropriate alert design.

Methods

Using a tool previously developed by the authors and descriptively named the Instrument for Evaluating Human-Factors Principles in Medication-Related Decision Support Alerts (I-MeDeSA), the drug-drug interaction alerts of 14 EHRs were evaluated for compliance with nine human factors characteristics. Results were used to produce a recommendation for an appropriate design for alerts.

Human Factor Principles

  • Alarm philosophy
  • Placement
  • Visibility
  • Prioritization
  • Color learnability
  • Confusability
  • Text-based information
  • Proximity of task components being displayed
  • Corrective action


Results & Conclusion

14 EHRs received scores with a range of 8 to 18.33 on a 26 point scale. High inter-rater reliability was calculated by Cohen’s kappa value (k=0.86). EHRs were then ranked by score. Examples of recommendations included:

The authors conclude that, based on the EHRs in the study, developers are inconsistent applying human factors principles. Further, they state that all 14 EHRs “fell short in meeting the principles of good alert design” [2]

Comments

These results are consistent with the notion that, for an EHR used by clinicians at the point of care (and specifically when ordering medications), the importance of the interface far exceeds conventional notions of “user friendly”. This study lends empirical support to importance of interface design as a matter of patient safety and quality of care.

References

  1. 1.0 1.1 1.2 A Clinical Case of Electronic Health Record Drug Alert Fatigue: Consequences for Patient Outcome.http://pediatrics.aappublications.org/content/131/6/e1970.full
  2. 2.0 2.1 Evaluation of Medication Alerts in Electronic Health Records for Compliance with Human Factors Principles http://www-ncbi-nlm-nih-gov.ezproxyhost.library.tmc.edu/pubmed/24780721