Stop sign
JOURNAL CLUB

Do interruptive alerts in the EHR work?

Key Takeaways:

  • Interruptive alerts are more effective at driving desired behaviors than non-interruptive alerts
  • Anecdotes point to a possibility of unintended consequences, so maintain vigilance with monitoring and end-user engagement when implementing new interruptive alerts
  • The overall quality of interruptive alert studies is relatively poor which makes the domain difficult to assess
Stop sign
INTRODUCTION

In their paper “Efficacy and unintended consequences of hard-stop alerts in electronic health record systems: a systematic review”, Powers et al look across an array of published studies on the effectiveness of pop-up alerts. Their meta-analysis provides a robust overview on the topic of interruptive alert effectiveness. The paper aims to answer the following questions: “(1) To what extent are hard-stop CDS alerts in EHRs effective in improving patient health and healthcare delivery outcomes? (2) What are the unintended consequences and adverse events of computerized hard-stop alerts? (3) How do hard-stop alerts compare to soft-stop alerts in efficacy and unintended consequences?”

APPROACH

Using studies related to the keyword phrase “hard stop”, the team ended up with 16 publications that involved interruptive alerting in the workflow. The team then used these studies and associated references to come up with a comprehensive search strategy. This strategy cast a wide net that captured 826 unique studies. Of these, the researchers eventually whittled them down to 32 studies that were deemed appropriate for inclusion in the final meta-analysis.

Given the challenge of doing randomized control trials (RCT) in clinical informatics, it’s not surprising that only three RCTs were found. On the other hand, 18 of the studies were pre-post evaluations. Most of the included studies were performed in an inpatient medicine setting in the United States. It’s important to note that no studies were deemed “excellent” quality and only eight were found to be of “good” quality. A full list of the studies can be found here.

RESULTS

The study authors found that interruptive alerts can, on the whole, influence end-user behavior. More specifically, the authors found that most studies on the topic focused on a process measure of interest as an intended target of alert implementations. While less than half of the studies mentioned observed unintended consequences, only three papers reported no unintended consequences.

Effectiveness

  • 11 of 15 (73%) showed improvement in patient or clinical measure
  • 8 of 10 (80%) showed improvement in health system operational measures

Interruptive and Non-interruptive

  • 4 of 32 (13%) compared interruptive and non-interruptive alerts (3 of the 4 showed interruptive superiority)

User Impacts

  • 10 of 32 (31%) referenced user experience
  • 11 of 32 (34%) referenced unintended consequences
DISCUSSION

This meta-analysis succeeds in capturing relevant studies around interruptive alerting within the EHR. The primary takeaway is that interruptive alerts can be effective, but adverse unintended consequences can occur. It is clear that the study quality is relatively poor when compared to the experimental rigor in other aspects of clinical medicine.

Available papers in the study largely targeted narrow process measures likely due to the expensive and time-consuming nature of running controlled experiments using native EHR tools. Measuring a change in some process measure is much easier and more direct than measuring the downstream outcomes a which can have a variety of confounding factors. For example, it is easier to know that an alert improved ordering of the “right” antibiotic, but it’s hard to know if that treatment choice ultimately decreased an outcome like mortality.

In general, identifying an alert’s impact can be difficult. The study’s evaluation of unintended consequences and user experience is a nod to the fact that these interventions can be effective, yet harmful at the same time. The documented examples of adverse events in the meta-analysis reflect the importance of remaining vigilant in monitoring and engaging end-users for feedback.

Despite the relatively poor quality of informatics research available, this meta-analysis demonstrates the effectiveness of interruptive alerts in driving proximal measures of interest. We know that these tools, especially when implemented poorly, can lead to alert fatigue and annoyance. As a result, it’s important to implement interruptive alerts thoughtfully and judiciously with close monitoring in order to evaluate impact.

EVALUATING YOUR CDS USING THESE METHODS

Phrase Health provides analytics dashboards that focus on the evaluation of alerts. Specifically, interruptive alerts can be explored by clinical department, provider type, individual user, and more. By combining this insight with other analytics and management tools, teams can effectively iterate on interventions to drive improvement in clinical care while decreasing the burden on clinicians.

You can start exploring your data quickly without any direct EHR integration or transfer of personal health information (PHI). Reach out to learn more.

Phrase helps you drive quality improvement and clinical outcomes through our advanced healthcare technology.