Live Usability Testing of Two Complex Clinical Decision Support Tools: Observational Study

The study aimed to further understand the barriers and facilitators of meaningful CDS usage within a real clinical context.

  • Authors
  • David A. Feldstein
  • Devin M. Mann
  • Lauren McCullagh
  • Linda Park
  • Paul Smith
  • Rachell Hess
  • Rebecca Mishuris
  • Safiya Richardson
  • Sundas Khan
  • Thomas McGinn
  • Published
  • JMIR Human Factors

Abstract

Background

Potential of the electronic health records (EHR) and clinical decision support (CDS) systems to improve the practice of medicine has been tempered by poor design and the resulting burden they place on providers. CDS is rarely tested in the real clinical environment. As a result, many tools are hard to use, placing strain on providers and resulting in low adoption rates. The existing CDS usability literature relies primarily on expert opinion and provider feedback via survey. This is the first study to evaluate CDS usability and the provider-computer-patient interaction with complex CDS in the real clinical environment.

Objective

This study aimed to further understand the barriers and facilitators of meaningful CDS usage within a real clinical context.

Methods

This qualitative observational study was conducted with 3 primary care providers during 6 patient care sessions. In patients with the chief complaint of sore throat, a CDS tool built with the Centor Score was used to stratify the risk of group A Streptococcus pharyngitis. In patients with a chief complaint of cough or upper respiratory tract infection, a CDS tool built with the Heckerling Rule was used to stratify the risk of pneumonia. During usability testing, all human-computer interactions, including audio and continuous screen capture, were recorded using the Camtasia software. Participants’ comments and interactions with the tool during clinical sessions and participant comments during a postsession brief interview were placed into coding categories and analyzed for generalizable themes.

Results

In the 6 encounters observed, primary care providers toggled between addressing either the computer or the patient during the visit. Minimal time was spent listening to the patient without engaging the EHR. Participants mostly used the CDS tool with the patient, asking questions to populate the calculator and discussing the results of the risk assessment; they reported the ability to do this as the major benefit of the tool. All providers were interrupted during their use of the CDS tool by the need to refer to other sections of the chart. In half of the visits, patients’ clinical symptoms challenged the applicability of the tool to calculate the risk of bacterial infection. Primary care providers rarely used the incorporated incentives for CDS usage, including progress notes and patient instructions.

Conclusions

Live usability testing of these CDS tools generated insights about their role in the patient-provider interaction. CDS may contribute to the interaction by being simultaneously viewed by the provider and patient. CDS can improve usability and lessen the strain it places on providers by being short, flexible, and customizable to unique provider workflow. A useful component of CDS is being as widely applicable as possible and ensuring that its functions represent the fastest way to perform a particular task.

  • Keywords
  • Clinical Decision Rule
  • Clinical Decision Support
  • Health Informatics
  • Live usability
  • Provider Adoption
  • Usability
  • Usability Testing
  • user experience
  • Workflow
  • Focus
  • Clinical Decision Support​
  • Usability Testing​
  • Evidence Based Medicine