Pilot-testing an adverse drug event reporting form prior to its implementation in an electronic health record

Our objective was to pilot-test a paper-based version of a newly designed ADE reporting form in three clinical settings prior to integrating it into an EMR. Our work highlights the utility of pilot-testing health technology interventions by intended end-users within clinical settings in order to maximize user-friendliness, utility and relevance, even in situations in which end-users were involved in earlier design stages. While there are differences between electronic and paper data collection forms, the two approaches can produce synonymous results (Boyer et al. 2002; Huang 2006). Although not all functionalities of an electronic form can be mimicked by a paper-based form, crucial design elements required for a successful electronic implementation became apparent to end-users in paper-based testing and will influence our future electronic build. Our fieldwork helped end-users and researchers anticipate how the ADE form’s functionality could be improved to assist clinicians in communicating relevant ADE information between care providers on different wards and across healthcare sectors, and as handover tools. This enabled us to anticipate the need for electronic linkages between different components of the EMR being implemented, ideally including a bidirectional link with drug plan data.

One of our concerns at the outset was that the form would be too lengthy and require too much time to complete, distracting its users from other work duties. Surprisingly, our fieldwork did not confirm this, as most users completed the form within 5 min and generally approved of its length and level of detail. Although the paper-based version did not allow us to display future functionalities (e.g., pop-up windows, ability to revise ADE reports in the future), end-users were able to identify preferences when different design options were proposed. An important caveat is that additional features added in an electronic build may contribute to increased functionality, but may also add complexity and require more time, necessitating further refinements.

ADEs are vastly underreported using current ADE reporting systems (Hohl et al. 2013; Wiktorowicz et al. 2010). Our fieldwork identified important avenues for improving reporting that may be addressed in a future electronic ADE documentation and communication form that is integrated into an EMR. These include addressing uncertainty about which ADE types should be documented (possibly through pop-up instructions), allowing providers to document uncertainty in the ADE diagnosis, enabling reports to be removed or modified after follow-up, providing space for alternative diagnoses, and enabling inter-professional communication across handovers and between inpatient and outpatient settings (possibly via patient-specific safety alerts). In our study, the majority of reported ADEs were adverse drug reactions. Other kinds of ADEs, such as non-adherence, sub- or supra-therapeutic doses were seen as more complicated, as the implications of reporting were less clear. We used an extended definition of an ADE, which included non-compliance and improper dosing regiments. While, all these events fall under the scope of medication-related problems (MRPs), our form purposely avoided this term to increase signal to noise ratio, and prevent reporting multiple non-clinically significant events per patient, as our overarching goal was to prevent recurrence of serious ADEs while avoiding alert fatigue and rendering documentation feasible. In previous workshops we held with end-users in advance of pilot-testing, pharmacists’ insisted on retaining the option to record non-adverse drug reaction ADEs (unpublished data). This conundrum might be addressed by educating users about the various kinds of ADEs encountered and need for communication across providers, and supporting a common approach to preventing future ADEs.

This study confirms previous observational work by our group that suggests that ADE diagnosis is a complex and multi-step process (unpublished data). If ADE reporting is to succeed, electronic forms that are created for this process must reflect this complexity, and enable reporting as a multi-step process. Multiple care providers including those who provide insight into alternative diagnoses for suspect events or provide follow-up of patient outcomes must be able to access and update information. The immediate implication for the design of electronic reporting systems is that they must enable communication between providers and across healthcare sectors. While we piloted the form with clinical pharmacists, doctors and nurses in hospital and community settings are likely to utilize the form as well. Thus, we anticipate further piloting and design adjustments as the form is implemented in other healthcare environments and for other provider groups.

During our fieldwork, we referred to the ADE form as a reporting tool. However “reporting” was very specifically associated with the communication of a subset of events to Health Canada through MedEffect form, as opposed to their documentation within an electronic record. An implication of “reporting” was an assumed permanency of the record that would be created, as none of the currently available reporting mechanisms allow for updates or modification after a report has been generated. As the overarching objective of our project is to develop a documentation tool that supports communication between care providers (rather than communicate events to external agencies), we changed the name of our form to “Adverse Drug Event Communication and Documentation Form” to highlight its intended purpose. We hope that our findings highlight the need for a culture shift around ADE communication, from an approach that serves to generate health data for external agencies, implied by “reporting”, to a patient-safety oriented approach that focuses on communication and documentation for prevention of repeat events.

Low completion rates can indicate problems with availability of information needed to complete a section of the form, or content problems with the section itself. Among the sections with the highest non-completion rates were those for the ADE symptom and diagnosis. ADEs are notoriously difficult to diagnose, and our prior observational work and workshops with stakeholders suggested that providing a record which allowed a subsequent care provider to re-trace evidence upon which an ADE diagnosis was based as an important aspect of ADE documentation. Pharmacists often listed presumed ADE symptoms; however clustering them into a diagnosis can be challenging, or require communication with physicians to rule out alternative diagnoses or await the results of confirmatory testing, leading them to skip this field. This finding may also in part explain some of the uncertainty expressed by pharmacists about reporting more complex, or less traditional ADEs.

Pharmacists were often unclear about which treatment recommendations to list within the ADE documentation form (e.g., whether to document the medication used to treat the ADE, or a medication used to replace the culprit drug). As a result users would often leave this field empty. We were unable to capture the full functionality of our electronic form—which will enable the pharmacist to recommend changes to a patient’s medication regiment using the EMR to a physician who can approve of them or alter them. While the electronic build may contain sufficient contextual information to address the ambiguities which existed in the paper based version of the form, this is likely to require electronic piloting.

Our findings demonstrate the value of completing pilot studies of electronic health information technology implementations with paper-based forms. While these cannot mimic the full functionality of an electronic interface, they provided vital feedback for subsequent design and pre-implementation user education that we might have otherwise overlooked, including questions regarding systems architecture.

Limitations

Lightweight ethnography, although time and resource efficient, carries a risk of only skimming the surface and providing only partial explanations. By only briefly engaging in the work environment, the observers risk missing less common routines and events that could otherwise have been captured. We relied on volunteers as the subjects of our observations, and therefore used a limited number of participants. This study is limited by the sole inclusion of clinical pharmacists, who were identified in our healthcare settings as the most likely care providers to encounter and document ADEs. It is possible that we might have uncovered other aspects of the ADE form requiring modifications had we been able to recruit more participants from other clinical backgrounds or settings, and anticipate further design adjustments as the use of the form is expanded. Also, our findings are susceptible to the Hawthorne effect, which occurs when participants are aware of the study objectives, potentially influencing their behavior. Finally, paper-based field evaluation of software designs has limitations, and hence findings must be evaluated in relation to data collected through other means.