Stephen M. Salerno, MD, MPH; Patrick C. Alguire, MD; Herbert S. Waxman, MD
Drs. Alguire and Salerno dedicate this article to Herbert Waxman, MD, a friend and colleague whose vision, leadership, and friendship will be missed. Dr. Waxman, who served as Senior Vice President for Medical Knowledge and Education at the American College of Physicians, died on 15 February 2003.
Disclaimer: The opinions or assertions presented are the private views of the authors and are not to be construed as official or as reflecting those of the Department of the Army or Department of Defense.
Potential Financial Conflicts of Interest: None disclosed.
Requests for Single Reprints: Customer Service, American College of Physicians, 190 N. Independence Mall West, Philadelphia, PA 19106; e-mail, firstname.lastname@example.org.
Current Author Addresses: Dr. Salerno: Department of Medicine (MCHK-DM), Tripler Army Medical Center, 1 Jarrett White Road, Honolulu, HI 96859.
Dr. Alguire: Medical Knowledge and Education Division, American College of Physicians, 190 N. Independence Mall West, Philadelphia, PA 19106.
Salerno S., Alguire P., Waxman H.; Competency in Interpretation of 12-Lead Electrocardiograms: A Summary and Appraisal of Published Evidence. Ann Intern Med. 2003;138:751-760. doi: 10.7326/0003-4819-138-9-200305060-00013
Download citation file:
Published: Ann Intern Med. 2003;138(9):751-760.
There have been many proposals for objective standards designed to optimize training, testing, and maintaining competency in interpretation of electrocardiograms (ECGs). However, most of these recommendations are consensus based and are not derived from clinical trials that include patient outcomes.
To critically review the available data on training, accuracy, and outcomes of computer and physician interpretation of 12-lead resting ECGs.
English-language articles were retrieved by searching MEDLINE (1966 to 2002), EMBASE (1974 to 2002), and the Cochrane Controlled Trials Register (19752002). The references in articles selected for analysis were also reviewed for relevance.
All articles on training, accuracy, and outcomes of ECG interpretations were analyzed.
Study design and results were summarized in evidence tables. Information on physician interpretation compared to a gold standard, typically a consensus panel of expert electrocardiographers, was extracted. The clinical context of and outcomes related to the ECG interpretation were obtained whenever possible.
Physicians of all specialties and levels of training, as well as computer software for interpreting ECGs, frequently made errors in interpreting ECGs when compared to expert electrocardiographers. There was also substantial disagreement on interpretations among cardiologists. Adverse patient outcomes occurred infrequently when ECGs were incorrectly interpreted.
There is no evidence-based minimum number of ECG interpretations that is ideal for attaining or maintaining competency in ECG interpretation skills. Further research is needed to clarify the optimal way to build and maintain ECG interpretation skills based on patient outcomes.
Interpretations of 12-lead resting electrocardiograms (ECGs) are often required in both ambulatory and inpatient settings. Organizations, including the American College of Cardiology (ACC) and the American Heart Association (AHA), have published consensus-based competency standards suggesting the optimal number of ECGs necessary to obtain and maintain competency in ECG interpretation skills. The ACC/AHA guidelines recommend interpreting a minimum of 500 supervised ECGs during initial training, using standardized testing in ECG interpretation to confirm initial competency and interpreting 100 ECGs yearly to maintain competency (1). The consensus statement asserts that these standards should apply to all practice settings and situations; however, the statement is controversial because of the lack of evidence-based literature on the optimal techniques for learning, maintaining, and testing competency in ECG interpretation. This systematic review attempts to synthesize the literature that may prove useful for physicians, program directors, and organizations seeking alternative evidence-based information on physician ECG interpretation skills.
We retrieved information through systematic searches and ongoing surveillance of MEDLINE (1996 to 2002), EMBASE (1974 to 2002), and the Cochrane Controlled trials Register (1975 to 2002). We used the following index terms and text words: electrocardiogram interpretation, electrocardiogram competency, electrocardiogram training, electrocardiogram errors, and computer electrocardiogram interpretation. Our search was limited to English-language articles that studied adult participants. References of the full-length articles were analyzed for additional citations. The search revealed 419 articles of potential interest. After we analyzed the abstracts of these articles, we eliminated 378 because ECG interpretation was not the main study focus. Thirty-nine articles and 2 letters to the editor contained research data directly related to ECG interpretation.
We divided the articles into the following broad categories: studies that included clinical outcomes (Table 1), studies that included discussion of the accuracy of computer ECG interpretation (Table 2), and all other studies that compared noncardiologists to a cardiologist reference standard (Table 3). The studies were not graded by quality. Although criteria exist for grading the quality of studies of diagnostic tests, the focus of our study was the users of the test rather than the usefulness of electrocardiography for diagnosing specific diseases. We also reviewed the recommendations of the American Board of Internal Medicine, American College of Physicians, AHA, ACC, and Accreditation Council on Graduate Medical Education Residency Review Committees for Internal Medicine and Cardiovascular Diseases (1, 43-45).
One common feature of most ECG interpretation studies is the use of an expert electrocardiographer “gold standard,” typically a consensus panel of cardiologists. This may be problematic because interpretations by several cardiologists reading the same ECG often vary substantially (14, 17, 29, 42). Even one cardiologist reading the same ECG on separate occasions may have substantially different interpretations (14, 29, 42).
Most studies on ECG interpretation by cardiologists report the proportion of abnormal diagnoses that are correctly identified, as determined by a consensus panel. These studies report that the participating cardiologists correctly determined 53% to 96% of the abnormalities identified by the reference standard (4, 17, 21, 23, 24, 28, 30). Two recent studies examined whether cardiologists agreed among themselves and their colleagues by using κ statistics to adjust for interpretations that agreed on the basis of chance alone (14, 29). Holmvang and colleagues reviewed 502 ECGs examined by both local cardiologists and an expert electrocardiography consensus panel (29). Agreement was poor to moderate on identification of ST-segment elevation (κ = 0.05), ST-segment depression (κ = 0.38), and a normal ECG (κ = 0.42). Interrater agreement on detecting T-wave inversion was very good (κ = 0.63). In contrast, intrarater agreement by the expert electrocardiographic consensus panel was good (κ = 0.58 to 0.67).
Levels of agreement may be higher for serious abnormalities, such as ST-segment elevation criteria for use of thrombolytic therapy. Massel and colleagues reported substantial agreement (κ = 0.78) among three cardiologists examining whether 75 ECGs met criteria for thrombolytic therapy (14). The study also showed good intrarater agreement (κ = 0.67 to 0.71) when the three cardiologists determined criteria for thrombolytic therapy on two separate occasions.
Because cardiologists do not agree on many aspects of ECG interpretation, future studies examining noncardiologist or computer interpretation should include κ statistics and control groups of cardiologists. Readers can then draw more sophisticated conclusions about the importance of disagreements in ECG analysis. More accurate measures of agreement, such as weighted κ statistics, should be included because disagreement may relate to only some aspects of the ECG interpretation.
Numerous trials have compared the ECG interpretation skills of cardiologists and noncardiologists. Most studies measured the proportion of ECG diagnoses determined by an expert consensus panel “gold standard” that noncardiologist physicians could identify. Seven studies measuring comprehensive ECG analysis found that the proportion of ECG diagnoses correctly identified by noncardiologist physicians ranged from 36% to 96% (4-6, 8, 11, 13, 33). In several studies that focused on a particular aspect of ECG interpretation, noncardiologists identified 87% to 100% of ECGs showing acute myocardial ischemia (6, 27), correctly classified 72% to 94% of ECGs as meeting inclusion criteria for thrombolytic therapy (3, 26, 30, 32, 37, 40), diagnosed 57% to 95% of ST-segment abnormalities (2, 9, 26), and correctly measured about 25% of PR and QT intervals (36).
Although most studies combined resident and staff physicians in their analyses, some provided data allowing subgroup analysis. Twelve articles included information specifically on resident physician interpretation skills. Resident physicians detected 96% of abnormal ECGs (15), correctly identified inclusion criteria for thrombolytic therapy 73% to 84% of the time (3, 26, 30, 32), demonstrated 36% to 80% of ECG diagnoses as determined by expert electrocardiographers (5, 6, 11, 13, 21, 28, 41), and discovered 38% of technical ECG abnormalities (41). Only two articles provided sufficient information for subgroup analysis on noncardiologist staff physicians. The articles did not provide information on specialty background or board certification. Staff physicians correctly identified inclusion criteria for thrombolytic therapy 77% of the time (30) and diagnosed abnormal ST-segment and T-wave abnormalities 57% to 97% of the time (16).
Four studies provided information on the ECG interpretation skills of nonphysicians. In two studies, nurses correctly identified ECG criteria for thrombolytic therapy 84% to 94% of the time (30, 37). The other studies examined the interpretation skills of medical students (28, 38), who identified 17% to 63% of ECG abnormalities identified by an expert reference standard.
Trainees often gain experience in ECG interpretation when they use an ECG to assist in the clinical management of a patient. Not surprisingly, research has shown that clinical history may affect interpretation of ECGs (28, 31, 38). Providers with less training are influenced by the history to a greater extent than are more experienced electrocardiographers. For example, when given a misleading history, diagnostic accuracy was reduced by 5% for cardiologists but up to 25% for residents in one recent study (28). Cardiologists performed better than other interpreters in all settings and were 90% accurate in their diagnoses, even when no history was provided. Another study also demonstrated that interpretations by cardiologists were minimally dependent on the presence or absence of history (46). This information suggests that noncardiologist interpreters may make more accurate interpretations when they know the clinical context of the ECG.
False-positive ECG interpretations could lead to unnecessary patient treatment. Six studies examined this aspect of interpretation (9, 15-17, 37, 40). Cardiologists typically had fewer false-positive interpretations than did noncardiologists. Cardiologists demonstrated a specificity of 93% to 100% for diagnosis of substantial ECG abnormalities, such as myocardial ischemia; specificity for noncardiologist physicians was 73% to 100%. In simpler determinations, such as differentiating normal from abnormal ECGs, the specificity of noncardiologists also approached that of cardiologists (15).
Eleven studies examined whether ECG interpretation errors could affect patient management, and seven measured patient outcomes (Table 1). Thirteen studies assessing the severity of interpretation errors reported that 4% to 33% of interpretations contained errors of major importance (2-13, 21). Expert consensus panels studied the charts of patients with ECG interpretation errors and determined whether a correct ECG interpretation would have changed patient management. This more detailed analysis revealed inappropriate management as a result of interpretation errors in 0% to 11% of cases (Table 1).
Six studies that documented patient follow-up showed a less than 1% incidence of adverse outcomes and potentially preventable death as a result of inaccurate ECG interpretation (4, 6, 8, 9, 11, 12). Therefore, although noncardiologist physicians have a high incidence of interpretation errors when compared to an expert reference standard, these errors seem to have minimal impact on morbidity and mortality. One important limitation of current studies on outcomes of ECG interpretation errors is that most were performed in emergency and inpatient settings. Because the probability of life-threatening disease is reduced in primary care settings, the impact of interpretation errors on patient outcomes may be overestimated by extrapolating from the results of existing research.
Computer software providing automated ECG interpretation is a common feature of most modern acquisition devices. Computer software can correctly identify 58% to 94% of various nonrhythm electrocardiographic abnormalities when compared to expert electrocardiographers (Table 2). Accuracy is considerably reduced when computers interpret arrhythmias, such as different types of second-degree atrioventricular block (22). In addition, reliability may be a problem; several studies show substantial differences in ECG interpretations obtained minutes apart in clinically stable patients (47-49).
Despite the limitations, evidence suggests that computer interpretation software is a useful adjunct to physician interpretation. In some reports, computers have detected abnormalities missed by physicians. A study of resident physicians showed that using computer interpretation software reduced the incidence of serious ECG interpretation errors from 22% to 18% (13). Another study that paired a computer and a cardiologist showed a reduction in false-positive ECG readings from 7% to 4.5% and 8% greater agreement with an expert electrocardiographer consensus panel (17). However, other research has indicated that preliminary computer interpretations may both benefit and mislead primary care physicians (50).
In addition to potentially reducing interpretation errors, computers may have other advantages. Computer interpretation can decrease the time necessary to interpret ECGs by up to 28% (17, 50). Therefore, although computer ECG analysis is still inferior to physician interpretation, it may be a useful adjunct to save physician time. Further study is needed to determine whether computer interpretations can consistently reduce medical errors.
A 2001 statement by the ACC/AHA recommended interpretation of 500 supervised ECGs, and an earlier 1995 edition of the same guideline recommended 800 interpretations to attain initial competency in ECG interpretation (1, 43). Both guidelines were created through expert consensus, and neither provided evidence-based data to support their conclusions. The Accreditation Council for Graduate Medical Education Residency Review Committee for Internal Medicine states that all residents should be given an opportunity to develop competency in interpretation of ECGs but has not specified how to achieve this goal (44). The American Board of Internal Medicine has not listed a minimum number of supervised ECGs required to sit for internal medicine board certification (45). Other professional organizations and surveys of clinicians lack consensus regarding the minimum number of supervised ECG interpretations needed to achieve competence during internal medicine residency training. One survey of program directors stated that a median number of 100 ECGs were necessary to obtain initial competency (51). The number of supervised ECG interpretations recommended to obtain competency during cardiology fellowship is also derived by consensus from the Accreditation Council for Graduate Medical Education Residency Review Committee for Cardiovascular Diseases, which recommends the interpretation of 3500 ECGs (44). The number of ECGs interpreted by a physician has not been linked to either standardized test performance or patient outcomes.
The most recent ACC/AHA guidelines also recommend that physicians pass an ECG certification examination as part of credentialing (1). Reports on such examinations suggest that cardiologists may perform better than other physicians (52). This may be especially true in situations where detailed clinical information is not available. However, it is unclear how standardized test scores are related to physician performance in ECG interpretation or patient outcomes in a clinical setting.
Evidence-based literature that addresses whether competency in ECG interpretation changes after initial residency or fellowship training is not available. For certain medical skills, such as interventional cardiology procedures, increased procedure volume is associated with improved patient outcomes (53). However, this relationship is less clear for cognitive skills. Observational data suggest that higher procedure volume improves resident confidence in numerous technical skills, such as thoracocentesis, paracentesis, and central line placement (54). No studies have examined whether physicians who are more confident in their skills have better patient outcomes.
It is also unclear whether continuing medical education, such as didactic courses, hands-on ECG interpretation seminars, or self-assessment programs, can improve ECG interpretation skills after initial residency or fellowship training. Several uncontrolled studies of residents and students showed improvement in ECG interpretation skills after structured ECG interpretation seminars (5, 55, 56). Further research is needed to better understand whether the yearly volume of ECGs interpreted and continuing medical education on ECG interpretation improve analytic accuracy and patient outcomes.
Electrocardiogram interpretation is a common cognitive skill that is acquired by internal medicine physicians during initial residency training; it is used by most primary care and subspecialty physicians. Little evidence-based literature supports strict quantitative standards for attaining and maintaining competency in interpreting ECGs. Expert electrocardiographers and other physicians frequently disagree, but adverse outcomes do not often result from these disagreements. The use of a cardiologist “gold standard” to determine ECG interpretation accuracy in future research may be problematic because cardiologists often do not agree on interpretations of the same ECG. Computers may be useful adjuncts in detecting missed electrocardiographic diagnoses and reducing time needed for ECG interpretation. Further high-quality research is needed to determine optimal standards for attaining competency in ECG interpretation. These standards must integrate physician skills to appropriately order ECGs; to recognize common normal and abnormal tracings; and to understand criteria for therapy decisions based on ECG interpretation, such as the administration of thrombolytic agents. Research demonstrating effective training methods that will decrease interpretation variability for high-stakes diagnoses, such as acute myocardial infarction, is needed to reduce medical errors and improve patient outcomes.
The In the Clinic® slide sets are owned and copyrighted by the American College of Physicians (ACP). All text, graphics, trademarks, and other intellectual property incorporated into the slide sets remain the sole and exclusive property of the ACP. The slide sets may be used only by the person who downloads or purchases them and only for the purpose of presenting them during not-for-profit educational activities. Users may incorporate the entire slide set or selected individual slides into their own teaching presentations but may not alter the content of the slides in any way or remove the ACP copyright notice. Users may make print copies for use as hand-outs for the audience the user is personally addressing but may not otherwise reproduce or distribute the slides by any means or media, including but not limited to sending them as e-mail attachments, posting them on Internet or Intranet sites, publishing them in meeting proceedings, or making them available for sale or distribution in any unauthorized form, without the express written permission of the ACP. Unauthorized use of the In the Clinic slide sets will constitute copyright infringement.
Cardiology, Cardiac Diagnosis and Imaging.
Results provided by:
Copyright © 2016 American College of Physicians. All Rights Reserved.
Print ISSN: 0003-4819 | Online ISSN: 1539-3704
Conditions of Use
This PDF is available to Subscribers Only