Author + information
- Received April 7, 2010
- Revision received June 18, 2010
- Accepted June 25, 2010
- Published online September 1, 2010.
- Allen J. Taylor, MD*,* (, )
- Jonathan Patrick, MD*,
- Suhny Abbara, MD†,
- Daniel S. Berman, MD‡,
- Sandra S. Halliburton, PhD§,
- Jerome L. Hines, MD∥,
- John McB. Hodgson, MD¶,
- John R. Lesser, MD#,
- L. Samuel Wann, MD**,
- Kim A. Williams, MD††,
- Jack A. Ziffer, PhD, MD‡‡,
- Lorraine J. Lennon, BA§§,
- Dawn M. Edgerton, MA§§ and
- Manuel D. Cerqueira, MD§
- ↵*Reprint requests and correspondence:
Dr. Allen J. Taylor, Advanced Cardiovascular Imaging, Department of Medicine, Section of Cardiology, Washington Hospital Center, 110 Irving Street, Northwest, Room 1E12, Washington DC, 20010-2975
Examinees of the first Certifying Examination in Cardiovascular Computed Tomography were surveyed regarding their training and experience in cardiac computed tomography. The results support the current training pathways within the American College of Cardiology/American Heart Association competency criteria that include either experience-based or formal training program in cardiovascular computed tomography. Increased duration in clinical practice, the number of scans clinically interpreted in practice, and level 3 competency were associated with higher passing rates.
Cardiac computed tomography (CCT) has undergone rapid technological advancements and has become a widely used modality in cardiovascular imaging over the past decade. To help assess individual physicians' expertise in the application and interpretation of CCT, the American College of Cardiology (ACC) and several partnering organizations developed clinical competency criteria for CCT in 2005 (1). The official document included a general description of the knowledge and cognitive skills identified by experts as necessary for competency in CCT. In addition, the requisite duration of training and numbers of cases performed and interpreted for level 1, 2, and 3 competencies were identified. Notably, these training and experience requirements were empirically selected based on expert opinion.
To further the tools documenting proficiency in CCT, the ACC, the American Society of Nuclear Cardiology, The Society for Cardiovascular Angiography and Interventions, and the Society of Cardiovascular Computed Tomography established the Certification Board of Cardiovascular Computed Tomography (CBCCT). In May 2007, the CBCCT embarked on a comprehensive and inclusive effort to develop the Certification Examination in Cardiovascular Computed Tomography (CECCT) (2). The process included detailed surveying of >600 CCT practitioners from various practice settings and medical specialties. From this comprehensive international effort, the scope of current clinical application of CCT was documented, and the precise knowledge and skills needed to perform these tasks were identified. Test content was focused on these essential skills and was developed according to the best psychometric standards. The final product was a practice-based exam designed for physicians to document their proficiency in the clinical application of CCT.
The requirements for cardiologists and nuclear medicine physicians to achieve eligibility for the exam were in part adapted from the ACC/American Heart Association level 2 competency criteria, with requirements including 50 clinical studies performed and 150 contrast-enhanced exams interpreted. Those who completed their training before 2005, before the publication of these competency criteria (legacy examinees), required letters from colleagues attesting to their CCT experience within same case number requirements. The eligibility requirements for radiologists included demonstration of competency with thoracic CT and at least 50 contrast-enhanced CCTs. The first CECCT was administered in September 2008 and provided a unique opportunity to survey a spectrum of skill and experience in CCT and its relationship to success on the examination. The objective of the current study was to assess the relationship between prior training and experience in CCT and the results on the initial CECCT and to use this evidence to evaluate the current ACC competency criteria.
By approval of the Board of Directors of CBCCT, an anonymous online survey was presented to all examinees of the 2008 CECCT. It included 10 multiple-choice questions covering the physicians' formal training in CCT, experience reading exams (clinically and/or in workshops), and current practice environment. The survey was conducted during the period between the completion of the examination and the announcement of the examination results. Examinees were informed that results of the survey would not impact pass/fail decisions. Responses to these questions were compared with respondents' test results (including pass/fail rates and total number of questions answered correctly). Descriptive statistics are provided for the survey responses. Comparisons among categorical variables are performed with the chi-square test. An exploratory logistic model (not pre-specified) was constructed evaluating the independent variables associated with passing the CECCT. All statistical analysis was performed using SPSS version 16.0 (SPSS Inc., Chicago, Illinois). Statistically significant differences were defined as a 2-tailed p value <0.05.
All examinees in 2008 (n = 872) were sent surveys, and 451 responded (51.7%). The pass rate among respondents (85%) was above the pass rate of nonrespondents (327 of 421, 77.7%; p = 0.01).
Responses regarding the nature of the examinees' formal training in CCT are displayed in Table 1. Examinees were instructed to select all of the methods of training that applied to them. A majority of examinees had participated in a CCT training course at some point in their training (66.3%). Also, 22.8% of respondents were “legacy” examinees—physicians who self-trained before the 2005 publication of ACC competency criteria. Such individuals identified generally high experience levels and had an examination pass rate of 93.2%. Relatively few examinees had trained in CCT during their clinical fellowship (requiring ≥2 months of CCT; 9.8%) or a dedicated CCT fellowship (6.0%). Pass rates for these pathways were 100% and 96.3%, respectively. Examinees who reported ≤1 month of CCT training during their formal clinical fellowships had the lowest pass rate (71.4%).
Examinees were asked to describe their level of expertise in CCT, as defined by the ACC competency criteria (Table 2). ACC level 2 and 3 examinees were nearly equally represented (49.9% and 45.2%, respectively). The ACC level 2 examinees passed approximately 80% of the time. ACC level 3 examinees performed better, passing at 91.7% (p = 0.014). The small number of examinees who described themselves as qualified by the American College of Radiology (1.6%, n = 7) all passed the exam (100% pass). Examinees with primary specialties of cardiology, radiology, and nuclear imaging all performed well. Overall, radiologists passed at the highest rate (96.4%) compared with 83% to 84% for the other 2 specialties (p = 0.006). However, when controlling specialty for ACC competency level, there was no difference between specialties, as more radiologists met ACC level 3 criteria.
Experience interpreting CCTs
The number of contrast-enhanced CCT exams for which an examinee was responsible for the official clinical interpretation and the associated examination pass rate is shown in Figure 1. There was a relationship between the number of clinical CCT interpretations performed and the likelihood of passing the examination (chi-square, p = 0.001) (Fig. 1) and a bivariate relationship with the examination score (r = 0.23; p = 0.001). Similar relationships were observed with the duration of clinical CCT experience for the likelihood of passing the exam (chi-square, p < 0.001) (Fig. 2) and the examination score (r = 0.25; p = 0.001). There was an inverse relationship seen between the number of cases interpreted during workstation training courses and pass rate (p = 0.05) (Fig. 3) and no overall relationship with the examination score (r = 0.001; p = NS).
We used logistic regression to evaluate the independent correlates of passing the examination. The model controlled for the duration of time in the clinical practice of cardiovascular CT, the number of scans clinically interpreted, the number of scans interpreted during workstation courses, and examinee subspecialty. Duration of time in the clinical practice of cardiovascular CT was significantly related to passing the examination (odds ratio 1.29 per category increment as shown in Fig. 2; 95% CI: 1.02 to 1.64; p = 0.03). The other variables were not independently associated with passing the CECCT.
Examinees currently working in private practice settings had pass rates of 84% (office-based practices) and 83% (hospital-based practices). Examinees from university hospitals had a pass rate of 94%. Examinees exclusively involved in clinical work had a pass rate of 82.4%, and those with combined clinical and research activities had a pass rate of 93.2%. Activities associated with a high pass rate included publishing original research in CCT (96.1% pass), lecturing on CCT (94.2% pass), and mentoring/training other clinicians in CCT (93.9% pass). Those involved in none of these pursuits passed 74.6% of the time.
Initial recommendations regarding training and experience in CCT defining levels of competency were empirically defined through expert consensus in 2006. Subsequently, the CBCCT established a certifying examination in CCT, providing the opportunity to evaluate the relationship between the ACC criteria and success on the examination. The results suggest that individuals meeting the competency criteria perform well on the CECCT, although a relationship does exist between CCT experience and success on the examination.
The main findings of the present study include a pass rate of approximately 81% and a relationship between experience and success on the examination. In general, level 3 examinees performed better than level 2 examinees, and a relationship existed between the number of scans interpreted by the examinee or their time in practice and pass rate. Nonetheless, survey respondents possessing the minimum criteria for CBCCT candidacy performed well. Interestingly, there was no relationship between the number of exams interpreted in workstation educational courses and exam results. This finding supports the ACC competency criteria recommendation that no more than 50 of the required 150 cases for level 2 competency are permitted to be obtained solely through workstation training courses. Lastly, individuals performed well on the exam regardless of specialty or practice setting (office, hospital, or university practice).
The literature on the optimal methods of training to develop expertise in CCT is limited. In a study of 4 CCT trainees, accuracy for computed tomography angiography interpretations showed little improvement over 1 year of training involving nearly CCT 600 cases (3). In contrast, a different study of 2 learners showed progressive improvement and proficiency (relative to invasive coronary angiography gold standard diagnosis) over approximately 150 CCT cases (4). However, neither of these small studies replicates the training and experience recommended by the ACC competency criteria, which include specific content areas, a minimum of 20 h of certified continuing medical education hours, the requirement for expert mentoring, and exposure to 150 CCT cases. Thus, the present study extends our understanding of the specific competency criteria as specified by the ACC under the CECCT as a surrogate for proficiency in CCT.
The finding of an inverse relationship seen between the number of cases interpreted during workstation training courses and pass rate deserves specific mention. The CECCT is not centered on workstation software skills, as often taught in these courses, and thus we speculate that overemphasis of this type of training may prepare clinicians well for day-to-day clinical practice, but be less effective for the purposes of the CECCT. Alternatively, it may have been emphasized by examinees with difficulty in gaining real-life experience due to other barriers in acquiring real-life experience. Dedicated study of optimal methods of learning CCT is needed, particularly as training experiences relate to both the CECCT and practical clinical proficiency.
There are limitations in using these data to assess the effectiveness of different methods of exam preparation and the appropriateness of various CCT competency criteria. First, there is the uncertainty regarding how well performance on the CECCT truly represents an individual's proficiency in clinically reading CCTs or relates to any other external quality or outcome measure. This is true for any type of medical board examination, and, even though the CECCT was developed according to best practices for such examinations, this limitation must be noted. Second, although it is clear that examinees reporting each of the methods of formal training did well on the exam, it is difficult to dissect out the relative contribution of each method to examinees' pass rates. An exception to this is particularly strong performance of those applicants who had dedicated CT training for ≥8 weeks (including longer-term dedicated CT fellowships). Although the survey response rate was modest and respondents had a similar pass rate to nonrespondents, selection bias is possible, including self-selection to undertake the CECCT.
The results of the first CECCT support the current training pathways within the ACC/American Heart Association competency criteria that include either experience-based or formal training program in cardiovascular CT.
Dr. Taylor is supported by Abbott. Dr. Berman receives research grant support from GE Healthcare & Sciences. Dr. Lesser is on the Scientific Advisory Board for Vital Images (moderate) and Speakers' Bureau for Siemens Medical (moderate). All other authors report that they have no relationships to disclose.
- Received April 7, 2010.
- Revision received June 18, 2010.
- Accepted June 25, 2010.
- American College of Cardiology Foundation
- Budoff M.J.,
- Cohen M.C.,
- Garcia M.J.,
- et al.
- Ovrehus K.A.,
- Munkholm H.,
- Bottcher M.,
- Botker H.E.,
- Norgaard B.L.