Skip to main content
  • Original Research Article
  • Open access
  • Published:

Evaluating applicants to a new emergency medicine residency program: subjective assessment of applicant characteristics

Abstract

Background

Because of the Accreditation Council for Graduate Medical Education (ACGME) and the Residency Review Committee (RRC) approval timelines, new residency programs cannot use Electronic Residency Application Service (ERAS) during their first year of applicants.

Aim

We sought to identify differences between program directors’ subjective ratings of applicants from an emergency medicine (EM) residency program’s first year (in which ERAS was not used) to their ratings of applicants the following year in which ERAS was used.

Method

The University of Utah Emergency Medicine Residency Program received approval from the ACGME in 2004. Applicants for the entering class of 2005 (year 1) did not use ERAS, submitting a separate application, while those applying for the following year (year 2) used ERAS. Residency program directors rated applicants using subjective components of their applications, assigning scores on scales from 0–10 or 0–5 (10 or 5 = highest score) for select components of the application. We retrospectively reviewed and compared these ratings between the 2 years of applicants.

Results

A total of 130 and 458 prospective residents applied during year 1 and year 2, respectively. Applicants were similar in average scores for research (1.65 vs. 1.81, scale 0–5, p = 0.329) and volunteer work (5.31 vs. 5.56, scale 0–10, p = 0.357). Year 1 applicants received higher scores for their personal statement (3.21 vs. 2.22, scale 0–5, p < 0.001), letters of recommendation (7.0 vs. 5.94, scale 0–10, p < 0.001), dean’s letter (3.5 vs. 2.7, scale 1–5, p < 0.001), and in their potential contribution to class characteristics (4.64 vs. 3.34, scale 0–10, p < 0.001).

Conclusion

While the number of applicants increased, the use of ERAS in a new residency program did not improve the overall subjective ratings of residency applicants. Year 1 applicants received higher scores for the written components of their applications and in their potential contributions to class characteristics.

Introduction

As emergency medicine (EM) expands as a specialty, the number of emergency medicine residency programs have increased in order to fill the need for EM-trained physicians [1]. There are currently 154 residency programs nationwide, with 21 of these being created and approved within the past 5 years [2].

Fig. 1
figure 1

Written components

Medical students and prospective residents typically apply to a residency program using the Electronic Residency Application Service (ERAS), an on-line program that allows applicants to distribute their applications to multiple residency programs during the application cycle rather than completing individual applications for each residency program. For new emergency medicine residency programs, the approval timeline through the ACGME/RRC is such that these programs typically are not approved in time to participate in ERAS during the application cycle of their first year class of applicants. As such, applicants must submit a separate application file directly to that program [35].

Because a newly created emergency medicine residency program is typically unable to participate in ERAS during the first-year application cycle, it may appear that these new programs face a dual challenge in selecting residents for their first year class: the challenge of attracting residents to a new, unproven program, as well as a potentially smaller applicant pool since the convenient ERAS service is not available.

In this study we compared applicant characteristics in the first 2 years of a new emergency medicine residency program; the first year of which applicants did not apply through ERAS, and in the second year in which applicants used ERAS. Using evaluator-assigned scores from the subjective components of their applications, we hypothesized that applicants in the first year of a new residency program, while a more limited applicant pool, would not have lower average scores in these areas.

Methods

The University of Utah Emergency Medicine Residency Program accepted its first class to begin in the summer of 2005. The program is a 3-year residency program in Salt Lake City, Utah, and is approved for eight residents per class. The program received ACGME approval in the summer of 2004. Given the approval timeline, applicants to our program’s first entering class were unable to use ERAS during the fall application process and instead, were required to submit a separate application file to our residency office. The second class of emergency medicine applicants applied in the fall of 2005 for the entering class of 2006. As the residency program had already been approved by ACGME the previous year, applicants to this class used ERAS.

Our study was a retrospective comparison of the evaluators’ subjective ratings of applicants during these 2 consecutive years: 2004–2005, ‘year 1,’ prior to our program’s participation in ERAS, and 2005–2006, ‘year 2,’ our program’s first year participating in ERAS. We reviewed the files of all applicants for these two application cycles and recorded applicant characteristics and credentials as represented in their applications. The first year class, while unable to participate in ERAS, was required to submit identical information as that required through ERAS, thus ensuring uniformity in the data provided by both applicant pools. The objective of our study was to compare the characteristics between all of those who applied during these 2 years. As such, we reviewed the files of every applicant, not distinguishing between those who were interviewed for the program or those who matched. This study received approval from our institutional review board.

In reviewing applications prior to inviting potential candidates for interviews, the program director and associate program director utilized a standard form to assign scores on scales from 0–5 or 0–10 (5 or 10 = highest score) for select subjective components of the application. These components included: research, volunteer/work experience, letters of recommendation, personal statement, dean’s letter, and the applicant’s potential contribution to their class characteristics (Fig. 1).

Program directors agreed on basic guidelines on how to evaluate the different components. In evaluating applicants’ research accomplishments, the rater’s assigned score was based on a pre-established list of criteria reflecting the type and number of research studies and publications in which the applicant was involved. The program directors assigned increasing points based on the number of publications, whether the applicant had a first author publication, and whether these publications were in an emergency medicine journal. Additional grading was based on the program directors’ subjective assessments of the strength of the applicants’ dean’s letters, letters of recommendation, etc., relative to other applications.

Program directors scored applicants’ potential contribution to class characteristics based on how they felt the applicant would later perform in a new residency program. They paid particular attention to their perception of the applicant’s ability to take a leadership role in both developing the program as well as building the program’s reputation throughout the hospital’s other specialties and departments.

Given several factors unique to the residency program’s setting, we considered the additional influence of geography on numbers of applicants and applicant characteristics. Salt Lake City is the home of the Church of Jesus Christ of Latter-day Saints, or LDS or “Mormon” Church. As such, it carries with it geographic appeal to members of the church. Additionally, Salt Lake City’s relative isolation from other emergency medicine programs and its recreational offerings introduce additional factors that may influence emergency medicine residency applicants.

To characterize the potential influence of geography on numbers of residency applicants, we developed a surrogate measure to evaluate possible geographic ties to the state of Utah. Residency applications do not include religious preference, nor do they contain information that would allow us to accurately and consistently determine an applicant’s preference for the region’s recreational offerings. We felt, however, that applicants who had previously lived in the state of Utah may be those most likely to be influenced by these geographic ties. We reviewed all applicants and categorized them as having “geographic ties” to the state if their birthplace or undergraduate college/university was in the state of Utah. We used these two markers of geography as these are the two areas of the application for which we could consistently identify prior applicant residence in the state of Utah.

Applications that were not complete (i.e., missing personal statements, letters of recommendations, etc.) were still considered by the program directors and evaluated based on the information submitted. To determine the differences between the study years, chi-square and t-test statistics were used, with p < 0.05 considered statistically significant (SPSS v. 16.0).

Results

One hundred thirty applicants applied during year 1 and submitted a separate application outside of ERAS. Four hundred fifty-eight applicants applied to the residency program the following year and used ERAS to apply to multiple residency programs, including our program. Applicants during year 1 had an average age of 30.9 years, which was comparable to the average age of applicants the following year (30.3 years, p = 0.225). The year 1 applicant pool had a higher percentage of male applicants (77.3% vs. 67.2%, p=0.028) (Table 1).

Table 1 First year vs. second year applicant pool characteristics

Program directors assessed applicants’ research accomplishments using the criteria listed previously. They found that applicants in year 1 and year 2 were comparable in their research efforts. Using a five-point scale, year 1 applicants had an average research score of 1.65, while those in year 2 had an average score of 1.81 (p = 0.329). Similarly, year 1 and year 2 applicants were similar in the program directors subjective assessments of their volunteer work, as represented on their applications (5.31 vs. 5.56, scale 0–10, p = 0.357).

In the additional areas assessed through the program directors’ subjective evaluation, year 1 applicants received higher scores than those in the year 2 applicant class. Year 1 applicants received an average score of 3.5 for their dean’s letters, compared to a score of 2.7 for year 2 applicants (scale 0–5, p < 0.001). In the assessments of the strength of their letters of recommendation, year 1 applicants were again superior (7.0 vs. 5.94, scale 0–10, p < 0.001). Year 1 applicants were more likely to receive higher scores for their personal statement (3.21 vs. 2.22, scale 0–5, p < 0.001).

We compared the program directors’ subjective assessment of applicants’ potential contribution to class characteristics. This area focused specifically on the perception of applicants’ ability to lead both in the creation of a new residency program and in establishing a strong reputation for the program throughout the hospital. Again, year 1 applicants received higher scores compared to those applying for the second year class (4.64 vs. 3.34, scale 0–10, p < 0.001) (Table 1).

Finally, we evaluated the potential contribution of applicants’ geographic ties to their decision to apply to the residency program. Of 588 total applicants over the 2 years, 67 (11.4%) had ties to the region, defined as their birthplace or attended undergraduate college/university in the state of Utah. The numbers of applicants in years 1 and 2 with geographic ties were similar (12.3% vs. 11.1%, p = 0.710).

Discussion

In this study, in order to better assess applicant quality and preparation for EM residency training, we present the first evaluation of subjective ratings of applicant characteristics in a new residency program, comparing applicants to our program prior to and with the use of ERAS. In doing so, we have attempted to evaluate areas of the application file that were previously difficult to quantify and evaluate during the application scoring process.

One previous study has evaluated the characteristics of applicants to an established emergency medicine residency program, comparing a year in which applicants used ERAS to a year in which they did not, looking at both objective and subjective applicant characteristics [6]. Similar to what Houry and Shockley observed, our program’s participation in ERAS significantly increased the number of applicants to our program in year 2. In contrast to their findings, however, we identified differences between the two groups of applicants in subjective characteristics between the ERAS and non-ERAS application years. This was likely due to the difference between our study and the Houry data, in that we also evaluated the effect of a new residency program on applicant characteristics.

These results may be useful and applicable for programs preparing to implement a new emergency medicine residency program. Such programs, if unable to use the convenient ERAS program for their first year of applicants, may be apprehensive as they enter the application process that first year, assuming that their applicant pool might not be as competitive as the following year’s when more, possibly better qualified, candidates will be submitting applications through ERAS. Based on our study results, such programs can take some comfort in knowing that year 1 applicants to our program were not only statistically similar with year 2 applicants in the areas of research and volunteer service but, in fact, received statistically higher subjective scores for the remaining subjective components. Of particular interest, subjective scores for the dean’s letter and letters of recommendation were found to be higher in year 1 applicants, a factor that has previously been studied and reported to correlate with performance as a first year EM resident [710].

Limitations

As a retrospective review of applicant files, this study carries with it several limitations. Errors in the applicants’ files or missing data, while supposedly limited due to the attention typically placed on these applications by the applicants, could result in misrepresentation of data gathered. Additionally, this study was conducted at a single residency program, the University of Utah. Applicant interest in programs may vary based on geographical ties to a region and personal concerns, perhaps even more than interest in the quality or reputation of the program. Thus, these results may not apply to every new program, given the relative geographic isolation of Salt Lake City. We found that a number of applicants (11.4%) had geographic ties to the residency program and may have been influenced by this geographic consideration. New programs in close geographic proximity to more established residency programs may not have the advantage of appeal based primarily on geographic considerations. In terms of comparing the characteristics between the 2 years of applicants, however, we did not find a significant difference in numbers of applicants with geographic ties and feel that this consideration should not have affected the class characteristics relative to one another.

The use of program directors’ subjective assessments presents the additional limitation and raises the question of external validity. We chose to use the program directors’ assessments of applicants in order to get a sense of what program directors in a new residency program may expect in their perceptions of applicant pool characteristics. These assessments were completed in real-time as the program directors evaluated the applicants’ files prior to inviting select applicants for interviews. In this sense, these results at least provide a guide for what new program directors may anticipate in the first 2 years of applicants to a residency program.

Conclusion

In a new emergency medicine residency program, applicants in the first year cycle, in which ERAS was not used, were statistically similar in research and volunteer service. While fewer in number compared to the second year of applicants, first year applicants received statistically higher scores for the written components of their applications—personal statement, dean’s letter, and letters of recommendation—as well as for their potential contributions to class characteristics. These results may assist program directors in a new residency program in anticipating applicant pool characteristics for the first 2 years of a new EM residency program.

References

  1. Task Force on Residency Training Information, Perina DG, Collier RE, Thomas HA, Witt EA (2009) Report of the task force on residency training information (2008-2009), American Board of Emergency Medicine. Ann Emerg Med 53(5):653–661

    Article  Google Scholar 

  2. SAEM Residency Catalog. <http://www.saem.org/saemdnn/ResidencyCatalog/tabid/680/Default.aspx/>.

  3. ACGME Program Requirements for Graduate Medical Education in Emergency Medicine. <http://www.acgme.org/acWebsite/downloads/RRC_progReq/110emergencymed07012007.pdf

  4. ACGME EM RRC: How to Apply for Accreditation in Seven Easy Steps. <http://www.acgme.org/acWebsite/home/accreditation_application_process.asp>.

  5. ERAS Timelines and Deadlines. <http://www.acgme.org/acWebsite/home/accreditation_application_process.asp>.

  6. Houry D, Shockley L (2001) Does participation in the electronic residency application service (ERAS) affect the quality of applications to a residency program? Acad Med 76(1):72–75

    Article  PubMed  CAS  Google Scholar 

  7. Crane JT, Ferraro CM (2000) Selection criteria for emergency medicine residency applicants. Acad Emerg Med 7:54–60

    Article  PubMed  CAS  Google Scholar 

  8. Hayden SR, Hayden M, Gamst A (2005) What characteristics of applicants to emergency medicine residency programs predict future success as an emergency medicine resident? Acad Emerg Med 12(3):206–210

    Article  PubMed  Google Scholar 

  9. Balentine J, Gaeta T, Spevack T (1999) Evaluating applications to emergency medicine residency programs. J Emerg Med 17:131–134

    Article  PubMed  CAS  Google Scholar 

  10. Dirschl DR, Dahners LE, Adams GL, Crouch JH, Wilson FC (2002) Correlating selection criteria with subsequent performance as residents. Clin Orthop 399:265–271

    Article  PubMed  Google Scholar 

Download references

Acknowledgements

Special thanks to all of the co-authors, in particular the physician faculty for their guidance and mentorship.

Funding/Conflict of Interest

The authors declare that they have no conflicts of interest or disclosures.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Troy E. Madsen.

Additional information

The views expressed in this paper are those of the author(s) and not those of the editors, editorial board or publisher.

Rights and permissions

Open Access This is an open access article distributed under the terms of the Creative Commons Attribution Noncommercial License ( https://creativecommons.org/licenses/by-nc/2.0 ), which permits any noncommercial use, distribution, and reproduction in any medium, provided the original author(s) and source are credited.

Reprints and permissions

About this article

Cite this article

Groke, S.F., Madsen, T.E., Strate, L. et al. Evaluating applicants to a new emergency medicine residency program: subjective assessment of applicant characteristics. Int J Emerg Med 3, 265–269 (2010). https://doi.org/10.1007/s12245-010-0209-5

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s12245-010-0209-5

Keywords