In this prospective, pre- and post-intervention survey, we determined that residents in an emergency medicine program perceive increased engagement, motivation, and challenge in a gamified curricular model compared to traditional, non-gamified, lecture-based didactics. Residents reported overwhelming agreement with positive statements regarding the implementation of gamification in their educational experiences, but our study was not powered to detect any significant difference in in-training examination scores before and after the intervention. Furthermore, there are a multitude of confounding variables affecting examination scores that were unable to be controlled for in this specific study.
For over half a century, educational researchers have recognized the need for change in our approach to adult learning [18]. Traditional pedagogical methods have long been shown to fall short. Couple this with an upcoming generation of adult learners who grew up in the technological age, not knowing life (or education) without computers, cellphones, and the internet. Emergency medicine educators have been leading the way in innovative strategies to improve adult learning for this new generation. Gamification is just one approach.
Over the past decade, the implementation of gamification models into educational curriculum has been met with tremendous success in a variety of settings [4, 5]. Systematic reviews of gamification in health professions education confirm that gamification has attracted the attention of educational researchers during the last few years, but there remains a scarcity of evidence and need for further theory-driven research [19,20,21]. In emergency medicine, we have seen success in gamification in various forms: game show-style quizzing [7,8,9], simulation competitions [10,11,12], escape rooms [13,14,15], and more. However, to our knowledge, this is the first study to consider these impacts of gamification in a longitudinal fashion.
The results supported our primary hypothesis that the game would increase learner motivation, engagement, and challenge. Selecting motivation as a measurable variable was important in that we believe it is a necessary precursor to self-directed learning. And the motivation we incur from gamification is the likely reason for its perceived success.
Perhaps the most broadly researched macro-theory of motivation is self-determination theory. SDT describes the core incentives of human behavior to reside with the needs for competence, relatedness, and autonomy [22]. Making learning into a game can allow residents the autonomy to set goals and obtain achievements, relate and socialize with their colleagues, and show their competence with progression in the game. In addition to these intrinsic needs, our game added extrinsic motivators in the form of gift cards and other prizes as awards for achievement. This was a strategic addition to solicit buy-in and further increase attention, engagement, and motivation. Although extrinsic motivators, like rewards and prizes, must be utilized cautiously, they have the ability to increase engagement, which can assist learners in identifying the value of an educational activity as it may apply to their circumstances, leading to autonomous motivation or “motivation arising out of genuine interest or personal endorsement or valuing of an activity.” [23]
Although our study was not powered to detect a significant difference in in-training examination scores, SDT supports that increasing levels of motivation, engagement, and challenge will lead to improved learning and outcomes. Further prospective, theory-based research is needed to determine whether gamification in emergency medicine residency programs can lead to improvements in both board examination scores and pass rates as well as patient outcomes.
Limitations
Medical education research, in general, suffers many methodological limitations [24]. Our study was no exception. This study was conducted at a single, community-based residency program limited to a very small sample size of only 18 residents, even with 100% participation. Furthermore, survey data, especially data obtained from residents, is prone to observer bias and social desirability bias as residents know they are being studied and may select the responses they believe to be desired by the investigators [25, 26].
Our main limitation, however, was the lack of power to detect a significant difference in performance on the in-training exam, likely due to small sample size. Even still, there are multiple confounding variables that affect examination scores and would prove difficult to control across a 1-year period separating subsequent in-training exams: other curricular components, quality of conferences, clinical experiences, self-directed learning, and more. Furthermore, we only obtained data on percentile rankings for each respondent (not raw score) which vary from year to year over administration periods and are compared to scores of peers in the same level of training across the country. A future consideration would be to utilize raw scores of in-training exams or, more broadly, pass rates on the ABEM qualifying exam, as the latter is already used in program evaluation. This would likely require a much larger sample size to yield statistically significant differences across exam administration periods.
Another potential limiting factor to the generalizability of our results is the intensive time requirement on the part of faculty to set up and conduct the game throughout the year. We estimate that a minimum of 30–50 faculty hours were volunteered throughout the game-play period to tabulate team scores, update leaderboards, and award prizes. This was in addition to time spent designing the game and performing the research component. Implementation of a large-scale game, like ours, would likely necessitate program support for protected faculty time and other resources. Despite these limitations, our study can be used as a basis to design future gamified curricular content and set up future prospective studies on its effectiveness.