Skip to main content
  • Original Research Article
  • Open access
  • Published:

Knowledge translation of the American College of Emergency Physicians’ clinical policy on syncope using computerized clinical decision support

Abstract

Aims

To influence physician practice behavior after implementation of a computerized clinical decision support system (CDSS) based upon the recommendations from the 2007 ACEP Clinical Policy on Syncope.

Methods

This was a pre-post intervention with a prospective cohort and retrospective controls. We conducted a medical chart review of consecutive adult patients with syncope. A computerized CDSS prompting physicians to explain their decision-making regarding imaging and admission in syncope patients based upon ACEP Clinical Policy recommendations was embedded into the emergency department information system (EDIS). The medical records of 410 consecutive adult patients presenting with syncope were reviewed prior to implementation, and 301 records were reviewed after implementation. Primary outcomes were physician practice behavior demonstrated by admission rate and rate of head computed tomography (CT) imaging before and after implementation.

Results

There was a significant difference in admission rate pre- and post-intervention (68.1% vs. 60.5% respectively, p = 0.036). There was no significant difference in the head CT imaging rate pre- and post-intervention (39.8% vs. 43.2%, p = 0.358). There were seven physicians who saw ten or more patients during the pre- and post-intervention. Subset analysis of these seven physicians’ practice behavior revealed a slight significant difference in the admission rate pre- and post-intervention (74.3% vs. 63.9%, p = 0.0495) and no significant difference in the head CT scan rate pre- and post-intervention (42.9% vs. 45.4%, p = 0.660).

Conclusions

The introduction of an evidence-based CDSS based upon ACEP Clinical Policy recommendations on syncope correlated with a change in physician practice behavior in an urban academic emergency department. This change suggests emergency medicine clinical practice guideline recommendations can be incorporated into the physician workflow of an EDIS to enhance the quality of practice.

Introduction

A gap exists between evidence-based knowledge and the care that is actually delivered to our patients [1, 2]. Knowledge translation is the process of bringing evidence from research to clinical practice. Practice guideline development, a pivotal step in this process, has limited effect on changing physician practice behavior [36]. This issue holds true in emergency medicine as well [79]. The American College of Emergency Physicians (ACEP) Clinical Policies have been shown to be safe and effective, and are even cited by other specialties [10, 11]. In spite of the benefits of the ACEP Clinical Policies, implementation of these clinical practice guidelines into physician practice continues to be a challenge. Even when physicians are aware of the evidence, they may not adhere to it [3, 12]. Lehrmann et al. [13] found that knowledge of the ACEP Clinical Policy on Hypertension did not translate into changes in physician practice.

Clinical Decision Support Systems (CDSS) are systems “designed to aid directly in clinical decision-making, in which characteristics of individual patients are used to generate patient-specific assessments or recommendations that are then presented to clinicians for consideration.” Clinical Decision Support Systems (CDSSs) can significantly improve clinical practice [1416]. Kawamoto et al. [16] found that clinical practice was improved with provision of CDSSs: (1) as part of clinician workflow, (2) with recommendations rather than assessments, (3) at the time and location of decision-making, and (4) if computer-based. Realizing the potential efficacy of CDSSs as Emergency Department documentation increasingly becomes computerized, Napoli and Jagoda [17] and Gallagher [8] concluded that future practice guideline implementation research should focus on using CDSSs.

This study aimed to improve knowledge translation from evidence-based emergency medicine practice guidelines by creating a CDSS for implementation of the recommendations from an ACEP Clinical Policy in an urban academic emergency department. We specifically chose the 2007 ACEP Clinical Policy on Syncope [18] because it included newly published recommendations on a frequently encountered diagnosis. We also hypothesized that there would be room for change in previously accepted physician practice behavior for the reasons outlined below. We sought to identify the presence of a change in physician ordering of cranial imaging and admission practices due to implementation of a HPI-based CDSS in patients with a final diagnosis of syncope. In our retrospective control population, the baseline admission rate for syncope patients was 68%. Due to the high costs of admission for syncope (estimated at $2 billion annually), there may be potential for cost savings if syncope admission guidelines are more closely followed [19]. The 2007 ACEP Clinical Policy on Syncope provides new recommendations on decision-making regarding need for head computed tomography (CT) imaging and hospital admission in adult patients presenting to the emergency department with syncope.

We hypothesized that incorporating these recommendations into a computerized CDSS embedded in the physician workflow of an emergency department information system (EDIS) would influence physician behavior. A change in behavior would suggest that this point-of-care decision support tool helps improve practitioner awareness, adoption, and adherence to ACEP practice guidelines and bridge the gap from evidence to practice.

Methods

Study design

This study was designed as a pre-post intervention design with a prospective cohort and retrospective controls. We conducted a medical chart review with 6-month retrospective baseline analysis and prospective data collection following implementation of a CDSS based on the 2007 ACEP Clinical Policy on Syncope into the EDIS.

The Mount Sinai School of Medicine Institutional Review Board reviewed and approved this study.

Study setting

The Mount Sinai Medical Center (New York, NY) is an 1,171-bed tertiary care academic medical center located in Manhattan, bordering the Upper East Side and East Harlem. Mount Sinai has a 61-bed ED with a volume of 88,140 visits in 2007. The ED adopted a comprehensive EDIS in 2004 (Picis, ED Pulsecheck, Wakefield, MA, formerly IBEX) that provides triage, patient tracking, physician and nurse documentation, retrieval of charts from prior ED encounters and inpatient data, computerized provider order entry, results review, discharge instructions and prescription writing. Attendings and/or residents could complete charts and final diagnosis on every ED patient, including a history of present illness (HPI) and final emergency department diagnosis (chosen from a drop-down list of International Classification of Diseases, 9th Revision approved diagnoses or free-text entries).

Study population

The intervention assessed behavior of physicians caring for consecutive syncope patients over 18 years of age in the emergency department. The medical records of 410 patients were reviewed prior to implementation, and 301 records were reviewed after implementation. Physicians included 34 attending physicians board-certified or board-eligible in emergency medicine, with additional information-gathering and decision-making provided by residents primarily in emergency medicine (EM), as well as occasional rotators from the departments of internal medicine, psychiatry, and obstetrics and gynecology.

Study protocol

A three-item module based on the 2007 ACEP Clinical Policy on Syncope [18] was added to the Syncope HPI template, for completion by attending or mid-level providers (see Fig. 1, Table 1). Each item served a dual role in: (1) reminding physicians of the strength of the policy’s recommendations and (2) prompting physicians to document their clinical decision-making.

Fig. 1
figure 1

A screen capture of the CDSS module as it appeared in the syncope HPI template

Table 1 Drop-down menu options for physicians charting syncope presentations. Menu options are adapted from the following recommendations in the 2007 ACEP Clinical Policy of Syncope [18]. “Cranial CT scanning need not be routinely performed unless guided by specific findings in the history or physical examination.” “Consider older age, structural heart disease, or a history of coronary artery disease as risk factors for adverse outcome.” “Admit patients with syncope and evidence of heart failure or structural heart disease.” “Admit patients with…older age and associated comorbidities, abnormal ECG, Hct <30 (if obtained), history or presence of heart failure, coronary artery disease, or structural heart disease”

The items included in the module prompted physicians to risk-stratify adult syncope patients based on EKG findings, and to explain the rationale for head CT and admission. Each item included a drop-down list based on recommendations and phrasings from the ACEP Clinical Policy, whose guidelines state that head CTs should not be ordered in patients presenting with syncope unless suggested by specific findings in the patient’s history or physical, and that hospital admission be reserved for specific high-risk patients.

Coinciding with the appearance in the electronic chart, the module’s presence as a decision support tool was announced via e-mail by a non-investigator to the EM residents and faculty. The announcement only brought to the physicians’ attention that the tool had become available and did not include details of this study. There was no additional marketing strategy as the objective of the study was to assess the impact of the CDSS alone.

Measurements

A retrospective chart review was employed to quantify baseline rates for ordering head CT and admission of adult patients with syncope presenting to the ED. Patients aged 18 years or older were identified by searching the electronic archive of all ED visits by patients in the periods 6 months pre-intervention and 20 weeks post-intervention, using a search of final diagnoses that included the word “syncope” or “syncopal.” Our EDIS stores final diagnoses in text format. Two abstractors (NC and NG, who were not blinded to the study purpose) then de-identified these records and exported them to Excel (Microsoft, Redmond, WA) for analysis.

Patient disposition (discharge from ED or admission to any inpatient service) was tabulated. Additionally, patient records were cross-matched against a radiology requisition record to determine which syncope patients had a head CT ordered and performed during their ED course.

Data analysis

Using SAS (version 9.2: SAS Institute Inc., Cary, NC), descriptive statistics were calculated (mean and standard deviation for continuous variables and proportions with the corresponding two-sided 95% confidence interval for categorical variables). Comparability of the pre-intervention group and the post-intervention group was analyzed using the two-sample t-test for age and the chi-square test for categorical variables, such as gender, admission rate, and head CT scan rate. Following the initial analysis, subset analyses for admission rate and head CT imaging rate were performed: (1) comparing the behavior of seven physicians who cared for ten or more patients in both the pre- or post-intervention groups and (2) comparing groups completing and bypassing the CDSS in the post-intervention group using the two-tail Z-test.

In the patient population studied in the San Francisco Syncope Rule derivation [20], application of the rule might have reduced admissions by 10%. Due to this finding, a 10% absolute reduction in admissions and head CT scans was used for the sample size justification. A one group χ2 test with a 0.05 two-sided significance level will have 80% power to detect a difference between a pre-intervention admission rate of 0.68 and a post-intervention rate of 0.58 when the sample size is 177. A one group χ2 test with a 0.05 two-sided significance level will have 80% power to detect a difference between a pre-intervention head CT scan rate of 0.40 and a post-intervention rate of 0.30 when the sample size is 182. As age was found to have a possible confounding effect on the outcome of ordering a head CT and admission in the univariate screen, it was included in a multivariable analysis.

Results

In the 6-month pre-implementation period between 1 June and 30 November 2007, a total of 410 patients aged 18 years or older presented to the Mount Sinai ED with a final diagnosis of syncope (see Table 2). Two hundred forty-four of these 410 patients were female (59.5%), and the average age was 64.9 ± 21.3 years.

Table 2 Patient demographics, pre- and post-CDSS intervention

The decision support module was added to the Mount Sinai Emergency Department Information System (EDIS) on 4 February 2008, and between that date and 22 June 2008, a total of 301 patients aged 18 years or older presented to the Mount Sinai ED with a final diagnosis of syncope (see Table 2). One hundred eighty-five of these 301 patients were female (61%), and the average age was 61.9 ± 22.5 years. There was no significant difference in gender or age between the pre- and post-intervention groups. There was no significant interaction between intervention and age for admission rate (p < 0.258) or for head CT rate (p < 0.420).

In the pre-intervention cohort, 68.1% of patients were admitted [95% CI: (63.3 to 72.5)] (see Table 3 and Fig. 2). The post-intervention admission rate was 60.5% [95% CI: (54.7 to 66.0)]. There was a significant difference in admission rate pre- and post-intervention (p = 0.036). The pre-intervention rate of obtaining a head CT was 39.8% [95% CI: (35.0 to 44.7)] compared to 43.2% [95% CI: (37.5 to 49.0)] post-intervention (see Table 3 and Fig. 2). There was no significant difference in the head CT scan rate pre- and post-intervention (p = 0.358).

Table 3 Percentage of syncope patients admitted and receiving head CT in the pre- and post-CDSS intervention periods
Fig. 2
figure 2

Percentage of syncope patients admitted and receiving head CT in the pre- and post-CDSS intervention periods

There were seven physicians who saw ten or more patients during the pre- and post-intervention. A subset analysis for the admission rate and head CT scan rate was performed on these seven physicians. There was a slight significant difference in the admission rate pre- and post-intervention for the subset of seven physicians who saw ten or more patients pre- and post-intervention (74.3% vs. 63.9%, respectively, p = 0.0495; see Table 4 and Fig. 3). There was no significant difference in the head CT scan rate pre- and post-intervention for the subset of seven physicians who saw ten or more patients pre- and post-intervention (42.9% vs. 45.4%, respectively, p = 0.660).

Table 4 Subset analysis of percentage of syncope patients admitted and receiving head CT in the pre- and post-CDSS intervention periods among seven physicians who saw ten or more patients during the pre- and post-intervention
Fig. 3
figure 3

Subset analysis: percentage of syncope patients admitted and receiving head CT in the pre- and post-CDSS intervention periods among seven physicians who saw ten or more patients during the pre- and post-intervention

In the post-CDSS intervention group comparing admission and CT head rates when the CDSS was completed versus when it was not completed, subset analysis revealed an admission rate of 51.7% when the CDSS was completed compared to 64.0% when it was not (Z-value 1.96, statistically significant to 95.0% confidence level) and a CT head rate of 43.7% when the CDSS was completed compared to 43.0% when it was not (Z-value 0.03, not statistically significant; see Table 5). In the post-CDSS intervention group comparing admission and CT head rates when the CDSS was visible versus when it was not visible, subset analysis revealed an admission rate of 54.0% when the CDSS was visible compared to 73.3% when it was not (Z-value 3.06, statistically significant to 99.8% confidence level) and a CT head rate of 38.5% when the CDSS was visible compared to 52.5% when it was not (Z-value 2.19, statistically significant to 97.2% confidence level; see Table 6).

Table 5 Subset analysis in the post-CDSS intervention group comparing admission and CT head rates when the CDSS was completed versus when it was not completed. A two-tailed Z-value is provided along with the confidence level (CL%) at which the two rates are deemed to be statistically significant
Table 6 Subset analysis in the post-CDSS intervention group comparing admission and CT head rates when the CDSS was visible versus when it was not visible. A two-tailed Z-value is provided along with the confidence level (CL%) at which the two rates are deemed to be statistically significant

Discussion

This study assessed the impact on physician management of syncope patients after implementation of a CDSS based upon the 2007 ACEP Clinical Policy on Syncope into an EDIS.

The admission rate for syncope patients was significantly lower in the post-intervention period compared to the pre-intervention period. The head CT imaging rate for syncope patients was not significantly different during the pre- and post-intervention periods. Since there was no significant interaction between intervention and age for admission rate (p < 0.258) or for head CT rate (p < 0.420), we can therefore conclude that age in conjunction with pre-/post-intervention does not affect the admission rate or ordering head CT scan rate in our cohort.

Subset analysis of physicians seeing more than ten patients in both the pre- and post-intervention periods showed similar changes and did not suggest cluster-associated phenomenon. Observed changes in admission rates in adult syncope patients may be indicative of improved awareness, adoption, and adherence to ACEP practice guidelines.

Subset analysis of physician behavior when the CDSS was completed versus not completed and visible versus not visible revealed significant differences in admission and CT head rates when the CDSS was visible and significant differences in admission when the CDSS was completed. These findings suggest that although this intervention was a passive one, the CDSS’s presence may have had significant effect on physician practice behavior. An alternative conclusion might be that physicians who are likely to ignore CDSS may also be less likely to adhere to evidence-based clinical practice guidelines.

It is well-established in the literature that even when physicians are aware of evidence, they may not adhere to it [3, 12]. Lehrmann et al. [13] found that increased knowledge of the ACEP Clinical Policy on Hypertension following distribution of the guidelines did not translate into changes in physician practice. Cabana et al. [3] identified knowledge, attitudes, and behavior as barriers to physician adherence to clinical practice guidelines. Kirkpatrick’s hierarchy of levels of evaluation proposes that “complexity of behavioral change increases as evaluation of intervention ascends the hierarchy” [21]. As evaluation ascends from (1) reactions to (2) learning to (3) behavior to (4) results, the impact of the intervention strengthens from (1) learner satisfaction to (2) knowledge to (3) transfer of learning to the workplace to (4) impact on society, respectively. Since acquired knowledge of ACEP Clinical Policy on Hypertension did not translate into changes in physician practice in Lehrmann’s trial, we approached the problem of adopting evidence-based guidelines in clinical practice at the next level in the hierarchy of interventions, namely, transfer of learning to the workplace via evaluation of behavior.

While developing the CDSS, we focused on following the provisions for improved clinical practice outlined by Kawamoto et al. [16] Namely, the CDSS was included in the clinician workflow in our computer-based EDIS. We also developed a CDSS that used recommendations rather than assessments. Instead of explicitly assessing for compliance to ACEP Clinical Policy recommendations, we sought a measurable change in physician behavior. Although our population demographics were similar to previously studied populations, this admission rate is considerably higher than previously studied populations [10, 20, 22]. We chose head CT imaging as our other outcome because in the absence of focal neurologic findings, head CT imaging is of low yield in determining the etiology of syncope [2325]. We suspected that clinicians were ordering more cranial imaging in syncope patients than necessary.

To our knowledge this is the first study to demonstrate a significant change in physician behavior after implementation of a CDSS based upon ACEP Clinical Policy. While the body of medical research and literature grows rapidly, practice guidelines provide a means to educate, summarize, and distill evidence-based medicine to the practicing physician. However, implementation and utilization of the ACEP Clinical Policies has historically been a challenge. Given the increased adoption of robust EDISs, these results will encourage further experimentation and implementation of CDSSs based upon evidence-based clinical practice guidelines into EDISs.

The next step for research and development of such CDSSs could include: (1) assessment of our CDSS closer to the point of decision-making or in a different practice environment, (2) integration of other ACEP Clinical Policies into similar CDSSs in an effort to create an EDIS with comprehensive decision support, or (3) focus on more complex outcome measures such as compliance with guidelines or patient outcome.

Limitations

We have identified several limitations to our study. First, instead of explicitly assessing for compliance to ACEP Clinical Policy recommendations, we assessed for a measurable change in physician behavior. Our EDIS could not track specific use of the CDSS. Consequently, admissions or head CTs could have increased in one type of patient and decreased in another, leaving the global rate unchanged.

Second, the EDIS in our institution did not have the capability to incorporate decision support at the precise time and location of decision-making in physician work flow. Specifically, our EDIS provides decision support in the documentation template for History of Present Illness (HPI). Therefore, the CDSS was more passive than active, meaning that CDSS use did not actually result in the action of ordering a head CT or admitting a patient.

Although our chart extraction selected patients with a diagnosis of or including the term “syncope,” the physician documenting on such a patient had the option to choose the patient’s HPI template independently of the patient’s diagnosis. Therefore, a patient diagnosed with syncope might not have the Syncope HPI template. In such a scenario, the physician would not encounter the decision support tool. Furthermore, as is common in unpredictable environments like the ED, HPI documentation may occur before or after the decision to obtain imaging or admit the patient. Since this study was completed at a teaching hospital, the physician documenting the HPI was not always the physician deciding the patient’s diagnosis, need for imaging, or admission.

Next, to prevent delaying or disrupting physician work flow, we decided the HPI documentation could not be made to require use of the CDSS, so the CDSS could be bypassed without reading or completing it. In order to minimize interruption of work flow, we abbreviated the language of the recommendations from the ACEP Clinical Policy on Syncope to fit within the format of the drop-down menus seen in Fig. 1. We assume that the practice guideline was not originally authored with the intention of implementation in an abbreviated form. Thus, it is possible that rewording the recommendations creates a simplified or modified recommendation for clinical decision-making. In a separate performance improvement project, however, we had markedly improved documentation of aspirin and beta-blocker use in chest pain patients using a CDSS with a single reminder phrase just above the “ENTER” button of a given HPI template. We found the structure and placement of the chest pain CDSS provided a quick and accessible—yet not prohibitive—reminder.

It is important to note that the 2007 version of ACEP Clinical Policy on Syncope was revised to include new recommendations using the findings from the derivation and validation of the San Francisco Syncope Rule [20]. Two studies have had problems validating the San Francisco syncope rule [26, 27].

A final limitation of this study was that the CDSS was trialed at the practice site of the CDSS’s creators. Garg et al. [15] found that “studies in which authors also created the CDSS reported better performance compared with those in which the trialists were independent of the CDSS development process.” We attempted to minimize this “Hawthorne effect” bias by having a third party announce the implementation of the CDSS without mention of the fact that we would be gathering data on physician behavior associated with the CDSS. Additionally, within our department there was no specific notification or implementation of new ACEP Clinical Policies.

Conclusions

In conclusion, in our urban academic emergency department the introduction of an evidence-based CDSS based upon ACEP Clinical Policy recommendations on syncope correlated with a change in physician practice behavior in terms of admission but not for head CT imaging. When the CDSS was visible but not used, it had significant effect on both admission and head CT imaging rates. This change suggests emergency medicine clinical practice guideline recommendations can be incorporated into the physician workflow of an EDIS to enhance the quality of practice. A more active CDSS implemented at the point of medical decision-making and whose use resulted in physician order entry might have a greater impact on behavior.

References

  1. McGlynn EA, Asch SM, Adams J, Keesey J, Hicks J, DeCristofaro A, Kerr EA (2003) The quality of health care delivered to adults in the United States. N Engl J Med 348:2635–2645

    Article  PubMed  Google Scholar 

  2. Pham JC, Kelen GD, Pronovost PJ (2007) National study on the quality of emergency department care in the treatment of acute myocardial infarction and pneumonia. Acad Emerg Med 14(10):856–863

    Article  PubMed  Google Scholar 

  3. Cabana MD, Rand CS, Powe NR, Wu AW, Wilson MH, Abboud PC, Rubin HR (1999) Why don’t physicians follow clinical practice guidelines? A framework for improvement. JAMA 282:1458–1465

    Article  PubMed  CAS  Google Scholar 

  4. Peterson ED, Roe MT, Mulgund J, DeLong ER, Lytle BL, Brindis RG, Smith SC et al (2006) Association between hospital process performance and outcomes among patients with acute coronary syndromes. JAMA 295:1912–1920

    Article  PubMed  CAS  Google Scholar 

  5. Blomkalns AL, Roe MT, Peterson ED, Ohman EM, Fraulo ES, Gibler WB (2007) Guideline implementation research: exploring the gap between evidence and practice in the CRUSADE Quality Improvement Initiative. Acad Emerg Med 14:949–954

    Article  PubMed  Google Scholar 

  6. Faul M, Wald MM, Rutland-Brown W, Sullivent EE, Sattin RW (2007) Using a cost-benefit analysis to estimate outcomes of a clinical treatment guideline: testing the Brain Trauma Foundation guidelines of the treatment of severe traumatic brain injury. J Trauam 63:1271–1278

    Article  Google Scholar 

  7. Wears RL (2002) Headaches from practice guidelines? Ann Emerg Med 39:334–337

    Article  PubMed  Google Scholar 

  8. Gallagher EJ (2002) How well do clinical practice guidelines guide clinical practice? Ann Emerg Med 40:394–398

    Article  PubMed  Google Scholar 

  9. Holroyd BR, Wilson D, Rowe BH, Mayes DC, Noseworthy T (2004) Uptake of validated clinical practice guidelines: experience with implementing the Ottawa Ankle Rules. Am J Emerg Med 22:149–155

    Article  PubMed  Google Scholar 

  10. Elesber AA, Decker WW, Smars PA, Hodge DO, Shen WK (2005) Impact of the application of the American College of Emergency Physicians recommendations for the admission of patients with syncope on a retrospectively studied population presenting to the emergency department. Am Heart J 149:826–831

    Article  PubMed  Google Scholar 

  11. Neff MJ (2003) ACEP releases clinical policy on evaluation and management of pulmonary embolism. Am Fam Physician 68:759–760

    PubMed  Google Scholar 

  12. Glasziou P, Haynes B (2005) The paths from research to improved health outcomes. ACP J Club 142:A8–A10

    PubMed  Google Scholar 

  13. Lehrmann JF, Tanabe P, Baumann BM, Jones MK, Martinovich Z, Adams JG (2007) Knowledge translation of the American College of Emergency Physicians Clinical Policy on Hypertension. Acad Emerg Med 14:1090–1096

    Article  PubMed  Google Scholar 

  14. Hunt DL, Haynes RB, Hanna SE, Smith K (1998) Effects of computer-based clinical decision support systems on physician performance and patient outcomes: A systematic review. JAMA 280:1339–1346

    Article  PubMed  CAS  Google Scholar 

  15. Garg AX, Adhikari NK, McDonald H, Rosas-Arellano MP, Devereaux PJ, Beyene J, Sam J et al (2005) Effects of computerized clinical decision support systems on practitioner performance and patient outcomes: A systematic review. JAMA 293:1223–1238

    Article  PubMed  CAS  Google Scholar 

  16. Kawamoto K, Houlihan CA, Balas EA et al (2005) Improving clinical practice using clinical decision support systems: a systematic review of trials to identify features critical to success. BMJ 330:765–772

    Article  PubMed  PubMed Central  Google Scholar 

  17. Napoli AM, Jagoda A (2007) Clinical policies: Their history, future, medical legal implications, and growing importance to physicians. J Emerg Med 33:425–432

    Article  PubMed  Google Scholar 

  18. Huff JS, Decker WW, Quinn JV, Perron AD, Napoli AM, Peeters S, Jagoda AS (2007) Clinical policy: critical issues in the evaluation and management of adult patients presenting to the emergency department with syncope. Ann Emerg Med 49:431–444

    Article  PubMed  Google Scholar 

  19. Sun BC, Emond JA, Camargo CA Jr (2005) Direct medical costs of syncope-related hospitalizations in the United States. Am J Cardiol 95:668–671

    Article  PubMed  Google Scholar 

  20. Quinn JV, Stiell IG, McDermott DA, Sellers KL, Kohn MA, Wells GA (2004) Derivation of the San Francisco Syncope Rule to predict patients with short-term serious outcomes. Ann Emerg Med 43:224–232

    Article  PubMed  Google Scholar 

  21. Hutchinson L (1999) Evaluating and researching the effectiveness of educational interventions. BMJ 318:1267–1269

    Article  PubMed  CAS  PubMed Central  Google Scholar 

  22. Quinn J, McDermott D, Stiell I, Kohn M, Wells G (2006) Prospective validation of the San Francisco Syncope Rule to predict patients with serious outcomes. Ann Emerg Med 47:448–454

    Article  PubMed  Google Scholar 

  23. Giglio P, Bednarczyk EM, Weiss K, Bakshi R (2005) Syncope and head CT scans in the emergency department. Emerg Radiol 12:44–46

    Article  PubMed  Google Scholar 

  24. Goyal N, Donnino MW, Vachhani R, Bajwa R, Ahmad T, Otero R (2006) The utility of head computed tomography in the emergency department evaluation of syncope. Intern Emerg Med 1:148–150

    Article  PubMed  Google Scholar 

  25. Grossman SA, Fischer C, Bar JL, Lipsitz LA, Mottley L, Sands K, Thompson S, Zimetbaum P, Shapiro NI (2007) The yield of head CT in syncope: a pilot study. Intern Emerg Med 2:46–49

    Article  PubMed  CAS  PubMed Central  Google Scholar 

  26. Birnbaum A, Esses D, Bijur P, Wollowitz A, Gallagher EJ (2008) Failure to validate the San Francisco syncope rule in an independent emergency department population. Ann Emerg Med 52:151–159

    Article  PubMed  Google Scholar 

  27. Sun BC, Mangione CM, Merchant G, Weiss T, Shlamovitz GZ et al (2007) External validation of the San Francisco syncope rule. Ann Emerg Med 49:420–427

    Article  PubMed  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Edward R. Melnick.

Additional information

Prior Presentations

Harvard Macy Institute for Educators in the Health Professions, May 2008; SAEM Annual Meeting, May 2009.

Funding Sources

None

The views expressed in this paper are those of the author(s) and not those of the editors, editorial board or publisher.

Rights and permissions

Open Access This is an open access article distributed under the terms of the Creative Commons Attribution Noncommercial License ( https://creativecommons.org/licenses/by-nc/2.0 ), which permits any noncommercial use, distribution, and reproduction in any medium, provided the original author(s) and source are credited.

Reprints and permissions

About this article

Cite this article

Melnick, E.R., Genes, N.G., Chawla, N.K. et al. Knowledge translation of the American College of Emergency Physicians’ clinical policy on syncope using computerized clinical decision support. Int J Emerg Med 3, 97–104 (2010). https://doi.org/10.1007/s12245-010-0168-x

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s12245-010-0168-x

Keywords