Back to Journals » Journal of Multidisciplinary Healthcare » Volume 18
Tracking Clinical Competency Growth: A Longitudinal Study of Medical Students in a Multidisciplinary Emergency Department Internship Program
Authors Sung TC , Shih HI, Kawaguchi T, Chi CH, Hsu HC
Received 2 May 2025
Accepted for publication 1 July 2025
Published 7 July 2025 Volume 2025:18 Pages 3877—3890
DOI https://doi.org/10.2147/JMDH.S530887
Checked for plagiarism Yes
Review by Single anonymous peer review
Peer reviewer comments 2
Editor who approved publication: Dr David C. Mohr
Tzu-Ching Sung,1 Hsin-I Shih,2– 4 Takeshi Kawaguchi,5 Chih-Hsien Chi,2,3 Hsiang-Chin Hsu3,6
1School of Medicine for International Students, College of Medicine, I-Shou University, Kaohsiung, Taiwan; 2Department of Emergency Medicine, National Cheng Kung University Hospital, National Cheng Kung University, Tainan, Taiwan; 3Department of Emergency Medicine, National Cheng Kung University Hospital, College of Medicine, National Cheng Kung University, Tainan, Taiwan; 4Department of Public Health, National Cheng Kung University Hospital, College of Medicine, National Cheng Kung University, Tainan, Taiwan; 5Department of Emergency and Critical Care Medicine, St. Marianna University School of Medicine, Kanagawa, Japan; 6School of Medicine, College of Medicine, National Cheng Kung University, Tainan, Taiwan
Correspondence: Hsiang-Chin Hsu, Department of Emergency Medicine, National Cheng Kung University Hospital, National Cheng Kung University, No. 138, Shengli Road, North Dist, Tainan City, 704302, Taiwan, Tel +886-6-235-3535 ext. 4381 ; +886-919-157-167, Email [email protected]
Background: Emergency department (ED) internships are essential for developing core clinical competencies in medical students. This study evaluates the weekly progression of key competencies during a structured three-week ED internship and examines the impact of extending the program from two to three weeks. Additionally, it considers how such training supports students’ readiness for multidisciplinary, team-based healthcare environments.
Methods: This retrospective longitudinal study analyzed a total of 2068 workplace-based assessment (WBA) forms collected from June 2018 to May 2020. Each WBA evaluated ten key clinical competencies: learning attitude, history taking, clinical judgment, medical knowledge, prescription, communication, patient education, literature appraisal, oral presentation, and documentation. A subgroup of 133 medical students who completed the full three-week internship with complete WBA records was included in the longitudinal analysis. Generalized Estimating Equations (GEE) modeling assessed performance trends and inter-cohort differences between 2018 and 2019.
Results: Significant improvements were observed across all competencies, with the most notable gains in history taking, clinical judgment, communication, prescription, and documentation. Learning attitude scores increased from 4.46 (SD = 0.66) in Week 1 to 4.58 (SD = 0.58) in Week 3. Prescription and documentation improved from 3.85 and 4.17 to 4.06 and 4.32, respectively. GEE analysis confirmed significant performance gains by Week 3 (p = 0.0064). Students in 2019 outperformed those in 2018 (p = 0.0318), suggesting that curriculum refinements enhanced educational outcomes.
Conclusion: A structured three-week ED internship significantly enhances medical students’ clinical competencies, particularly in areas fundamental to multidisciplinary care such as communication, clinical judgment, and documentation. The extended duration promotes deeper clinical immersion and fosters skill reinforcement through real-time feedback and collaborative practice. These findings underscore the importance of sustained, structured clinical experiences in preparing students to function effectively within interdisciplinary healthcare teams. Further research should investigate optimal training durations and integrative teaching strategies to support competency-based, team-oriented medical education.
Keywords: clinical competency progression, emergency training, medical internships, diagnostic reasoning, multidisciplinary care, medical education
Background
Emergency medicine (EM) offers a distinctive and dynamic learning environment for undergraduate medical students, providing opportunities to gain hands-on experience in managing acute and often life-threatening conditions. In this high-pressure setting, students are expected to assess and treat patients with undifferentiated complaints under the guidance of experienced physicians. Recognizing the increasing complexity of emergency care, the Josiah Macy Jr. Foundation’s 1994 report emphasized the necessity for medical trainees to acquire comprehensive clinical competencies across multiple disciplines.1–3 Furthermore, emergency medicine encompasses skill sets and knowledge areas that are frequently underrepresented in traditional undergraduate medical curricula, prompting EM educators to advocate for greater integration of EM training into the core curriculum.4
On June 6, 2016, the Ministry of Education in Taiwan released the New Guidelines for Clinical Practice of Medical Students in Schools of Medicine, mandating a transition from a seven-year to a six-year medical education program. This change applied to students admitted into traditional programs starting from the 2013 academic year and to post-baccalaureate medical students beginning in 2015.5 Under the revised curriculum, the fifth and sixth years are designated for intensive clinical clerkships, placing greater emphasis on experiential learning and direct patient care. Compared to the previous model that relied heavily on classroom-based instruction, the new framework aims to cultivate core clinical competencies by immersing students in real-world healthcare settings. This shift is intended to better prepare future physicians for collaborative, multidisciplinary healthcare environments.
Based on the Guidelines for Undergraduate Education in Emergency Medicine by the American College of Emergency Physicians (ACEP), medical students should develop key clinical competencies to recognize and manage urgent and emergent conditions.6 These include focused history-taking, physical examination, diagnostic reasoning, critical thinking, teamwork, and basic procedural skills. ACEP emphasizes that these core skills should be fostered through structured clinical exposure in emergency departments and integrated throughout the preclinical and clinical years, regardless of a student’s future specialty. These core skills, also recognized as milestones, were required to be assessed for fourth‐year medical student performance by the completion of their EM clerkships.7 Such mandatory EM clerkship were proved to increase medical students’ perceived self-efficacy.8 In Taiwan, the Assessment Standards for Clinical Skills of Medical Students in Six-Year Medical Programs, revised by the Taiwan Medical Accreditation Council (TMAC) in 2019, outlines 80 required competencies for graduation.9 Of these, 11 are particularly well-suited for training in emergency medicine, such as triage, basic and advanced life support, airway management, communication, prescription writing, patient education, literature appraisal, presentation, and documentation. These competencies align with ACEP’s framework, highlighting emergency medicine as a vital environment for developing essential clinical skills in line with both national and international standards.
In alignment with this reform, the ED clerkship was significantly restructured.10 Table 1 compares the former two-week and the new three-week program for sixth-year medical students. The updated curriculum integrates didactic teaching, procedural workshops, simulations, and expanded clinical rotations. Twenty-three hours are now dedicated to hands-on training in triage, BLS, ALS, PLS, communication, and intoxication management—shifting from a lecture-heavy model to one emphasizing interactive, skill-based learning. A major improvement is the 99 hours of clinical rotations across day, evening, and night shifts, offering broader exposure to emergency care scenarios. Previously, students had only 6 hours of day-shift experience, limiting their understanding of ED operations. Evaluation methods have also evolved to support a more competency-based approach. In the new program, 40% of the grade comes from clinical performance, 30% from skills assessments (eg, BLS, PLS, simulations), 20% from written exams, and 10% from oral presentations—ensuring a balanced assessment of knowledge, practical ability, and communication. The revised structure thus reflects a deliberate shift in pedagogical priorities: from task-specific competency toward integrated clinical capability, aimed at producing well-rounded, adaptable physicians who can function effectively in high-stakes, time-sensitive environments like the ED.
![]() |
Table 1 Comparison of ED Rotation Courses – Current Three-week vs Previous Two-week Clerkship Training |
Within this revised training and assessment context, the cultivation of clinical reasoning has emerged as a central yet challenging educational goal. Clinical reasoning is broadly defined as “a process that operates toward the purpose of arriving at a diagnosis, treatment, and/or management plan”.11 Mastery of clinical reasoning is essential for undergraduate medical students, as it reduces diagnostic errors, promotes effective decision-making, and ultimately enhances patient safety—particularly in emergency care where time and clarity are critical.12,13 The unpredictable, high-acuity environment of the ED—characterized by diverse patient presentations, shifting priorities, and limited information—offers a powerful setting for trainees to develop these diagnostic competencies under real-time pressure.
Despite the growing emphasis on experiential learning and competency-based education, limited research has investigated how medical students’ clinical competencies evolve within a structured emergency medicine clerkship—particularly in light of curricular reforms such as extended training duration. To address this gap, the present study aims to evaluate the weekly progression of essential clinical competencies—including clinical reasoning, history taking, physical examination, and communication skills—throughout a structured three-week ED internship. Furthermore, the study examines the educational impact of transitioning from a two-week to a three-week program, assessing how this extended duration enhances students’ clinical development in a high-acuity, fast-paced environment. Given that the ED represents a dynamic, multidisciplinary care setting where teamwork, interprofessional collaboration, and adaptability are crucial, understanding the trajectory of competence acquisition in this context is particularly important. By providing empirical evidence on students’ longitudinal growth, this study contributes valuable insights into the optimization of emergency medicine training within undergraduate medical education. It supports the development of future physicians equipped with complex, team-based healthcare systems.
Methods and Materials
Study Design
This retrospective longitudinal study was conducted at the College of Medicine, National Cheng Kung University, Taiwan, over a 24-month period from June 2018 to May 2020. The 2018–2019 period was selected for focused analysis because it marked the first implementation of a structured, three-week ED internship using standardized WBAs. The study aimed to evaluate the short-term progression of core clinical competencies among sixth-year medical students participating in the “Emergency Medicine & Clerkship” course, a mandatory, two-credit component of the undergraduate medical curriculum.
This course was designed as a structured, three-week immersive clerkship situated within the ED, providing students with hands-on, experiential training in a real-world, high-pressure clinical setting. The curriculum was intentionally structured to simulate the unpredictability and temporal demands of emergency medicine practice. To that end, each student completed four clinical shifts per week, comprising two-day shifts (08:00–17:00), one evening shift (17:00–22:00), and one overnight shift (22:00–08:00). This shift-based rotation model was developed to ensure that students were exposed to the full spectrum of emergency care—ranging from routine cases during daytime hours to high-acuity, resource-limited scenarios during nights and weekends. It also provided insight into the physiological stress and cognitive load associated with non-traditional working hours, a core element of professional readiness in emergency medicine.
A central component of the course was active, supervised clinical engagement, grounded in the national competency-based medical education framework. Under the supervision of senior ED physicians, students were expected to manage a wide variety of patient cases, perform focused history taking, physical examinations, initial assessments, and critical interventions, and contribute to clinical decision-making discussions. This breadth of exposure facilitated the development of key competencies such as diagnostic reasoning, teamwork, and communication under time-sensitive and uncertain conditions.
Participants
This study included sixth-year medical students enrolled in the Emergency Medicine & Clerkship course at the College of Medicine, National Cheng Kung University, between June 2018 and May 2020. As a mandatory component of the final-year undergraduate medical curriculum, all students were required to complete a structured three-week ED rotation as part of their clinical training. During the study period, a total of 133 students were initially eligible for inclusion.
To ensure consistency in clinical exposure and the reliability of performance evaluation data, the final analytic sample consisted of 133 students who met predefined inclusion and exclusion criteria. All students were enrolled in the mandatory ED clerkship between June 2018 and May 2020. Only those who voluntarily completed clinical performance assessments for each of the three weeks and adhered strictly to the standard training schedule were included. Eligibility required full completion of the three-week ED rotation, including participation in a predefined set of clinical shifts: a minimum of six daytime shifts (08:00–17:00), three evening shifts (17:00–22:00), and three overnight shifts (22:00–08:00). Students attended the clerkship five days per week with a workload capped at 40 hours per week. Clinical performance was evaluated by supervising physicians or senior residents at least once daily at least once daily, ensuring comprehensive and consistent documentation of each student’s competency development throughout the rotation.
In addition, inclusion required the availability of complete weekly assessment records across all three weeks, encompassing ten core clinical competency domains outlined in the institution’s standardized evaluation system. These domains included, but were not limited to, clinical reasoning, procedural skills, communication, and professionalism, all assessed by supervising faculty as part of routine clerkship evaluation.
Students were excluded from the analysis if they missed more than one scheduled shift for any reason (eg, illness, personal leave, emergencies), had incomplete or missing assessment records, or participated in modified training schedules that could introduce inconsistencies in clinical exposure.
Performance Evaluation and Feedback Mechanisms
Throughout the three-week ED clerkship, students’ clinical performance was systematically observed and assessed by supervising emergency medicine faculty. These evaluations were aligned with the nationally endorsed core competencies defined by TMAC, which include patient assessment and management, procedural skills, clinical reasoning, professionalism, communication, and adherence to ethical and patient safety standards. These assessments were integrated into routine clinical activities to ensure authenticity and relevance to real-world practice.
Following each observed clinical interaction, students received real-time formative feedback from faculty supervisors. This feedback process aimed to encourage immediate reflection on clinical decisions and behaviors, facilitate the correction of errors and reinforce best practices. By integrating feedback into the context of actual patient care, the evaluation process not only served as a measure of performance but also functioned as an active learning tool. The iterative nature of this feedback loop was intended to support self-directed learning, promote progressive refinement of clinical skills, and foster adaptive expertise in a high acuity setting.
Assessment Instruments
To evaluate students’ clinical performance during the ED clerkship, a structured assessment instrument was employed. This instrument was developed by the faculty of the College of Medicine at National Cheng Kung University through a consensus-driven process and was designed to align with the New Principal Practices for the Clinical Practice of Medical Students in the School of Medicine, as outlined in the 2016 national syllabus for medical students’ clerkship training in Taiwan.5 Its primary aim was to assess the development of essential clinical competencies in undergraduate medical trainees across real-world patient care settings.
The instrument encompassed ten core competency domains, considered fundamental to clinical training in emergency medicine: learning attitude, physical examination skills, clinical judgment, medical knowledge, prescription and treatment planning, communication skills, patient education, literature appraisal, oral case presentation, and clinical documentation. Each domain was rated using a 7-point Likert scale, with higher scores indicating more advanced levels of proficiency. The choice of this scale provided sufficient granularity to detect incremental improvements over the three-week clerkship period while maintaining clarity and usability for clinical supervisors.
Performance assessments were conducted in situ during routine clinical shifts. A faculty member from the ED selected an appropriate clinical encounter in which to observe a student. The student was expected to conduct focused history taking and a physical examination, then formulate a differential diagnosis and propose an initial management plan. This was followed by a clinical reasoning discussion with the supervising faculty, during which feedback and guidance were provided. Faculty members served in dual roles as clinical preceptors and assessors, observing the student’s behavior and decision-making in a live patient-care context.
Immediately following the encounter, the faculty member completed the assessment using a standardized digital form—typically via a secure institutional platform such as Google Forms—to ensure timely data entry and minimize recall bias. These assessments were compiled weekly for each student and formed the basis of the study’s competency progression analysis.
Assessors Training
To ensure the reliability and consistency of clinical performance evaluations, all assessors involved in the mini-Clinical Evaluation Exercise (mini-CEX) underwent formal rater training prior to participating in the assessment process. Faculty members eligible to serve as raters were required to meet specific criteria: they had to be board-certified emergency medicine specialists with a minimum of three years of clinical experience and at least one year of active teaching experience in an accredited teaching hospital.14
Following eligibility confirmation, raters completed a comprehensive 47-hour training program specifically designed to standardize assessment practices and reduce variability in scoring. The training curriculum was grounded in the core principles of competency-based medical education, drawing from frameworks established by the Accreditation Council for Graduate Medical Education (ACGME). Key components of the program included an overview of the mini-CEX as a workplace-based assessment tool, video-based calibration exercises, and interactive sessions focused on scoring alignment and observational consistency.
During the video calibration phase, raters observed pre-recorded clinical encounters and practiced scoring them against established benchmarks. This process allowed faculty members to recognize subtle variations in clinical performance and to align their scoring decisions with a shared understanding of expected competency levels. Group discussions followed each session to address scoring discrepancies, explore the rationale behind different judgments, and foster consensus. These exercises were particularly effective in promoting inter-rater reliability and in identifying potential sources of rater bias or error.
Additionally, the training emphasized the practical application of the mini-CEX in both undergraduate and postgraduate clinical settings, with sessions focused on direct observation, constructive feedback delivery, and the documentation of performance across multiple competency domains. By the end of the program, faculty raters were equipped with the necessary skills to conduct valid, reliable, and formative clinical assessments, ensuring a standardized evaluation process across all participating students.
Smartphone-Based Evaluation
To facilitate efficient and standardized evaluation of student performance during the emergency medicine clerkship, a smartphone-based evaluation system was adopted. This system employed Google Forms as the primary platform for data entry and collection, allowing both students and supervising physicians to complete performance assessments in real time. The transition from traditional paper-based methods to a digital interface aimed to streamline the assessment process, reduce administrative workload, and support immediate feedback at the point of care.
To ensure the reliability and validity of data collected through this platform, a comprehensive rater training program was conducted prior to the start of the rotation. This training included both students and ED faculty members and was designed to promote shared understanding of the evaluation framework, scoring criteria, and expected performance standards. Supervising physicians received focused instruction on the use of the Google Form, with particular emphasis on the importance of consistent scoring across domains, anchoring judgments to observable clinical behaviors, and providing objective, constructive feedback. The goal was to minimize inter-rater variability and reduce subjective bias in competency assessments.
In parallel, students participated in an orientation session led by the clerkship director, which provided a detailed overview of the digital evaluation tool. This session introduced the purpose, structure, and content of the form, as well as the rationale behind each evaluation criterion. Students were encouraged to understand the significance of each item—not merely as administrative tasks, but as indicators of their growth in clinical knowledge, procedural skills, communication, and professionalism. The orientation also stressed the importance of timely and accurate submission, as the evaluations would be used to track progression over the course of the three-week rotation.
Statistical Analysis
To evaluate student performance across the three-week ED clerkship, a comprehensive statistical analysis was conducted. Descriptive statistics, including mean scores, standard deviations, ranges, and total scores, were first calculated to summarize the distribution of clinical assessment data. These summary measures provided an overview of central tendencies and variability in students’ performance, offering foundational insights into the breadth and consistency of competency development during the clerkship.
To explore potential differences in scoring patterns, additional comparative analyses were performed. These included examinations of variation in assessment scores across different types of clinical cases and among various faculty evaluator groups. This step enabled the identification of potential contextual influences—such as case complexity or evaluator variability—on student performance ratings.
Given the repeated-measures nature of the dataset, in which individual students contributed multiple assessments across three consecutive weeks, a GEE model was employed. This analytic approach was chosen for its ability to account for the hierarchical data structure and within-subject correlation, allowing for the examination of longitudinal changes in performance across the three-week period while adjusting for possible clustering effects. The GEE model further enabled the analysis of how temporal factors (eg, assessment week or academic year) influenced overall performance, while controlling inter-individual variability.
All statistical analyses were conducted using SAS software (version 9.4; SAS Institute Inc., Cary, NC). Statistical significance was determined using a two-tailed alpha level of p < 0.05. This threshold ensured that all findings were evaluated with appropriate rigor and in accordance with conventional standards in medical education research.
Ethics Approval
This study was conducted in accordance with the ethical principles outlined in the Declaration of Helsinki. Ethical approval was obtained from the Institutional Review Board of National Cheng Kung University Hospital (IRB Approval Number: A-EX-109-033). All data were de-identified prior to analysis to ensure participant confidentiality, and only aggregated, anonymized results were reported. The use of voluntarily completed educational performance assessments for research purposes was approved as part of the institutional research ethics framework.
Results
Overview of Clinical Competency Assessments
Table 2 presents a summary of clinical assessment scores across the three-week rotation, showing incremental gains in core competencies such as learning attitude, history taking, and communication skills. A total of 2068 clinical performance assessments were collected from sixth-year medical students during their three-week ED clerkship between June 2018 and May 2020. Assessments were distributed as follows: 722 (34.91%) in the first week, 692 (33.46%) in the second week, and 654 (31.62%) in the third week. Although there was a slight decline in submission rates across weeks, this variation likely reflects natural clinical workflow patterns or reduced perceived need for feedback as students gained confidence and familiarity with the ED environment.
![]() |
Table 2 Assessment Scores Across Three weeks of the ED Clerkship From June 2018 to May 2020 |
Each assessment evaluated ten core competencies, including learning attitude, history taking and physical examination, clinical judgment, medical knowledge, prescription, communication skills, patient education, literature appraisal, oral presentation, and documentation. Descriptive statistics indicated steady improvement in several domains. For instance, mean scores for learning attitude increased from 4.46 (SD = 0.66) in week one to 4.58 (SD = 0.58) in week three. History taking and physical examination improved from 4.15 (SD = 0.71) to 4.36 (SD = 0.62), while clinical judgment showed a more modest rise from 4.02 (SD = 0.74) to 4.16 (SD = 0.68).
Notably, communication-related competencies showed consistent gains. Communication skills improved from week one by a total of 4.10 (SD = 0.72) to 4.25 (SD = 0.67) by week three. Similarly, patient education scores increased from 4.00 (SD = 0.74) to 4.14 (SD = 0.70), reflecting improved ability to convey clinical information clearly and empathetically. Improvements were also observed in prescription skills (from 3.85 to 4.06) and documentation (from 4.17 to 4.32), underscoring growth in competencies critical for independent clinical practice.
Subgroup Analysis of Students with Complete Three-week Assessments
To more precisely capture individual competency progression, a subgroup analysis was conducted on 133 students who voluntarily submitted complete assessments for all three weeks. As presented in Table 3, this subgroup demonstrated statistically significant improvement across several key competencies. For example, history taking and physical examination scores increased from 4.18 (SD = 0.50) to 4.34 (SD = 0.41) (p = 0.014), and prescription skills improved from 3.85 (SD = 0.50) to 4.02 (SD = 0.46) (p = 0.002). Communication skills rose significantly from 4.10 (SD = 0.51) to 4.25 (SD = 0.44) (p = 0.008), and documentation scores also showed significant improvement (p = 0.035).
![]() |
Table 3 Summary of 133 participants Who Have Completed Full Three-week Assessment Forms |
Although other domains such as clinical judgment, medical knowledge, and literature appraisal exhibited upward trends, these changes were not statistically significant, suggesting that certain competencies may require longer exposure or more targeted instructional strategies for measurable gains.
Modeling Weekly and Yearly Changes Using GEE
To account for the repeated-measures nature of the data and examine competency progression over time, a GEE model was employed (see Table 4). The intercept, representing the baseline level of student performance across all domains, was estimated at 1.4038 (SE = 0.0096, 95% CI: 1.3848–1.4227, p <0.0001), indicating a stable and consistent initial level of clinical competency at the start of the clerkship.
![]() |
Table 4 Parameter Estimates for Repeated Measures in Average Overall Performance Progression Using the Generalized Estimating Equations (GEE) Models |
Importantly, the model identified a statistically significant improvement in performance scores by the third week compared to the first week (Estimate = 0.0263, SE = 0.0097, 95% CI: 0.0074–0.0452, p = 0.0064). This finding highlights a meaningful progression in student competencies over a relatively short timeframe, emphasizing the impact of sustained clinical exposure and active learning within the ED setting. No significant difference was observed between the first and second weeks (Estimate = 0.0059, p = 0.5212), suggesting that the most notable gains occurred during the latter half of the rotation.
In addition to weekly effects, a significant year-over-year difference was detected: students in 2019 performed better than those in 2018, with an estimated score increase of 0.0215 (SE = 0.0100, 95% CI: 0.0019–0.0412, p = 0.0318). This improvement may reflect enhancements in the clerkship curriculum, better faculty calibration, or increased student preparedness in response to curricular refinement efforts implemented after the initial year. As illustrated in Figure 1, average clinical performance showed a consistent upward trajectory over the three-week ED clerkship, with noticeable year-over-year improvement between 2018 and 2019.
![]() |
Figure 1 Average performance progression of medical students during ED internships: a three-week comparison of 2018 and 2019. |
Novel Contributions and Implications
While the observed gains in competencies such as communication, prescription, and clinical reasoning are consistent with findings from existing literature on immersive clinical education, this study’s novelty lies in its detailed week-by-week quantification of competency development and the application of the GEE model to statistically account for repeated measures. These methods allow for a more nuanced understanding of when and how clinical competencies evolve during short-term clerkships. Moreover, by identifying which skills show early gains and which require more prolonged or targeted interventions, the findings provide actionable insights for curriculum design and clinical supervision strategies in emergency medicine education.
Discussion
This study underscores the pivotal role of structured, immersive clinical training in the ED in nurturing essential competencies among undergraduate medical students. The findings indicate that extended exposure to real-world ED settings—supported by structured supervision and timely formative feedback—leads to significant gains in diagnostic reasoning, prescription accuracy, communication proficiency, and clinical documentation. Statistically significant improvements were evident by the third week of rotation, suggesting that a minimum threshold of three weeks is necessary to ensure meaningful skill development. This trend highlights the cumulative effect of sustained clinical engagement in high-pressure environments where students gradually build clinical confidence and competence.15
Importantly, the ED offers a naturally multidisciplinary environment that integrates physicians, nurses, pharmacists, case managers, and allied health professionals in time-sensitive, team-based care. Students’ participation in such interprofessional teams not only reinforces clinical decision-making but also cultivates collaborative skills essential for modern healthcare practice. Through direct interaction with diverse healthcare roles, students gain a deeper understanding of the dynamics of multidisciplinary care, enhancing their ability to function effectively within complex clinical systems.16 Notably, year-over-year comparisons revealed that students in 2019 outperformed their 2018 counterparts, suggesting that curricular revisions or pedagogical improvements—such as clearer learning objectives, enhanced feedback quality, or increased opportunities for interprofessional collaboration—contributed to improved preparedness. These findings affirm the value of iterative curriculum refinement aligned with evolving educational standards and the realities of contemporary, team-based clinical practice.
Teaching clinical reasoning is essential yet remains a persistent challenge in medical education.17 Educators must guide students in closing gaps in assessment, decision-making, and knowledge application. Traditional ward or outpatient rotations—typically before ED exposure—help build basic skills like history taking, physical exams, presentations, and documentation.18,19 However, competencies such as prescription writing, clinical judgment, and patient education are often underemphasized. The ED’s fast-paced, high-pressure environment demands rapid synthesis and decisive action, while still requiring clear communication and patient education to prevent short-term return visits.20 Effective discharge instructions are vital to ensure safe care transitions.21 Given these demands, educators must create structured, real-time learning opportunities in the ED, including decision-making practice, discharge communication, and targeted feedback. As students often struggle to identify their own gaps, active supervision and specific guidance from faculty are critical.
As medical education evolves, integrating competency-based frameworks and simulation-based experiences becomes increasingly important, especially for areas that are underrepresented in traditional rotations.22,23 By situating clinical reasoning within the dynamic context of emergency medicine and designing deliberate practice opportunities for less-developed skills, educators can more effectively support trainees’ progression toward clinical competence and independent practice.21
Prior research has consistently emphasized the educational value of early case-based teaching in fostering clinical reasoning and decision-making skills among medical students.24 By engaging with structured clinical cases early in their training, students begin to construct illness scripts—mental frameworks that integrate signs, symptoms, pathophysiology, and diagnostic cues.25 These scripts serve as foundational tools for making accurate and efficient clinical judgments in real-world scenarios. Moreover, repeated exposure to multiple cases with similar presenting symptoms helps students refine their ability to identify and contrast subtle differentiating features between conditions, thereby deepening diagnostic accuracy and reinforcing clinical reasoning pathways.26
In alignment with this pedagogical approach, the Society of Academic Emergency Medicine (SAEM) has developed a national curriculum for fourth-year emergency medicine clerkships, advocating for broad exposure to a diverse spectrum of emergent clinical conditions. These include, but are not limited to, abdominal pain, altered mental status, cardiac arrest, chest pain, gastrointestinal bleeding, headache, poisoning, respiratory distress, shock, and trauma.19 This approach ensures that students gain essential experience in managing critical scenarios, which is pivotal in their transition from medical education to clinical practice. The intent of this curriculum is to provide undergraduate trainees with comprehensive, hands-on experience in managing acute scenarios—experiences that are critical for bridging the gap between medical education and clinical practice. When paired with real-time supervision and formative feedback, this exposure not only enhances core competencies but also strengthens students’ readiness for high-stakes decision-making in their future roles as physicians.
To ensure that undergraduate trainees acquire essential emergency care competencies, the learning objectives of clinical encounters and case-based discussions are structured to enable students to: (1) systematically construct a comprehensive differential diagnosis for common emergent conditions, incorporating a broad range of possible etiologies; (2) accurately describe the hallmark clinical features associated with these conditions, highlighting key signs and symptoms that facilitate rapid recognition and prioritization; and (3) outline the initial evaluation and management steps, including pertinent diagnostic investigations, immediate therapeutic interventions, and appropriate disposition planning.19 Mastery of these competencies is vital in the ED, where timely and precise decision-making directly influences patient outcomes.27
Despite the importance of clinical reasoning, diagnostic errors remain a significant concern in ED settings, posing risks to patient safety.5,11 To address this, Taiwan implemented the Taiwan Triage and Acuity Scale (TTAS) in 2010—a five-level computerized system classifying patients by urgency from Level 1 (most urgent) to Level 5 (least urgent).28 To minimize risk, trainees typically manage Level 3–5 patients with milder conditions. While this safeguards patients, it limits trainee exposure to high-acuity cases crucial for developing advanced reasoning and decision-making. As ED presentations are unpredictable, this restricted exposure may reduce training authenticity. Thus, to ensure balanced learning, institutions should consider strategies like supervised high-acuity encounters, debriefings, and simulation-based training.29
In these years, the COVID-19 pandemic has intensified existing challenges in clinical education by limiting direct patient contact, especially with critically ill individuals.30 This disruption reduced vital hands-on experience in high-acuity emergency care. In response, simulation-based education has become a key, evidence-supported strategy to maintain emergency medicine training quality.31 Simulation allows safe, repeated practice of complex clinical scenarios involving rapid assessment, critical decisions, teamwork, and crisis management—all without patient risk.32 Rooted in experiential learning, it offers structured opportunities for practice, feedback, and reflection. Recognizing its value, most Canadian emergency medicine residency programs have integrated simulation into their curricula, focusing on resuscitation and interprofessional crisis management.33 Repeated simulation and debriefing improve trainee confidence, preparedness, and patient safety outcomes in real clinical settings.
Assessing clinical competence in authentic clinical settings is fundamental to medical education, and direct observation by faculty plays a central role in this process. Direct observation allows faculty to witness firsthand the clinical skills and behaviors of trainees, offering an opportunity to evaluate them in real time. This approach enables faculty to transform their observations into structured judgments and ratings, providing a basis for constructive and timely feedback, which is crucial for the developmental process of trainees. Furthermore, direct observation is not only valuable for immediate assessment but also serves as a robust predictor of future performance, giving insights into the readiness and potential of medical students to handle complex clinical situations in their careers.34–36
Improving the teaching and assessment of clinical reasoning is a critical issue for educators, particularly for trainers in EDs. clinical reasoning is a vital skill that determines how trainees integrate and apply medical knowledge to patient care. Workplace-based assessments, such as direct observation, play an instrumental role in developing clinical reasoning skills by offering tailored feedback based on real-world clinical encounters. Evidence shows that such feedback significantly boosts trainees’ motivation and confidence, enhancing their learning experiences and clinical competence over time.37 Feedback is indispensable in both medical education and clinical education. It is traditionally viewed through several conceptual frameworks: as a mere transfer of information, as a reaction to observed behaviors, or as a dynamic, cyclical process that involves a continuous exchange of information and reaction between the educator and the trainee.38 In clinical education specifically, we propose defining feedback as detailed, specific information regarding a trainee’s observed performance compared to a predefined standard. This definition underscores feedback’s role in closing the gap between current and desired performance, ultimately aiming to improve the trainee’s knowledge, skills, and professional behavior.39
A study by Johnson et al examined core medical trainees’ perspectives on their curriculum and assessment practices, highlighting the perceived value of feedback in workplace-based assessments. Most trainees reported that feedback was one of the most useful components of these assessments, suggesting that when feedback is done well, it significantly contributes to their professional development and learning outcomes.40 However, the relationship between feedback and its impact on learning outcomes is complex and not always positive. Research indicates that feedback can sometimes inadvertently lead to negative consequences, such as decreased motivation, impaired confidence, self-doubt, student burnout, or even reduced performance.41 Such outcomes often result from poorly delivered feedback, which may be perceived as overly critical, vague, or lacking in actionable recommendations. Moreover, trainers may consciously avoid providing feedback to avoid causing offense or triggering defensiveness in trainees.
To mitigate these challenges, it is crucial to maximize opportunities for training assessors to provide high-quality feedback. Assessor training is increasingly recognized as vital to ensure feedback is constructive, timely, and aligned with learning goals.42 Effective training should focus on techniques for delivering clear, specific, and constructive feedback that is aligned with the trainee’s level of experience and learning needs. Moreover, assessors should be trained to administer assessments in a way that fosters a supportive learning environment. The quality of feedback is directly linked to the effectiveness of assessments; therefore, if assessments are used merely as scoring exercises without a strong emphasis on delivering meaningful, actionable feedback, the potential educational gains will be minimal. Feedback should be integrated as a core component of the assessment process, ensuring that it is not only formative but also drives continuous improvement in clinical competence.43
While direct observation and feedback are critical for assessing clinical competence and fostering professional growth in medical education, several limitations challenge their effectiveness. These include subjectivity and bias in assessments, variability in feedback quality, and the reluctance of trainers to provide honest feedback due to concerns about causing offense or provoking defensiveness. Time constraints in busy clinical settings often limit opportunities for thorough observation and meaningful feedback, while inadequate training and calibration of assessors contribute to inconsistencies.44 Additionally, poorly delivered feedback can negatively impact trainee motivation and confidence, potentially hindering learning outcomes.45 Overemphasis on assessment scores rather than formative feedback, limited predictive validity of observations, emotional impact on trainees, and the resource-intensive nature of these processes further complicate implementation. Addressing these challenges requires a strategic approach to training, a supportive educational culture, and a focus on the developmental needs of trainees to enhance the value of workplace-based assessments.
Future research should explore the effects of extended ED rotations on enhancing diagnostic reasoning skills among medical students, with particular attention to the retention and application of these skills in real-world clinical practice. Investigations could also assess the integration of innovative educational strategies—such as simulation-based learning, interprofessional case-based discussions, and team-based clinical scenarios—to further strengthen core clinical competencies within multidisciplinary healthcare settings. Moreover, examining the influence of varying levels of supervision and the quality of formative feedback on student performance may yield critical insights for optimizing clinical training environments. Longitudinal studies tracking the progression of diagnostic abilities from undergraduate medical education through to professional practice would be instrumental in evaluating the long-term efficacy of diverse instructional approaches across different clinical disciplines.
Conclusion
This study demonstrates that a structured, three-week ED internship can significantly enhance core clinical competencies in undergraduate medical students, particularly in diagnostic reasoning, prescription accuracy, communication, and documentation. However, several limitations should be noted. The study was conducted at a single institution, with a relatively small longitudinal sample and reliance on supervisor-based assessment, which may introduce potential bias. From the student perspective, extended clinical immersion with consistent feedback promotes deeper engagement and competency development. From the institutional perspective, findings support the integration of structured, longitudinal assessment into clinical rotations. From the hospital and healthcare perspective, embedding interprofessional learning opportunities—such as collaboration with nursing, pharmacy, and allied health—can strengthen students’ readiness for team-based practice. Future research should explore optimal length and structure of ED internships, explore complementary teaching strategies such as simulation and reflective practice, and evaluate the long-term impact of such training on collaborative clinical performance.
Disclosure
The authors have no conflicts of interest to declare for this work.
References
1. Macy J Jr. The role of emergency medicine in the future of American medical care. Ann Emergency Med. 1995;25(2):230–233. doi:10.1016/S0196-0644(95)70329-2
2. Tews MC, Hamilton GC. Integrating emergency medicine principles and experience throughout the medical school curriculum: why and how. Acad Emergency Med. 2011;18(10):1072–1080. doi:10.1111/j.1553-2712.2011.01168.x
3. Wald DA, Lin M, Manthey DE, Rogers RL, Zun LS, Christopher T. Emergency medicine in the medical school curriculum. Acad Emerg Med. 2010;17 Suppl 2:S26–30. doi:10.1111/j.1553-2712.2010.00896.x
4. Russi CS, Hamilton GC. A case for emergency medicine in the undergraduate medical school curriculum. Acad Emergency Med. 2005;12(10):994–998. doi:10.1197/j.aem.2005.05.028
5. New Principal Practices for the Clinical Practice of Medical Students in School of Medicine. Ministry of Education. Available from: http://ctld.ncku.edu.tw/var/file/47/1047/img/3391/187887459.pdf.
6. American College of Emergency Physicians. Guidelines for undergraduate education in emergency medicine. Ann Emerg Med. 2016;68(1):150. doi:10.1016/j.annemergmed.2016.04.049
7. Santen SA, Peterson WJ, Khandelwal S, House JB, Manthey DE, Sozener CB. Medical student milestones in emergency medicine. Acad Emergency Med. 2014;21(8):905–911. doi:10.1111/acem.12443
8. Cevik AA, Cakal ED, Alao D, Elzubeir M, Shaban S, Abu-Zidan F. Self-efficacy beliefs and expectations during an emergency medicine clerkship. Int J Emerg Med. 2022;15(1):4. doi:10.1186/s12245-021-00406-0
9. Liu K-M. The core clinical competencies for medical students in Taiwan. Evaluation Bimonthly. 2011;31:39–42.
10. Hsu H-C, Kawaguchi T, Chi C-H, et al. A comparative analysis of curriculum evolution in emergency medicine between for undergraduate and postgraduate levels. J Med Educ. 2024;28(1):32–48.
11. Young M, Thomas A, Lubarsky S, et al. Drawing boundaries: the difficulty in defining clinical reasoning. Acad Med. 2018;93(7):990–995. doi:10.1097/ACM.0000000000002142
12. Newman-Toker DE, Pronovost PJ. Diagnostic errors—the next frontier for patient safety. JAMA. 2009;301(10):1060–1062. doi:10.1001/jama.2009.249
13. Harden RM. AMEE Guide No. 14: outcome-based education: part 1-An introduction to outcome-based education. Med Teach. 1999;21(1):7–14. doi:10.1080/01421599979969
14. Welfare MoHa. Two-Year Physician Post-Graduation General Medical Training Program. Available from: https://www.mohw.gov.tw/dl-41004-c61fab6b-36f0-41fb-b357-e23358d7e858.html.
15. Wolff M, Hammoud M, Santen S, Deiorio N, Fix M. Coaching in undergraduate medical education: a national survey. Med Educ Online. 2020;25(1):1699765. doi:10.1080/10872981.2019.1699765
16. Gismalla M, Alawad A. Undergraduate emergency medicine education: problems and challenges. Austin Emerg Med. 2017;3(1):1049.
17. Beck AP. Assessment of first-year medical student perceptions in the development of self-directed learning skills in a single medical school course. BMC Med Educ. 2025;25(1):340. doi:10.1186/s12909-025-06890-9
18. Caretta‐Weyer HA, Park YS, Tekian A, Sebok‐Syer SS. Identifying emergency medicine program directors’ expectations of competence upon entry into residency: bridging the distance from the Association of American Medical Colleges Core Entrustable Professional Activities. AEM Educ Training. 2025;9(2):e70024. doi:10.1002/aet2.70024
19. Manthey DE, Ander DS, Gordon DC, et al. Emergency medicine clerkship curriculum: an update and revision. Acad Emergency Med. 2010;17(6):638–643. doi:10.1111/j.1553-2712.2010.00750.x
20. Germain N, Samb R, Côté É, et al. Impact of a geriatric emergency management nurse on thirty-day emergency department revisits: a propensity score matched case-control study. medRxiv. 2024;30:24319195.
21. Ng C, Chung C. An analysis of unscheduled return visits to the accident and emergency department of a general public hospital. Hong Kong J Emerg Med. 2003;10(3):153–161. doi:10.1177/102490790301000304
22. Castellanos-Ortega Á, Broch Porcar MJ, Palacios-Castañeda D, et al. Effect of a competence based medical education program on training quality in intensive care medicine. COBALIDATION TRIAL. Med Intensiva. 2025:502126. doi:10.1016/j.medine.2024.502126.
23. Fisk P, Hall AK, O’Brien M, Cheung WJ. Simulation education in the age of competency-based medical education: a study of the use of simulation-based education in Canadian emergency medicine programs. Cjem. 2025. doi:10.1007/s43678-025-00935-0
24. Parsad V, Divakaran J. Active learning in medical education: a brief overview of its benefits. 2025.
25. Hsu H-C, Lin C-H, Hsieh -C-C, et al. Developing a complex simulation training program for emergency medical services paramedics using a high-fidelity simulation model. J Med Educ. 2016;20(4):296–307.
26. Bowen JL. Educational strategies to promote clinical diagnostic reasoning. N Engl J Med. 2006;355(21):2217–2225. doi:10.1056/NEJMra054782
27. Penciner R, Woods RA, McEwen J, Lee R, Langhan T. Committee CAoEPUE. Core competencies for emergency medicine clerkships: results of a Canadian consensus initiative. Can J Emerg Med. 2013;15(1):24–33. doi:10.2310/8000.2012.120686
28. Ng C-J, Yen Z-S, Tsai JC-H, et al. Validation of the Taiwan triage and acuity scale: a new computerised five-level triage system. Emer Med J. 2011;28(12):1026–1031. doi:10.1136/emj.2010.094185
29. Sung T-C, Hsu H-C. Improving critical care teamwork: simulation-based interprofessional training for enhanced communication and safety. J Multidisciplinary Healthcare. 2025;355–367. doi:10.2147/JMDH.S500890
30. Malhotra AK, Bernstein M. Neurosurgery During a Pandemic. Ethical Challenges for the Future of Neurosurgery. Springer; 2024:163–174.
31. Berger M, Buckanavage J, Jordan J, Lai S, Regan L. Telesimulation use in emergency medicine residency programs: national survey of residency simulation leaders. West J Emerg Med. 2024;25(6):907. doi:10.5811/WESTJEM.24863
32. Hsu H-C, Shih H-I, Chuang M-C, et al. Perceptions of an emergency medicine simulation program by undergraduate medical students with different clinical experiences. J Med Educ. 2014;18(4):139–152.
33. Ilgen JS, Sherbino J, Cook DA. Technology‐enhanced simulation in emergency medicine: a systematic review and meta‐analysis. Acad Emergency Med. 2013;20(2):117–127. doi:10.1111/acem.12076
34. Carr SE, Celenza A, Puddey IB, Lake F. Relationships between academic performance of medical students and their workplace performance as junior doctors. BMC Med Educ. 2014;14(1):1–7. doi:10.1186/1472-6920-14-157
35. Holmboe ES. Realizing the promise of competency-based medical education. Acad Med. 2015;90(4):411–413. doi:10.1097/ACM.0000000000000515
36. Ney EM, Shea JA, Kogan JR. Predictive validity of the mini-clinical evaluation exercise (mcex): do medical students’ mCEX ratings correlate with future clinical exam performance? Acad Med. 2009;84(10):S17–S20. doi:10.1097/ACM.0b013e3181b37c94
37. Saedon H, Salleh S, Balakrishnan A, Imray CHE, Saedon M. The role of feedback in improving the effectiveness of workplace based assessments: a systematic review. BMC Med Educ. 2012;12(1):25. doi:10.1186/1472-6920-12-25
38. Cevik AA, Cakal ED, Abu-Zidan FM. Emergency medicine clerkship curriculum in a high-income developing country: methods for development and application. Int J Emerg Med. 2018;11:1–9. doi:10.1186/s12245-018-0190-y
39. Van De Ridder JM, Stokking KM, McGaghie WC, Ten Cate OTJ. What is feedback in clinical education? Med Educ. 2008;42(2):189–197. doi:10.1111/j.1365-2923.2007.02973.x
40. Johnson G, Barrett J, Jones M, Parry D, Wade W. Feedback from educational supervisors and trainees on the implementation of curricula and the assessment system for core medical training. Clin Med. 2008;8(5):484–489. doi:10.7861/clinmedicine.8-5-484
41. Beck‐Pancer D, Kryzhanovskaya IV. How to… create peer‐facilitated support groups for health professions students. Clin Teacher. 2025;22(3):e70080. doi:10.1111/tct.70080
42. Poeppelman RS, Cho J, Nachbor K, et al. “But why?”: explanatory feedback is a reliable marker of high-quality narrative assessment of surgical performance. Acad Med. 2024;2024:10–97.
43. Sudan R, Scott DJ, Campbell KK, et al. ACS-simulation in surgical education (ACS-SISE): improving simulation-based teaching skills of surgeon educators. Global Surg Educ J Assoc Surg Educ. 2025;4(1):1–8. doi:10.1007/s44186-025-00349-7
44. Pack R, Ott MC, Cristancho S, Chin M, Van Koughnett JA, Ott M. Lost in translation? How context shapes the implementation of competence by design in operative settings. Canad J Surg. 2025;68(1):E49. doi:10.1503/cjs.014623
45. Nawi SNAM, Nordin MN. BF skinner’s learning theory and teaching strategies for special education students. Special Educ. 2025;3(1):e0043–e0043.
© 2025 The Author(s). This work is published and licensed by Dove Medical Press Limited. The
full terms of this license are available at https://www.dovepress.com/terms.php
and incorporate the Creative Commons Attribution
- Non Commercial (unported, 4.0) License.
By accessing the work you hereby accept the Terms. Non-commercial uses of the work are permitted
without any further permission from Dove Medical Press Limited, provided the work is properly
attributed. For permission for commercial use of this work, please see paragraphs 4.2 and 5 of our Terms.