Abstract
Background: Medical students are required to undertake procedural clinical skills training before they qualify as doctors, and an assessment of these skills is a critical element of their fitness to practice. Context: Challenges facing educators include the logistics of observing: i.e. who is best placed to assess their competence? Evidence appears to be inconclusive about the competence of students in the workplace to adhere to standards of practice, and the time required for an effective assessment. Innovation: In this article the aim is to discuss who is best placed to assess final-year medical students in the workplace. We explore the use of direct observed procedural skills (DOPS) to assess students undertaking procedural skills in a simulated workplace setting by tutor-, peer- and self-assessment. The DOPS tool has been used to assess foundation doctors, but can it be used to effectively assess undergraduate medical students? Implications: The main purpose of formative assessment in the simulated setting is to support student learning through the provision of feedback and debriefing. The use of the DOPS tool in this way can provide an insightful perspective of a students' assessment of procedural clinical skills. Tutors can use the DOPS tool to guide their teaching practice by tailoring their lessons towards areas in which students require more guidance. The DOPS assessment tool presents an opportunity to provide immediate and relevant feedback.
Original language | English |
---|---|
Pages (from-to) | 228-232 |
Number of pages | 5 |
Journal | Clinical Teacher |
Volume | 9 |
Issue number | 4 |
DOIs | |
Publication status | Published - 2012 |