To the joy of critics who bemoaned its cost, lack of effectiveness, and contribution to student burnout, the United States Medical Licensing Examination (USMLE) Step 2 Clinical Skills examination (CS2) was recently retired. The test had been suspended due to concerns about participant safety during the pandemic. Thus far, the USMLE has not indicated what, if anything, will replace it.
The drawbacks of the CS2 were well publicized. It cost US and Canadian test-takers $30.6 million to identify only 4.6% of students who did not pass on the first try. Students competed for limited slots, traveling to the only five locations in the country where the test was offered, all while shouldering additional costs associated with residency applications. Moreover, the CS2 lacked any meaningful feedback on performance, and its purported contribution to patient safety itself was disputed.
As imperfect as it certainly was, the CS2 required students to demonstrate that a minimum level of competence had been attained. It helped reassure the public that all licensed physicians had at least acquired basic clinical skills. Furthermore, it wasn't just a test for US students: The CS2 provided a common threshold for licensing for US and international medical graduates (IMGs). At a time when equal opportunity is at the forefront of our minds, that principle is certainly worth sustaining.
The ultimate responsibility for ensuring competence in clinical and communication skills has been returned to medical schools, as it should be. Without common content and methodology, however, how can we ensure that individual schools agree, adopt, and attain a shared standard, one that is equal to that required of IMGs?
Related to those concerns is a larger question about what the dissolution of this clinical skills test signifies. The assessments we administer to our students send messages to those both within and outside the profession, messages about what we value. Without any suggested replacement, the USMLE seems to be signaling that bedside skills no longer matter. Even if some would argue that the CS2 had expressed a similar message with its low bar, at least there was a bar. One in 20 US students failed to meet that low bar. As costly as the CS2 may have been, it could be argued that the price was worth it. How much will it cost to identify improperly trained physicians now?
The implication of the USMLE's move is that this critical set of skills is not as deserving of attention in medical training as rote memorization. We know that this is untrue. Errors in diagnosis have substantial adverse consequences. Many of these mistakes stem from oversights during the physical examination, most commonly a simple failure to examine the patient at all. The elimination of the CS2, despite all of the test's flaws, may inadvertently reinforce and amplify that failing, and have a lasting effect on the practice of medicine and on patient care. It may drive us to an even further reliance on laboratory data and imaging while ignoring important clues to disease.
Better Than Nothing Is Still Something
Assessment drives learning. Systematic teaching coupled with robust assessment are time-honored and proven ways of learning physical examination skills and improving diagnostic acumen. Clearly, the CS2 never achieved this. Its standards were simply too low; however, the removal of CS2 has now left our training system with no way to assess even a bare-minimum standard.
Part of the problem with the CS2 was its exclusive use of "simulated" patients to ensure that the assessment was as similar as possible for all takers. This meant that few real abnormal physical signs could be included. All that was tested was the ability to go through the motions. We surely want our students to know more than simply where to "correctly" place a stethoscope on the chest. The solution is not to eliminate all such testing but to refine it so that the exercise is worthwhile and indispensable.
Looking to other countries may be worthwhile. The United Kingdom has no national medical licensing assessment for its own medical students and has recently dropped plans to introduce one. However, IMGs are required to pass a clinical skills assessment — the Professional and Linguistic Assessments Board examination — as well as a separate, standalone test of English language proficiency to attain a license. Both UK and foreign graduates entering residency training and seeking specialty certification cannot complete those programs without passing a clinical skills assessment. For example, residents in internal medicine must complete all workplace (in-training) assessments at their own institutions as well as two knowledge tests — equivalent to US boards exams — and an externally administered independent assessment of their bedside skills (Practical Assessment of Clinical Examination Skills). That test is not without its own critics and has a financial cost for trainees; however, it includes patients with abnormal physical signs and allows clinicians to assess candidate performance.
We do not believe that a national test of this sort could ever be adopted in the United States; however, the principles behind the separation of English proficiency assessment from clinical skill assessment is worth consideration. Likewise, having a common passing standard for domestic and international trainees, and the inclusion of patients with abnormalities, are ideas worthy of support.
Without the CS2, an aspiring US physician can progress from high school to independent, unsupervised practice without ever objectively demonstrating that they have met a minimum standard of competence in the skills central to the practice of medicine. That is a sobering thought. Compare this with the extensive training and certification of pilots, who are similarly responsible for human lives. Abandoning a standard altogether is not the answer.
If Not the CS2, Then What?
The time is right for American medical education to ensure that a graduating student has the skills they need. The CS2 should not create a void; it should be replaced with better, more meaningful, affordable, locally delivered assessments that share common passing standards.
As an example, the California Consortium for the Assessment of Clinical Competence currently administers common Clinical Performance Examinations (CPX), based on agreed upon objectives, to graduating medical students in the state of California. Although similar consortiums exist in small numbers across the country, most medical schools relied on CS2 to demonstrate that students were graduating with at least some minimal standard in clinical skills.
Whatever future assessment is developed, it must go further than the CS2 and evaluate students' skills in identifying actual examination findings. In clinical practice, recognizing something abnormal in the physical examination and focusing that examination appropriately in response to different clinical presentations is vital to patient safety and outcomes. Every day, radiologists call treating teams to inform them about strangulated hernias that were overlooked in patients who presented with gastrointestinal symptoms, a diagnostic delay that could be consequential. At the same time, a lack of confidence in bedside examination skills is resulting in overordering tests, diagnostic delays, unnecessary exposure to contrast and radiation, and surgical misadventures, not to mention increased expenses.
Ultimately, setting a high standard for students is about patient safety. A rethinking of clinical skills testing should therefore take into account the views of patients, as well as those of educators and students. The resources and time needed to develop in-person standardized examinations for clinical skills are significant. Given the stakes, it is also quite necessary.
Medical schools have high standards for admissions. These include identifying candidates with the humanistic qualities that we value in building the physician-patient relationship, with the critical thinking skills that are necessary to provide the highest level of patient care and safety. We do an incredible disservice to these students by not holding them to the same high standards at graduation that we did during their acceptance.
Although CS2 had its flaws as an assessment of clinical skills, its removal has left the medical training system without any agreed measure of competency. The time is now for medical schools to step forward, develop standard assessments, and ensure the quality of their graduates.
The authors of this piece are part of the Stanford Department of Medicine's Stanford Program in Bedside Medicine, teaching and promoting physical examination skills to trainees and practicing clinicians. Their group tweets @Stanfordmed25, and a collection of their efforts can be found here.
Follow Medscape on Facebook, Twitter, Instagram, and YouTube
Medscape Med Students © 2021 WebMD, LLC
Any views expressed above are the author's own and do not necessarily reflect the views of WebMD or Medscape.
Cite this: Don't Clinical Skills Matter? We Need a New Step 2 Exam Now! - Medscape - Mar 04, 2021.