fbpx
NBOME
Contact Us: 866.479.6828
mail

Myths and Misconceptions: Score Concordance – What Tables (Really) Mean

In the February edition of the Journal of Graduate Medical Education, we published an article showing concordance between scores for USMLE and COMLEX-USA.

To help students make rational test-taking decisions, it is important for those who are advising DO students to be able to interpret the results of the study.

Most residency programs understand that COMLEX-USA and USMLE are administered to make licensure decisions and that DOs take COMLEX-USA and MDs take USMLE. Although the scores for these examinations were never meant to guide residency selection, there are ways for programs to compare exam performance for MD and DO applicants for use in holistic review of applicants. This study was designed to give program directors another way to understand COMLEX-USA scores.

Let’s say a program director is not very familiar with DO applicants, but they know that the average USMLE Step 2 score for residents previously accepted into their program was between 210 and 230. In their holistic review of all applicants, they can look at the Concordance Table and see, with some degree of error, what that score range would look like for COMLEX-USA.

While the COMLEX-USA and USMLE series measure overlapping constructs, the concorded scores provide some “ballpark estimates” of how DO candidates would perform on USMLE.  It should be noted, however, that DOs are required to take COMLEX-USA, and their scores reflect their performance on examinations that were specifically designed for the assessment of their competencies for osteopathic medical practice. As such, the concorded scores, and any associated pass/fail inferences, are not perfect.

Here is a run-down of some of the myths we’ve seen, and the corresponding truth:

MYTH 1:  The fact that a COMLEX-USA score goes up when a USMLE score goes up means that the examinations are interchangeable.

It has been argued that both examinations are not necessary because concordance means similarity, therefore redundancy. There are a number of problems with this. First, the content of the examinations is not the same. While Level 1 and Step 1 measure overlapping constructs, as do Level 2 and Step 2, they are not the same assessments.  Blueprints for USMLE and COMLEX-USA both include the content and competencies as taught and required in the curricular programs leading to the separate degree programs, DO and MD.

 

To protect the public, licensing boards want to assure that physicians, whether DO or MD, have demonstrated competencies necessary for safe practice in their respective professions. Similarly, patients want to know that DOs have demonstrated competencies for practice as DOs. This would be the same expectation for MDs to practice as MDs, for podiatrists to practice podiatric medicine, or for optometrists to practice optometry.

 

Second, one could argue that any measure of application of knowledge and related competencies will show a positive correlation – good test-takers tend to perform well on tests.  This does not mean that COMLEX-USA and USMLE are interchangeable. It just means that they measure overlapping constructs.

 

MYTH 2: The Pass/Fail standards do not align in this table, so USMLE is a more “rigorous” test.

The purpose of the study was to develop more accurate concordance tables, not to predict USMLE Step Pass or Fail status from COMLEX-USA scores.  Given that the passing rates on COMLEX-USA and USMLE are quite high (most candidates score well above the cut-scores), predicting passing status on Step 1 from Level 1 (or Step 2 from Level 2) is error-prone, and not appropriate.

COMLEX-USA is designed for licensure for osteopathic medical students and graduates. Passing Level 1 and 2 indicates that they have demonstrated the competencies needed for licensure and  ntry into supervised residency practice at accredited GME programs. Standards are set with reference to the expected performance and competence of the candidates who are eligible to take the respective examinations.

Osteopathic medical students are not meant to take USMLE – it was never designed for them. USMLE was introduced in the early 1990s to take the place of the three previous MD licensing exams (NBME, FLEX, FMGEMs) used for MD students from US-LCME-accredited medical schools and IMGs. It is not surprising that DOs perform differently than MD students on USMLE Step 1 and Step 2. We might expect the same if MD students took COMLEX-USA.

From the concordance analyses, the passing standards from COMLEX-USA and USMLE do not map 1:1.  Given that: 1.) the concordance analyses were based on a sample of osteopathic students, 2.) the exams measure different but overlapping constructs, 3.) the standard-setting procedures for COMLEX-USA and USMLE differ, and 4.) there is no data for US MD students taking COMLEX-USA, this is not surprising. With this in mind, it makes sense that DO students perform better on examinations such as COMLEX-USA that are specifically constructed to measure their competencies required for osteopathic medical education and practice. More important, one must acknowledge that any concorded score is not a perfect predicted measure of performance on the other examination, including those at or near the respective pass/fail cut-points.

 

MYTH 3: This data will make DO students feel they have to take USMLE.

While we can’t predict what would make a DO applicant feel more competitive as a GME candidate, let’s talk about why they shouldn’t feel they have to take USMLE in the first place. We need to do a better job informing program directors about the way DOs are educated and trained, including what COMLEX-USA scores mean, to help collectively foster inclusion and reduce bias. This is why relying on USMLE performance for osteopathic graduates is not consistent with holistic review and not appropriate.

Similarly, to further promote a more holistic selection process, the reliance on real (or mapped) USMLE scores for DOs and MDs should wane. A passing performance on COMLEX-USA Level 1 and 2 indicates that the applicant is competent in the foundational biomedical and clinical sciences necessary for licensure at the level of entry into supervised GME training.

The results from the concordance study will allow program directors to determine a more accurate “predicted” USMLE score for those DO applicants who did not take USMLE. There have been numerous concordance studies published with the exact same purpose. The reliance on USMLE should decrease with the transition to Pass/Fail for Step 1 and Level 1, but only if DO advocates, faculty, advisors, and learners also stand up for the profession, its learners, and its distinct educational pathway to help to make that happen.

Noted author and psychologist Adam Grant wrote recently, “The simplest way to be a good friend is to be a loyal fan.  You root for their happiness like you rooted for Jim & Pam, Arya Stark, or Harrry, Hermione, & Ron.  You cheer for their success like you cheer for your favorite sports team. And you keep showing up.”

In the single GME era, with a better understanding of osteopathic medicine and the qualifications of  DO graduates, most residency program directors have already changed their selection practices and have been more inclusive of DO applicants, using more holistic review processes. As a result, the DO match rates for residency and fellowship programs have been very strong since the first uniform NRMP Match in 2020.

 

MYTH 4: The results of this study won’t change program requirements so I should advise DO students to take USMLE.

As advocates for our students and our profession, we need to continue to fight for the equality of DO applicants in residency and fellowship selection. The perceived need to take USMLE creates additional stress for DO students, both financial and emotional, and–to be fair–should not weigh heavily in any selection decisions. The increasing acceptance of DOs in residencies and fellowships across the US, combined with the transition to Pass/Fail for both USMLE Step 1 and COMLEX-USA Level 1, should reduce the notion that DO students must take Step 1 moving forward.

Some advisors at the COMs have told us that a passing performance on Step 1 will help some DOs in certain instances in the selection process. This opinion ignores the fact that passing performance on Level 1 provides the same screening information for program directors who use licensing exams to determine readiness to enter a GME program. COMLEX-USA is the tool to assess competencies for a DO. This is the message that needs to get out, and we are hopeful that COM advisors and faculty can help us to do that.

While licensure examinations and their associated scores will likely continue to be used as a part of the residency application process, particularly in more competitive specialties, the current debate around whether DO applicants can be fairly considered with their COMLEX-USA scores provides an opportunity to develop a more holistic–and arguably fairer–review process for residency applications.

Our profession must come together to address biases against DOs in GME to foster more equity and inclusion. We should take pride in the fact that we provide a distinctive, interconnected “body, mind, and spirit” approach to partner with patients to restore and maintain their health.  When we do, it will benefit not only applicants and residency programs but perhaps most importantly, the patients we have the privilege to serve.

 

^