This article originally appeared in The Bar Examiner print edition, Fall 2024 (Vol. 93, No. 3), pp 36–40. By Joanne Kane, PhD; Douglas R. Ripkey, MS; and Mengyao Zhang, PhDSeveral semitransparent graphs and plots laid over each other with a hand tracing an upward trending plot line

Earlier this year, staff at NCBE received an email from faculty at two law schools seeking help in identifying students at risk of failing the bar exam. Questions raised within the email included

  1. whether there is a relationship between Multistate Professional Responsibility Examination (MPRE) and bar exam performance such that failure on the MPRE might be predictive of bar exam failure,
  2. whether it is worthwhile to require students to report their MPRE scores as part of a course grade in Professional Responsibility, and
  3. whether students who struggle with multiple-choice questions can be identified well before they sit for the bar exam, so that remediation can be pursued. This article will address these three questions in turn.

1. Is there a relationship between MPRE scores and bar exam scores?

The short answer to this question is: yes.

A previous Bar Examiner article mentioned that a low score on the MPRE can serve as an early warning sign that a person might not be adequately prepared to perform well on the bar exam. The article succinctly noted, “Those with high MPRE scores tend to do well on the bar exam; those with low MPRE scores tend not to.”1 Importantly, this piece pointed to several other warning signs, including undergraduate grade point averages (GPA) under 3.0, Law School Admission Test (LSAT) scores under 140, and law school grades.

More recently, an analysis conducted by former NCBE Director of Testing and Research Mark A. Albanese, PhD, and his staff examined whether MPRE performance is correlated with examinees’ first-time bar exam performance. (The study looked only at Multi­state Bar Examination [MBE] performance, but MBE and written-­component performance are highly related.) Bar exam scores earned in administrations from February 2013 through February 2020 were included. This study revealed that MPRE performance is a strong predictor of MBE scores. Importantly, though, the study noted that law school grades, undergraduate GPA, and LSAT or Graduate Record Examination (GRE) scores would also be expected to be important predictors.

A longer answer, then, is that in the absence of any other information a person could use to predict bar exam performance, MPRE scores are helpful. However, in most cases, if a faculty member would have access to MPRE scores, they would likely also have access to other, and likely better, information, like law school transcripts. What is not currently known is how much information, if any, an MPRE score can provide over and above other information such as law school grades. Research has repeatedly shown that law school grades, and perhaps the trajectory of law school grades over time,2 are the single best predictor of bar exam performance.

Our suspicion is that law school grades would better predict bar exam performance than MPRE scores given this previous research and given that there is (or should be) stronger overlap in what law school grades measure and what the bar exam measures: combined law school GPA (LGPA) and the bar exam both measure broad legal knowledge across several content areas. The MPRE measures a narrower and somewhat distinct knowledge area, which makes it seem likely that there would be a somewhat weaker comparative relationship between it and bar exam performance.

2. Is it worthwhile to require students to report their MPRE scores as part of their course grade in a Professional Responsibility class?

Law faculty and deans are best positioned to make decisions about curricular requirements and grading policy.

However, as noted above, the MPRE is designed to measure a narrower and distinct knowledge area—specifically, “candidates’ knowledge and understanding of established standards related to the professional conduct of lawyers.”3 To the extent that the aim of Professional Responsibility courses is a bit different, the MPRE might not be appropriate as part of a course grade. For example, the MPRE is explicitly “not a test to determine an individual’s personal ethical values.”4 If a law school course is more focused on promoting personal ethical values or exploring the difference between compliance and ethics, for example, the MPRE might not be a reasonable measure of learning within such a course.

Even if course objectives do closely or at least partially align with the MPRE’s purpose, it is a best practice in educational measurement in most cases to only use test scores for the purpose for which they were designed. The MPRE was not designed to be an exam used in part or whole to determine grades in courses, nor for faculty evaluation, nor was it designed to predict bar exam performance.

Moreover, a logistical concern associated with making MPRE scores a part of a course grade relates to timing. The MPRE is currently offered three times per year: typically in March, August, and November. If there were a strong link between a course’s content and goals and the content measured via the MPRE, it would stand to reason that students would be best positioned to take the MPRE after they had completed the full course—not during or before they took the course. The law school academic calendar, particularly a semester-based one, would in most cases not be well aligned with MPRE offerings (or alignment might vary by academic year).

Another timing-related concern is that some jurisdictions only consider an MPRE score valid within a certain amount of time relative to when the MPRE was taken and/or when the applicant will sit for the bar exam.5 Although most jurisdictions have fairly generous timelines (such that taking the examination a few months earlier or later than they otherwise would is unlikely to have much impact on eligibility), schools and students would be well advised to be aware of these time-based parameters and to consider the potential for the unintended consequence of some students needing to retake the MPRE if the score they earned during their law school course expired by the time they needed it for licensure.

There is another logistical and equity-based concern: the MPRE currently costs $160. Requiring the MPRE during a course (rather than after completion) might diminish students’ chances to be adequately prepared for the exam, arguably hurting both their grade and their chances of receiving a passing score on the examination, because they haven’t seen all the content within a professional responsibility­–­related class. Furthermore, a few jurisdictions (currently Wisconsin and Puerto Rico) do not require the MPRE, and for individuals not seeking a legal license, the MPRE is not required. Imposing an additional $160 cost on students for something that may be of little direct benefit to them could raise fairness questions or challenges.

3. Can students who struggle with multiple-choice questions be identified for remediation far enough in advance of the bar exam?

On the one hand, a large body of evidence suggests it is relatively unusual for students to struggle with multiple-choice questions but not with other types of exam questions or coursework; as such, this response should not be read as an affirmative conclusion from a negative premise. On the other hand, there could be good reasons to use multiple-choice questions within law school courses, and remediation could be among those reasons. Questions written following best practices6 can be an efficient and fair way to test core knowledge and skills. And having students experience the type of exam content, as well as the question types, they will see on the bar exam (or any other exam) is a testing best practice.

Be cautious in assuming some students are simply not good at multiple-choice questions.

NCBE’s internal research has consistently shown that most examinees who perform well on the MBE also perform well on the written component of the bar exam, and vice versa. It is less common to see examinees perform significantly better on one portion of the exam versus the other, at least among examinees who complete all the questions.

External research has shown similar patterns over time. One study, for example, found that candidates who performed well on the MBE also performed well on extremely high-fidelity performance tasks that included professional actors trained to be clients or witnesses; these candidates successfully interviewed the clients, made an opening statement, wrote a response to an offer from opposing counsel, etc. Candidates who performed poorly on the MBE also tended to perform poorly on the high-fidelity performance tasks.7

In another study, candidates received a case file and then watched a video of a hearing. They then had to answer questions about the content, like whether an objection was appropriate. Each candidate completed a few cases. Once again, candidates who performed well on the MBE tended to also perform well on the questions regarding the hearings, whereas candidates who did not perform well on the MBE also tended not to perform well on these questions.8 Other similar research studies comparing performance on the MBE to performance on more skills-based, simulated assessments show similar patterns of relationship between the two component types.

Results from other tests show some similar patterns. For example, on the GRE General Test, it was found that open-ended, essay-type questions “were not measuring anything beyond what is measured by the current multiple-choice version of these items.”9 In an under­graduate-level economics course, grades earned in a constructed-­response test did not differ significantly from grades earned on a multiple-choice test.

If you are going to expose students to (more) multiple-choice questions, make them high-quality ones.

Any multiple-choice questions used should be of high quality.10 Poor-quality multiple-choice questions may in some cases be testing nothing more than “testwiseness,” which is an examinee’s ability to apply knowledge other than the knowledge a question is intended to measure to arrive at the right answer. Many critics of multiple-­choice tests have low-quality questions in mind. As an extreme example, if we asked students to name the fastest animal on a multiple-­choice test and gave options (a) the peregrine falcon; (b) a tomato; (c) Cleveland, Ohio; and (d) sidewalk chalk, the question would not actually require them to know that a peregrine falcon is indeed the fastest animal but rather to simply know which of the options even is an animal. This extreme example of a low-quality question is obviously problematic from the perspective of assessing whether a person knows which animal is fastest. Other examples of “cueing,” “testwiseness,” or bias can be more subtle. High-quality multiple-­choice questions are desirable and achievable but take time and effort to develop.11

Multiple-choice questions have their place and their advantages. They tend to be one of the most efficient ways of measuring a large body of knowledge. Relatedly, scores earned from multiple-choice tests tend to be more reliable than scores earned from other types of tests.12 Although some perceive them as unfair, viewed from another lens they are more fair than other types of assessment; scores are objective, reliable (as noted), replicable, and may be machine-scored. Also, machines, or people scoring such anonymous work, very much by design do not know the personal characteristics of the examinee completing multiple-­choice work (including race, gender, age, socioeconomic status, family connections, etc.) This anonymity is not possible in many other evaluative contexts. Finally, whereas it might be possible for an examinee to claim an assessment was unfair if it happened to focus on the one or two areas of law the student did not understand, it is hard to plausibly argue this with a 200-­question test covering multiple areas of law designed to test general legal knowledge.

Consider other methods of exposing students to test-like conditions.

In a previous Bar Examiner article, Professor Louis N. Schulze, Jr., in describing the success of academic support measures at Florida International University College of Law, wrote:

“Did you find the silver bullet of predicting bar failure?” No, because such a singular determinant likely does not exist. “Did your faculty start ‘teaching to the test’?” No, because teaching to the test is exactly the opposite of what is necessary.13

For some, there might be a fine line between teaching to the test and teaching what’s on the test. It is helpful for most exam takers—regardless of the exam—to go into the testing environment with a clear idea of the materials covered by the exam, the format of the exam, and time limits associated with the exam. The bar exam is no exception. NCBE provides free study aid resources and a short video highlighting bar exam study basics at www.ncbex.org/study-aids.

Conclusion

NCBE strongly supports law school faculty who work to help students as they prepare for careers in the legal profession and for the bar exam. The MPRE can be a useful datapoint. But it is far from the only one. Although MPRE and bar exam performance are related, so are law school grades and bar exam performance. Where available, grades may be more useful from a student-support perspective because they could help identify particular content areas (criminal law, for example) or skillsets (legal writing and research, for example) that individual students find most challenging. Again, the MPRE is but one datapoint among many that could help predict both readiness for success on the bar exam and for professional practice; it is likely not the strongest.

Notes

  1. Susan M. Case, PhD, “The Testing Column: Failing the Bar Exam—Who’s at Fault?” 82(3) The Bar Examiner 33–35 (September 2013), at 33. (Go back)
  2. Aaron N. Taylor, Jason M. Scott, and Joshua L. Jackson, “It’s Not Where You Start, It’s How You Finish: Predicting Law School and Bar Success,” 21(10) Journal of Higher Education Theory and Practice 103–142 (2021). (Go back)
  3. “About the MPRE,” https://www.ncbex.org/exams/mpre/about-mpre. (Go back)
  4. Id. Emphasis added. (Go back)
  5. See the Comprehensive Guide to Bar Admission Requirements, Chart 6 Supplemental Remarks, “Are there any time parameters within which an MPRE score must be earned or achieved?,” available at https://reports.ncbex.org/comp-guide/charts/chart-6/. (Go back)
  6. How Are Questions Written for NCBE’s Exams? Part One: Two Multiple-Choice Question Drafters Share the Process,” 88(3) The Bar Examiner 25–29 (Fall 2019). (Go back)
  7. Stephen P. Klein and Roger E. Bolus, “An Analysis of the Relationship between Clinical Skills and Bar Examination Results,” report prepared for the Committee of Bar Examiners of the State Bar of California and the National Conference of Bar Examiners (1982). (Go back)
  8. Stephen P. Klein, “An Analysis of the Relationship between Trial Practice Skills and Bar Examination Results,” report prepared for the Committee of Bar Examiners of the State Bar of California and the National Conference of Bar Examiners (1982). (Go back)
  9. Brent Bridgeman and Donald A. Rock, “Relationships Among Multiple-Choice and Open-Ended Analytical Questions,” 30(4) Journal of Educational Measurement 313–329 (December 1993), available at https://onlinelibrary.wiley.com/doi/10.1111/j.1745-3984.1993.tb00429.x. (Go back)
  10. Andrew C. Butler, “Multiple-Choice Testing in Education: Are the Best Practices for Assessment Also Good for Learning?” 7(3) Journal of Applied Research in Memory and Cognition 323–331 (September 2018), available at https://www.sciencedirect.com/science/article/abs/pii/S2211368118301426. (Go back)
  11. Susan M. Brookhart, “Making the Most of Multiple Choice,” 73(1) Educational Leadership 36–39 (September 2015), available at https://eric.ed.gov/?q=story+AND+map+AND+story+AND+map&pg=4&id=EJ1075062. (Go back)
  12. David A. Walker, “Estimating How Many Observations Are Needed to Obtain a Required Level of Reliability,” 7(1) Journal of Modern Applied Statistical Methods 152–157 (May 2008), available at http://digitalcommons.wayne.edu/jmasm/vol7/iss1/12. (Go back)
  13. Louis N. Schulze, Jr., et al., “Helping Students Pass the Bar Exam: Five Law Schools Share Their Successful Strategies,” 88(2) The Bar Examiner (Summer 2019). (Go back)

Portrait photo of Joanne Kane, PhDJoanne Kane, PhD, is Associate Director of Psychometrics for the National ­Conference of Bar Examiners.

Portrait Photo of Douglas RipkeyDouglas R. ­Ripkey, MS, is the Deputy Director of ­Psychometrics for the National ­Conference of Bar Examiners.

Portrait photo of Mengyao Zhang, PhDMengyao Zhang, PhD, is Associate Director of ­Psychometrics for the National ­Conference of Bar Examiners.

Did you Know?

Law schools participating in NCBE’s aggregated data reporting program receive anonymized reports following each administration of the MPRE, showing how their students and graduates performed on the exam. This program helps schools meet their reporting obligations to the American Bar Association under Standards 302 and 315. Currently, 151 schools participate in this program.

From Wendy C. Perdue, Dean and Professor of Law, University of Richmond School of Law:

“We use the MPRE as one direct measure to assess our school-wide learning outcome related to the proper exercise of professional and ethical responsibilities, as required by ABA Standard 302.”

Contact us to request a pdf file of the original article as it appeared in the print edition.

  • Bar
    Bar Exam Fundamentals

    Addressing questions from conversations NCBE has had with legal educators about the bar exam.

  • Online
    Online Bar Admission Guide

    Comprehensive information on bar admission requirements in all US jurisdictions.

  • NextGen
    NextGen Bar Exam of the Future

    Visit the NextGen Bar Exam website for the latest news about the bar exam of the future.

  • BarNow
    BarNow Study Aids

    NCBE offers high-quality, affordable study aids in a mobile-friendly eLearning platform.

  • 2023
    2023 Year in Review

    NCBE’s annual publication highlights the work of volunteers and staff in fulfilling its mission.

  • 2023
    2023 Statistics

    Bar examination and admission statistics by jurisdiction, and national data for the MBE and MPRE.