This article originally appeared in The Bar Examiner print edition, Fall 2022 (Vol. 91, No. 3), pp. 16–18.By Joanne E. Kane, PhD
A few weeks after each bar exam, NCBE announces the national Multistate Bar Examination (MBE) mean score through various channels, including via a press release on our website at www.ncbex.org/news. The news is typically covered in the legal press, leading to speculation about bar exam pass rates across the country. It is not the MBE scores alone that determine the bar exam pass rates (total scores and jurisdictions’ passing standards do). But, due to the common and recommended practice of scaling the bar exam’s written component score to the MBE, the MBE score distribution is a clear harbinger of pass rates to come.1
MBE test forms are built to relatively narrow and precise specifications2 and equated to allow for score comparability over time.3 (The term test form refers to a particular set of test items, or questions, administered at a given time.) In cases where the mean score for a given administration looks different from the last year’s mean, that tends to get attention—both because it will likely signify a change in pass rates (the outcome stakeholders often care most about) and because it may underscore other important changes or trends in legal education and/or in the legal profession. If the mean is more than perhaps a point different from the previous year’s (in either direction), both NCBE staff and researchers beyond our organization tend to investigate to understand better what occurred through additional research.4
These differences in the MBE mean are of interest and importance to NCBE and to examinees, but also to jurisdictions and law schools, which may be called upon to analyze why pass rates have changed within their specific examinee or graduate pools. While this analysis might lead to curricular changes, for example, it is not used by NCBE or the jurisdictions to make adjustments to exam scores, either nationally or within any jurisdiction.
When the mean is very close to previous years’ results, as was true with this July’s mean of 140.3, it can feel like there is little to investigate.5 Results for each item have been carefully checked by many individuals and groups at NCBE. And each examinee’s score is always calculated, checked, and double-checked across multiple systems. And the equating process and results have been independently checked and replicated by the Center for Advanced Studies in Measurement and Assessment (CASMA), as has been true for several years. Beyond that routine work, investigations of large-scale trends can be limited when there is seemingly less to explain.
But the national mean does not tell the full story: pass rates depend on the distribution of scores relative to the passing score in a jurisdiction—not the average score relative to the passing score.6 Previous Bar Examiner articles have illustrated different types of distributions and their shapes.7
Because the MBE is not artificially curved to a standard distribution or bell curve, distributions that have the same mean can have very different shapes. Having the full distribution is necessary for predicting and calculating the pass rate. A future article will expand on this concept.
Toward a More Complete Understanding of Factors Affecting MBE Performance
Many factors can affect group-level examinee performance on the MBE. In past Testing Columns, we have explored some of them while seeking to put MBE performance trends into broader context. We know from several research projects, for example, that performance on the Law School Admission Test (LSAT) and law school grade point average (LGPA) are correlated both with each other and with bar exam performance, meaning that someone who does well on the LSAT and gets good grades in their law school classes is also likely to perform well on the bar exam.8
It is even possible to look at the number of people who take the LSAT nationwide in a given year and use that information to predict what might happen with the bar exam a few years down the road. Assuming that a higher number of LSAT takers means a higher number of people subsequently applying to law schools, this means, in turn, that law schools may be more selective in their admissions, and that entering 1L classes will therefore likely have higher LSAT scores on average.9
Work by researchers from other organizations has revealed that not only does LGPA predict bar success, but, more specifically, a positive growth in LGPA over the course of a student’s law school experience best does so.10 Law schools track the ongoing impact of curricular changes and academic support measures,11 and brand-new research is examining law students’ experiences, behaviors, and attitudes during the COVID-19 pandemic.12 There is more to learn about how individuals and groups were impacted by short-term and longer-term changes in remote and hybrid learning and work.
Predictions Are Just That— Predictions
Predictions are not perfect even in periods of stability. They are incomplete, and at best provide rough estimates of trends. Of course, a global pandemic will disrupt even our most carefully considered predictions. In 2018 and 2019, we saw an increase in the number of people taking the LSAT; this increase, along with other factors, suggested bar exam performance for that cohort of law school students might, three years later, also show a significant increase over previous years.13 Here we are three years later, and the actual results are somewhat different: after several years of fluctuating July MBE means just prior to the onset of the COVID-19 pandemic, the July 2021 and July 2022 means fall somewhere in the middle of recent years’ averages.
Many factors complicate our ability to predict bar exam outcomes, and the models we had used prior to the pandemic did not account for COVID-19. For students who entered law school in the fall of 2019, COVID-19’s arrival the following spring will have affected their law school careers in unprecedented ways. The global pandemic has had a profound impact on individuals, families, and communities. Law school students had to rapidly transition to remote learning and adapt to numerous other changes; to say this was challenging is an understatement, as the 2021 Law School Survey of Student Engagement reveals.14 Though we are still in the very early stages of seeking to understand all the effects the pandemic may have had on legal education so far,15 it is not implausible to think that it may have played a role in July 2022 MBE performance as well.16
Prediction Is Very Difficult—Especially About the Future
Predicting future bar passage rates based on national trends, including the number of LSATs administered, 1Ls admitted, or LSAT and GRE scores for incoming students, is very challenging. There are so many factors that could influence bar passage at a school level, jurisdiction level, and/or national level, that accounting for all of them is likely impossible. Given the tremendous challenge associated with accurately modeling or predicting bar passage, why would NCBE do it at all? We do not have direct access to individual examinees’ LSAT scores or grades. We do not use any of these factors to change how we do the scoring, for example. And NCBE does not set (or adjust) passing standards at all, let alone based on number of examinees or any other factor.
One reason for doing this work is to understand as completely as we can how the legal education landscape and NCBE’s testing products work together, so that we can best comprehend changes in scores when they occur. This analysis allows NCBE and our stakeholders to best understand how the bar exam fits within the broader ecosystem of legal education and training.
Notes
- Scaling is a procedure that statistically adjusts a jurisdiction’s raw scores on the written components of the bar exam so that collectively they have the same mean and standard deviation as the jurisdiction’s scaled MBE scores. For more about scaling, see “The Testing Column: Scaling, Revisited,” 89(1) The Bar Examiner 68–75 (Fall 2020). (Go back)
- Joanne Kane, PhD, and April Southwick, “The Testing Column: Writing, Selecting, and Placing MBE Items: A Coordinated and Collaborative Effort,” 88(1) The Bar Examiner 46–49 (Spring 2019). (Go back)
- Mark A. Albanese, PhD, “The Testing Column: Equating the MBE,” 84(3), The Bar Examiner 29–36 (September 2015). (Go back)
- See, e.g., Mark A. Albanese, PhD, “The Testing Column: The July 2014 MBE: Rogue Wave or Storm Surge?” 84(2) The Bar Examiner 35–48 (June 2015), or Mark A. Albanese, PhD, “The Testing Column: July 2019 MBE: Here Comes the Sun; August 2019 MPRE: Here Comes the Computer,” 88(3) The Bar Examiner 33–35 (Fall 2019). (Go back)
- The July 2022 mean of 140.3 was 0.1 point lower than the July 2021 mean of 140.4 and 0.8 point lower than the July 2019 mean of 141.1. (Comparability with 2020 results is of limited value due to the small number of examinees who took a full MBE in July, September, or October 2020, due to the COVID-19 pandemic.) (Go back)
- Michael T. Kane, PhD, and Joanne Kane, PhD, “Standard Setting 101: Background and Basics for the Bar Admissions Community,” 87(3) The Bar Examiner 9–17 (Fall 2018). (Go back)
- See, e.g., Mark A. Albanese, PhD, “The Testing Column: What Does the Mean Mean and How Standard Is That Deviation?,” 83(3) The Bar Examiner 37–45 (September 2014). (Go back)
- See Andrew A. Mroch, PhD, and Mark A. Albanese, PhD, “The Testing Column: Did UBE Adoption in New York Have an Impact on Bar Exam Performance?” 88(4) The Bar Examiner 34–42 (Winter 2019–2020) at 38, noting that LSAT score and LGPA each had “a relatively strong, statistically significant positive relationship with bar exam score, where a positive relationship indicates that an increase in background characteristic is associated with an increase in bar exam score.” (Go back)
- Mark A. Albanese, PhD, “The Testing Column: July 2019 MBE: Here Comes the Sun; August 2019 MPRE: Here Comes the Computer,” 88(3) The Bar Examiner 33–35 (Fall 2019). (Go back)
- Aaron N. Taylor, PhD, Jason M. Scott, MPP, and Joshua L. Jackson, PhD, “It’s Not Where You Start, It’s How You Finish: Predicting Law School and Bar Success,” 21(10) Journal of Higher Education Theory and Practice 103–142 (2021). (Go back)
- See, e.g., Louis N. Schulze, Jr., “The Science of Learning Law: Academic Support Measures at Florida International University College of Law,” 88(2) The Bar Examiner 9–11 (Summer 2019). (Go back)
- Gallup and AccessLex Institute. “Law School in a Pandemic, Year 2: Moving from Emergency Remote Teaching to Emerging Best Practices in Distance Legal Education,” available at https://www.accesslex.org/law-school-in-a-pandemic-year-2. (Go back)
- Supra note 9. (Go back)
- The Law School Survey of Student Engagement (LSSSE) 2021 Annual Survey Results, “The Covid Crisis in Legal Education,” available at https://lssse.indiana.edu/wp-content/uploads/2015/12/COVID-Crisis-in-Legal-Education-Final-10.28.21.pdf. (Go back)
- See “2021 Statistics: COVID-19: Implications for 2020–2021 Admissions,” 91(1) The Bar Examiner 30–31 (Spring 2022). (Go back)
- Supra note 12. (Go back)
Joanne E. Kane, PhD, is the Associate Director of Testing for the National Conference of Bar Examiners.
Contact us to request a pdf file of the original article as it appeared in the print edition.