This article originally appeared in The Bar Examiner print edition, June 2016 (Vol. 85, No. 2), pp 48–56.
By Mark A. Albanese, Ph.D.The recent declines in Multistate Bar Examination (MBE) scores have caused a number of shock waves to reverberate through the bar examining community. The chief shock is how much damage a seemingly small change in the overall MBE mean score can generate in terms of declining passing rates at the individual jurisdiction level and especially at the individual law school level. The problem is that these declines are not happening in a vacuum. They coincide with declining academic credentials of the applicants being admitted to law school (e.g., low LSAT scores and undergraduate GPAs).1 Further complicating the problem is that the declines in MBE mean scores are not spread uniformly across law schools and jurisdictions, resulting in some jurisdictions seeing a three-or-more-point drop in MBE mean scores, while others have seen a more modest drop, or even—in fewer cases—an increase.
Another factor, particularly affecting passing rates at the individual law school level, is that law students are transferring to other law schools, particularly more prestigious law schools, in increasing numbers. In 2015, Harvard accepted 55 students who transferred from other law schools, whereas in the four previous years, Harvard had never accepted more than 35 transfer students per year.2 These are not just any law students, but are often law students at the tops of their classes at the schools from which they transfer. This has been called poaching,3 but from the point of view of the law school being left, which may have invested much in recruiting, scholarship money, and instruction, it can feel a lot like desertion, particularly when the simple fact of the students’ leaving decreases the bar exam pass rate for the law school (as I will show later).
A third contributing factor to declining passing rates is defection, where graduates leave the jurisdiction in which their law school resides to take the bar exam in another jurisdiction (or perhaps decide not to take the bar exam at all). Like the deserters, the defectors are likely to be those who see better opportunities in other jurisdictions and are often students at the top of the class, not just those who seek the fabled jurisdiction with a passing standard so low that anyone with a pulse can pass.
A final contributing factor is deferral, where graduates delay taking the bar exam so that they can better prepare. (How deferrals affect the law school’s bar exam pass rate depends on whether those graduates are low or high performers, as I will discuss later.)
In this article, I will use the results from the July 2014 and July 2015 MBE administrations to illustrate how the declines in MBE scores, desertions, defections, and deferrals can affect the passing rates in different jurisdictions and individual law schools.
How the Decline in the National MBE Mean Score Plays Out at the Jurisdiction and Law School Levels
In July 2014, the MBE mean scaled score was 141.5, a decline of 2.8 points from July 2013. In July 2015, the MBE mean score declined a further 1.6 points from the July 2014 level, breaking the 140 barrier with a 139.9 value. Since MBE scaled scores can generally range between about 40 and 190, and typically have a standard deviation (SD) of about 16 (standard deviation being the measure of the spread of scores—and although not strictly true, can be thought of as the average deviation of scores from the mean), these declines may seem almost trivial. However, a seemingly trivial change in the national mean can have dramatic consequences for an individual jurisdiction or law school.
Figure 1 shows how the MBE mean score changed from July 2014 to July 2015 for the 50 jurisdictions administering the MBE with 10 or more examinees in both administrations. The mean changed from a loss of close to 7 points in the jurisdiction at the top of the figure to a gain of slightly over 3.5 points in the jurisdiction at the bottom of the figure. Thus, the overall loss of 1.6 points from the 2014 level was not uniformly spread over all jurisdictions in 2015. If we had analogous data for the 200+ individual law schools, it is likely that we would see even more variation. In describing how a relatively small overall change in the MBE mean can have a large impact at the individual law school level, several forces need to be considered.
Figure 1: Change in MBE mean score from July 2014 to July 2015 for the 50 jurisdictions administering the MBE with 10 or more examinees in both administrations
(Click image for full size view)
The Decline in the Overall Applicant Pool
The law school class graduating in spring 2015 entered in the fall of 2012. The applicant pool in 2012 was 67,900, a 13.5% decline from the 2011 level of 78,500. Enrollment decreased from 48,697 in 2011 to 44,481 in 2012, a 8.7% decline. There was thus a net decline of 4.8% of the applicant pool after adjusting for the decline in the enrollment (13.5% decline in applicants − 8.7% decline in enrollment = 4.8% net decline in the applicant pool), indicating that law schools were drawing deeper into the applicant pool than they had in the past year. So what can this 4.8% deeper draw mean for a law school?
For starters, the decline in applicants for any given law school will depend on how well the school draws from the overall applicant pool. Just as the MBE mean score change varied over jurisdictions, the extent to which the 13.5% decline in the applicant pool affected jurisdictions and their law schools is also likely not to have been uniform. The prestigious private schools such as Harvard and Yale and state flagship institutions may have been less affected by the decline than less prestigious or at least less attractive law schools. The net effect is that the decline of 13.5% in the applicant pool and the drawdown required to fill classes may have been doubled or even tripled for law schools at the bottom of the law school attractiveness pile.
The other factor affecting the decline in applicants at the law school level is whether a law school reduced its class size or not. If it did not, it is likely that the drawdown into the depleted applicant pool required to fill the class would have been the full 13.5% or more, depending upon how competitive the school is in attracting students.
Replacing High-Performing Applicants with Lower-Performing Applicants
The other key issue is whether the decline in applicants has been distributed uniformly over the applicant pool or has been concentrated in one area, such as on the high-performing end of the distribution. A recent study demonstrated that the largest loss of entrants to law school has been in the LSAT score range of 160+; between 2010 and 2015, the percentage of law school matriculants in the 160+ range declined from 40.8% to 32.0%, while those in the <150 LSAT score range increased from 14.2% to 23.8%.4 These numbers indicate that on average, law schools have had just short of a 9% loss of matriculants from the high-performing end of the distribution and have replaced them with students from the lower-performing end of the distribution.
More specific to the 2014 and 2015 takers of the MBE who would have entered law school in 2011 and 2012, respectively, the decline in applicants from the high-performing end of the distribution was from 39% in 2011 to 36.3% in 2012, a 2.7% reduction, accompanied by a 3.6% rise in applicants from the lower-performing end of the distribution. These changes were reflected in the overall LSAT mean of matriculants by a decline from 157.4 in 2011 to 157 in 2012.5 Because the 157 reflected LSAC’s change (effective with the fall 2012 entering class) to reporting LSAT mean scores rounded to whole numbers instead of reported to the tenth of a point, the decline was at maximum 0.9 points (i.e., if the unrounded value in 2012 was 156.5, the lowest value that would round to 157).
The MBE Mean of the Replacement Group
To illustrate how a 1.6-point drop in the MBE mean score and the loss of a percentage of its high-performing group could affect an individual law school’s passing rate, I performed a simulation. (See Tables 1 and 2.) The simulation used a fictitious law school with 200 students in 2014 as well as 200 in 2015. The school had exactly the national MBE mean score of 141.5 in 2014 and 139.9 in 2015. The fictitious school lost between 5% and 35% of its class, and the lost students had a mean score that moved from a high value (165) to a lower value (135) in 5-point increments. With each of these values set, it was a mathematical exercise to determine what the mean of the replacement lower-performing group would be. This way I could get a sense for what would happen to the law school’s passing rate in the region where different jurisdictions set their cut scores (in 2015, the cut scores on the 200-point MBE scale ranged from 129 to 145).
Table 1 shows the mean score that the replacement group to our simulated law school would need in order to compensate for a loss of different percentages of the class who had mean MBE scores ranging from 135 to 165. The lowest MBE mean score modeled, 135, is the most common cut score used in the various jurisdictions. Table 1 is somewhat complex to understand, so I will take a couple of entries and explain their interpretation.
Table 1: MBE Means of Replacement Groups as a Function of the MBE Mean of Lost High Scorers and Percentage of the Class Lost
The upper left value in Table 1 is 103; if our law school lost 10 (5%) of its 200 students who had an MBE mean of 135, it would have had to replace them with 10 who had a mean MBE score of 103 in order for the overall mean of the class to drop the 1.6 points from 141.5 to 139.9. If that were the case, the large majority of these replacements would fail the bar exam, because a score of 103 is 26 points (over 1.5 times the typical SD of 16) below the lowest passing score used in any jurisdiction (129).
For a second, less extreme example, take the value on the upper right: 133. This value would be the mean replacement score if our school lost 5% of its students with a mean score of 165. Because a score of 133 would pass in fewer than 40% of the jurisdictions (20 of the 54 jurisdictions have cut scores between 129 and 133), these replacements would be challenged to pass the bar exam in over 60% of the jurisdictions.
Note that the mean score of the replacement group goes up as you go down the table and the percent of the class being replaced increases; it also goes up as you go across the table from left to right and the mean score of the lost students rises. Note also that the table is sectioned into three parts. The upper left section contains replacement means that are below 129, the lowest passing score used by any jurisdiction. Over half of the replacements in this region would fail the bar examination in even the jurisdiction with the lowest cut score (since the mean is in the middle of the scores, half the scores will be above the mean and half below the mean, thus, a group with a mean at the passing score will have approximately half its scores above the mean and therefore also above the passing score), and the proportion of the replacements failing would go up from there as they attempt to gain entry to the bar in jurisdictions with more stringent passing scores. The lower right section contains replacement means that are at or above 145, the highest passing score used by any jurisdiction. The replacements on the bottom right are those for whom more than half would pass the bar exam in even the jurisdictions with the most stringent passing standards. The replacements in the center are those whose fate will be heavily dependent on the passing standard used in the jurisdiction where they sit for the bar exam.
The Effect of the Replacement Group on the Law School Passing Rate
The final comparison of interest related to declines in MBE scores is an estimate of how much a law school’s passing rate would decline as a percentage of the class’s high-performing group was lost and was replaced by lower-performing students. Table 2 shows the increases in the expected percentage of failures resulting from the replacements of the lost high scorers for passing scores of 129 (part I) and 145 (part II). Values that are in bold and underlined indicate for each mean of the lost high scorers the percentage of the class they represent that resulted in the maximum increase in failures for their replacements. (For the columns for mean high scores less than 150 in part II, the maximum value would be obtained for the percentage of the class above 30, so no value is highlighted). Actual values were taken to 6 decimal places, so ties are only due to rounding. For example, for the cut score of 129 shown in part I, the lost high group with a mean score of 135 and which comprised 18% of the class had the replacement group with the highest failure increase of 3.92%. This was also the largest increase for any of the mean values of the lost high-performing groups. Note that the mean of the high-performing group was 6 points above the cut score in this comparison. All other comparisons involved means for the lost high-performing group that were more than 6 points above the cut score. Table 2 part II shows similar results for a passing score of 145. In this case, the highest failure rate increase of 3.94% occurred when the mean score of the lost high-performing group was 150 and it constituted 21% of the class.
Table 2: Increase in Percentage of Scores Below a 129 or 145 Cut Score as a Function of the MBE Mean of Lost High Scorers, Percentage of the Class Lost, and MBE Means of Replacement Groups
I. Cut Score = 129
II. Cut Score = 145
An important thing to note is that these percentages are highly dependent upon the SD used. I used the SD of 16, which tends to be the typical SD for the entire population taking the MBE at a given administration. If the SDs of the lost high-performing group and lower-performing replacements are smaller—which is highly likely, since the former are concentrated in the top half if not the top fifth of the distribution and the replacements are likely to be in the bottom half of the distribution—the failure rate increase will be substantially larger. For example, taking the value in Table 1 corresponding to a lost high-performing group with a mean of 135 who comprise 18% of the class, the replacement group mean will be 126.1. Applying a cut score of 129 and using an SD of 8 instead of 16 gives a percentage increase in failure of 7.47% instead of 3.92%. So, given an average decline in mean MBE scores of 1.6 points, the percentage increase in failure could be as high as 8% (maybe even 10% if the SD is less than 8). As seen in Figure 1, the change in MBE scores was substantially more than 1.6 points in many jurisdictions, maxing out at nearly 7 points. So, in jurisdictions with multiple law schools, the increases in failure rate shown for jurisdictions in Figure 1 could translate for the less attractive law schools to increases of 20% or more. But it does not stop there. We have yet to consider the impact of deserters, defectors, and deferrals.
How Deserters, Defectors, and Deferrals Affect Law School Passing Rates
Recall that deserters are students who choose to transfer to another law school after their first year. These are often students near the top of the class who sometimes are recruited (poached) by other law schools. Defectors are students who sit for the bar examination in another jurisdiction or who decide not to sit for the bar examination at all. They may enter a graduate program in another profession, take a legal job that does not require admission to the bar, or take a job outside the practice of law. Deferrals are students who delay taking the bar exam to a later date. Each of these groups, if composed of high performers, can have a negative impact on a law school’s passing rate.
In 2014, the largest number of transfer students lost by a school (100) constituted 21% of the class it admitted in 2013.6 Nationally, the percentage of students who transferred in 2014 was 5.5%, up 0.5–0.9% since before 2012, when the applicant pool began its descent to levels not seen since the 1970s.7 While one-half to one percent may seem like a trivial amount, it is not uniformly spread across all law schools. One study showed a disproportionate loss to transfer by law schools in tiers 3–4 compared to their tier 1–2 counterparts for the 2007 U.S. News & World Report ranking.8
On the assumption that students who desert (transfer) tend to be the relatively high performers in a law school and that at least some of those who defect will be high performers, Table 3 shows the decrease in pass rate resulting from the loss of high performers as they constitute from 5% to 30% of the class as a function of what the passing rate of the school would have been if the high performers had remained.
Table 3: Decrease in Pass Rates from the Loss of High Scorers as a Function of the Pass Rate of All Entering Students
Note: Percentage in each column heading indicates percentage of high-performing non-takers
There are two trends in Table 3 worth noting. The first trend is that the impact of the lost high-performing group increases as its percentage of the class increases (as you go from left to right). The second trend is that the lower the overall pass rate of the school, the more keenly it feels the loss of its top performers. Whereas a school with a base expectation of 50% of its graduates passing the bar exam would see a 21% drop in its pass rate (resulting in a pass rate of 29%) if it lost 30% of its class who were top performers, a school with a base expectation of 80% would see only a 9% decline in its pass rate (resulting in a pass rate of 71%) from the same 30% loss. Keep in mind that nothing changed in either of these schools except whether their top performers decided to stay or go.
If the defectors are low performers who have little chance of passing the exam, their decision not to take the bar exam will not hurt the passing rates of their law schools. However, if they are high performers, their defection from taking the bar exam will have a negative effect on the passing rates of their law schools.
Similarly, if those who defer are low performers with little hope of passing who want to take extra time to study to bolster their likelihood of passing, this will actually help the law school passing rates. But deferrals who are high performers will have a negative impact on the passing rates of their law schools.
To see how this works, suppose we have 10 students in a class, 5 of whom will pass the bar exam. The passing rate of the law school is then 50%. If, however, 1 of the 5 who will pass the bar exam sits out, the passing rate will be 4 out of 9, or 44.4%. (If we were to still count the student sitting out as part of the 10, the passing rate would be 4/10 = 40%. Generally, however, the bar passage rate for a law school only includes those who sit for the bar examination in a given administration.) On the other hand, if a student with little hope of passing the bar exam defers in order to study, this will help the passing rate because there will still be the 5 who will pass, but there will be only 9 takers for a 55.6% passing rate.
These are tough times for law schools, particularly for those that are at the low end of the attractiveness spectrum. There are reports of sizable numbers of students transferring from even mid-tier law schools.9 The fact that Harvard had over a 50% increase in its transfer group in 2015 from previous years makes this quite clear. It is a very competitive world out there.
The results of the analyses in this article show that the loss of high performers constituting 10% of a class can increase the failure rate from 2% to 6%, even if nothing else changes. If the class suffers a loss amounting to 30% of the class, the failure rate can increase by up to 21% depending upon the competitiveness of the class. If you add on to that the potential that high performers may be in shorter supply than in the past, which could require admitting students in regions near the bar exam cut score where bar failure is a greater possibility, this results in an additional increase in failure rates of 4% to 9% depending upon the variability of the lost group and its replacements. The problem becomes even more acute if a school experiences more than the 1.6-point decline in the MBE mean score. As shown in Figure 1, a number of jurisdictions had a decline of 3 or more points, and for larger jurisdictions with multiple law schools, some schools could see declines of double that or more, since some law schools may see increases rather than declines, which would magnify the effect on the less attractive schools. In these cases, it would not be surprising to see increases in failure rates on the order of 15% to 20%, which would further increase if the highest-ability students who remain desert, defect, or defer.
While the impact of the deserters, defectors, and deferrals gets sorted out when schools submit their bar passage rates to the ABA, there is a long period during which only the home jurisdiction pass rates are available. And unless the law school removed students who did not sit for the bar examination in the home jurisdiction from its class profile, the passing rate could be quite disparate from the actual pass rate. However, accounting for those who take the bar exam in a different jurisdiction takes time and is not immediately apparent. Panic can set in before all the results have been compiled.
When scores on the July 2014 bar examination were released, some law schools were saying that their bar exam scores dropped but that their classes looked just like those that were admitted the year before in terms of their academic credentials (usually LSAT scores and undergraduate GPAs). This statement was repeated by some law schools in July 2015. Although the student body could appear to be the same as it had been in the past because of students who transferred in, those students’ LSAT scores and undergraduate GPAs generally stay with the law school they originally entered in the statistics provided to the ABA, which will cause a disconnect between LSAT scores and bar exam pass rates of the law schools from which these students eventually graduate.
So, bar exam pass rates are down, and there are schools that claim that their student qualifications are the same as in the past. Are the students the same? Maybe yes, maybe no; discovering the answer would take a complete analysis, accounting for the LSAT scores and undergraduate GPAs of students below the 25th percentile (which are not represented in the ABA law school data profiles), desertions, defections, and deferrals.
- Susan M. Case, Ph.D., The Testing Column: Identifying and Helping At-Risk Students, 80(4) The Bar Examiner 30–32 (December 2011).
- Posting of Derek T. Muller to Excess of Democracy, “Top 25 Law Schools Ranked by Law Student Transfer Preferences” (Dec. 15, 2014), http://excessofdemocracy.com/blog/2014/12/top-25-law-schools-ranked-by-law-student-transfer- preferences (accessed Feb. 19, 2016).
- Posting of Jerry Organ to The Legal Whiteboard, “Changes in Composition of the LSAT Profiles of Matriculants and Law Schools between 2010 and 2015” (Jan. 18, 2016), (accessed Feb. 19, 2016) https://lawprofessors.typepad.com/legalwhiteboard/2016/01/in-late-december-2014-i-posted-a-blog-analyzing-how-the-distribution-of-matriculants-across-lsat-categories-had-changed-si.html (Go back)
- Posting of Natalie Kitroeff to Bloomberg BNA Big Law Business Legal Communities, “Is the Law School Crisis Affecting Harvard?” (Jan. 12, 2016), https://bol.bna.com/is-the-law-school-crisis-affecting-harvard/ (accessed Feb. 8, 2016). (Go back)
- Posting of Ry Rivard to Inside Higher Ed, “Poaching Law Students” (Feb. 4, 2015), https://www.insidehighered.com/news/2015/02/04/law-school-transfer-market-heats-getting-some-deans-hot-under-collar (accessed Feb. 18, 2016). (Go back)
- Organ, supra note 1. (Go back)
- Organ, supra note 1. (Go back)
- Rivard, supra note 3. (Go back)
- Posting of Jerry Organ to The Legal Whiteboard, “Further Understanding the Transfer Market—A Look at the 2014 Transfer Data” (Dec. 20, 2014), https://lawprofessors.typepad.com/legalwhiteboard/2014/12/further-understanding-the-transfer-market-a-look-at-the-2014-transfer-data.html (accessed Feb. 19, 2016). (Go back)
- Posting of Bill Henderson to Empirical Legal Studies, “Transfer Students—The Data” (June 3, 2008), http://www.elsblog.org/the_empirical_legal_studi/2008/06/transfer-studen.html (accessed Feb. 19, 2016). (Go back)
- Organ, supra note 7. (Go back)
Mark A. Albanese, Ph.D., is the Director of Testing and Research for the National Conference of Bar Examiners.
Contact us to request a pdf file of the original article as it appeared in the print edition.