This article originally appeared in The Bar Examiner print edition, December 2015 (Vol. 84, No. 4), pp. 37–41.

By Mark A. Albanese, PhDIn testing, speed refers to the extent to which examinees have the time to finish the examination without being unduly hurried. Tests are generally classified as being speeded if time is considered to be relevant to the skill being assessed, and such tests are designed so that not all examinees (or maybe any) will be able to complete all items in the time allotted. A typing test is a good example of a speeded test. There is always more text than the typist could possibly hope to type in the allotted time. The point is to determine how many words the typist can type per minute and with how many errors.

For the most part, licensing tests like the MBE and MPRE are not designed to be speeded. This puts them into a category called a power test. The goal is to see whether the examinee has the requisite skill to complete the examination within reasonable time limits.

The generally accepted rule is that a test is speeded if more than 10% of the examinees fail to reach the last item.1 There are other definitions as well. The Educational Testing Service (ETS) has adopted a guideline whereby a test is considered speeded if 1) fewer than 100% of the examinees reach 75% of the test items and 2) fewer than 80% of the examinees finish 100% of the test.2

For the most part, the ETS rule is more lax than the 10% rule generally adopted. However, there are other indicators of speededness that are sometimes considered, such as whether examinees give the same response to the last items on the test (e.g., an examinee selects the first option, A, for the last five items on the test). In testing, a sequence of consecutive items is called a string, and if option A is selected for the last five items, that sequence would be called a five-item string A. This is sometimes referred to as a straight-line answer pattern because examinees sometimes just draw a straight line between the bubbles on the answer sheet. Another indicator of speededness that is sometimes considered is whether performance on the last items on the test deteriorates, meaning that the percent correct on the last items of the test is lower than the percent correct on the first items of the test.

In this article, I will show some data related to each of these indicators of speededness for the MBE. If you have any fear that the MBE is speeded, I hope that these data will put you at ease.

Percentage of Examinees Who Failed to Reach the Last Item

Table 1 shows the number and percentage of examinees who failed to reach the final item on the MBE over the last 10 years for February and July as well as morning and afternoon test sessions. Not only did none of the MBE examinee groups reach the 10% threshold for speededness, they did not reach even one-tenth of that value (1%). By the traditional definition of speededness, the MBE is clearly not a speeded examination.

Table 1: Number and Percentage of Examinees Failing to Reach the Final Item (Traditional Indicator of Speededness)

Percentage of Examinees Who Straight-Lined at Least Five Items

Next we examined whether there were strings of five or more items at the end that were given the same response. Table 2 shows that the maximum percentage of examinees who straight-lined answers to a string of at least five items never exceeded 3.4% across the last 10 years, either in February or July or morning or afternoon test sessions. Thus, even using this expanded definition of what constitutes speededness, the MBE data did not reach the traditional criterion of 10%.

Table 2: Number and Percentage of Examinees Having a String of Five or More Items with the Same Response at the End

Percentage of Examinees Who Failed to Reach the Last Item or Who Straight-Lined the Last Five or More Items

Finally, we looked at the combined total of examinees who either failed to reach the final item or produced a straight-line answer pattern, the results of which are presented in Table 3. The maximum ever reached was less than 4%. Thus, even expanding the definition and combining results from two indicators, the percentage never came close to meeting the 10% criterion.

Table 3: Number and Percentage of Examinee Failing to Reach the Final Item or Having a String of Five or More Items with the Same Response at the End

Performance on the First 10 Versus the Last 10 Items

Taking our analysis to the next level, we examined whether performance deteriorated at the end of the examination compared to the beginning. This comparison is somewhat problematic, because fatigue can be an alternative explanation to speededness for a performance decrease. Further, there is no good way to equate individual items, so the intrinsic ­difficulty of the items could be different. However, in spite of these problems, we computed the
average percent correct on the first 10 items and the last 10 items on the February and July 2014 examinations.

In July 2014, the mean percent correct on the last 10 items was 4% lower than for the first 10 items. In February 2014, the performance on the last 10 items was actually 2% higher than on the first 10 items. Even if we use the 4% reduction in performance in July as the best estimate of performance on the last 10 items, this amount is well within the variation in average difficulty we see between items on the examination. If we couple this with the potential for fatigue to be a competing explanation for the decline in performance, the amount does not reach levels that are considered problematic.

Expected Reading Rate and Level

A few other statistics from the February and July 2014 MBEs might also be helpful. The amount of time allowed for the MBE requires that examinees read at an average rate of 92 to 95 words per minute. To provide a context for interpreting how demanding this reading rate might be, one study that determined the optimal silent reading rate for different grade levels listed a rate of 114 words per minute as being optimal for the second grade (8-year-olds).3

We also computed two indexes of the complexity of the language used on the examinations and expressed the result as a grade level. (Many indexes have been developed to try to measure the readability of English writing. They examine such things as the ratio of “complex words” to “simple words” and the average length of sentences within a passage. The indexes differ in terms of the factors they include, the constants they include in the formula used to estimate reading level, the scale on which they report reading level, and precise rules for whether particular kinds of words, such as proper nouns or compound words, should be included in the calculations. But the indexes are similar in terms of all attempting to represent text complexity in terms of the grade level at which students would likely be able to understand the passage.)

The two indexes examined different aspects of the prose and gave fairly different results. The first index indicated that the average grade level of the language used on the 2014 MBEs was between 10.55 and 10.59, midyear of the tenth grade in high school. The other index put the language level at between 13.31 and 13.37, which would be early in the second year of college. Either way, the language level of the MBE is well within what should be expected of graduating law students.


In summary, we have no reason to believe that the MBE is a speeded examination, whether we examine failure to reach the final item, straight-line answers, or declines in performance on items at the end compared to those in the beginning. None of the indexes assessing patterns at the end of the exam reach anywhere near the 10% level traditionally used to define a test as speeded. Comparisons of the percent correct on the last 10 items versus the first 10 items for the 2014 examinations found at most a 4% reduction in performance in July, but that is counterbalanced by a 2% increase in February. The reading rate required to complete the MBE in the time allotted is less than 100 words per minute, a rate that would be expected of students somewhere between the first and second grades.4 Finally, the language used on the examination is well within the education level that should be expected of graduating law students, being at the most targeted at the second year of college. If you are still concerned that the MBE is a speeded examination—well, maybe
not everyone should be a lawyer.


The data for this article were provided by NCBE Testing Department staff members Dr. Joanne Kane and Douglas Ripkey.


  1. J.C. Nunnally, Psychometric Theory 565 (McGraw-Hill 1967). (Go back)
  2. S. Ellerin-Rindler, “Pitfalls in Assessing Test Speededness,” 3 J. Educ. Meas. 261–270 (1979). (Go back)
  3. J. Hasbrouck & G.A. Tindal, “Oral Reading Fluency Norms: A Valuable Assessment Tool for Teaching Teachers,” 59 Reading Teacher 636–644 (2006). (Go back)
  4. Id. (Go back)

Portrait photo of Mark AlbaneseMark A. Albanese, PhD, is the Director of Testing and Research for the National Conference of Bar Examiners.

Contact us to request a pdf file of the original article as it appeared in the print edition.

  • Bar
    Bar Exam Fundamentals

    Addressing questions from conversations NCBE has had with legal educators about the bar exam.

  • Online
    Online Bar Admission Guide

    Comprehensive information on bar admission requirements in all US jurisdictions.

  • NextGen
    NextGen Bar Exam of the Future

    Visit the NextGen Bar Exam website for the latest news about the bar exam of the future.

  • BarNow
    BarNow Study Aids

    NCBE offers high-quality, affordable study aids in a mobile-friendly eLearning platform.

  • 2023
    2023 Year in Review

    NCBE’s annual publication highlights the work of volunteers and staff in fulfilling its mission.

  • 2022
    2022 Statistics

    Bar examination and admission statistics by jurisdiction, and national data for the MBE and MPRE.