This article originally appeared in The Bar Examiner print edition, Fall 2018 (Vol. 87, No. 3), pp 18–24.

By NCBE Testing and Research Department

NCBE Research Department Staff

NCBE Testing and Research Department Staff:Front row: Juan Chen, Ph.D.; Mark A. Albanese, Ph.D.; Joanne Kane, Ph.D.Back row: Mengyao Zhang, Ph.D.; Andrew A. Mroch, Ph.D.; Mark Connally, Ph.D.; Douglas R. Ripkey, M.S.

NCBE receives questions about the bar examination from many sources on a continuing basis—questions related to the content of the bar exam and the evolution of the questions, or sometimes arising from misunderstandings about statistical methods used in the development and scoring of the bar exam. Broader questions have also been raised pertaining to the bar exam and licensure in general. In this Q&A article, NCBE Testing and Research Department staff members address a lucky 13 recently asked questions.

1 Does the fact that all MBE items are pretested before appearing as live items on the exam make the test more difficult and reduce scaled scores?

No. All MBE items (questions) are pretested by appearing as unscored items on exams so that their performance can be evaluated before they are used on a future exam. This process is an important step in verifying that the items demonstrate acceptable statistical characteristics. NCBE has been asked whether pretesting makes the MBE more difficult because it reduces (or eliminates) items that are too easy or items that have multiple correct answers (i.e., items for which more than one answer is correct, thereby allowing an examinee to receive credit for the question when choosing any of those correct answers).

Pretesting items reduces the number of items that do not function as intended on the MBE, which includes items that are too easy and items that would have been given credit for more than one answer (or, in some cases, for any answer), as well as items that are too difficult. Such questions are not statistically desirable, as they do not contribute enough information to separate examinees who are minimally competent from those who are not. The presence of such items would often result in a larger number of items correct (a higher raw score) than would be the case in their absence; however, after equating the MBE, scaled scores would be the same with or without these underperforming items. The statistical process of equating the MBE adjusts for differences in the difficulty of different versions of the MBE, thereby accounting for the inclusion or exclusion of these underperforming items in an examinee’s score. For example, if every examinee obtained credit for a particular item, raw scores would go up by a point, but equating would adjust for the extra “easiness” that this item contributes to raw scores and, as a result, scaled scores would be the same with or without the underperforming item.

2 Why did NCBE remove the “yes, if” and “no, if” answer format on MBE items and switch to “yes, because” and “no, because”?

“Yes, if” and “no, if” in this context refers to having each answer choice having information that needs to be separately considered in order to make an informed response. NCBE did not remove the “if”; NCBE moved the “if.” Instead of each of the four answers having its own conditional statement, the conditions now are built into the fact pattern of the question, which is the portion of a multiple-choice question that presents the problem (excluding scenarios in which “if” is part of the legal standard). This helps make the question clearer by making it possible to answer the question without needing to review the answer choices. We continually strive for clearly posed fact patterns in order to reduce the likelihood of an item testing only memorization rather than analysis; to reduce the demand on short-term memory (since each answer option need not be remembered and compared with the others); and to help make items more focused, which generally reduces the item length. The fact that the condition is not removed but embedded in the fact pattern should maintain the realism of the question, while reducing or eliminating ambiguity and unnecessary cognitive demands associated with having to read all options to deduce the correct answer. NCBE made this change to MBE questions over 10 years ago.1

3 Why did NCBE eliminate MBE items with multiple combination choices of answers?

Items with multiple combination choices of answers allow the option of, for instance, choosing answer A, B, C, or D—or choosing combinations such as “both A and C,” “A, B, and C,” or “all of the above.” Although some have argued that items with multiple combination choices of answers are more realistic than single-best-answer items, research on such items has found that they are less reliable, are more confusing, and often contain inadvertent clues to the correct answer.2 For these reasons, questions like these were phased out of NCBE exams over 10 years ago.

4 Why has NCBE moved to MBE items with shorter fact patterns?

The shortening of fact patterns (i.e., the portion of the question setting up the problem) on MBE items was the result of efforts to streamline questions and remove verbiage that is extraneous to the content intended to be tested by the item. Sometimes, extra material is intended to make the item appear more realistic, or to give the fact pattern “texture,” but it can contribute to making items more confusing, messy, and distracting for examinees. The more extraneous material is added to an item, the greater the cognitive load it imposes on the examinee, who must read and evaluate the merits of the additional material; this adds to the fatigue factor in a long examination when an examinee has a limited amount of time to answer each question. NCBE’s goal is to make fact patterns clear and concise, and this has driven the evolution to shorter fact patterns on our MBE items today. NCBE moved to shorter fact patterns over 10 years ago.

5 Did adding Civil Procedure as a seventh topic area tested on the MBE make the MBE more difficult?

No. Examinees may perceive that the MBE is more difficult because it now involves a seventh topic area, but adding Civil Procedure has not made the MBE more difficult in the sense of having an effect on scaled scores. Civil Procedure was added to the MBE in February 2015. To make room for it without increasing the length of the MBE, the number of items on the other six content areas was reduced from 31 or 32 items to 27 (except for Contracts, which was reduced to 28), to maintain the number of scored items at 190 out of 200 (10 items being unscored pretest items). The number of items allocated to each of the seven topic areas was further reduced to 25 in February 2017 when the number of unscored items increased from 10 to 25.

It is important to remember that the scale underlying the MBE (i.e., the scale that produces scaled scores) is maintained by the statistical process of equating that makes adjustments for fluctuations in the difficulty of MBE exams across time. This equating process helped ensure that the addition of Civil Procedure did not adversely affect scaled scores. To confirm this, we equated the MBE with and without Civil Procedure items in February 2015 and July 2015. The results were within 0.2 points for all scaled scores in the range where jurisdictions have set their passing scores (between 129 and 145 on the 200-point MBE scale) and did not result in a change in pass/fail decision for any examinee.

6 Jurisdictions differ in the number of subjects tested on the written portions of their bar examinations. Doesn’t this invalidate the standardization or the comparability of MBE scores across jurisdictions?

The MBE is designed to test the same content and be administered and graded in as similar a fashion as possible regardless of the jurisdiction in which it is administered, which is typically the scope of standardization for an exam like the MBE. The fact that one jurisdiction tests more subject areas than another does not “invalidate” the standardization of the MBE any more than it would be invalidated because examinees go to different law schools that have different curricular emphases. Performance does differ across jurisdictions, and the number of subjects tested could have some effect on examinee performance, but we have not seen research to support or refute this hypothesis. NCBE has not conducted such a study because it presents a number of complexities. Even defining what exactly is a subject area is no small challenge—one that takes thorough analysis to discern whether, for instance, a subject area tested in one jurisdiction is indeed different from a subject area called by a different name in another jurisdiction. Even if one can discern whether there are additional subject areas tested in a given jurisdiction, examinees generally rise to whatever hurdle they must meet. So, any such study would require clear definitions of the subject areas tested in different jurisdictions as well as fairly comprehensive examinee background information (e.g., LSAT scores, law school GPAs, etc.) and how much time candidates devote to studying for the MBE content areas versus the additional jurisdiction-specific subject areas. If a jurisdiction were to approach NCBE to conduct such a study, we would certainly work with the jurisdiction to conduct the best study that could be designed. It would, however, be up to the jurisdiction to obtain the background information on examinees and the cooperation of any other jurisdictions that would be needed. So, while the additional requirements that different jurisdictions add to the bar examination may or may not affect how examinees prepare for the bar examination, the MBE is designed to assess what new lawyers need to know and be able to do. Its content, scoring, and administration are consistent across jurisdictions (and schools), which allows it to serve as a consistent yardstick.

7 The bar exam is a pass/fail test, so why is a question considered bad if all examinees answer it correctly? Doesn’t that just show that they are all minimally competent?

There are really two issues to address in this question. First, strictly speaking, the bar examination is not a pass/fail test; jurisdictions use the bar examination scores to make pass/fail decisions. This may seem like splitting hairs, but it is an important distinction. Because jurisdictions set their own criteria for what constitutes a passing performance on the bar exam, there is a range of scores that determine passing performance, and we need to consider this when selecting questions for the exam so that each question contributes to differentiating scores near passing performance in different jurisdictions.

In this context, a “bad” question is shorthand for a question that is statistically not desirable. All else being equal, we want questions that cover the necessary content and have reasonable statistical characteristics. For example, we want questions that are not too hard, not too easy, and well targeted to the range of passing scores observed in jurisdictions. We avoid questions that 100% of examinees would answer correctly because (a) such questions provide no information separating examinees who are minimally competent from those who are not; (b) upon further review, extremely easy questions are often found to have flaws providing clues that point to the correct answer, thereby requiring little or no legal knowledge to answer the question correctly; and (c) such questions are not a good use of examinee time and effort, since they contribute very little to providing examinees with a scaled score on the exam that gives a good estimate of their performance.

8 After the July 2014 drop in bar passage rates, schools engaged in self-analysis, looking for causes for this drop in terms of what courses their students had taken, which professors they had studied with, and so on. Can bar passage rates really
be helpful to law schools in this way?

As a licensing exam, the bar examination is not specifically designed for use to diagnose examinee weaknesses or to provide an evaluation of the quality of law school courses or professors. But if a law school considers bar passage to be a marker of whether it is achieving one or more of its goals—and because bar passage is an important metric used for things like ABA accreditation and law school ranking—it could be helpful for law schools to study what factors in their schools contribute to institutional-level bar passage rates. (These factors could include the required curriculum time devoted to topics on the bar examination and how this has changed over recent years, whether students have changed how they study for the bar examination, and whether the entry credentials of students have changed.) Such research could help inform what works (and what doesn’t) when law schools embark on reviewing admission standards and new programs or curriculum changes to help support their students.

9 Law schools have added new course offerings to their curricula and new opportunities for their students to reflect the changing nature of the law profession—including clinics, innovative courses, Alternative Dispute Resolution, and client counseling. Are the topics on the bar exam keeping pace?

The purpose of the bar examination is to provide the bar examining authorities with valid and reliable information regarding whether a candidate possesses critical knowledge, skills, and abilities needed for entry-level practice so as not to pose a threat to the public. The MBE and other NCBE testing products are purchased by these authorities because they provide this information, and NCBE testing products are continually being evaluated by content experts and testing experts for their ability to meet the needs of the bar examining authorities in the various jurisdictions.

The topics and skills assessed on NCBE exams were confirmed to be highly relevant to entry into the legal profession by a job analysis survey of newly licensed lawyers conducted on behalf of NCBE in 2011–12. Law professors are heavily represented on NCBE’s exam drafting committees, keeping items grounded in what is being taught in law school (and ideally vice versa), and practitioners review every item for relevance and appropriateness for the newly licensed lawyer. Although the topics on the bar examination have remained relatively constant, through our development and review process the particular items associated with the topics keep evolving with changes in the law, law practice, and what is deemed critical for entry-level practice. While law schools have some flexibility in establishing their curricula, NCBE must meet the needs of the bar admissions authorities, whose number one goal is protection of the public.

NCBE supports innovation and evolution of our exams to keep pace with the rapidly evolving legal profession. Accordingly, we appointed a Testing Task Force in January 2018 charged with undertaking a three-year study to ensure that the bar examination continues to test the knowledge, skills, and abilities required for competent entry-level practice in the 21st century. The Task Force will solicit input from stakeholders, and the study will be supported by research and review conducted by independent professionals with relevant technical expertise. All aspects of the examination will be reviewed as part of this process, including the scope and depth of what is covered on current NCBE exams and how this relates to entry-level practice.

10 Instead of a bar examination, why not have diploma privilege, whereby candidates for the bar are assessed through their completion of and performance on a set of required courses at their law schools rather than via testing?

Whether or not to allow diploma privilege in lieu of passing a bar examination would be a decision made on a jurisdiction basis by that jurisdiction’s Supreme (or highest) Court.3 Diploma privilege was common before 1917; however, the American Bar Association formally denounced the privilege in 1921, and a few years later the Association of American Law Schools, at its first regular meeting, passed a resolution critical of the admission of persons without examination. The rationale stated was that “bar admissions were too important to leave to anyone other than the state.”4 Licensure across many ­occupations and professions has both an educational requirement and the required passage of a standardized examination. A major advantage of a licensure examination is that it is a consistent yardstick against which to assess those wanting to enter practice.

11 There are a number of issues about the present bar examination that merit analysis. For instance: (1) Instead of having bar passage be a categorical pass/fail decision based upon total performance on a single exam, why not have the exam be a step test, as in the medical profession, where the exam is composed of several parts to be taken over the course of a candidate’s education? (2) Or why not structure the exam so that passing certain subjects or certain parts of the exam would allow a candidate to engage in some kind of limited practice while continuing to try to pass the rest of the exam? (3) Why not have the bar exam be an open-book test, since few lawyers depend solely upon memory for much of their work? (4) Why not expand the methods of testing beyond multiple-choice and written components so that more skills can be assessed?

These are all valid issues that deserve consideration and that NCBE’s Testing Task Force will no doubt explore during the course of its study. In the meantime, there are some points that can be made about each of these recommendations that are worth mentioning.

  1. Although there are advantages to step testing, there are also disadvantages—a stepped approach to testing and licensure would almost certainly involve more testing; be more complicated to develop and administer, since it would require targeting items on early steps at the student’s developmental level; and undoubtedly be more expensive than the current bar exam.
  2. Allowing examinees to separately pass individual parts of a licensing examination requires that each part be psychometrically acceptable to stand on its own for making a high-stakes decision. The current bar exam, were it to be administered in separate parts, would have difficulty meeting that requirement. All things being equal, longer tests produce more reliable scores than shorter tests. To attain a reliability of 0.80 (the typical reliability for the Multi­state Professional Responsibility Examination [MPRE], which consists of 50 scored items and 10 unscored pretest items) in each of the seven content areas on the MBE, for example, we would have to double the number of items and increase the testing time from the current 6 hours to 12 hours. Besides the increase in personnel and time needed to administer what would end up being a 2-day examination, the cost of producing such an examination would be substantially increased.
  3. As to whether the bar examination should be open book, it depends upon whether bar exam decision makers believe that there are fundamental legal principles that all newly licensed lawyers should know, be able to apply, and not need to look up. Even if the answer to this question is “no,” open book might sound simple in theory, but in practice it becomes more complex. For example, what resources are allowed? Does everyone receive the same resource materials? What, if anything, is off limits? If all resources are at hand, does the time needed to reach an answer now become a variable that should be factored into scores?
  4. The bar examination could in theory be expanded to new testing formats. For example, simulations of some type might be a possibility. These new formats, however, come at a substantial cost—and what they gain in authenticity, they lose in breadth of knowledge, skills, and abilities assessed, because only one or a relatively small number of simulations would be realistically feasible and by necessity would cover a small number of topics. Such tests would increase the cost and amount of time required to take the bar exam.

It is important to keep in mind that the ultimate decision about whether the bar examination undergoes any such changes depends upon their acceptability to the jurisdictions—that is, their acceptability to the bar examiners and the state Supreme (or highest) Courts from which jurisdictions derive their authority. NCBE cannot impose such changes on the jurisdictions by fiat.

12 There are a number of issues about the present general process of licensure that deserve to be reconsidered. For instance: (1) Why don’t we provide alternate options for law school graduates who fail to pass the bar examination, such as allowing them to practice as paraprofessionals? (2) Why don’t we move toward specialization licensing instead of a single general license to practice? (3) Shouldn’t young lawyers be included in setting passing standards, since they are experiencing what it means to be a newly licensed lawyer?

These are all good questions, but NCBE has no role in determining the licensing process in the jurisdictions or how jurisdictions set their passing standards. These policies are set by the Supreme (or highest) Courts in the various jurisdictions. NCBE attempts to bring issues like these to the fore through articles in this publication and presentations at our Annual Bar Admissions Conferences, but the decisions about these issues are made by the bar examining authorities in the various jurisdictions. NCBE’s role is to provide examination materials and assistance to jurisdictions to support their licensing processes, including research to help them explore potential changes to those processes. NCBE’s Testing Task Force, in its comprehensive and future-focused study of the bar examination, is working to ensure that the examination continues to serve the licensing needs of the jurisdictions.

13 NCBE is located in Wisconsin, a state that has diploma privilege. How can NCBE’s exams be written by lawyers who never took the bar examination?

NCBE’s exams are not written by the attorneys who work at NCBE; they are written by the 64 members of NCBE’s 10 exam drafting committees, composed of a diverse group of law professors, practicing attorneys, and judges representing 25 jurisdictions. These drafting committees are supported by NCBE’s attorney editors (many of whom took the bar examination), who help coordinate the drafting of the items and work with the drafters to ensure that each item follows various conventions.

We always welcome questions about the bar examination and NCBE processes and initiatives, and we encourage you to contact us if you want more information about the answers to any of these 13 questions or any other questions we have previously addressed. NCBE is committed to sharing with all stakeholders information about our test development processes and decisions, as well as engaging in important conversations about the future of the bar exam. We welcome your questions about any aspect of the bar examination.

Notes

  1. See S.M. Case & B.E. Donahue, “Developing High-Quality Multiple-Choice Questions for Assessment in Legal Education,” 58(3) Journal of Legal Education (September 2008) 372–387. (Go back)
  2. M.A. Albanese, T. Kent & D. Whitney, “Clueing in Multiple-Choice Test Items with Combinations of Correct Responses,” 54(12) Journal of Medical Education (1979) 948–950; M.A. Albanese, “Multiple-Choice Items with Combinations of Correct Responses: A Further Look at the Type K Format,” 5(2) Evaluation & the Health Professions (1982) 218–228; M.A. Albanese, “Type K and Other Complex Multiple-Choice Items: An Analysis of Research and Item Properties,” 12(1) Educational Measurement: Issues and Practice (Spring 1993) 28–33. (Go back)
  3. Diploma privilege is currently allowed in Wisconsin. New Hampshire allows successful graduates of the University of New Hampshire’s Daniel Webster Scholars Honors Program—a two-year performance-based program to which students are accepted prior to their second year of law school—to be admitted to the New Hampshire Bar without taking the traditional bar examination. In 2013–14, the Iowa Supreme Court explored adopting the diploma privilege based upon recommendations from the Iowa State Bar Association regarding Iowa’s bar admission process; after considering responses received during a public comment period, the Court decided in September 2014 against adopting a diploma privilege. (Go back)
  4. T.W. Goldman, “Use of the Diploma Privilege in the United States,” 10(1) Tulsa Law Review (1974) 41, available at https://cdn.ymaws.com/www.iowabar.org/resource/resmgr/files/use_of_the_diploma_privilege.pdf. (Go back)

Contact us to request a pdf file of the original article as it appeared in the print edition.

  • Bar
    Bar Exam Fundamentals

    Addressing questions from conversations NCBE has had with legal educators about the bar exam.

  • Online
    Online Bar Admission Guide

    Comprehensive information on bar admission requirements in all US jurisdictions.

  • NextGen
    NextGen Bar Exam of the Future

    Visit the NextGen Bar Exam website for the latest news about the bar exam of the future.

  • BarNow
    BarNow Study Aids

    NCBE offers high-quality, affordable study aids in a mobile-friendly eLearning platform.

  • 2023
    2023 Year in Review

    NCBE’s annual publication highlights the work of volunteers and staff in fulfilling its mission.

  • 2023
    2023 Statistics

    Bar examination and admission statistics by jurisdiction, and national data for the MBE and MPRE.