This article originally appeared in The Bar Examiner print edition, Spring 2024 (Vol. 93, No. 1), pp. 65-67. By Geoffrey R. Bok; Carole Wesenberg

In early 2024, NCBE had 4,124 participants from 88 law schools in 41 jurisdictions take part in a nationwide field test for the NextGen bar exam. The field test helped NCBE gather feedback and data on new question types, test the computer-based delivery system, and refine timing estimates. Jurisdiction graders also had the opportunity to gain experience in grading the new exam. Two graders share their insights on the process below.

Extensive Grading Materials and High-Quality Questions: The NextGen Bar Exam Ensures Thorough Assessment and Grader Uniformity

By Geoffrey R. BokI have been grading bar examinations since 1989, first in Massachusetts and later in Vermont—each of which first administered the Uniform Bar Exam in July 2018 and July 2016, respectively. I have written and graded state law essay questions, and graded the Multistate Essay Examination, the Multistate Performance Test (MPT), and the earlier 2023 NextGen pilot test. Based on this, I was asked to be a grader for six questions during the recent field test of the NextGen bar exam.

NCBE’s training for grading the NextGen field test was extensive. Each question came with a comprehensive document that contained the full text of the question, along with related materials that had been given to examinees (such as relevant legal documents, court decisions, newspaper articles, affidavits, and transcripts of client interviews). These materials also provided the grader with a detailed and clear list of the points a correct answer would make, as well as a list of incorrect answers. Many sample answers were included, as were extensive summaries of what score each of these sample answers should receive. Finally, graders watched a separate on-demand video for two of the questions, in which an expert went over how to score answers to each question.

The first type of question I graded required the examinee, acting as an attorney in a matter, to provide two short responses based on the factual (and sometime legal) information provided to them. For example, such responses could be in terms of what the attorney should do next in representing their client, what arguments to make to the opposing side to try to settle the case favorably, what discovery to request, and/or what facts are adverse or helpful to their client. Sometimes a question had more than two correct answers, but the examinee got full points by identifying two correct answers and half the total points for only identifying one correct answer. On the questions of this type that I graded, it forced the examinee to review a somewhat complex set of facts and then use their legal skills and knowledge to respond briefly to a specific inquiry for two pieces of information or future action.

The other type of question I graded was similar to a shorter version of the current MPT, in that it gave the examinee the law and facts needed to write a legal document (e.g., a legal memorandum to a judge in support of a client’s motion or revisions to a draft contract). Answers that received the most points found the applicable legal standards in the provided materials, found the most helpful facts in the provided facts to apply to these legal standards, and then brought them together in a document that did not misstate either the law or the facts. Some of these questions were longer than others, so examinees were tasked with writing documents of different lengths and levels of complexity.

There also were some multiple­-choice questions, but these did not appear to need a human grader to score them. These multiple-choice questions appeared both within each item set of short-answer constructed­-response questions and as standalone items.

I have a few observations based on my experience grading the NextGen field test. First, in all my years of grading bar examinations, I have never been so impressed with the scope and detail of the provided grading materials and resources. Access to such extensive grading materials should ensure that grading a NextGen answer is both accurate and uniform from grader to grader. For instance, for NextGen, two graders will grade each response, and when the grades are not within a set tolerance, they will receive a third grade from a grading leader or more senior grader. Next, I found that the exam’s questions, and the answers they sought, appeared to test an examinee’s legal knowledge and skills in a very fair, thorough, and thoughtful manner. Furthermore, the factual patterns (and types of questions asked based on these factual patterns) reflected real legal problems and tasks that a newly licensed lawyer would face.

Based on this field test grading, and on my many years as a grader and a bar examiner, I am excited to see the NextGen bar exam move from this stage to implementation.

Portrait Photo of Geoffrey R. Bok Geoffrey R. Bok is a member of the Vermont Board of Bar Examiners and former chair of the Massachusetts Board of Bar Examiners. He currently chairs the Massachusetts Supreme Judicial Court’s Bar Admissions Curriculum Committee.

Ensuring Progress: The NextGen Bar Exam Improves Lawyer Skills Testing and Grader Consistency

By Carole WesenbergChange does not necessarily assure progress, but progress implacably requires change.” — Henry Steele Commager1

Over the past two decades, I have graded the bar exam in Idaho 36 times. The exam has certainly changed over this time, including via adoption of the Uniform Bar Exam (UBE) in March 2011. Idaho is now exploring whether to adopt the NextGen bar exam.

Besides grading the bar exam, I have attended NCBE conferences and workshops; worked with the Idaho State Bar regarding the grading process; and, most recently, served on Idaho’s NextGen Bar Examination Task Force. To better understand the grading-process changes that would be necessary in Idaho due to the new exam, I volunteered and was selected to grade the NextGen field test.

After grading the NextGen field test, I was struck by the progress made in ensuring that the new exam provided a better assessment of the real-world skills and knowledge necessary to be a competent attorney. Examinees still need to know and understand the law. However, they no longer have to memorize, for example, the doctrinal minutiae of every complex or nuanced legal topic. Instead, for some questions examinees must identify what legal research to undertake, what facts are important, and/or what their clients need to know.

Although the grading process may not be front of mind for most stakeholders, for longtime graders like me, grading the high-stakes bar exams is an enormous responsibility. Graders need to ensure fairness to both examinees and the public. And, because of the UBE’s score portability, a benefit that continues with the NextGen exam, it’s important that one jurisdiction’s grading standards do not significantly differ from another’s.

Over my years of grading, I have thought a lot about best practices to ensure fairness and consistency. However, the challenge with the Multistate Essay Examination (MEE) and the Multistate Performance Test (MPT) is that it is impossible to entirely eliminate the subjectivity that can come with grading essay questions. NCBE provides a lot of resources through rubrics, model answers, and workshops to help calibrate exam answers. However, grading consistency remains a challenge requiring both awareness and vigilance.

The NextGen bar exam eliminates many of my concerns regarding grading and consistency. It directs examinees to provide written answers to address specific tasks in a practice-oriented context, such as identifying errors in a contract or interpreting a provided statute. These tasks require examinees to provide concise and focused responses compared to the more open-ended format of the MEE. These changes help ensure that grading will be more objective, fair, and uniform within and throughout jurisdictions.

NCBE has done an excellent job of making the graders’ job more straightforward. The NextGen bar exam includes integrated question sets and performance tasks; NCBE provides a grading rubric, grading notes, and benchmarks for these test items. The materials were generally thorough and easy to apply. In particular, the benchmarks provided model answers and examples of partially correct and incorrect answers, taking a substantial amount of guesswork out of determining the proper number of points to assign to an answer.

Integrated question sets are geared to quickly assess an applicant’s ability to analyze, synthesize, or evaluate the information provided. The sets I graded called for two or four correct answers per question, and those answers were limited to one or two sentences; answers were graded on a two-point scale. I found the grading less time-­consuming and, more importantly, easier to discern the differences between the answers. Relative grading or rank-ordering were no longer in play. In short, grading was simplified: the answer was correct, partially correct, or incorrect.

Performance tasks were similarly more straightforward to grade. They are more akin to the current MPT; however, each question’s call was narrowly tailored to assess an applicant’s skills in legal research and written legal analysis. Thus, instead of reviewing a lengthy answer typical of an MPT, performance task answers were shorter and more focused. In the task I graded, the rubric was not broken down holistically by issue but rather via a familiar IRAC/CREAC format,2 with each element graded on a two-point scale and another two-point assessment for the overall answer organization. Although I might question whether a conclusion element should be weighed the same as a rule or application element, I am pleased that many subjective components have been eliminated. And I expect NCBE to refine the grading process further in the future.

All in all, the changes present in the NextGen bar exam ensure progress on two fronts. First: it is a better test of lawyering skills. Competent representation does not require mere memorization of the law. Nor would we expect newly licensed attorneys to rely solely on their memory; to do so would constitute malpractice. A licensure exam should test for the skills a profession requires. Second: the NextGen bar exam has a more objective, fair, and consistent grading format. Graders can be more confident in the validity and reliability of the grading process for both the applicants and the public. These are the strengths of the NextGen bar exam, and I am excited about the changes and progress to come.

Notes

  1. Henry Steele Commager, “We Have Changed—and Must,” New York Times (April 30, 1961), at 77, available at https://www.nytimes.com/1961/04/30/archives/we-have-changed-and-must-a-historian-reviews-the-profound.html. (Go back)
  2. IRAC (Issue, Rule, Analysis, Conclusion) or CREAC (Conclusion, Rule, Explanation, Analysis, Conclusion) are two methods for organizing specific legal issues. (Go back)

Portrait Photo of Carole Wesenberg Carole ­Wesenberg is a Career Law Clerk for the Honorable N. Randy Smith with the Ninth Circuit Court of Appeals. She is also an instructor at Idaho State University in Paralegal Studies.

Contact us to request a pdf file of the original article as it appeared in the print edition.

  • Bar
    Bar Exam Fundamentals

    Addressing questions from conversations NCBE has had with legal educators about the bar exam.

  • Online
    Online Bar Admission Guide

    Comprehensive information on bar admission requirements in all US jurisdictions.

  • NextGen
    NextGen Bar Exam of the Future

    Visit the NextGen Bar Exam website for the latest news about the bar exam of the future.

  • BarNow
    BarNow Study Aids

    NCBE offers high-quality, affordable study aids in a mobile-friendly eLearning platform.

  • 2023
    2023 Year in Review

    NCBE’s annual publication highlights the work of volunteers and staff in fulfilling its mission.

  • 2023
    2023 Statistics

    Bar examination and admission statistics by jurisdiction, and national data for the MBE and MPRE.