This article originally appeared in The Bar Examiner print edition, Fall 2025 (Vol. 94, No. 3), pp. 20–23.By Kara Smith, PhD
Establishing a recommended passing score range is among the most important psychometric responsibilities in the development of a high-stakes exam. But what does it mean to set a fair and meaningful passing standard before an exam has had a live administration? The answer is both technically complex and deeply human. At its core, this process, for the NextGen UBE, is about defining what minimum competence is in a real-world legal context and ensuring that those who pass the exam are ready to safely serve clients and the public with integrity and professionalism.
As NCBE prepares jurisdictions for the transition to the NextGen UBE, one of our key roles is to support them in determining appropriate passing scores. NCBE does not set passing scores for jurisdictions; rather, our responsibility is to recommend a score range based on empirical evidence and expert input. The aim is to provide jurisdictions with a defensible and transparent basis for decision making that respects their autonomy while ensuring consistency with the profession’s expectations for entry-level competence.
Gathering Evidence
In the absence of live data from actual test takers in high-stakes conditions, standard setting must rely on proxy tools and multipronged strategies that replicate, as closely as possible, the operational exam context. NCBE addressed this challenge through a series of linked efforts: a prototype exam, a base scale-setting process, statistical concordance studies, and a live standard-setting event. Each helped build a strong empirical and conceptual foundation. The prototype exam allowed stakeholders to experience the exam format and content firsthand. The base scale established a performance continuum grounded in observable behaviors. The concordance study helped relate new performance metrics to the legacy exam. The capstone was the live standard-setting event.
Live Standard-Setting Event
The live standard-setting event brought together a diverse group of 84 panelists (representing 40 jurisdictions) from across the legal profession—judges, attorneys, and educators—to participate in a structured, research-based process for identifying the level of performance that reflects minimal competence. The heart of this process, and the foundation for all subsequent judgments, was the development of a shared understanding of the minimally qualified candidate (MQC). This step is crucial in any standard-setting process; it is particularly important when developing a recommendation for a new exam.
Participants were first asked to generate and refine definitions of the MQC. A critical insight from this phase was recognizing that participants often begin with mental models of an ideal lawyer or a highly successful law school graduate, which tend to be aspirational. To counteract this tendency, facilitators emphasized the importance of anchoring judgments in what is necessary to ensure public protection—not perfection.
A full day of development, discussion, and debate resulted in a foundational understanding of the MQC.
Definition of a Minimally Qualified Candidate
The MQC is an individual who has attained a sufficient degree of knowledge, skill, or proficiency to provide, at least, minimally safe and competent legal services to the public. They can work with minimal supervision on uncontested cases or matters that follow a standard routine. Their understanding of doctrinal concepts and principles in foundational areas of the law is broad, though basic, and they can apply this knowledge to determine the types of legal problems a situation may involve and to identify the primary issues in a client matter.
They demonstrate legal reasoning ability by formulating analyses or arguments and reaching conclusions in common practice scenarios. In client representations, they can identify appropriate next steps based on relevant rules and standards and in alignment with the client’s objectives, interests, and constraints. Additionally, they can identify and implement effective legal research strategies to support their work.
Standard-Setting Participants’ Descriptors of the Minimally Qualified Candidate
The definition of a MQC was then supplemented with detailed descriptors that emerged through panelist discussion. These descriptors were synthesized and grouped by major thematic categories. Together, they provide a fuller picture of the MQC, moving beyond doctrinal knowledge to encompass applied skills and professional behaviors.
Foundational Legal Knowledge: Generally Strong
- Has baseline knowledge of legal concepts: procedural rules (criminal/civil), statutes vs. regulations
- Understands structure of the legal system (federalism, judiciary, case types)
- Knows principles of interpretation
- Can identify what area of law applies to a matter and start appropriate legal research
- Understands the procedural posture of a case and basic legal terminology
Issue Spotting: Competent, but Lacks Depth
- Can spot major legal issues and flag missing facts
- May miss nuanced or secondary issues in fact patterns
- Can identify claims but often struggles to assess their strength or rank importance
- Needs to learn how to prioritize legal issues and avoid getting lost in rabbit holes
Legal Research: Adequate for Discrete Tasks
- Can use basic research tools effectively
- Distinguishes between mandatory vs. persuasive authority
- Knows when and how to conduct basic research on unfamiliar issues
- May need direction for deeper or more complex legal research strategies
Legal Writing: Mixed Proficiency
- Can draft basic memos and communications to clients
- Often requires supervision when writing to courts or in complex legal contexts
- Weak writers struggle to
- convey compelling arguments;
- answer specific legal questions clearly; and
- organize persuasive or strategic arguments.
Soft Skills and Client Interaction: Supervision Required
- Needs support in
- managing client expectations;
- conducting effective client interviews; and
- communicating nuanced or sensitive information appropriately.
- May succeed at basic intake or gathering facts but struggles with
- strategic communication; and
- advising clients or managing adversarial relationships.
Critical Thinking and Judgment: Needs Development
- Often needs guidance to
- make credible legal judgments;
- weigh and prioritize arguments;
- decide when to settle or escalate; and
- avoid superficial or overly literal analysis.
- Needs to develop the crucial ability to “know what you don’t know”
Ethics: Basic Concepts Understood, but Complexity Is Challenging
- Understands foundational ethics (e.g., confidentiality, conflict checks)
- Needs supervision for
- applying ethical rules in nuanced or complex contexts; and
- making real-time judgment calls involving professional responsibility.
Work Habits and Professionalism: Vary Widely
- Shows two out of three: strong thinking, strong writing, or strong work ethic
- Can differentiate oneself by organization, preparation, and ability to manage workload
Tasks That Can Be Performed Without Supervision
- Can generally
- spot key issues;
- perform discrete legal research tasks;
- draft internal memos and basic forms;
- identify next steps in a case;
- understand basic court procedures; and
- conduct basic client intake.
Tasks That Require Supervision
- Complex writing (especially to courts)
- Evaluating case strength or credibility of legal positions
- Nuanced issue spotting and argument development
- Applying judgment in unfamiliar procedural or ethical situations
- Client communication with potential risk or legal impact
In summary, participants described MQCs as generally competent at foundational legal skills but still developing the deeper judgment, communication skills, and strategic thinking required for unsupervised practice in complex situations. They can perform discrete legal tasks and contribute meaningfully in supervised settings, and they benefit from clear expectations and structured feedback.
Integrating Standard Setting with Concordance and Outcomes Analysis
Following the live standard-setting event, NCBE brought together multiple lines of evidence to evaluate the effect of various passing score options. The detailed panelist judgments from the standard-setting event were analyzed alongside the statistical concordance study, which compared performance expectations between the current and NextGen exams. In addition, NCBE conducted an outcomes analysis to estimate how different recommended passing scores would affect pass rates under various scenarios. These projections provided insight into the practical consequences of score-setting decisions and helped ground the resulting conversation in real-world implications.
All this information was presented to NCBE’s Passing Score Advisory Panel, composed of the NextGen Implementation Steering Committee and the NCBE Board of Trustees. After reviewing the full body of evidence—psychometric data, expert judgment, and modeled outcomes—the Advisory Panel endorsed a recommended passing score range, which has been shared with jurisdictions for their consideration.
Supporting Jurisdictions in Making Informed Decisions
Developing a recommended passing score range reflects NCBE’s deep commitment to supporting jurisdictions in their implementation of the NextGen UBE. We recognize that each jurisdiction must weigh its own policy considerations and values, and we are committed to providing the tools, data, and expertise they need to make informed, defensible decisions. Our role is not to prescribe a single score but to offer empirically grounded guidance that reflects the profession’s expectations for minimum competence and prioritizes public protection.
As jurisdictions prepare for the transition to the NextGen UBE, NCBE remains available to consult, explain the process that led to the recommended passing score range, and assist in interpreting the impact of passing score choices. We understand the weight of these decisions and are proud to partner with jurisdictions as they continue to uphold integrity and fairness in the legal licensing process.
If your jurisdiction would like additional information or support regarding the recommended NextGen passing score range, please contact Kara Smith at ksmith@ncbex.org.
Kara Smith, PhD, is Chief Product Officer for the National Conference of Bar Examiners.
About the NextGen UBE
Set to debut in July 2026, the NextGen UBE will test a broad range of foundational legal doctrine and lawyering skills in the context of the current practice of law. The skills and concepts to be tested were developed through a nationwide legal practice analysis and reflect the most important knowledge and skills for newly licensed lawyers in both litigation and transactional practice. NCBE is committed to ensuring a systematic, transparent, and collaborative implementation process, informed by input from and participation by stakeholders, and guided by best practices and the professional standards for high-stakes testing. For more information, visit https://www.ncbex.org/exams/nextgen.
Contact us to request a pdf file of the original article as it appeared in the print edition.






