Dissecting the Rankings: The U.S. News and World Report

The U.S. News and World Report law school rankings have had a remarkable impact on law school admissions and, in a broader sense, American legal culture. This piece explores the methodology and effects of this influential and controversial ranking system.

I.Introduction
II.Methodology
III.Takeaways
IV.Conclusion 

I. Introduction

(back to top)

In the world of legal education, the U.S. News and World Report’s annually published law school rankings are ubiquitous. U.S. News has been publishing its law school rankings (along with its other graduate and undergraduate rankings) since 1989, and today its rankings are the most widely read, closely followed, and hotly debated of any such publication. The U.S. News rankings have achieved such popularity that they exert enormous influence on aspiring law students who often turn to the rankings when choosing which law schools to apply to, and which law school to attend.

The rankings are so impactful that even a small drop in a school’s ranking can be disastrous when recruiting future students to that school. Because one of the largest components in the U.S. News rankings is student quality, a slight drop that hurts recruiting can have a devastating snow-ball effect on a school’s ranking/reputation. While law school administrators are usually indignant about this growing influence they are simultaneously bound to play along with U.S. News system.

If a school doesn’t volunteer the information requested by U.S. News for compiling its rankings, U.S. News estimates the missing data… conservatively. Any single school that might choose to buck the trend and fight the rankings would likely take a hit in the rankings, a risk no school is willing to take. This news would precipitate angry alumni phone calls, online speculation about declining school quality, a move in next year’s law applicants to competitor schools, and, very likely, teetering job security for that law school’s administrators.

Since schools cannot effectively opt out of the ranking system, then, the only realistic option is to work within the methodology created by U.S. News. Though they frequently detest this reality, law schools (specifically law school admissions offices) are beholden to the metrics selected by U.S. News.

The resulting transformation in the law school admissions world can be lauded or decried. On the positive side, the U.S. News rankings have pushed schools to use objective measures for assessing candidates and has led to increasing financial aid (especially for desirable applicants who often find themselves the subject of law school bidding wars). Furthermore, the rankings themselves provide some legitimate criteria by which applicants can assess the hundreds of law schools in the United States.

On the negative side, the criteria established by U.S. News are notwholly objective… undergraduate GPA, for example, does not account for differing degrees of difficulty in institution or in curriculum. One could also argue that admissions counselors should have the freedom to admit a student with a slightly lower LSAT score who has demonstrated true passion and enthusiasm for the law over a student with a higher LSAT for whom law school is simply a fall-back option.

But, for practical purposes, the “value” of the U.S. News rankings is really not worth spending too much of your time debating. No matter how you might feel about the U.S. News rankings, they are here to stay and their influence should not be taken lightly. The best thing someone considering law school can do regarding U.S. News’ reports is to understand the rankings. By studying the methodology you can see what law school admissions offices are focusing on and this is critical information.

I would advocate strongly against using a magazine’s (or any third party’s) rankings/preferences as the determining factor driving your decisions regarding law school, but that doesn’t mean there isn’t information here that is useful. Just make sure you are thoughtful and judicious when filtering this information and deciding how to act upon it.

II. Methodology

(back to top)

U.S. News and World Report uses twelve metrics, weighted differently in accordance with importance, to evaluate law schools. The raw scores are then normalized with the first place school (Yale, always) scoring 100.

These are the criteria U.S. News uses, listed in descending order of weight. Following the list is an explanation of what the metrics are and how the data is gathered for each:

25.00% - Peer Assessment Score
15.00% - Assessment Score by Lawyers/Judges
14.00% - Employment Rate 9 mos. After Graduation
12.50% - Median LSAT Scores
10.00% - Median UGPA
9.75% - Average instruction, library, and supporting services
4.00% - Employment Rate at Graduation
3.00% - Student/Faculty Ratio
2.50% - Acceptance Rate
2.00% - Bar Passage Rate
1.50% - Financial Aid
.75% - Library Resources

Peer Assessment Score (25%)

The peer assessment score measures the reputation of a law school in the eyes of its peers in the law school community; it is the most heavily weighted metric in the U.S. News system. Each fall law school deans, deans of academic affairs, chairs of faculty appointments, and the most recently tenured faculty members are asked to law schools on a scale from marginal (1) to outstanding (5). If a respondent does not know enough about a school to evaluate it fairly they are asked to mark "don't know." A school's score is the average of all the respondents who rated it. Responses of "don't know" count neither for nor against a school. About 71 percent of those surveyed responded in 2008.

Assessment Score by Lawyers/Judges (15%)

The lawyer/judge assessment score measure the reputation of a law school in the eyes of practicing lawyers and judges. Each fall legal professionals, including the hiring partners of law firms, state attorneys general, and selected federal and state judges, are asked to rate programs on a scale from marginal (1) to outstanding (5). If a respondent does not know enough about a school to evaluate it fairly they are asked to mark "don't know." A school's score is the average of all the respondents who rated it. The two most recent years lawyers' and judges' surveys are averaged to get the final lawyer/judge assessment score. Responses of "don't know" count neither for nor against a school. Only about 31 percent of those surveyed responded in 2008.

Employment Rate 9 Months after Graduation (14%)

This is the percent of students in the graduating class that were employed nine months after graduation. For this metric, 25 percent of those whose status is unknown are counted as employed (since this data is gathered by voluntary survey). Those who are unemployed and not seeking jobs are excluded from the calculations and are not counted as unemployed. Those who are unemployed and seeking work are counted as unemployed in the calculations of the employment rates.

Median LSAT Score (12.5%)

This is simply the combined median LSAT score of all full-time and part-time entrants to a law school. This is gathered based solely on those entering the law school; those who were admitted but did not attend are not counted. Schools can use each entrant’s highest LSAT score in determining this median, and this is likely responsible for the shift away from “averaging” of scores for multiple LSAT-takers towards simply taking the highest score. As will all aspects of the admissions process, the metrics used by U.S. News have led law schools to adapt how they measure candidates in accordance.

Median Undergrad GPA (10%)

This is the combined median undergraduate grade-point average of all the full-time and part-time entrants to a law school. The GPA used in this calculation is the LSAC-calculated GPA (LSAC has a specific formula for calculating an applicant’s GPA that can lower or in some cases even raise a candidates actual GPA which is never reported to U.S. News). Note that it is the raw LSAC GPA that is reported to U.S. News, not the grade-inflation-correcting percentile/class rank of an applicant (a data point that LSAC does have and is available to law school admissions offices).

Average Instruction, Library, Supporting Services Expenses (9.75%)

This is the amount the school spent on its faculty and facilities for the previous year.

Employment Rate at Graduation (4%)

This is the percent of students in the graduating class that were employed/had jobs lined up at the time of graduation. Graduates who are working or pursuing graduate degrees are considered employed.

Student/Faculty Ratio (3%)

This is the ratio of students to faculty members for the previous year, using the American Bar Association’s definition.

Acceptance Rate (2.5%)

The acceptance rate is the proportion of applicants to a law school who were accepted. (e.g. 5,000 applicants, 500 were admitted, acceptance rate = 10%). A lower acceptance rate will yield a higher score for a law school on this metric since lower acceptance rates indicate greater selectivity. Remember that the number of accepted applicants must always be higher than the number of spots available since applicants apply and are accepted to multiple schools (meaning some of those admitted to each school will not enroll). This becomes important in the application process because it incentivizes schools to admit as few applicants as possible (while still, obviously, making sure to enroll a full class). Schools attempting to keep their acceptance rates low will use a variety of techniques including: decision-timing, early decision applications, extensive waitlisting (since a waitlist does not count as an acceptance), scholarships to entice applicants who are admitted to matriculate, and, of course, the dreaded “yield-protection” technique, where a school either waitlists or rejects an applicant because their LSAT/GPA numbers are too high and the school believes that applicant will, instead, attend a higher ranked law school.

Bar Passage Rate (2%)

This is the ratio of the school's bar passage rate for the previous graduating class to that jurisdiction's overall state bar passage rate for first-time test takers over the previous winter and summer bar administrations. The jurisdiction used for this calculation is the state where the largest number of graduates (from the school in question) took the state bar exam. The jurisdictions’ bar passage rates are provided by the National Conference of Bar Examiners.

Amount of Financial Aid (1.5%)

This is the amount of money the school provided students in financial aid for the previous year.

Library Resources (.75%)

This is the total number of volumes and titles in the school's law library at the end of the previous fiscal year.

III. Takeaways

(back to top)

Looking at this methodology there are several things we can learn. First, it should be obvious why the U.S. News rankings have remained so stable over the past 20 years (the consistently inclusive “Top 14” is evidence of this stability); the largest factors in the ranking system are the “Peer” and “Lawyer/Judge Assessment Scores.” Law school administrators/faculty almost never have first-hand experience with more than a handful of schools they might have worked at (sometimes even fewer) and lawyers/judges will likely only have first-hand experience with their own alma-mater. As a group, then, peers and lawyers/judges are not necessarily going to have much information to use in making their assessments. So where do they turn to form opinions? You guessed it – the very rankings they are informing. This leads to an echo-chamber effect in the ranking system that reinforces school positions and leaves most schools with little chances for upward mobility.

Since these assessment scores are essentially impossible to influence effectively, schools are likely to look next to their employment rate 9 months from graduation as this is the next most heavily-weighted factor, but there is not too much that schools can do with this metric either. Obviously, career service offices are dedicated to finding their graduates employment and to be sure, the fact that this is such a heavy factor in the U.S. News rankings gives them good reason to work hard for their students. However, since this data is gathered by survey and not all surveys sent out are responded to, a full picture of employment 9 months after graduation is difficult to gather. Additionally, the incentive for schools to count their graduates as employed (especially at the time of graduation) has had some deleterious implications.

There is ambiguity about whether or not a graduate has to have a job that requires a JD in order to be counted (or at least a job they could not have gotten without their law school education). Obviously, if a graduate chooses to start a business or go into investment banking, they should be counted as employed even though they don’t technically need a JD for either pursuit. What if, however, an unemployed graduate who can’t find work picks up a job at Starbucks to pay rent? Are they counted as employed? Many schools will say yes and in the most surreptitious manipulation, some career service offices have even been accused of hiring recent graduates to temporary administrative positions in order to see that they are counted as “employed” in the data gathered by U.S. News. In sum, honest schools are incentivized to work hard for their students by this ranking metric, but have few means of positively influencing their rank without resorting to misrepresentation.

So what can schools do to help their rank? They can play the numbers game. The next two most heavily weighted factors are the aggregate LSAT scores and GPA’s of incoming students (measured by median). Together these factors account for a whopping 22.5% of a school’s rank, and these factors are much more within schools’ ability to impact. Obviously schools are limited by the applicants they receive and by how desirable their school is to the applicant pool in general. If a Tier 2 school began only admitting 170+ LSAT applicants they might not enroll anybody in next year’s class. Every application cycle, however, schools are required to pick and choose between a plethora of worthy candidates who have similar credentials. The simple fact is schools are focusing more on an applicants LSAT and GPA than they ever have before and the emphasis on these measures has grown so great that these numbers alone can yield very accurate results in predicting an applicant’s chances of success at different schools.

To see the heavy impact the rankings have had on schools’ admissions criteria, look at the history of how LSAT scores have been viewed by schools. It used to be that most schools would take an applicant with multiple LSAT scores and average these scores. But U.S. News only cares about an entering student’s highest LSAT score; suddenly it is difficult to find schools that average LSAT scores, as nowadays it is the highest LSAT score that schools are focusing on by and large. The focus on higher numbers has also diminished the value of work experience for law school applicants. Ten years ago, an applicant’s work experience was much more valuable on a law school application than it is today. Increasingly, an applicant’s resume, recommendations, personal statement, etc, matter on a secondary level when compared to calculable (and USNWR-watched) metrics. Schools are looking for certain LSAT/GPA numbers and if you don’t have them your “soft” factors are simply not likely to come into play.

The great upside for applicants with these high numbers is that they are now hot commodities for law schools. Schools are competing for those applicants with high numbers and will literally get into bidding wars doling out financial aid to, essentially, bribe applicants. Need based aid is increasingly difficult to come by, but if you have the right numbers/target the right schools, merit-based aid is everywhere.

One final takeaway is that schools are now working hard to make their acceptance rates as low as possible since this is one other metric they are well positioned to influence. Schools today use fee-waivers, early decision contracts, and the waitlist process to lower their acceptance rates/raise their yield. With fee waivers schools can get huge numbers of applicants who have no chance of admission, with early decision contracts schools can be sure an applicant will attend before admitting them, and with the waitlist schools can admit the fewest number of students possible required to fill their classes. One big consequence of the use of the waitlist is “yield protection”, where schools waitlist candidates with numbers higher than the school’s normal class LSAT/GPA range. Schools do this because they assume such candidates will be admitted to higher ranked schools which they will opt to attend instead.

IV. Conclusion

(back to top)

Whether you laud or decry the U.S. News ranking methodology, it is an undeniable staple of the law school world. If you are a serious law school applicant you will be best served by understanding the way these rankings work and the implications they have on the law school admissions process.

Do not: use the rankings to make all of your decisions, assume that a higher ranked school is a better school, or look condescendingly upon lower ranked schools or graduates from these schools.

Do: learn how the methodology works, understand the impact it has on law schools, be aware of its shortcomings and flaws (there are many) and remember that at the end of the day these rankings are a yearly publication by a magazine seeking to garner readership with no specific affiliation to the law school world – they are not dogma, they are a consumer product. They are, however, a part of the law school culture and will remain so for the foreseeable future.