USNWR Law School Ranking Methodology

(Rankings, Profiles, Tuition, Student Life, . . . )
texas man
Posts: 169
Joined: Tue Mar 16, 2010 10:59 pm

USNWR Law School Ranking Methodology

Postby texas man » Sun May 09, 2010 4:03 pm

Here is the methodology currently used by USNWR for their law school rankings. Please share your thoughts on any specific components and/or the veracity of the methodology in general.

USNWR Methodology:

Quality Assessment (40):

25.00 — Peer Assessment Score
15.00 — Assessment Score by Lawyers/Judges

Selectivity (25):

12.50 — Median LSAT Score
10.00 — Median Undergrad GPA
2.50 — Acceptance Rate

Placement Success (20):
14.00 — Employment rate nine months after graduation
4.00 — Employment rate at graduation
2.00 — Bar Passage Rate

Faculty Resources (15):
9.75 — Expenditures Per Student: average instruction, library, and supporting services
1.50 — Expenditures Per Student: financial aid
3.00 — Student/Faculty Ratio
0.75 — Library Resources

For more details regarding the methodology see : http://www.usnews.com/articles/education/best-law-schools/2010/04/15/the-law-school-rankings-methodology.html

User avatar
DerrickRose
Posts: 1106
Joined: Sat Dec 06, 2008 5:08 pm

Re: USNWR Law School Ranking Methodology

Postby DerrickRose » Sun May 09, 2010 4:14 pm

There's nothing *wrong* with the methodology. If the employment stats were real, the USNWR would be pretty darn near perfect. But there are two problems.

1. The employment stats aren't real, plus they don't reflect salary differences, so the gaps between the good schools and the bad ones are masked.

2. It doesn't give a real-time dynamic look at the relative values of attending the various schools. Attending #19 WUSTL in 2010 is about as good as attending #80 Rutgers-Newark in 2005. The rankings don't, and can't, reflect this.

The data in the USNWR is valuable, and it for the most part does its intended job, which is to rank the schools one-by-one relative to each other.

The problem is that the rankings aren't placed in their proper context by the people that read them and the numbers are tacked to an existing set of assumptions and expectations that are flagrantly untrue.

People who are looking for law school rankings to tell people that law school isn't worth the price are pointing the finger in the wrong direction.

Tautology
Posts: 434
Joined: Thu Mar 18, 2010 12:40 pm

Re: USNWR Law School Ranking Methodology

Postby Tautology » Sun May 09, 2010 4:55 pm

Overall:

The weights given to each factor are completely arbitrary. Why should Judge/Lawyer assessment be 60% of Peer assessment rather than 20% or 150%? Judge/lawyer data is probably less reliable because of the response rate, but is it 60% as reliable. There is no real answer. In fact, there is no justification for combining the different pieces of data at all, except that people would rather look at one list than many no matter how meaningless the one is.

Peer Assessment:

This data is generally interesting and useful, I have little problem with it other than the arbitrariness of the weighting as mentioned above.

Selectivity:

Acceptance rate is fairly meaningless. It has more to do with the size of the school and its relative popularity (hey guess what California schools get a lot of applicants!) than it does with quality. At least they gave it a fairly meaningless weight. Or do we think the fact that Harvard, Stanford and Yale top the list makes it meaningful even when the next three are Berkeley, U. Conn and U. or Maryland? Yield protect and fee waivers make this even stupider.

It also seems odd that the ratio of the arbitrary weights assigned to the LSAT and UGPA is far, far less than it is for almost any school out there.

Placement Success:

Self-reported data is dubious and easily manipulable. Pretty worthless 20%.

Faculty Resources:

I don't understand this section at all. Only Student/Faculty ratio seems very relevant at all, and there I'd rather see average class size (from the perspective of the student). Another section that's pretty worthless.

Conclusion:

The methodology is pretty terrible, but because these rankings are as influential as they are, they become a self-fulfilling prophecy of sorts and so work out pretty well to what we subjectively want them to anyway. Also, they're better than the Cooley rankings.

--LinkRemoved--

User avatar
SaintClarence27
Posts: 700
Joined: Wed Jun 10, 2009 8:48 am

Re: USNWR Law School Ranking Methodology

Postby SaintClarence27 » Sun May 09, 2010 5:28 pm

Tautology wrote:Selectivity:

Acceptance rate is fairly meaningless. It has more to do with the size of the school and its relative popularity (hey guess what California schools get a lot of applicants!) than it does with quality. At least they gave it a fairly meaningless weight. Or do we think the fact that Harvard, Stanford and Yale top the list makes it meaningful even when the next three are Berkeley, U. Conn and U. or Maryland? Yield protect and fee waivers make this even stupider.



This is probably the most ridiculous one. It should seriously be dropped. There is *no* reason for it to be included in the rankings, other than it promotes schools giving out fee waivers for less qualified applicants, then breaking their hearts when they are inevitably rejected.

Tautology wrote:Placement Success:

Self-reported data is dubious and easily manipulable. Pretty worthless 20%.



This is probably the most important statistic, but is meaningless due to the factors cited above. If there could be some independent review process, combined with a better set of numbers (like "graduates employed in a position requiring a J.D."), it would be infinitely more meaningful. If it could be done right (possibly even including salary data), then it should represent an even larger chunk of the numbers.

Tautology wrote:Faculty Resources:

I don't understand this section at all. Only Student/Faculty ratio seems very relevant at all, and there I'd rather see average class size (from the perspective of the student). Another section that's pretty worthless.


I think the financial aid thing would be pretty useful, but I think it would be better served as kind of an "average cost of attendance after aid" thing. I didn't actually quote Tautology to argue with him, just as a way to organize my post, btw. I agree regarding the rankings as a sorting mechanism. I would like to also see some kind of benefit for schools with a more holistic application process, but I'm not sure how that could be done.

djgoldbe
Posts: 141
Joined: Wed Oct 28, 2009 5:23 pm

Re: USNWR Law School Ranking Methodology

Postby djgoldbe » Sun May 09, 2010 9:13 pm

If they are going to include things like expenditures, I would much rather the financial aid portion be given far more weight. (and I agree it should be 'total cost after financial aid' not just financial aid) 9.75 for expenditures per student versus 1.5 expenditures for financial aid seems quite off, especially when you consider many of the expenditures for larger class sizes is not linear.

mcdad
Posts: 37
Joined: Wed Oct 07, 2009 3:39 pm

Re: USNWR Law School Ranking Methodology

Postby mcdad » Sun May 09, 2010 9:22 pm

A lot of this is measuring what people already think. When asked to assess another school's rank a lot is based on what you have heard, so it becomes self-reinforcing. An objective measure would be a test that everyone took at the end of the first year. Comparing how the students did on that test with the average LSAT for that school would be a semi-objective measure of the effectiveness of that school rather than a measure of what people had heard about the school.

Renzo
Posts: 4265
Joined: Tue Dec 02, 2008 3:23 am

Re: USNWR Law School Ranking Methodology

Postby Renzo » Sun May 09, 2010 9:37 pm

DerrickRose wrote:There's nothing *wrong* with the methodology. If the employment stats were real, the USNWR would be pretty darn near perfect. But there are two problems.

1. The employment stats aren't real, plus they don't reflect salary differences, so the gaps between the good schools and the bad ones are masked.

I disagree that there is nothing wrong, and my biggest complaint is the one you listed--it heavily counts made up employment data. My other beef is with the peer assessments. 40% of the rankings depends on rating more than 200 schools relative to one another on a scale of 1 to 5. If Yale is 5, is NYU just as good (5), or not quite as good (4)? If NYU is a 4, that leaves you 1,2, and 3 to rank all but (roughly) 6ish schools in the US. Not exactly precise enough to be meaningful, especially when it's 40% of the rankings.

flcath
Posts: 1502
Joined: Fri Nov 06, 2009 11:39 pm

Re: USNWR Law School Ranking Methodology

Postby flcath » Sun May 09, 2010 10:13 pm

Renzo wrote:
DerrickRose wrote:There's nothing *wrong* with the methodology. If the employment stats were real, the USNWR would be pretty darn near perfect. But there are two problems.

1. The employment stats aren't real, plus they don't reflect salary differences, so the gaps between the good schools and the bad ones are masked.

I disagree that there is nothing wrong, and my biggest complaint is the one you listed--it heavily counts made up employment data. My other beef is with the peer assessments. 40% of the rankings depends on rating more than 200 schools relative to one another on a scale of 1 to 5. If Yale is 5, is NYU just as good (5), or not quite as good (4)? If NYU is a 4, that leaves you 1,2, and 3 to rank all but (roughly) 6ish schools in the US. Not exactly precise enough to be meaningful, especially when it's 40% of the rankings.

This is a big issue with all polls (how fine of a distinction do you let the respondents make). Using a smaller scale (1-5 vice 1-10, etc.) will give results that are more reliable, but less informative. Subjective 1-10 polls usually produce falsely sensational results.

In your hypo, I think law profs are sufficiently qualified to discern b/t Yale and NYU, and possibly every school in the T13, on a 1-10 scale. But what about the "5 to 10" half of the scale? Do you really think your prof can meaningfully distinguish the difference between Northeastern U and Arkansas-Fayetteville?

User avatar
Shot007
Posts: 96
Joined: Fri Dec 05, 2008 1:46 pm

Re: USNWR Law School Ranking Methodology

Postby Shot007 » Sun May 09, 2010 10:47 pm

DerrickRose wrote:
2. It doesn't give a real-time dynamic look at the relative values of attending the various schools. Attending #19 WUSTL in 2010 is about as good as attending #80 Rutgers-Newark in 2005. The rankings don't, and can't, reflect this.


wanted to know what everyone thought of this -- things aren't really that bad -- are they?

User avatar
Joga Bonito
Posts: 302
Joined: Thu Dec 10, 2009 4:46 pm

Re: USNWR Law School Ranking Methodology

Postby Joga Bonito » Sun May 09, 2010 10:52 pm

Shot007 wrote:
DerrickRose wrote:
2. It doesn't give a real-time dynamic look at the relative values of attending the various schools. Attending #19 WUSTL in 2010 is about as good as attending #80 Rutgers-Newark in 2005. The rankings don't, and can't, reflect this.


wanted to know what everyone thought of this -- things aren't really that bad -- are they?


I don't know about all that...its a good hyperbole though.

User avatar
beesknees
Posts: 458
Joined: Wed Nov 11, 2009 10:46 am

Re: USNWR Law School Ranking Methodology

Postby beesknees » Sun May 09, 2010 11:01 pm

I'd like to preface by saying I think these rankings are useful as an introductory guide to understanding where schools are in terms general relative quality. But they become less useful in making clear discernments between closely ranked schools or deciding between T30 and T50.

As far as methodology, where I find fault with the methodology is:

1) 20% of the assessment score is based on generalalized placement data. Employment rate doesn't say anything to the type of employment graduates obtain, merely that this many grads were able to obtain A job.

2) A whopping 40% of the assessment score is based on qualitative data obtained from God-knows what sample. Does USNWR attempt to take a random sample? Is there a regional bias? We simply cannot say, but their lax standards for employment data should tell us something here.

Also, while assessment scores are obviously important because these people are the ones doing the hiring, it still is a self-perpetuating system that lags behind reality. I'm not finding fault at the extremes (read: Yale = #1), but the middle is not so obvious. Schools that are underperforming from what they did, say 10 or 20 years ago can still ride off of their former glory and vice versa.

3) 15% of the assessment score is determined by largely irrelevant data - expenditures per student, library resources, etc.

4) USNWR's apparent disregard for accuracy and truthfulness in the data they report (employment data).

User avatar
TCScrutinizer
Posts: 497
Joined: Sun Sep 13, 2009 11:01 pm

Re: USNWR Law School Ranking Methodology

Postby TCScrutinizer » Sun May 09, 2010 11:04 pm

Shot007 wrote:
DerrickRose wrote:
2. It doesn't give a real-time dynamic look at the relative values of attending the various schools. Attending #19 WUSTL in 2010 is about as good as attending #80 Rutgers-Newark in 2005. The rankings don't, and can't, reflect this.


wanted to know what everyone thought of this -- things aren't really that bad -- are they?


Things are only that bad if you were one of the people expecting that going to a top law school equaled being handed shit for the rest of your life.

Pearalegal
Posts: 1433
Joined: Fri Jan 30, 2009 10:50 am

Re: USNWR Law School Ranking Methodology

Postby Pearalegal » Sun May 09, 2010 11:05 pm

TCS wrote:
Shot007 wrote:
DerrickRose wrote:
2. It doesn't give a real-time dynamic look at the relative values of attending the various schools. Attending #19 WUSTL in 2010 is about as good as attending #80 Rutgers-Newark in 2005. The rankings don't, and can't, reflect this.


wanted to know what everyone thought of this -- things aren't really that bad -- are they?


Things are only that bad if you were one of the people expecting that going to a top law school equaled being handed shit for the rest of your life.


This, but less dramatic.

User avatar
DerrickRose
Posts: 1106
Joined: Sat Dec 06, 2008 5:08 pm

Re: USNWR Law School Ranking Methodology

Postby DerrickRose » Mon May 10, 2010 12:08 am

Shot007 wrote:
DerrickRose wrote:
2. It doesn't give a real-time dynamic look at the relative values of attending the various schools. Attending #19 WUSTL in 2010 is about as good as attending #80 Rutgers-Newark in 2005. The rankings don't, and can't, reflect this.


wanted to know what everyone thought of this -- things aren't really that bad -- are they?


In 2005 Rutgers Newark could be attended at in-state sticker for a grand total of approximately 100k and offered a 15% chance at Biglaw.

In 2010 Wash U at sticker costs a grand total of about 175k and offers a chance at Biglaw that for the current 1L class can't be too much higher than 20%. It might be less.

You tell me.

User avatar
traehekat
Posts: 3195
Joined: Thu Apr 30, 2009 4:00 pm

Re: USNWR Law School Ranking Methodology

Postby traehekat » Mon May 10, 2010 12:16 am

DerrickRose wrote:
Shot007 wrote:
DerrickRose wrote:
2. It doesn't give a real-time dynamic look at the relative values of attending the various schools. Attending #19 WUSTL in 2010 is about as good as attending #80 Rutgers-Newark in 2005. The rankings don't, and can't, reflect this.


wanted to know what everyone thought of this -- things aren't really that bad -- are they?


In 2005 Rutgers Newark could be attended at in-state sticker for a grand total of approximately 100k and offered a 15% chance at Biglaw.

In 2010 Wash U at sticker costs a grand total of about 175k and offers a chance at Biglaw that for the current 1L class can't be too much higher than 20%. It might be less.

You tell me.


Yeah but I mean, its all relative. You are still better off attending WashU now than attending Rutgers, as far as employment prospects go. USNWR isn't interested in evaluating the condition of the legal market (if they were, maybe they would care a little bit more about bogus placement data), they are just ranking the schools.

User avatar
kalvano
Posts: 11728
Joined: Mon Sep 07, 2009 2:24 am

Re: USNWR Law School Ranking Methodology

Postby kalvano » Mon May 10, 2010 12:26 am

1) Are you Yale, Harvard, or Stanford?

If yes, go to the front of the line, no waiting for you. Would you like champagne with your lunch?

If no, then piss off, we'll get to you later when we set up the dartboard.






2) Profit.

User avatar
MoS
Posts: 404
Joined: Fri Oct 16, 2009 10:59 pm

Re: USNWR Law School Ranking Methodology

Postby MoS » Mon May 10, 2010 12:30 am

I think it would be more useful if they just gave us the score the came up with, without turning it a scale of 0-100. I would like to see the raw score to see how close schools really are or aren't. I mean Yale could be a .978 and Harvard a .972 or Yale could be 180 and Harvard 162 for all I know. I just think the raw score would make more sense to display instead of converting it.

User avatar
mazzini
Posts: 120
Joined: Tue Feb 09, 2010 2:16 pm

Re: USNWR Law School Ranking Methodology

Postby mazzini » Mon May 10, 2010 12:38 am

In 2005 Rutgers Newark could be attended at in-state sticker for a grand total of approximately 100k and offered a 15% chance at Biglaw.

In 2010 Wash U at sticker costs a grand total of about 175k and offers a chance at Biglaw that for the current 1L class can't be too much higher than 20%. It might be less.

You tell me.


Big law really is the only thing that matters. Region has nothing to do with placement in said area of employment either. Christ, these threads are getting tiresome.

flcath
Posts: 1502
Joined: Fri Nov 06, 2009 11:39 pm

Re: USNWR Law School Ranking Methodology

Postby flcath » Mon May 10, 2010 1:03 am

mazzini wrote:
In 2005 Rutgers Newark could be attended at in-state sticker for a grand total of approximately 100k and offered a 15% chance at Biglaw.

In 2010 Wash U at sticker costs a grand total of about 175k and offers a chance at Biglaw that for the current 1L class can't be too much higher than 20%. It might be less.

You tell me.


Big law really is the only thing that matters. Region has nothing to do with placement in said area of employment either. Christ, these threads are getting tiresome.

Yeah, that was kind of an unfair comparison. Also, a 2005 Rutgers figure versus a 2009 WUSTL (low-ball) estimate?

User avatar
voice of reason
Posts: 264
Joined: Thu Oct 29, 2009 12:18 am

Re: USNWR Law School Ranking Methodology

Postby voice of reason » Mon May 10, 2010 1:14 am

People (should only) attend law school so they can get jobs as lawyers. Thus, the main ranking criterion ought to be what job opportunities the degree offers. A few of the USNWR criteria -- the quality assessments by lawyers/judges and employment rate at and 9mos after graduation -- are proxies for this, albeit flawed ones. Together, these items count for 33% of the USNWR rating. The other 67% of the rating is based on variables that are secondary at best. Some are flatly irrelevant, like financial aid expenditures.

Apparently most law students want, or end up wanting, biglaw. Therefore one of the best ratings would be an estimate of the proportion of the class that can get biglaw upon graduation. (Note that this is not identical to the proportion going into biglaw.)

User avatar
PDaddy
Posts: 2073
Joined: Sat Jan 16, 2010 4:40 am

Re: USNWR Law School Ranking Methodology

Postby PDaddy » Mon May 10, 2010 1:29 am

Tautology wrote:Overall:

The weights given to each factor are completely arbitrary. Why should Judge/Lawyer assessment be 60% of Peer assessment rather than 20% or 150%? Judge/lawyer data is probably less reliable because of the response rate, but is it 60% as reliable. There is no real answer. In fact, there is no justification for combining the different pieces of data at all, except that people would rather look at one list than many no matter how meaningless the one is.

Peer Assessment:

This data is generally interesting and useful, I have little problem with it other than the arbitrariness of the weighting as mentioned above.

Selectivity:

Acceptance rate is fairly meaningless.


I have said these same things many times. "ARBITRARY"


voice of reason wrote:People (should only) attend law school so they can get jobs as lawyers. Thus, the main ranking criterion ought to be what job opportunities the degree offers. A few of the USNWR criteria -- the quality assessments by lawyers/judges and employment rate at and 9mos after graduation -- are proxies for this, albeit flawed ones. Together, these items count for 33% of the USNWR rating. The other 67% of the rating is based on variables that are secondary at best. Some are flatly irrelevant, like financial aid expenditures.

Apparently most law students want, or end up wanting, biglaw. Therefore one of the best ratings would be an estimate of the proportion of the class that can get biglaw upon graduation. (Note that this is not identical to the proportion going into biglaw.)


That's the problem. The job opportunities that are now afforded to graduates from certain schools have resulted, in part, FROM the rankings and not the quality of education graduates are receiving. Out of the 200 or so ABA approved law schools, one can expect to get a good legal education from 150. Job prospects are another story.

I'll keep saying it forever, the firms drive this thing. If they would be willing to do their own homework, instead of relying on the schools to do their sorting for them, students would attend the schools that are more appropriate for them and they'd be much happier. That would make them better performers. If more students from a greater number of schools were better performers, the firms couldn't rely on just "top" schools for their talent. The cycle would reverse because firms would have to look objectively at talent from all of the schools, and not just the top-10% of that talent.

User avatar
legalease9
Posts: 623
Joined: Tue Mar 23, 2010 8:41 pm

Re: USNWR Law School Ranking Methodology

Postby legalease9 » Mon May 10, 2010 3:33 am

voice of reason wrote:People (should only) attend law school so they can get jobs as lawyers. Thus, the main ranking criterion ought to be what job opportunities the degree offers. A few of the USNWR criteria -- the quality assessments by lawyers/judges and employment rate at and 9mos after graduation -- are proxies for this, albeit flawed ones. Together, these items count for 33% of the USNWR rating. The other 67% of the rating is based on variables that are secondary at best. Some are flatly irrelevant, like financial aid expenditures.

Apparently most law students want, or end up wanting, biglaw. Therefore one of the best ratings would be an estimate of the proportion of the class that can get biglaw upon graduation. (Note that this is not identical to the proportion going into biglaw.)


But how else do you get that metric. To be fair, while USNEWS is very flawed (I hate their library focus... in the age of the internet who cares?) They are trying to find metrics that they can measure. There is no way to measure the % of people who "could" get a big law job. That is impossible to determine. I'm sure Cooley thinks its grads could get "big law" if they just focused on it more. There is no scientific method to judge unexploited opportunities.

My opinion on Best ranking...

% of graduates who make six figures+
% of graduates who have Article III clerkships+
% of graduates who went into PI careers

This judges schools based on the three job types graduates actually want, while eliminating the low level shit-law jobs no one wants.

This is not perfect, (or even close) but it at least gets at what matters. I still don't know how you will ever get truly accurate employment info w/o compulsory disclosure of salary and career (which will never happen). That is the missing link that will forever make rankings at least partially BS.

User avatar
Rand M.
Posts: 1033
Joined: Fri Aug 07, 2009 8:24 am

Re: USNWR Law School Ranking Methodology

Postby Rand M. » Mon May 10, 2010 5:54 am

MoS wrote:I think it would be more useful if they just gave us the score the came up with, without turning it a scale of 0-100. I would like to see the raw score to see how close schools really are or aren't. I mean Yale could be a .978 and Harvard a .972 or Yale could be 180 and Harvard 162 for all I know. I just think the raw score would make more sense to display instead of converting it.


No, converting the scores to a 100 point scale does not change anything. Those gaps between the scores on the 100 point scale are just as representative of what you are looking for as those numbers you suggested could be the casse. That's why there was so much hubbub this year about the two point difference between S and C; people questioned the T3 premise and wanted to kick S down and commented on how it may not stay T3 much longer.

User avatar
ggocat
Posts: 1663
Joined: Sat Dec 13, 2008 1:51 pm

Re: USNWR Law School Ranking Methodology

Postby ggocat » Mon May 10, 2010 8:15 am

Some essential reading:

Theodore P. Seto, Understanding the U.S. News Law School Rankings, http://papers.ssrn.com/sol3/papers.cfm? ... _id=937017.

Andrew P. Morriss & William D. Henderson, Measuring Outcomes: Post-Graduation Measures of Success in the U.S. News & World Report Law School Rankings, http://papers.ssrn.com/sol3/papers.cfm? ... _id=954604.


Lighter reading: William D. Henderson & Andrew P. Morriss, What rankings don't say about costly choices, http://www.law.com/jsp/nlj/PubArticleNL ... hbxlogin=1.

User avatar
MoS
Posts: 404
Joined: Fri Oct 16, 2009 10:59 pm

Re: USNWR Law School Ranking Methodology

Postby MoS » Mon May 10, 2010 10:20 am

Rand M. wrote:
MoS wrote:I think it would be more useful if they just gave us the score the came up with, without turning it a scale of 0-100. I would like to see the raw score to see how close schools really are or aren't. I mean Yale could be a .978 and Harvard a .972 or Yale could be 180 and Harvard 162 for all I know. I just think the raw score would make more sense to display instead of converting it.


No, converting the scores to a 100 point scale does not change anything. Those gaps between the scores on the 100 point scale are just as representative of what you are looking for as those numbers you suggested could be the casse. That's why there was so much hubbub this year about the two point difference between S and C; people questioned the T3 premise and wanted to kick S down and commented on how it may not stay T3 much longer.

Well I would agree if they were simply multiplying by a certain number like when you turn a decimal into a percentage. But they skew the scores so the best score is a 100 and the worst is 0 so they are stretching the scores on all sides. Plus, does any school really deserve a 0? 20 maybe. When they do this, if you held all but the top or the bottom school's numbers constant, then fluctuations in those two schools statistics make other schools' ranks change, typically the ties are no longer ties and a school could fall or jump more than 4 places. That is not a system I support. It's too reactive to small changes, and punishes school for factors that have nothing to do with them.




Return to “Choosing a Law School”

Who is online

Users browsing this forum: Baidu [Spider], MSNbot Media, Specter1389 and 8 guests