kappycaft1 wrote:While it might not be fair to everyone who goes to law school, I do think that it would be beneficial to change the "employment rate" of the USNWR rankings to only count full-time, long-term, JD-required positions. If we assume that the purpose of law schools is to train lawyers, then the best way to assess a school's ability to achieve this purpose would be through excluding non-lawyer jobs from the employment rates. (If someone was interested in non-JD-required work but wanted to go to law school, they could use sites like LST to assess schools' placement into other fields.)
Also, I believe that instead of simply categorizing full-time, long-term, JD-required positions by "tier," it would be better to base the score off of income instead. That is, take the median starting salaries and divide them by the cost of tuition for 3 years to get a score; this score would be an incentive for schools to lower tuition in order to raise their rank while simultaneously being an indicator of outcome to potential applicants. Instead of being based on absolute values, it would be based on a curve so that schools would be directly competing with each other for points. To make sure that low salaries are not intentionally left out, the score would be multiplied by the % of the graduating class for which the income was reported (so x 1.0 if 100% of the graduates were accounted for, and x 0.8 if only 80% of the graduates were accounted for).
The above could be used to replace the current 18% (4% for at graduation and 14% for 9 months after graduation) of a school's overall score that goes to employment rate as follows:
10%: Employment Rate (Full-time, Long-term, JD-required Positions within 9 months of graduation)
8%: Income/Tuition*Reliability (A full 8% would be awarded to the school with the best (highest) ratio)
I'd like to see US news broken down into three different broad categories: raw reputation, inputs, and outcomes. Raw reputation is probably measured somewhat well already with the peer/practitioner surveys, although I'd up the "practitioner" weight and lower the "peer school" rate so that they are both 20%.
The inputs would measure LSAT and GPA, and account for about 20%. I do think the school's ability to attract students is a somewhat large factor, as it influences how flat the grading curve is going to be and, in general, better inputs lead to better outcomes. LSAT and GPA aren't the most comprehensive measures of inputs, but it's also hard to objectively measure other inputs on a standardized, industry-wide basis. I'd ditch the selectivity rankings, because these are just to easy to game with fee waivers, yield protection, etc.
The outcomes are probably the most poorly measured. This should be a balance between the type of job you get and your average debt at graduation. Maybe 25% for job placement and 15% for average debt. It would have a similar effect to the ratio you speak about. The current rankings place emphasis on "money spent on teaching" and "money spent on scholarships." But this creates perverse incentives, such as raising tuition but giving out more scholarships--instead of just lowering tuition across the board. An "average debt at graduation" figure would solve that.
I would be somewhat in favor of measuring placement success by salary alone, but that has problems, too. You'd have to adjust for the public interest IBR factor, sort out which clerkships actually lead to prestigious biglaw and which ones are just a 12-month cure for chronic unemployment, etc.
Either way, I don't think it would take educated people more than 1 hour in a room discussing things to come up with a much
better ranking system than US News. If you could come up with a smart one, the effects on the entire legal education complex would be astronomical. One hour or two of work is all it would take. Yet no one wants to do it.