So I took the 2014 rankings and input them into a spreadsheet I made comparing the c/o 2011 employment data with the 2013 rankings. It doesn't seem like the 2014 Rankings methodology is much different, but thanks largely to people like those at LST, the leaps in employment data transparency made the data simply better. Even though employment only encompasses 18% of the ranking, you'll see that the schools most whitewashing were the generally the biggest losers, while schools that gained from having more accurate data saw the biggest jumps.
%JDReq is a ranking based exclusively on the school's percentage of graduates employed in full-time, long-term jobs requiring bar passage.
2013 and 2014 are the school's respective USNWR rankings.
Change is the movement from 2013 to 2013 USNWR rankings.
JD FTLT is the number of graduates employed in full-time, long-term jobs requiring bar passage.
Total Graduates is the number of graduates from the school in the c/o 2011 ABA data.
% JD FTLT is the percentage of graduates employed in full-time, long-term jobs requiring bar passage.
Color Key: A while ago I created a spreadsheet to see what schools would rank if the ranking was based exclusively on JDFTLT jobs held at 9 months. That percentage is listed in the far right column, and that "JD ranking" is listed on the far left. Then I colored the schools shades of green and red based on their actual 2013 ranking relative to the c/o 2011 employment data. It was pretty scattered, but when you insert the 2014 rankings, many of those seemingly anomalous (or not-so-anomalous) rises and falls in the rankings can be explained simply by the new, more transparent employment data.
For example, American, which is widely considered a poorly disguised for-profit JD mill, ranks 180th out of 201 ABA-accredited law schools in percentage of graduates in full-time, long-term jobs requiring bar passage. It's actual ranking in 2013, 49, doesn't really bear any relation to the available employment data. (Hence the very dark red highlighting, and I marked this as "two tiers" worse. If schools were ranked according to employment outcomes alone, American would be behind 90% of accredited law schools, and possibly some unaccredited ones.)
(In case you didn't know or realize, the c/o 2010 data was very generalized and not too useful, and last year's employment data was the first that was genuinely insightful. The 2013 rankings are based off the c/o 2010 rankings, while the ones released yesterday are based off last year's more transparent c/o 2011 rankings.)
Without further introduction, here is a list of the biggest winners and losers in the new ranking system, as compared to their 2011 employment data.
