Sell Manilla wrote:ggocat wrote: Keeping schools in the equation is a cost-effective way of ensuring reliable and valid data.
I agree that it's the most cost-efficient way, but I'm not so sure about "reliable" & "valid"
What I mean here by "reliable" is mainly the amount of data collected and the accuracy of that data. If you have a self-reported site, and 5% of the students report, then the data is not reliable as an indicator of the employment prospects for that school overall (which, I believe, is what the project is trying to ultimately measure).
With a pure open website that anyone can edit, you will also not be able to obtain accurate results unless you go through additional steps that you alluded to. But as you suggested, those steps would likely be prohibitive (or it there would be more cost-effective ways of collecting data than to try to make a website open but "verified/regulated" at the same time).
I still believe you will get more reliable data if schools collect data than if you have an open website. They will be able to directly contact every person and ask for employment data. You can't do that extrinsically with an open website. For that matter, you can't even do it with a direct mailer unless you persuade the schools to give you updated contact information on their graduates, something they are likely not going to do.
I have had some experience researching U.S. News and the placement success variables. I am content in believing that for the most part, the data is reliable. That is, the schools generally try to measure exactly what is asked. But the big problem with U.S. News employment data is validity. That seems to be the primary reason LST asks for the additional data = validity = ensuring the variables measured are related to employment prospects. LST will not "cure" reliability problems, particularly if it is relying on schools to report information.
Ideally, LST could cure reliability issues by obtaining contact information for every graduate and conducting the data collection themselves. Then they would have complete control over method of collection, which is how some schools currently (purportedly) make data inaccurate. In this sense, removing schools from the equation = more reliable data. But you would still need to generate a list of recent graduates. The best way to do this is to get the list from schools direction = keeping them in the equation. There are other methods, but you would lose reliability.
(I probably should not have used the word "valid" in my first post. That's more to do with making sure the variables actually measure what you're trying to measure. That's primarily why U.S. News sucks. The data collected do not allow students to draw meaningful conclusions about employment prospects. So we would say the data is invalid, although it is generally reliable--i.e., it is generally accurate.)
Sell Manilla wrote:This might work if LST gains serious momentum, which would be amazing. I have a hard time believing that TTT schools which may or may not be fudging data would participate, though. They might have everything to lose & nothing to gain.
First, it would be incorrect to believe that the top schools don't engage in tactics to artificially improve the variables measured by U.S. News. They generally play by the rules (i.e., so data is reliable), but the rules allow for serious validity problems. You see more and more top schools employing their own grads or establishing post-grad fellowships (placements in public service entities, subsidized by the school).
Top schools have much more to lose by a drop in employment numbers given the clustering of numbers. If GULC's 9-month employment stats drop 5-10% relative to peer schools, it's probably no longer a T14. So in a rough economy, GULC established a program to employ 60 of its graduates. Other schools have taken similar actions: Michigan, Columbia, BC, UT, SMU, Miami, just to name a few. (see
http://www.top-law-schools.com/forums/v ... 0#p3239811).
But lower ranked schools overall seem to not engage in tactics to artificially inflate employment rates. If you look at the data, you see most schools report accurately or refuse to report. (Refusing to report at-graduation employment in years prior to the next rankings benefited schools with lower numbers). Many schools report in the 50-60% range for at-graduation and the 80-90% range for nine months. If any of these schools wanted to, though, they could establish programs to artificially employ graduates. That data would still be reliable as a measure of the variables U.S. News collect. But many of them appear to not take full advantage of the loopholes in U.S. News reporting. Why? Less incentive. They are grouped into large categories, and a drop in the overall rankings of 10 slots does not affect them unless they are switching tiers (compared to a drop of 5 slots near the top end). And even then, switching between tier 3 and tier 4 is not as significant as switching between tier 2 and tier 3.
(there are, of course, exceptions to the rule, I can think of some schools in tier 3/4 who engage in shady tactics)