Just how honest were law schools when they reported their data to U.S. News for our 2011 Best Law Schools rankings? Each year, we ask law schools to report the same statistical information to us that they report on the American Bar Association's (ABA) annual accreditation questionnaire. Despite some notable exceptions and data errors over the years, it turns out the schools are pretty reliable in their data reporting.To support this absurd confusion of reliability with honesty (you can reliably lie, can't you?), he points to a correlative study done by Chapman's Tom Bell, which takes the data provided to the ABA and uses the USNWR's methodology to reach the same conclusions that USNWR did. Then Morse drops this absolute gem of propaganda bullshit:
The fact that Professor Bell was able to duplicate our methodology by using law school data he obtained directly from the ABA proves that the U.S. News rankings process is very transparent and can be duplicated using publicly available data. This exercise also establishes that U.S. News is calculating and weighting the ranking variables as stated in the posted methodology. Users of our law school rankings can be confident that the results are correct given the weights and rankings variables that U.S. News has chosen.Finally, does Bell's study also prove that law schools are being accurate in how they report their statistical profile data to the ABA, the general public, and U.S. News? No [ed.: yet your opening speaks of "honest" schools because _______?], this only proves that law schools are being very careful to report the same data to U.S. News that they report to the ABA for accrediting purposes.
Has anyone, in the history of the USNWR rankings, doubted that USNWR was using data that wasn't identical to that given to the ABA, especially since the ABA information is public and USNWR, being excellent journalists, would surely cross-check their data? Has anyone ever suggested that law schools are not "being very careful to report the same data to U.S. News that they report to the ABA for accrediting purposes?"
As far as I know, they haven't. Instead, all Morse did here was publish a blog entry to say that some professor took the listed methodology applied it to the exact same data set, and came up with scientifically-similar results. What the hell is that supposed to show, again?
Ah, yes, that "the results are correct given the weights and rankings variables that U.S. News has chosen."
For most people, the word "correct" would imply that the substance of the rankings is correct, e.g., that Harvard is a better law school than Georgetown, rather than the process used to form the rankings was scientifically sound. Take a look at certain fragments of this piece in order:
- ...schools are pretty reliable in their data reporting.
- ...the U.S. News rankings process is very transparent...
- ...U.S. News is calculating and weighting the ranking variables as stated...
- ...the results are correct...
The way this was composed is deliberately designed to defend the whole system, including the substantive results. The fact that in the next paragraph he disclaims "but the data may not be good" doesn't change the fact that he spent the rest of the piece defending the results derived from that data.
The first issue is, and always has been, that their methodology is fatally flawed. The second issue is, and always has been, that the numbers presented to them are inaccurate.
The fact that Morse was able to take a complete non-story not even worthy of a blog entry and turn it into a rhetorically-sound defense of the rankings' value is admirable. I'm baffled that a professional writer and statistician can conflate the concepts of reliability, honesty, accuracy and precision all in one post. Of course, he did almost the exact same thing a year and a half ago.
The fact that Professor Bell was able to duplicate our methodology and outcomes using data collected from a different source proves that the U.S. News rankings process is very transparent and accurate.
Professor Bell concludes:
"For now, I'll just offer this happy observation: The close fit between USN&WR's scores and the model's scores suggests that law schools did not try game the rankings by telling USN&WR one thing and the ABA (the source of much of the data used in my model) another. Even a skeptic of law school rankings can find something to like in that."
Again, a complete non-story that does not justify the claim made ("the U.S. News rankings process is very . . . accurate") as read by lay people. It's not a "happy" observation and there is no hint that a "skeptic . . . can find something to like" since no ground of skepticism is weakened. The law schools' gaming of the rankings does not take place in presenting different data to the USNWR and the ABA. No one has ever thought that. What law school in its right mind would lie to a journalist when there's contradictory public information available? Do you think law school administrators are dopes?
On the bright side, at least this year's article had a one-line disclaimer that was somehow supposed to alleviate a plausible interpretation of the other four paragraphs. That still doesn't change the fact that it's shameless propaganda calculated to make the rankings seem "accurate," "honest," and "transparent," which seems odd given that just last week Bob was telling us not to take them so seriously.
Hahaha. That is pretty amusing. This is Mr. Morse's livelihood we are talking about.
ReplyDeleteBut what else do you expect him to say? The problem (or reality rather) is that rankings (I am speaking of the undergraduate rankings for the USNWR, because that is my area of interest) are just part of our society. Perhaps its just our "obsession" with putting value on things. The problem is when people take them as gospel, rather than one piece of the picture.