Even though I don't like the concept of the rankings generally, I realize they do have an effect. I obviously think this is good news; isn't anything that presents a more honest picture of the legal profession?
But I wanted to talk more about USNRW in general. I found this article here, about a recent study of which USNWR was a partner that looked at the nature of college rankings (the article is part 2 of a 3-part series:
3. Do the U.S. News rankings "put pressure on institutions to invest in strategies and practices primarily for the purpose of maintaining or strengthening position in the rankings," either consistently or occasionally? An overwhelming majority (95.1 percent) of NACAC's members at both colleges and high schools believe that "Yes, colleges either occasionally or consistently invest in strategies and policies to improve in the rankings."Although this is for the undergraduate sector, I strongly suspect that it applies equally (or greater) to law schools, and I find the timing of these revelations significant.
...
4. Does your school or institution make programmatic changes because of rankings? The report concluded that 54.1 percent of NACAC's members representing colleges reported that their particular institutions do not make any programmatic changes based on the ranking. However, it said that 7.6 percent say that their school consistently makes changes and 38.4 percent say that their schools make changes occasionally because of the rankings.
Notice anything curious about questions 3 and 4 when put together? Question 3 asked if schools in the generic sense put in policies to improve their USNWR rank. Question 4 asked members if their institution in particular did it (although I'm not sure what they mean by "programmatic" change and that seems like a question flaw; under a liberal definition of the word, it would mean anything, but people might interpret it narrowly).
In any event, I find the disparity between 3 and 4 interesting. 95% of respondents believe schools make changes, but only 46% of those reporting say their own school does it. Assuming no one thought they were the only school in the country that did it, that's a hypocrisy/sanctimonious rate of around 49%, i.e., those who thought other schools put in programs with the ratings in mind, but their school did not.
That's really high and USNWR isn't dumb. Although normally I'm cynical about such things, there's a part of me that honestly believes USNWR is getting the results back from this survey (and maybe others) and is realizing how widespread and destructive to academia their ratings have become. In the past few decades, USNWR has gone from being a news magazine to being a journalistic outpost whose sole practical purpose to most people anymore is to put out these silly rankings. And when someone's bread and butter may be tainted - and the above discrepancies show the college administrators have no qualms lying about their actions or their motives - it's usually wise to do something about it and find ways to correct the problem.
In the past few years, they've gone entirely online and faced major criticism from college presidents for their rankings. Given the results of both those developments, it wouldn't surprise me in the least if they were actually trying to take care of their rankings and nip accurate criticism - such as LST's that the information presented was not giving a complete picture - in the bud to maintain relevant.
No comments:
Post a Comment