Sunday, February 6, 2011

Building a Law School Ratings System: Part 2 - Top Line/"National" Ratings

For the uninformed, despite the one-size-fits-all salaries given by law schools and reporting agencies like USNWR, legal hiring is bimodal.

As you can see from the article linked above, for roughly 19,000 in the class of 2009 who reported salary information, about 25% reported salaries were in excess of $160,000. A separate "bubble" formed at the lower end of the chart representing the 34% of graduates who had salaries between $40,000 and $65,000. So although the median of the data is around $93,000, relatively few salaries actually fell between $75,000 and $160,000.

What this part of the ratings seeks to do is account for that upper portion of the bimodal distribution, those roughly 5000 students in the class of 2009 who were at the top of the salary heap. Essentially, I operate on the obvious assumption that these are the most coveted jobs for law graduates and I want to know how good law schools are getting their graduates into these upper-tier jobs.

To do this, I decided to go directly to the firms. Last week, I picked 50 firms off this list that also had a searchable database of their attorneys. I assume that these firms are those most likely to pay salaries on the high end of the distribution (some basic research into starting salaries backed up this assumption as valid). I tried to balance the selected firms by geography and lawyer demand, so D.C., New York, and Chicago-based firms were favored over firms in, say, South Carolina, Nebraska, or Seattle. But almost every major metropolitan area was represented in rough proportion to each other.

To keep the numbers manageable, I limited my search to attorneys whose last names began with certain letters of the alphabet on the assumption that the alphabet is evenly distributed through law schools. Where I was able, I limited my search on those parameters. I then counted the number of attorneys each firm had listed as (1) Partner, Shareholder, or the equivalent; and (2) Associate. These are fairly universal terms. Because designations like "Of counsel" or "staff attorney" are not consistent in their use across firms, I decided to ignore those individuals. I do not believe such a move significantly disadvantaged any single law school.

Look at law firm websites has numerous advantages. First of all, these are the people the law firm themselves is holding out as their representatives and employees. Although some of the websites are horribly, horribly designed, they double as advertising, so who the firms hold out as their employees seems, to me, to be a better indicator of "prestige" (i.e., what people think of certain law schools) than asking judges or other bigwigs. Actions speak louder than words, after all.

After I had accumulated the data, I rated the law schools on four criteria:
  • Overall lawyers - how many graduates of a particular school were associates or partners at these firms.
  • Partnership saturation - how many graduates of the law schools had become owners of the largest law firms (to correct for any possibility that some schools hired well, but were historically poor at moving people up).
  • Associate saturation - how many graduates of the law schools had been recent hires of the largest law firms (to correct for any possibility that some schools may have been great at getting people to partnership, but firms no longer hire from there).
  • Range of Firms - how many different firms a given school had attorneys at (to correct for any possibility that some firms stocked one or two particular firms; this actually happened multiple times).
I then took the resulting numbers and adjusted them with a multiple that accounted for the number of graduates each school puts out in a given year; if Harvard and Yale both land 10 associates somewhere, Yale did a better job.

The resulting numbers were then assigned letter grades. For each category, a "B" was set at the amount of equal distribution. For example, if the entire pool of associates was 1000 people and there are 200 law schools of equal size, a firm that had 5 associates would receive a B. Grades were then distributed using the same method my 1L professors used to distribute grades.

I then added a fifth category: Federal Appellate Clerkships. These are very coveted jobs that are often a destination for the top students in each class. Using a site that gave information about clerk hiring, I applied the same basic method as above.

When all this was said and done, each school had 5 grades. These grades were then amalgamated into a GPA with equal weight. Each final GPA was then converted into one final grade using standard metrics and rounding to the closest marker (e.g., a 3.53 was an A-) for how well the school does in landing top graduates in the nation's top jobs.

I will not defend this methodology as statistically perfect. It is not scientific, nor did I even attempt it to be (neither, really, does USNWR or any other system). Because this is not hard science, and to avoid unnecessary side debates, I'm not going to disclose what firms I reviewed or what specific grades were assigned for the categories. Additionally, my own biases may have clouded firm selection or the methodology itself. I have connections that may have biased my choices in some way or another at all six law schools in Chicago, the U. or Illinois, Notre Dame, Indiana, UCLA, Texas, Denver, St. Louis, Emory, Mercer, Georgia, UMKC, Thomas Cooley, and Capital.

That said, I do feel this system is better at accurately portraying the chances of a top graduate at getting a top-line job than other systems, at least for the class of 2009. I will defend it on that front. This type of system is a significant improvement on taking self-selective surveys from graduates.

Prospective law students should read the chart below as relative. For example, finishing in the top 10% at an "A" school is better than a "B" school, etc. If a law school has a grade of "F," it means that in the entire pool (over 7000 attorneys) I found very few (often 0) representatives from those schools. Because we don't know the overall amount of graduates that the large firms will take in the future, there is nothing more I can give than a vague guide, e.g., that a C+ school is better than a D school in terms of placing BigLaw members.

Here are some pertinent summary stats:
  • The pool was the 197 schools with at least ABA provisional accreditation.
  • Using the method established above for setting the "B" level, the median cumulative grade wound up being a D-.
  • 85 schools scored a grade of F. 32 did not have a single attorney show up in the pool.
  • The highest scorers in each category were no surprise: the University of Chicago (overall employment and partnership saturation), Columbia (associate saturation), Georgetown (firm range), and Yale (clerkship hiring).
  • Many people do not know what law school they went to. Many attorneys list "Rutgers Law School" as their school, even though they're two. Absent evidence to the contrary, these were marked as Rutgers-Newark. Generic "John Marshall" listings were dependent on which campus they were closer to. Generic listings of "Indiana" were counted as IU-Bloomington. If I had no idea what school they went to, I ignored it; same with non-approved schools. Old names for law schools (e.g., Puget Sound) were counted under their new names (e.g., Seattle University). Not properly listing the law school's name was also embarrassingly common.
  • Many state schools, like Tennessee, New Mexico, UConn, and Alabama did relatively poorly in this metric compared to their USNWR rating, but the perhaps biggest surprise to me was Wake Forest's poor performance. On the contrary, Howard, Wayne State, Syracuse, and New York Law School all did better than their USNWR rating would suggest.
  • California may have been slightly underrepresented, but it appeared to me that east-coast graduates fared much better in west-coast firms than vice-versa. Furthermore, Loyola, Pepperdine, Hastings, and Davis all did better than their USNWR would suggest.

Without further ado, here is the full list that will be used later in the rating system; schools are listed alphabetically within each grade level:

SCHOOL TOP-LINE GRADE
Chicago A
Columbia A
Duke A
Georgetown A
Harvard A
Michigan A
Northwestern A
NYU A
Penn A
Virginia A
Yale A
Cornell A-
GWU A-
Stanford A-
Texas A-
UC-Berkeley A-
UCLA A-
Boston B+
Emory B+
Notre Dame B+
USC B+
Vanderbilt B+
Boston Coll. B
Fordham B
Illinois B
BYU B-
Iowa B-
Minnesota B-
UC-Hastings B-
Wash U. (St. Louis) B-
American C+
Florida C+
Wash & Lee C+
Brooklyn C
Houston C
Indiana C
Loyola-Los Angeles C
North Carolina C
Rutgers-Newark C
St. Johns C
Tulane C
UC-Davis C
Utah C
William & Mary C
Cardozo C-
Case Western C-
Catholic C-
George Mason C-
Georgia C-
Howard C-
Miami C-
Ohio State C-
Pepperdine C-
Southern Methodist C-
Temple C-
Villanova C-
Washington C-
Wisconsin C-
Alabama D+
Arizona D+
Arizona St. D+
Baylor D+
Chicago-IIT
D+
Colorado D+
Denver D+
Kansas D+
Maryland D+
Pitt D+
Richmond D+
San Diego D+
Syracuse D+
Cincinnati D
Connecticut D
DePaul D
Hofstra D
Kentucky D
Louisiana St. D
Loyola-Chicago D
Missouri D
New York LS D
Oklahoma D
Rutgers-Camden D
Santa Clara D
Seton Hall D
St. Louis D
Wake Forest D
Wayne State D
Arkansas-LR D-
Ave Maria D-
Buffalo D-
Creighton D-
Florida St. D-
Franklin Pierce D-
Georgia St. D-
John Marshall (Chi.) D-
Mercer D-
Michigan St. D-
Nebraska D-
Northeastern D-
Pacific D-
San Francisco D-
Seattle D-
South Carolina D-
South Texas D-
Southwestern D-
Stetson D-
Suffolk D-
Tennessee D-
Texas Tech D-
Toledo D-
Akron F
Albany F
Appalacian F
Arkansas F
Baltimore F
Barry F
Cal-Western F
Campbell F
Capital F
Chapman F
Charleston F
Charlotte F
Cleveland St/Marshall F
Cooley F
CUNY-Queens F
Dayton F
Detroit F
Drake F
Drexel F
Duquense F
Elon F
Florida A&M F
Florida Coast F
Florida Intl. F
Golden Gate F
Gonzaga F
Hamline F
Hawaii F
Idaho F
Indiana-Indianapolis F
John Marshall (Atl.) F
Jones College (Faulkner) F
La Verne F
Lewis & Clark F
Liberty F
Louisville F
Loyola-New Orleans F
Maine F
Marquette F
Memphis F
Mississippi F
Mississippi College F
Missouri-Kansas City F
Montana F
NC Central F
New England F
New Mexico F
North Dakota F
Northern Ill. F
Northern Kent. F
Nova F
Ohio Northern F
Oklahoma City F
Oregon F
Pace F
Penn St. F
Phoenix F
Quinnipiac F
Regent F
Roger Williams F
Samford F
South Dakota F
Southern F
Southern Ill. F
St. Mary's F
St. Thomas (Fl) F
St. Thomas (MN) F
Texas South F
Texas Wesleyan F
Thomas Jefferson F
Touro F
Tulsa F
U. of District of Columbia F
UC-Irvine F
UNLV F
Valparaiso F
Vermont F
W. New England F
Washburn F
West Virginia F
Western St. F
Whitier F
Widener F
Willamette F
Wm. Mitchell F
Wyoming F

Note: These are NOT final ratings. They are merely top-line components that will be used later on in computing the final grade for each school.

As a final note, do not expect Part 3 for some time. This took me a long time, and Part 3 is a significantly greater undertaking.

To see more, check out one of the other sections:

Part 1: Introduction
Part 2: Top-Line/"National" Employment (this page)
Part 3: Bottom-Line/"Local" Employment
Part 4: Saturation and Regional Considerations
Part 5: Applying Investment Principles
Part 6: Conclusion

2 comments:

  1. At the risk of sniping and being totally unwilling to create my own ranking system, I once looked at where the judges on state highest courts got their law degrees. Quite illuminating.

    ReplyDelete
  2. Third Tier Drake is an embarrassment. People at the very top of the class ended up in toiletlaw. But tuition is now more than $30K per year.

    ReplyDelete