« Keeping up with the law... | Main | Developing an innovative pre-law school summer reading list »

July 2, 2008

BCS-style Law School Rankings (Alpha Test Version)

In the late 1990's, college football was at a crossroads.  The power conferences formed an alliance, the Bowl Championship Series, and then faced the dilemma of creating a ranking system through which it could select teams for the big-money bowl games.  At that time, there were two major rankings already in place-- a coaches poll and a writer's poll.  The BCS  created a system which (after adjustments in 2004) gave those two polls most of the weight in the BCS rankings, with the rest of the weight going to a number of computer-generated rankings which used significantly different formulas.  The result was a ranking which takes some of the edges off of the problems with any one system.

Legal education finds itself at the mercy of a controversial ranking system, the U.S. News annual ranking of law schools.  Few other ranking systems receive much attention as law schools are discussed, and critics believe that the U.S. News rankings are both manipulable and a false measure of educational quality.   

What if we created a BCS-type ranking for law schools?  I asked Baylor student Sid Earnheart to run some numbers towards this end.  In taking a run at this, we took guidance from, but did not strictly replicate, the BCS system.  For example, in our field there is only one major ranking, not two, a fact which requires a different methodology than that used by the BCS. 

In calculating the rankings listed below, we gave 50% of the weight to the U.S. News poll, giving each school a score equal to the number of schools behind it in the rankings.  For the remainder of the calculation, we looked to five ranking systems, each of which received a weight of 10%:

1)  The Internet Legal Research Group:  The ILRG is a compilation of raw data.  We merged that data and compiled rankings, first excluding two of the categories listed, bar passage rate and percentage employed at graduation, because the data already included difference between bar passage rate and the state rate and employment after 9 months.  (www.ilrg.com/rankings.html)

2)  Law School100.com:  This is a web site which claims to create rankings "based on qualitative, rather than quantitative data."  Unfortunately, the site does not further explain their methodology.  (www.lawschool100.com)

3)  The Cooley Rankings:  These rankings are created by Thomas E. Brennan and Don LeDuc of Cooley Law School in Michigan.  It analyzes 32 factors.  (www.cooley.edu/rankings/)

4)  The Leiter Rankings:  The Educational Quality Rankings are compiled by Professor Brian Leiter of the University of Texas (soon to be of the University of Chicago).  He uses three factors:  faculty quality, student quality, and teaching quality.  (www.leiterrankings.com)

5)  The Hylton Rankings:  These are compiled by professor J. Gordon Hylton of Marquette Law School.  He uses only two factors-- LSAT scores and the peer assessments from the U.S. News survey.  (www.elsblog.org)

Using this combination, we come up with the following top-20 ranking.  Each listing includes the name of the school followed by (U.S. News score/ILRG score/LawSchool100 score/Cooley score/Leiter score/Hylton Score).

1)   Harvard   (184/185/185/185/184/184)

2)   Yale   (185/184/184/178/185/185)

3)  Columbia (182/183/182179/183/183)

4)  NYU  (181/181/182/180/182/180)

5)  Stanford  (183/182/184/167/180/182)

6)  Penn   (179/180/179/174/175/177)

7)  Virginia  (177/174/179/182/178/178)

8)  Michigan  (177/177/179/177/176/179)

9)  Northwestern (177/175/179/181/177/174)

10)  Cal-Berkeley (180/173/179/169/172/175)

11)  U. of Chicago  (179/179/182/148/181/181)

12)  Georgetown   (172/170/179/184/179/176)

13)   Cornell   (174/178/179/152/173/173)

14)   Duke   (174/172/172/162/174/172)

15)   UCLA  (170/171/172/173/171/171)

16)  Texas  (170/158/172/183/167/170)

17)   George Washington (166/167/167/175/168/164)

18)   Minnesota  (164/163/172/176/163/167)

19)   Boston University  (165/169/167/164/157/165)

20)  Wash. U.- St. Louis (167/160/161/156/162/166)

This composite, of course, is no better than any of the rankings contained within it, and may even be worse than all of them.  For example, it is hard to get contemporaneous figures, and it could be that certain elements (ie, the US News peer assessment) may get too much emphasis because they are heavily weighted in multiple rankings).  Further, it looks like some of the results which are surprising (Chicago, Harvard over Yale) are mostly because of the Cooley rankings.

My hope in posting this is to get input from those with knowledge in the area.  Is this a worthwhile project at all?  What changes should we consider in methodology?  Are there different rankings which deserve consideration?  And which law school might win the Rose Bowl against Ohio State, if the law students got to use motor vehicles?

I look forward to receiving some advice.

-- Mark Osler

 

July 2, 2008 | Permalink

TrackBack

TrackBack URL for this entry:
http://www.typepad.com/services/trackback/6a00d8341c8ccf53ef00e5539da0628834

Listed below are links to weblogs that reference BCS-style Law School Rankings (Alpha Test Version):

Comments

The Cooley rankings accuont for all of the variability. The other ranking systems just replicate each other because they are looking at the same factors and underlying data. The BCS system, to the extent it has logic, works because it employs rankings systems based on different methodologies. While all rely to some extent on wins and losses, some use opinion and others use different weightings of various statistics about those wins and losses. You probably would need to limit the ranking systems to Leiter (excluding student quality), Hylton (since his peer assessments are potentially from a different pool than Leiter's) and perhaps some of the Cooley factors (although it's hard to justify why they do things like explicitly weighting size of the school as a positive, as if the vast majority of schools can't simply admit more students to be bigger). You might even play with Stake's rankings by individual criteria. Even then, the rankings are all too much of an echo chamber for US News to really measure anything differently.

Posted by: Anon | Jul 3, 2008 4:05:03 PM

I'd add Princeton Review on teaching/student experience and Vault, which are a bit different, i think.

Posted by: Jason | Jul 4, 2008 8:49:21 AM

The Cooley rankings provide a great deal of swing (bring down Stanford/Chicago/Duke/Cornell/WUStL, bring up Minnesota/Northwestern/Georgetown)

Posted by: Chris | Jul 20, 2008 2:35:53 PM

The Cooley rankings also brought down UC-B relative to all the other rankings, just reinforcing the fact that you would need to be very confident and explicit about your reasons for including Cooley

Posted by: Anon | Jul 20, 2008 2:45:22 PM

Post a comment