« The Leaky-Boat World of U.S. News | Main | Men, Women, and Legal Education »
April 1, 2008
Isn't bar passage a terrible law school ranking metric?
Responding to Mark's post on US News rankings, co-blogger Anupam comments that a useful ranking metric "might be Bar Passage, adjusted to reflect the jurisdiction's overall bar passage rate." I could not disagree more, in part because I think bar passage is a very harmful aspect of the US News ranking system. Let me explain:
Bar passage rates tell us what percentage of a law school class has passed the — silly? culturally biased? poorly graded? — timed high-pressured test that many jurisdictions use as one barrier to being a licensed lawyer. I have long been troubled by bar exams for lots of reasons (too numerous to detail here), and I am especially troubled that US News gives these exams extra legitimacy through its ranking criteria. Let me (too) quickly explain my anti-bar bias:
1. I do not think the sole or chief goal of law schools is to help a student pass the timed high-pressured bar exam. Notably, major law schools clearly don't think this should be their sole or chief goal: if they did, law school classes would look and sound and operate much more like Bar-Bri classes.
2. Because law school is obviously about a lot more than bar passage, every rational law student (with sufficient resources) takes a bar prep course. Consequently, it seems fair to assume that bar passage rates reflect the quality of a bar prep course more than the quality of law school instruction.
3. Bar passage rates also, obviously, reflect the quality of the student body that a law school admits. But US News and other rankings already use a variety of other metrics to directly assess/reward the quality of the student body that a school admits.
4. In my view, students and faculty at most schools — at least those outside the top 10 — already obsess way too much about bar passage rates (in part because US News has used this as a metric). I do not want there to be even more energy focused on a timed high-pressured test that seems, in my view, to be pernicious in many ways.
Of course, I may be wailing on Anupam's comment principally because I have long wanted to wail on the craziness I see in bar exam realities. So, because I realize I may be blinded by my anti-bar biases, I would like to hear Anupam and others explain why bar passage might be a useful and valuable law school ranking metric.
April 1, 2008 in Rankings | Permalink
TrackBack
TrackBack URL for this entry:
https://www.typepad.com/services/trackback/6a00d8341c8ccf53ef00e551a2f7788834
Listed below are links to weblogs that reference Isn't bar passage a terrible law school ranking metric?:
Comments
I agree with almost all that you write above. Because bar passage is highly correlated with incoming student quality, then the rate becomes an "echo chamber" as some have described it. Schools ranked higher get students with better incoming numbers, which leads to continued high rates - and vice versa.
I do disagree with this statement: "In my view, students and faculty at most schools — at least those outside the top 10 — already obsess way too much about bar passage rates..."
The reason why is this: students pay good money to go to law school with (usually) one goal - to become lawyers. We may teach them an infinite amount of useful knowledge, but if we fail to help them pass the bar, then their primary goal is frustrated.
It is true that many students who get into law school will fail the bar the first time, and some may never pass the bar. However, for those students on the borderline, law schools should work to a) assist those students in leaping the hurdles to a career in law(however absurd we might think that they are) and b) assist those students in then being the best lawyers they can be.
A law school that truly didn't care about whether its students actually became good, practicing lawyers would be a pretty backward place.
Posted by: Michael Risch | Apr 1, 2008 12:53:32 PM
The NCBE should provide all schools with the MBE scores of the students who take that exam, and schools should be required to report MBE score data, along with the number of students who took the MBE. The variable of what state which bar students take is not a variable in providing consumer data, which should be the ABA's primary concern. US News would then follow.
Posted by: E | Apr 1, 2008 3:21:54 PM
Most students, I assume, read a school's bar passage rate as an indication of how they are likely to do on the bar exam if they go to that school. If this is true, the inclusion of bar passage rates is highly deceptive. Whether a student is likely to pass the bar depends mostly on his or her personal characteristics, very little on his or her school. A student admitted to both Harvard and Suffolk is NOT more likely to pass the bar exam if she decides to go to Harvard -- indeed, Harvard may teach her less about passing the bar than Suffolk.
For the most part, schools' bar passage rates merely measure the quality of their student bodies. They differ from median LSATs and GPAs in that they measure overall quality, not merely median quality. Secondarily, they measure disqualification rates. A school with higher bar passage rates than its incoming LSATs would predict probably flunks out more students after first year. (This is a standard way of boosting bar passage.) The factor having the smallest impact on bar passage is probably the value added by the school. There are a few exceptions to this generalization, but they are few and far between.
Posted by: Theodore Seto | Apr 1, 2008 6:50:18 PM
As a proud member of the Wisconsin Bar, most of whose members were not forced to submit to the bar exam because of the diploma privilege for Marquette and Wisconsin law grads, I think that there is absolutely no basis for the continued existence of bar exams. Before the profound waste of time and money is allowed to continue, there should be a careful study of the level of practice in Wisconsin versus other jurisdictions that require a bar exam to see if a bar exam makes a positive difference. Absent such a showing, no one should believe that the bar exam requirement has anything to do with a better level of practice.
Posted by: Mike Zimmer | Apr 8, 2008 9:38:52 AM
Both the LSAT and the MBE are lousy tests as measures of qualification for the endeavors to which they are the gateways. That said, however, they do provide an opportunity for measuring some of the "value added" to which E refers. If the bar passage rates of a law school's graduates were compared to the LSATs of those taking the later exam (not just to the LSAT numbers for all incoming students, a number likely lowered by those who don't make it through law school), there would be a clear indication of the degree to which the law school helped its students. Any bias from bar-prep courses would be washed out by the fact that virtually all students take them and the limited selection of those courses.
I'd love to test my theory for the law school at which I teach--a school spurned by the ABA for its low-cost, practice- and teaching-focused structure--but alas, we don't require the LSAT (an ABA gripe)and few of our students take it. Yet our bar-pass rates have been about 80% historically of those achieved by all in-state law schools, which include Harvard, B.C, B.U., and three other ABA-approved schools.
Posted by: Andy Starkis | May 20, 2008 2:02:17 PM
The comments to this entry are closed.
Recent Comments