Major Flaw in Law School Rankings

http://www.flickr.com/photos/unclefuz/4605482933/

The law school rankings issued by U.S. News and World Report are widely reported, and are used by many law school applicants as a guide to prospective schools. In fact, you would be hard pressed to find an applicant who at least did not consider the rankings during the application process.

A recent article, however, points out a major flaw in the rankings that has led to a change in future ranking calculations.

Schools not required to report at-graduation employment numbers

In the latest rankings, 74 schools did not report their at-graduation employment numbers. The schools are not technically penalized for not doing so. Instead, the rankings automatically assign them a number that is approximately 30% lower then the number of graduates employed nine-months later.

Because of the large number of schools not reporting the data, U.S. News and World Report believed that schools were attempting to skew the rankings by not reporting the statistic. As a result, the company has indicated they will significantly change its calculation in the future when schools do not self-report.

Employment numbers are important, especially in this economy

Frankly, it seems ridiculous that schools were not more harshly penalized for not reporting that data. In this economy, there must be at least one school who realized their at-graduation employment numbers were more than 30% below the employment rate at nine months.

Applicants have a right to know (and should want to know) how many students are actually employed at the time of graduation. While it does take many graduates a number of months to find a job, that information is still very relevant.

If the at-graduation employment rate is only 50%, but the nine month rate is 80%, that is still a problem. Some students may not be able to afford up to nine months of unemployment. Even if students decide they can afford the lapse, students should know that they might be unemployed when they walk across the stage in front of mom and dad.

Subscribe

Get Lawyerist in Your Inbox, Daily

Current Articles
Current Lab Discussions
  • Jacklyn

    As someone who just recently went through the application process…this is fascinating. I didn’t put huge weight on rankings but still…good to know where the gap is. Thanks for the post!

  • Susan Gainen

    While tens of thousands of dollars of post-law-school debt should focus everyone’s attention on post-JD employment, the intense focus on the “at graduation” statistic is flawed because it values only those jobs that can be acquired before graduation.

    With my blood pressure in check, I will reflect on a meeting that many NALP members attended a few years ago with The Ever Popular Mr. Morse from USNews.

    As recently as 2007, there was a group of schools with the majority of students who knew that they were going to large law firms by Fall of their third year. This was a happy circumstance for the students, for their deans, and for their “at graduation” statistics.

    What was most frustrating about our face-to-face with Mr. Morse was his complete dismissal of the career goals of students who sought other types of employment. He was particularly harsh on the collective experience of students in California schools whose goal was to be public defenders, or, for statistical purposes, those who sought a job that they could not apply for until after they had bar results. He dismissed the concern of school representatives who pointed out that a huge number of their students sought those jobs, making an “at graduation” statistic a cool combination of meaningless and hurtful.

    He was similarly dismissive of students seeking small firm jobs, legal service, non-profit and other government positions, although employers are often unable to offer permanent positions precisely at graduation.

    Mr. Morse and his USNews statistics value the big firm dollar and what was once the big firm certainty of employment at graduation.

    While this may be a conversation for another forum, if “at graduation” is the measure and sole sign of success, then the only worthwhile job is with a large law firm. In the past, that was achievable for a limited number of law students. Recent experience shows unequivocally that the Big Firm first job is neither certain nor permanent. To create a statistical regime that denigrates every other choice seems wrong.

  • Randall Ryder

    @ Susan – I am very confused. The post was about the rankings failing to accurately account for the at-graduation statistic. By no means am I advocating that statistic is “the sole sign of success.” But it is certainly relevant, and it certainly reflects poorly on the value of the rankings.

  • Harvey Birdman

    Great post. The TaxProf blog had a similar one the other day as well. I can’t help but notice that no one is addressing the real problem here. Yes, the fact that some schools didn’t report “at graduation” numbers is troubling. However, the real problem is that the schools “self report” both numbers (at graduation, and 9 months out). I have first hand experience that most schools lie about both of these. Think about it, if a Tier 4 school charging $35k per year actually told the real numbers (around 30% employment), then no one would go there.

    A WSJ article from few years back gave some insight as to how the numbers are fabricated. Here is an excerpt talking about Brooklyn Law School, a top Tier 2 school:

    A glossy admissions brochure for Brooklyn Law School, considered second-tier, reports a median salary for recent graduates at law firms of well above $100,000. But that figure doesn’t reflect all incomes of graduates at firms; fewer than half of graduates at firms responded to the survey, the school reported to U.S. News. On its Web site, the school reports that 41% of last year’s graduates work for firms of more than 100 lawyers, but it fails to mention that that percentage includes temporary attorneys, often working for hourly wages without benefits, Joan King, director of the school’s career center, concedes.

    This explains how the schools lie. They send a survey out to all 300 graduates from the past year. Roughly 40% reply (probably mostly with jobs). The school then counts the 40% as 100% and tells US News that they have 98% employment with a median salary of $100k. The actual numbers may be more like 40% employment with a median salary of $25k (due to 60% unemployment). I have worked in the doc review sweatshops in NYC, which are completely filled with students from these types of schools. These temp jobs, which the schools will gladly count as full employment with $100k salary are not stable employment. If a case settles, the doc review agency gives the reviewers 5 minutes to get out. In addition, the reviewers are not treated like attorneys and rarely can even go into the actual offices of the firm. Temp doc reviewers often go months with no work and spend time on unemployment. However, schools will never tell prospective students that this is what the future holds for them.

    Think about it. Even in this terrible economy when firms of all sizes have laid off associates, almost every school from Yale to the last Tier 4 school is reporting employment in the 80 – 98% range. Could that be possible? Even federal law clerks from Courts of Appeals are unemployed, but somehow New England School of Law has 80% employment 9 months out. This simply cannot be.

    Finally, I will suggest the reason that no one reports on this situation and why this will never change. There is too much money to be made. These schools, with their salesmanship of the hope of a lucrative future, feed unsuspecting students to loan companies, who in turn sell the debt to the investment banks, who in turn package the debt and sell student-loan-backed securities (which do not appear to be risky because doctors and lawyers will typically find a way to make the payments). Everyone from the school to the bank to the hedge fund investing in the pooled loans makes tons of money. The only ones who lose are the students.

    That WSJ article can be found here: http://online.wsj.com/article/SB119040786780835602.html

  • Anonymous

    USNWR reaps untold revenue from its rankings issues–and is marketed to consumers as decision making input about products on which consumers will spend tens of thousands dollars. But what if the product it is selling (i.e. the information) is demonstrably faulty, relies on statistically insignificant samples, is invalid and/or unreliable? Should Morse and USNWR be held responsible for perpetrating fraud?
    The problem is, USNWR rankings for law schools answers only one question: what are the arguably top 15-20 law schools in country. Beyond this, as the lumping of scores for the lowest ranking 160 schools illustrates, the rankings provide no additional insights of value to the consumer. That is, for all prospective consumers of legal education–except those going to the top 15-20 schools–the rankings speak to nothing other than the fact that a school is not in the top 15-20 schools.
    Consider the most significant factor in the methodology: quality assessment. Quality assessment is not a measure of the product itself, but of the product’s image. Unlike consumer reports, USNWR doesn’t ask for user reviews. Instead, it asks individuals to rate the products that they themselves did not buy, did not test, and certainly did not use. Wha? You’re going to ask the toyota buyer to rate the honda,the volkswagen, and the ford. Ehem. Okay. That’s information? And, this is the weightiest measure in the formula.
    And consider who is being asked to provide an assessment: Law deans and professors. What programs do they know most about? Where did they choose to invest their own pennies? Anyone? By an immense margin, in the schools ranked 15-20. Almost exclusively. So now, you’re going not only to ask about something the reviewer did not buy, did not test, and did not use, but you’re going to ask only the buyers of one product in particular to review all the products available. Right. That’s useful. Really. Okay. Would that qualify as “information” for a car buyer?
    Do we really need to go on about lawyer and judge assessments? It is simply more of the same.
    To the extent consumer reports provides information about product quality, and to the extent that user reviews are indicia of actual quality, USNWR should either disclose that it tells nothing about anything other than which are top 15-20, should pan the so-called “assessment” scores, or it should move to a user review system akin to other products consumers spend so much time thinking about and ultimately paying for, i.e. user reviews. Reports of actual users could, for example, be taken at intervals 1-5-10 and 20+ years after graduation. Questions to ask:
    1. on a scale of 1-10, the quality of the education I received was:
    2. the net cost of my education was x, my income is y
    [3. calculated cost to earnings ratio]
    4. on a scale 1-10, I am satisfied with the quality of the education (or
    the educational experience met or exceeded my expectations)
    5. on a scale of 1-10, this education/institution provided access to the level of opportunity which I was seeking
    6. My level of career satisfaction is x on scale of 1-10
    7. If I had the opportunity to go back in time and choose a different school, I would choose the same school, Y/N.
    This is information.
    The USNWR methodology is an awful systems that misinforms consumers. At the same time it is incredibly influential in the decision making of both legal education consumers and law school administrations. In reality, it is information for, by, and about 7% of the products available; products that 93% of consumers can not or do not buy.
    How does this relate to the topic of this post? Accurate “at grad” numbers are a small part (.04) of the formula, whereas quality assessment comprises .4—nearly half of the entire score. As goes the assessment score, so goes the ranking. While interesting, the at grad score is nearly irrelevant. The real problem is that the USNWR methodology is just awful.

  • um

    harvey

    this is the common wisdom for a law school stats skeptic.

    however, in reality the “response rate” for the national 9 months nalp data has been over 90% for many years in a row-with some schools getting 100%.

    are the schools and nalp collusively making up the response rate too? just curious.

    i know the market sucks-that law schools are theivs etc…but isnt it better to use facts that are not patently false to help your case?

    just saying