Author: Michael Simkovic

0

Brian Tamanaha’s Straw Men (Part 2): Who’s Cherry Picking?

(Reposted from Brian Leiter’s Law School Reports)

BT Claim 2:  Using more years of data would reduce the earnings premium

BT Quote: There is no doubt that including 1992 to 1995 in their study would measurabley reduce the ‘earnings premium.’” 

Response:  Using more years of historical data is as likely to increase the earnings premium as to reduce it

We have doubts about the effect of more data, even if Professor Tamanaha does not.

Without seeing data that would enable us to calculate earnings premiums, we can’t know for sure if introducing more years of comparable data would increase our estimates of the earnings premium or reduce it.

The issue is not simply the state of the legal market or entry level legal hiring—we must also consider how our control group of bachelor’s degree holders (who appear to be similar to the law degree holders but for the law degree) were doing.   To measure the value of a law degree, we must measure earnings premiums, not absolute earnings levels.

As a commenter on Tamanaha’s blog helpfully points out:

“I think you make far too much of the exclusion of the period from 1992-1995. Entry-level employment was similar to 1995-98 (as indicated by table 2 on page 9).

But this does not necessarily mean that the earnings premium was the same or lower. One cannot form conclusions about all JD holders based solely on entry-level employment numbers. As S&M’s data suggests, the earnings premium tends to be larger during recessions and their immediate aftermath and the U.S. economy only began an economic recovery in late 1992.

Lastly, even if you are right about the earnings premium from 1992-1995, what about 1987-91 when the legal economy appeared to be quite strong (as illustrated by the same chart referenced above)? Your suggestion to look at a twenty year period excludes this time frame even though it might offset the diminution in the earnings premium that would allegedly occur if S&M considered 1992-95.”

There is nothing magical about 1992.  If good quality data were available, why not go back to the 1980s or beyond?   Stephen Diamond and others make this point.

The 1980s are generally believed to be a boom time in the legal market.  Assuming for the sake of the argument that law degree earnings premiums are pro-cyclical (we are not sure if they are), inclusion of more historical data going back past 1992 is just as likely to increase our earnings premium as to reduce it.  Older data might suggest an upward trend in education earnings premiums, which could mean that our assumption of flat earnigns premiums may be too conservative. Leaving aside the data quality and continuity issues we discussed before (which led us to pick 1996 as our start year), there is no objective reason to stop in the early 1990s instead of going back further to the 1980s.

Our sample from 1996 to 2011 includes both good times and bad for law graduates and for the overall economy, and in every part of the cycle, law graduates appear to earn substantially more than similar individuals with only bachelor’s degrees.

 

Cycles

 

This might be as good a place as any to affirm that we certainly did not pick 1996 for any nefarious purpose.  Having worked with the SIPP before and being aware of the change in design, we chose 1996 purely because of the benefits we described here.  Once again, should Professor Tamanaha or any other group wish to use the publicly available SIPP data to extend the series farther back, we’ll be interested to see the results.

0

Brian Tamanaha’s Straw Men (Part 1): Why we used SIPP data from 1996 to 2011

(Reposted from Brian Leiter’s Law School Reports)

 

BT Claim:  We could have used more historical data without introducing continuity and other methodological problems

BT quote:  “Although SIPP was redesigned in 1996, there are surveys for 1993 and 1992, which allow continuity . . .”

Response:  Using more historical data from SIPP would likely have introduced continuity and other methodological problems

SIPP does indeed go back farther than 1996.  We chose that date because it was the beginning of an updated and revitalized SIPP that continues to this day.  SIPP was substantially redesigned in 1996 to increase sample size and improve data quality.  Combining different versions of SIPP could have introduced methodological problems.  That doesn’t mean one could not do it in the future, but it might raise as many questions as it would answer.

Had we used earlier data, it could be difficult to know to what extent changes to our earnings premiums estimates were caused by changes in the real world, and to what extent they were artifacts caused by changes to the SIPP methodology.

Because SIPP has developed and improved over time, the more recent data is more reliable than older historical data.  All else being equal, a larger sample size and more years of data are preferable.  However, data quality issues suggest focusing on more recent data.

If older data were included, it probably would have been appropriate to weight more recent and higher quality data more heavily than older and lower quality data.  We would likely also have had to make adjustments for differences that might have been caused by changes in survey methodology.  Such adjustments would inevitably have been controversial.

Because the sample size increased dramatically after 1996, including a few years of pre 1996 data would not provide as much new data or have the potential to change our estimates by nearly as much as Professor Tamanaha believes.  There are also gaps in SIPP data from the 1980s because of insufficient funding.

These issues and the 1996 changes are explained at length in the Survey of Income and Program Participation User’s Guide.

Changes to the new 1996 version of SIPP include:

Roughly doubling the sample size

This improves the precision of estimates and shrinks standard errors

Lengthening the panels from 3 years to 4 years

This reduces the severity of the regression to the median problem

Introducing computer assisted interviewing to improve data collection and reduce errors or the need to impute for missing data

Introducing oversampling of low income neighborhoods
This mitigates response bias issues we previously discussed, which are most likely to affect the bottom of the distribution
New income topcoding procedures were instituted with the 1996 Panel
This will affect both means and various points in the distribution
Topcoding is done on a monthly or quarterly basis, and can therefore undercount end of year bonuses, even for those who are not extremely high income year-round

Most government surveys topcode income data—that is, there is a maximum income that they will report.  This is done to protect the privacy of high-income individuals who could more easily be identified from ostensibly confidential survey data if their incomes were revealed.

Because law graduates tend to have higher incomes than bachelor’s, topcoding introduces downward bias to earnings premiums estimates. Midstream changes to topcoding procedures can change this bias and create problems with respect to consistency and continuity.

Without going into more detail, the topcoding procedure that began in 1996 appears to be an improvement over the earlier topcoding procedure.

These are only a subset of the problems extending the SIPP data back past 1996 would have introduced.  For us, the costs of backfilling data appear to outweigh the benefits.  If other parties wish to pursue that course, we’ll be interested in what they find, just as we hope others were interested in our findings.

0

Brian Tamanaha’s Straw Men (Overview)

(Cross posted from Brian Leiter’s Law School Reports)

Brian Tamanaha previously told Inside Higher Education that our research only looked at average earnings premiums and did not consider the low end of the distribution.  Dylan Matthews at the Washington Post reported that Professor Tamanaha’s description of our research was “false”. 

In his latest post, Professor Tamanaha combines interesting critiques with some not very interesting errors and claims that are not supported by data.   Responding to his blog post is a little tricky as his ongoing edits rendered it something of a moving target.  While we’re happy with improvements, a PDF of the version to which we are responding is available here just so we all know what page we’re on.

Stephen Diamond explains why Tamanaha apparently changed his post: Ted Seto and Eric Rasmusen expressed concerns about Tamanaha’s use of ad hominem attacks.

Some of Tamanaha’s new errors are surprising, because they come after an email exchange with him in which we addressed them.  For example, Tamanaha’s description of our approach to ability sorting constitutes a gross misreading of our research.  Tamanaha also references the wrong chart for earnings premium trends and misinterprets confidence intervals.  And his description of our present value calculations is way off the mark.

Here are some quick bullet point responses, with details below in subsequent posts:

  • Forecasting and Backfilling
    • Using more historical data from SIPP would likely have introduced continuity and other methodological problems
    • Using more years of data is as likely to increase the historical earnings premium as to reduce it
    • If pre-1996 historical data finds lower earnings premiums, that may suggest a long term upward trend and could mean that our estimates of flat future earnings premiums are too conservative and the premium estimates should be higher
    • The earnings premium in the future is just as likely to be higher as it is to be lower than it was in 1996-2011
    • In the future, the earnings premium would have to be lower by **85 percent** for an investment in law school to destroy economic value at the median
  • Data sufficiency
    • 16 years of data is more than is used in similar studies to establish a baseline.  This includes studies Tamanaha cited and praised in his book.
    • Our data includes both peaks and troughs in the cycle.  Across the cycle, law graduates earn substantially more than bachelor’s.
  • Tamanaha’s errors and misreading
    • We control for ability sorting and selection using extensive controls for socio-economic, academic, and demographic characteristics
    • This substantially reduces our earnings premium estimates
    • Any lingering ability sorting and selection is likely offset by response bias in SIPP, topcoding, and other problems that cut in the opposite direction
    • Tamanaha references the wrong chart for earnings premium trends and misinterprets confidence intervals
    • Tamanaha is confused about present value, opportunity cost, and discounting
    • Our in-school earnings are based on data, but, in any event, “correcting” to zero would not meaningfully change our conclusions
  • Tamanaha’s best line
    • “Let me also confirm that [Simkovic & McIntyre’s] study is far more sophisticated than my admittedly crude efforts.”
36

Nonrespondent law graduates and other sampling questions

The Washington Post reports one possible concern with estimates of the Economic Value of a Law Degree:

“[Paul] Campos argues that low-earning lawyers may be less likely to participate in SIPP in the first place because of the stigma involved in admitting that, even anonymously.”

By email, Jerry Organ asks related questions about the representativeness of our sample.

“SIPP” is the United States Census Bureau’s Survey of Income and Program Participation, and is one of the primary data sources used in The Economic Value of a Law Degree.  Campos worries about stigma and non-response.  Thankfully SIPP is specifically designed to deal with these problems and to include impoverished and stigmatized members of the population, including those who receive government aid.

The Census Bureau explains SIPP’s purpose as follows:

 “To collect source and amount of income, labor force information, program participation and eligibility data, and general demographic characteristics to measure the effectiveness of existing federal, state, and local programs; to estimate future costs and coverage for government programs, such as food stamps; and to provide improved statistics on the distribution of income and measures of economic well-being in the country.”

The Census Bureau elaborates on the use of SIPP to analyze participation in Food Stamps and other anti-poverty programs here.

Census explains in greater detail how SIPP handles issues related to response bias, non-response bias, and weighting here.  SIPP oversamples in poor neighborhoods, imputes when necessary, and adjust the sample weights to approach a nationally representative sample.

It is about a good a survey as one is likely to find conducted by people who care a great deal about nonresponse and accurate estimates.

Additionally, to the extent that any lingering nonresponse bias may cause those with low earnings to be less inclined to participate, this bias will affect both law graduates and bachelor’s degree holders.  What we measure in the Economic Value of a Law Degree is the earnings premium, or difference in earnings that is attributable to the law degree.  The biases should wash out, or more likely, bias down our estimates of the law degree earnings premium, because bachelors are far more likely than law graduates to live in poverty.

Indeed studies that have compared earnings reported in SIPP to earnings from administrative data (tax and social security administration data) find that SIPP data underestimates earnings premiums because more highly educated and higher income individuals tend to underreport earnings, while less educated and lower income individuals tend to over report.  We make no attempt to correct for this downward bias in our earnings premium estimates to offset any lingering selection on unobservables.

Individual response bias issues also won’t affect federal student loan default data, which is administrative data from the Department of Education.  As noted in the article and in previous blog posts, former law students default on their student loans much less frequently than former students of bachelor’s degree or other graduate degree programs.

 

36

Brian Tamanaha Says We Should Look at the Below Average Outcomes (And We Did)

Brian Tamanaha’s response to The Economic Value of a Law Degree, as reported by Inside Higher Education doesn’t capture the contents of the study.  According to IHE, Tamanaha said:

 “The study blends the winners and losers, to come up with its $1,000,000, earnings figure, but that misses the point of my book: which is that getting a law degree outside of top law school – and especially at bottom law schools –is a risky proposition . . . Nothing in the article refutes this point.”

Professor Tamanaha is correct that the $1 million figure is an average, but we didn’t write a 70 page article with only one number in it.

The Economic Value of a Law Degree not only reports the mean or average—it reports percentiles, or different points in the distribution.  At the 75th percentile, the pre-tax lifetime value is $1.1 million – $100,000 more than at the mean.  At the 50th percentile, the value is $600,000.  At the 25th percentile, the value is $350,000.  These points in the earnings distribution do better than breaking out returns by school—they allow that even some people at good schools have bad outcomes (and vice versa).  Thus we capture, and at length, exactly the concern Tamanaha expresses.

Lifetime earnings distribution slide

 

As we discuss in the article, for technical reasons related to regression of earnings to the median, our 75th and 25th percentile values are probably too extreme. The “75th percentile” value is likely closer to the 80th or 85th percentile for lifetime earnings, and the “25th percentile” is likely closer to the 20th or 15th percentile.

In other words, roughly the top 15 to 20 percent of law school graduates obtain a lifetime earnings premium worth more than $1.1 million as of the start of law school. Roughly the next 30 to 35 percent obtain an earnings premium between $1.1 million and $600,000. In the lower half of the distribution, roughly the first 30 to 35 percent obtain an earnings premium between $350,000 and $600,000. Roughly the bottom 15 to 20 percent obtain an earnings premium below $350,000. These numbers are pre-tax and pre-tuition.

Even toward the bottom of the distribution, even after taxes, and even after tuition, a law degree is a profitable investment.  And that is before income based repayment, which can substantially reduce the risk at the bottom of the distribution.

We also present student loan default rates for 25 standalone law schools, most of which are low ranked institutions, and all of which have student loan default rates that are below the average for bachelor’s and graduate degree programs.  The average law school default rate is approximately one third of the average default rate for bachelor’s and graduate programs.

Student Loan Defaults

 

People with law degrees are not immune from risk.  No one is.  But the law degree reduces the risk of financial hardship.  Law degree holders face significantly less risk of low earnings than those with bachelor’s degrees, and also face lower risk of unemployment.  Increased earnings and reduced risk appear to more than offset the cost of the law degree for the overwhelming majority of law students.

Frank McIntyre and I did not miss the point of Brian Tamanaha’s Failing Law Schools.   Rather, we disagree with his conclusions about the riskiness of a law degree because data on law degree holders does not support his conclusions.  We discuss Tamanaha’s analysis on pages 20 to 24 of The Economic Value of a Law Degree.

We believe that Professor Tamanaha’s views deserve more attention than we could give them in the Economic Value of a Law Degree. Because of this, last Spring, we also wrote a book review of Failing Law Schools, pointing out both the strengths and weaknesses of his analysis.  We will make the book review available on SSRN soon.

If Professor Tamanaha disagrees with our estimates of the value of a law degree at the low end, we’re happy to hear it.  But he should not say that we ignored the issue.  We look forward to a productive exchange with him, on the merits.

6

The Economic Value of a Law Degree (part 1 of about 5)

In the classic film It’s a Wonderful Life, George Bailey suffers financial hardship, becomes depressed, and wishes he had never been born. As Bailey attempts suicide, a Guardian Angel, Clarence, intervenes. Clarence magically shows Bailey an alternate universe in which Bailey never existed. Clarence helps Bailey realize that although his life may be hard, a world without Bailey would be far worse for those Bailey cares about.

In an ideal world, we could do for law students what Clarence does for Bailey: run the universe twice. In the first version, the law student attends law school. In the second version, he or she follows another path. With perfect knowledge of long-term outcomes, the student could decide which choice leads to the better life.

In the real world, the closest we can come to this ideal is to compare past outcomes for two groups of individuals who are similar to our prospective law student and were substantially similar to each other, until one group obtained law degrees while the other group did not.

This is the approach that Frank McIntyre and I take in The Economic Value of a Law Degree.  Using large samples and detailed earnings data from the U.S. Census Bureau’s Survey of Income and Program Participation, we measure differences in annual earnings, hourly wages, and work hours between those with law degrees and those who end their education with a bachelor’s degree. Because we include those who are unemployed or disabled, our analysis incorporates differences in risk of unemployment. Read More