Saturday, December 14, 2013

Girls' Growing Advantage on PISA Tests

There is a long history of academic articles and social commentary on boys' advantage on tests in math and science. Two low points: the Barbie doll who chirped "Math class is tough!" and Larry Summer's off-the-cuff, uninformed opinions on why there are not more women in science.

Girls have now caught up with boys on math and science in many tests, including the PISA, which was released last week. At least one article pointed out that, in the US, there is no longer a gender gap in math and science scores.

I saw no articles on a much more striking result: girls' advantage on reading scores is absolutely enormous. This gender gap is much, much larger than the math gap was ten years ago.



When do we get a Ken doll whining "Literacy is hard?" Will pundits soon enlighten us as to why boys are intrinsically unmotivated to read?

Tuesday, October 29, 2013

Proposal for Reforming Student Loan Repayment

Last week I released, through the Hamilton Project of the Brookings Institution, a proposal for reforming the repayment of student loans. This news item from my school's website sums up the event and press coverage:

On October 21, The Hamilton Project hosted a forum on the evolving role of higher education in American society.  At the forum, Susan Dynarski presented a paper titled "Loans for Educational Opportunity: Making Borrowing Work for Today's Students," which served as the focus of a roundtable discussion. The paper was co-authored by Education Policy Initiative postdoctoral research fellow Daniel Kreisman.

Dynarski and Kreisman propose a single, income-based loan repayment system that automatically deducts payments from borrowers' paychecks to replace the current array of repayment options. Dynarski emphasized that the U.S. currently has "not a debt crisis, but a repayment crisis," stating: 

"We have a repayment crisis because student loans are due when borrowers have the least capacity to pay. It often takes years for college graduates to settle into a steady, high-paying job that reflects the value of their education."

Dynarski and Kreisman's proposal was one of three papers released by The Hamilton Project suggesting major changes to the financial aid system. Their work received coverage in The New York Times, The Chronicle of Higher Education and Inside Higher Education.

Thursday, October 17, 2013

Postdoc Position in Education Research at Univ of Michigan

We are looking for a great postdoc to work on our team here at University of Michigan. We have a tremendous set of resources for a recently-minted social scientist (econ, sociology, poli sci, education, psych...) who wants to broaden and deepen skills in experiments, quasi-experimental analysis, and working with state and district partners in evaluation.

http://www.edpolicy.umich.edu/training/epi-postdoc.pdf

Sunday, October 13, 2013

The Missing Manual: Using National Student Clearinghouse Data to Track Postsecondary Outcomes

I have a new paper with two great colleagues, Steve Hemelt (former Michigan post-doc, now a prof at UNC-Chapel Hill) and Josh Hyman (current post-doc, newly-minted PhD in economics and public policy from Michigan). The three of us have worked extensively with data from  the National Student Clearinghouse, and in this paper we share insights and advice about this relatively new data rource.

"This paper explores the promises and pitfalls of using National Student Clearinghouse (NSC) data to measure a variety of postsecondary outcomes. We first describe the history of the NSC, the basic structure of its data, and recent research interest in using NSC data. Second, using information from the Integrated Postsecondary Education Data System (IPEDS), we calculate enrollment coverage rates for NSC data over time, by state, institution type, and demographic student subgroups. We find that coverage is highest among public institutions and lowest (but growing) among for-profit colleges. Across students, enrollment coverage is lower for minorities but similar for males and females. We also explore two potentially less salient sources of non-coverage: suppressed student records due to privacy laws and matching errors due to typographic inaccuracies in student names. To illustrate how this collection of measurement errors may affect estimates of the levels and gaps in postsecondary attendance and persistence, we perform several case-study analyses using administrative transcript data from Michigan public colleges. We close with a discussion of practical issues for program evaluators using NSC data."

Thursday, October 3, 2013

Government Jobs and "Miserly" Pay

This article lacks perspective on what low-wage workers earn in the US:
"And while pay for senior civil servants can be generous, other salaries can be equally miserly. Wages can vary depending on the location, but in Philadelphia, jobs at salary level GS-2 — a post that typically goes to someone with a high school diploma and no experience — pay as little as $24,379 annually."
 My quick tabulation of the March 2012 Current Population Survey (I have it on DropBox, if you are stymied by the BLS blackout!) shows that, among full-time workers in their early twenties with only a high school degree, median earnings are $18,000 and the 75th percentile is $28,000. Government work has always been a safe haven for those with low skills, and $25,000 a year does not look miserly for those competing for jobs in this particular market. A G-2 is a great job for someone with no experience and little education.

Wednesday, August 28, 2013

Sex! Money! Religion!

I spend a lot of time bashing epidemiological papers, because they are such easy targets and get such breathless headlines. I am pleased to have the opportunity to bash an economics paper, courtesy of a tip from esteemed colleague David FiglioDavid pointed me to a paper that concludes that more sex produces higher wages. It's forthcoming in the International Journal of Manpower (no, I am not making this up).

When we compare the sexual activity of those with high and low wages, we might worry that there are confounding factors that lead a person to succeed in both the mating market and labor market. For example, if you are persuasive enough to talk someone out of their pants, you may well be persuasive at talking yourself into a raise.  If you have the health and energy to be productive at work, those traits help you be more reproductive, too. Simultaneity bias is also plausible: the relationship could run the other way, with higher wages leading to more success in the sex market.

If we wanted to apply gold-standard research methods to this question, we would gather up volunteers, randomize them into treatment and control groups, and expose members of the the treatment group to an intervention that increases their sexual activity. I will leave this intervention to your imagination. We would then compare the wages of the treatment and control group to get the effect of the intervention. A bit more statistical fiddling would get us the causal impact on sex on wages, at least for those who were stimulated to action by the intervention.

In this paper, the author does not run a randomized trial, but instead uses observational data to draw his conclusions - specifically, an instrumental-variables strategy.   The idea here is to find a variable (the "instrument") that is correlated with the "treatment" of interest (sex) but is correlated with the outcome (wages) only through that treatment.

In this particular context, an instrument is valid if it 1) is correlated with sexual activity and 2) is not correlated with wages through any channel except sexual activity. The first condition is called the relevance condition and the second the exclusion restriction. 

The relevance condition is testable. A strong instrument (must ... resist... salacious ... pun) scores an F-test of 10 or above in the equation that estimates the relationship between sex and the instrument. Instruments that pass this test are a dime a dozen. Don't let a strong instrument turn your head - hold out for a plausible exclusion restriction.  The exclusion restriction is not formally testable - it's an identifying assumption. A good instrumental-variable paper will kick the tires hard on the instrument, using both data and knowledge of the world to make the case that there is no possible channel through which the instrument affects the outcome of interest.

In this paper, the author's identifying assumption is that religiosity has no relationship with economic activity except through its (negative) correlation with sexual activity.

Could religion have a relationship with wages, other than through its "effect" on sex? Are these results believable?

I now channel Max Weber (who penned The Protestant Ethic and the Spirit of Capitalism):

Max says "Ja," and "Nein."




Sunday, August 25, 2013

Friday, August 16, 2013

Caffeine Drives the Finnish Education Miracle

I have discovered the driver of the Finnish education miracle! It's not the lack of testing, or the well-paid teachers, or the late school-starting age. It's that Finns are the top coffee drinkers in the world, by a long shot. Per-capita consumption is 12 kilos a year, three times that in the the US. Norway is a distant second at 10 kilos.

Since each kilo produces about 120 shots of espresso, this implies that Finns (including kids!) average four cups coffee a day. Take the kids out of this denominator (17% of the population is under 15) and you get closer to five cups a day per adult.  According to recent, crappy research on the health effects of coffee, everyone in Finland should now be dead.

Causal Inference: Coffee Edition

Aigh! Coffee won't kill me, but bad science (and science reporting) might. 
"In a study of more than 40,000 individuals, researchers found that people who drink more than 28 cups per week (that's about four a day) have a 21 percent increased mortality risk and a more than 50 percent increased risk if under 55."
Kudos to the reporter for helpfully dividing 28 by 7 for the reader! I might have severely misinterpreted this research otherwise.

Now, how about we give the magnitudes of our estimates a sniff test before publishing? Four cups of coffee a day increases mortality by 50%? This is enormous. Note that this is an increase in mortality from all causes - not just mortality from conditions that might have a plausible, theoretical link to coffee consumption, such as bladder cancer, uncontrolled tremors, and concussion from bouncing off walls. 

Here is a sniff test of the magnitude of this estimate: a similar, correlational analysis showed that light smoking (less than half a pack of day) is associated with an increase in all-cause mortality of 30%. Heavy smoking (more than half a pack a day), an increase of 80%.  These magnitudes are in the same ballpark as the coffee study, which immediately suggests to me that the coffee estimates are absurd. 

I would hazard that maybe, just maybe, there are confounding factors that the coffee study did not pick up. For example, the authors did not control for physical activity, education or marital status. If inactive, single, high school dropouts drink more coffee, we would get the inflated estimates we see in this paper, since this population dies younger than those who are active, married and better educated.

Now, the smoking estimates have the same weakness as the coffee estimates: they are conditional correlations that do not imply cause and effect. The critical difference is that in the case of cigarettes we have plausible theoretical links between exposure and mortality (lung cancer, emphysema) and decades of clinical and lab science that shows convincingly that cigarettes kill people. Coffee? Nada, though the bluestockings have been trying for a long time to show that it must be harmful. With this lack of theory and evidence linking coffee and mortality, the coffee researchers should be especially cautious in interpreting their correlations as causal.

So, file this one under lousy research as well as lousy reporting. This study gives the cocoa case a run for its money. We are fast building a hall of shame of hot-beverage research and reporting. 


Tuesday, August 13, 2013

Causal Inference: Better than Cocoa Edition

In today's headlines, a story about autism that communicates science so much better than was the case in the cocoa debacle. This is a wonderful example of how reporters and researchers can communicate research in accurate yet non-technical language. Reporters, researchers and press offices, take note!

First, the headline: "Preliminary study suggests link between inducing labor and autism." The wording does an excellent job of communicating that this is suggestive work. The headline appropriately makes no health recommendations, which would not be justified by this research.

Second, the reporting: "A new study suggests that babies born after their mother's labor is medically induced or accelerated might have an increased risk of autism. The study, published today in the academic journal JAMA Pediatrics is preliminary and does not prove cause and effect." Be still, my heart. Perfect. 

Third, the researcher's quotes: "Still, it’s a statistical signpost directing researchers to take a closer look at possible links between expediting labor — often it’s to save the life of the mother and child — and autism, said Marie Lynn Miranda, lead author of the paper and a University of Michigan professor of pediatrics and environmental informatics. 'We have a lot of kids with autism and we know the rates are increasing, but we don’t the causes,' she said." Professor Miranda describes the process of science beautifully. She found an association - a statistical signpost - that has produced a testable hypothesis that scientists can pursue with methods that can extract a causal relationship.

And she's from Michigan! Go Blue!

Monday, August 12, 2013

Stand and Deliver: Effects of Boston's Charter High Schools on College Preparation, Entry, and Choice

I have a new paper with my Cambridge collaborators on the effects of Boston's charter schools on preparation for college and college choice. In previous work (which I blogged about here), we looked at the effects of these schools on the state's standardized test, the MCAS. We found large and positive effects of charter attendance, with effects largest for kids who most need help: Blacks, Hispanics, those with limited English proficiency, special ed kids, those who have the lowest baseline scores. The effect sizes are huge - kids at charters gain 0.1-0.2 standard deviations each year on their peers at the traditional public schools. This earlier paper on test scores has now been published in the Quarterly Journal of Economics.

Test scores are not what makes the world go round, however - the aim of education is to make better, smarter, happier, well-rounded citizens. While we don't measure all of these things in our new work, we gain some ground by examining preparation for college (in the form of the SAT and Advanced Placement scores), college entry and choice of college. As the children attending these schools age, we hope to look at yet more outcomes.

Summary of the paper's findings: 
We use admissions lotteries to estimate the effects of attendance at Boston's charter high schools on college preparation, college attendance, and college choice. Charter attendance increases pass rates on the high-stakes exam required for high school graduation in Massachusetts, with especially large effects on the likelihood of qualifying for a state-sponsored college scholarship. Charter attendance has little effect on the likelihood of taking the SAT, but shifts the distribution of scores rightward, moving students into higher quartiles of the state SAT score distribution. Boston's charter high schools also increase the likelihood of taking an Advanced Placement (AP) exam, the number of AP exams taken, and scores on AP Calculus tests.
Finally, charter attendance induces a substantial shift from two- to four-year institutions, though the effect on overall college enrollment is modest. The increase in four-year enrollment is concentrated among four-year public institutions in Massachusetts. The large gains generated by Boston's charter high schools are unlikely to be generated by changes in peer composition or other peer effects.
The numbers behind this summary are pretty impressive. Charter attendance increases SAT scores by about a third of a standard deviation and doubles the likelihood of taking and passing an Advanced Placement test. Charter attendance quadruples the likelihood of taking the AP calculus exam (from 6% to 27%) and quintuples the likelihood of getting a passing score (from 1.5% to 9%). As these numbers make clear, a lot of the kids induced to take an AP course don't end up getting college credit for it. The fact that they are able to take the class at all, however, indicates that they have taken a strong set of college-prep classes. In particular, you can't take calculus without having taken algebra, trigonometry and geometry - all of which you need to be in the running for a selective, four-year college and a STEM career.

Charter attendance also affects postsecondary outcomes. Most strikingly, kids who attend Boston's charters are 17 percentage points more likely than their comparable peers to attend a four-year college. There also appears to be a positive effect on attending any college at all, but these estimates are not precise enough to take to the bank. We have to wait for more cohorts of these kids to age into college before we can say anything definitive on this point.

I discussed why we use admissions lotteries to get at these results in my earlier post. To recap: the key empirical challenge in understanding the effect of charter schools is selection bias: kids who go to charter schools are different in both observable and unobservable ways from kids who don't.  Are kids whose parents are highly educated or motivated concentrated at charters? Kids whose test scores were plummeting in the public schools? Kids who were not challenged in the public schools? All of these differences would contaminate any effort to compare the achievement of kids at charters and kids at public schools. 

We solve this problem by exploiting the randomized lotteries conducted by over-subscribed charter schools. The lottery approach focuses on students who apply to charters, comparing outcomes for those who lose the lottery to those who win. A mere coin flip (or randomly-generated number) separates the lottery winners and losers, so we can be confident that they are alike in every observable and unobservable way - except for their charter school attendance. This closely approximates the gold standard of a randomized, controlled trial. The What Works Clearinghouse has reviewed an earlier version of our paper and given it a provisional "Meets Standards without Reservation." 



Thursday, August 8, 2013

Causal Inference: Hot Cocoa Edition

"A study of 60 elderly people with no dementia found two cups of cocoa a day improved blood flow to the brain in those who had problems to start with."
There was no randomized control group in this study, nor even a non-randomized control group. It has a pre-post design:
"[R]esearchers asked 60 people with an average age of 73 to drink two cups of cocoa a day - one group given high-flavanol cocoa and another a low-flavanol cocoa - and consume no other chocolate."
Time passed, lives were lived, and people drank cocoa. Then, the researchers attributed changes to the drinking of cocoa:
"Study author Dr. Farzaneh Sorond, a neurologist at Brigham and Women's Hospital and assistant professor of neurology at Harvard Medical School, said chocolate seemed to boost the brain's blood supply, citing an 8.3 percent increase in blood flow after a month's worth of hot cocoa...'In people with impaired blood flow, she added, "cocoa may be beneficial by delivering more fuel.'"
This research design is not up to the inferences and recommendations being made. It is a flimsy foundation for any medical advice. Yet that is how it is being sold:  "Chocolate is the New Brain Food, "Cocoa Can Prevent Memory Loss."  

Crappy science reporting is often the dangerous offspring of a press office that writes a sexy, misleading press release and lazy reporters who swallow it whole. A researcher can lose control of the message. The quotes above indicate that, in this case, the researcher is also complicit.  

Friday, July 19, 2013

Leaning In

In my usual cranky manner, I refuse to read a book (or watch a TV series) until the buzz has died down. Lean In by Sheryl Sandberg is full of sensible, uncomfortable truths. A safe bet is that few people who have pissed on it have actually read it. 

On the applicability of her advice to professional vs. working-class women:
"I am also acutely aware that the vast majority of women are struggling to make ends meet and take care of their families.  Part of this book will be most relevant to women fortunate to have choices about how much and when and where to work; other parts apply to situations that women face in every workplace, in every community, and in every home."
On women's increasing education, compared to their stagnant leadership roles:  
"But while compliant, raise-your-hand-and-speak-when-called-on behaviors might be rewarded in school, they are less valued in the workplace."
Worth a read. Then, form an opinion.

Tuesday, July 16, 2013

College Graduates and the Recession

The Pew Charitable Trust has released a terrific report (OK, they released it in January, when I was in blogger hibernation) on how the recession affected those with and without college degrees. 

The bottom line: the recession sucked, but it sucked most if you did not have a college degree. From the report:
"Although all 21- to 24-year-olds experienced declines in employment and wages during the recession, the decline was considerably more severe for those with less education." 
"The comparatively high employment rate of recent college graduates was not driven by a sharp increase in those settling for lesser jobs or lower wages." 
"Out-of-work college graduates were able to find jobs during the downturn with more success than their less-educated counterparts." 
College is insurance against bad economic times. If you are (or know) a recent college graduate upset about current job prospects, go out and agitate for more stimulus spending, or looser monetary policy, or extended unemployment benefits. But don't go telling people (especially teenagers!) that a college education is a raw deal. It's the safest harbor during stormy economic times.

Sunday, July 14, 2013

The Value of Clinical Trials, in Education and Cancer Research

A great op-ed in today's New York Times explains the value of randomized trials. The parallels with education are striking.

As explained in the article, many pharmaceutical companies, doctors and patients are frustrated by the failure in randomized trials of so many cancer drugs that looked promising in uncontrolled trials. I have heard the same frustration vented in education circles, with the What Works Clearinghouse disparagingly referred to the "What DOESN'T Work Clearinghouse." 

The education trials that find no positive effects are not failed studies. They are successful studies in that they keep us from wasting millions of dollars (and kids' and teachers' time) on the latest cool, sexy, exciting, elegant fad that doesn't work. They also keep us searching for something better.

I admire the patient (and beautifully scientific) perspective of the cancer researcher quoted at the end of the article: 
"His definition of a successful clinical trial? 'At the end of the the day,' he says, 'regardless of the result, you've learned something.'"


Thursday, July 11, 2013

Headlights Off, Driving in the Dark

Canada pretty much destroyed the long form of its household census by making response optional. The response rate has plummeted, and cities now find themselves without the data they need to make policy decisions.

Friend Nolan Miller informs me that legislation was recently introduced in the US Senate that would eliminate all federal data collection except the constitutionally-mandated, decennial head count. Unemployment rate? GDP growth? Nah, who needs to know?


Call for Papers Using State Longitudinal Data



Along with Mark Berends of Notre Dame, I am editing a special issue of Educational Evaluation and Policy Analysis. The call is below; submissions are due October 1, 2013.


Research Using Longitudinal Student Data Systems: Findings, Lessons, and Prospects
Issue Editors: Mark Berends and Susan Dynarski
Expected Publication Date: 2014
Over the past decade, there has been an explosion in the availability to education researchers of largescale longitudinal, student-level data sets.Chicago, Florida, North Carolina, and Texas were leaders in the move to open these databases to researchers. The Institute of Education Sciences of the U.S. Department of Education, through a variety of funding initiatives, has encouraged researchers to partner with states, districts, and other education practitioners to use the data to develop research that can inform education policy. IES points to the need for these research partnerships in a recent publication: 
The Institute recognizes that evidence-based answers for all of the decisions that education decision-makers and practitioners must make every day do not yet exist. Furthermore, education leaders cannot always wait for scientists to provide answers. One solution for this dilemma is for the education system to integrate rigorous evaluation into the core of its activities. The Institute believes that the education system needs to be at the forefront of a learning society—a society that plans for and invests in learning how to improve its education programs by turning to rigorous evidence when it is available and by insisting that, when we cannot wait for evidence of effectiveness, the program or policy we decide to implement be evaluated as part of the implementation. (Request for Applications, CFDA Number 84.305E)
In a 2014 special issue of Educational Evaluation and Policy Analysis (EEPA), we will publish original research findings that have emerged from these kinds of partnerships. We are soliciting papers from researchers who have worked with states, districts, and other practitioners to formulate research questions and use administrative data sets to answer those questions. We are especially interested in studies that have used experimental and quasi-experimental methods to extract causal relationships from such data. For example: 
  • What is the effect of a state’s need-based grant program on college attendance, choice, and persistence?
  • How does a mandatory algebra requirement in secondary schools affect high school graduation and college attainment?  
We also welcome descriptive, exploratory papers that suggest directions for policy, research, and future partnerships and answer questions such as:
  • What are the achievement gaps between poor and nonpoor children from kindergarten to high school and into postsecondary education?
  • What lessons have researchers and practitioners learned about forming and maintaining research partnerships that they can pass on to future collaborators?
Submission Guidelines 
When you submit your manuscript, please make sure to indicate in the cover letter that it is for the special issue. Submissions received by October 1, 2013, will be considered, but earlier submission is appreciated.When submitting your manuscript, please follow the EEPA manuscript submission guidelines. All manuscripts must be submitted electronically at http://mc.manuscriptcentral.com/eepa.

What Works Clearinghouse reviews one of my charter school papers



The What Works Clearinghouse has done a quick, positive review of one of my charter school papers. This arm of the US Department of Education gives its seal of approval to research studies that meet its high standards for research design. They have given it (tentatively) the highest rating possible: Meets WWC Standards, WIthout Reservation.