Applied Microeconomics
Research InterestsEconomics of Education, Labor Economics, Applied Microeconomics, and Big Data Analytics
Articles- Speaking on Data's Behalf: What Researchers Say and How Audiences Choose (with Jesse Chandler, Mariel Finucane, Jeffrey G. Terziev, and Alexandra M. Resch)
- What Works for Whom? A Bayesian Approach to Channeling Big Data Streams for Public Program Evaluation (with Mariel Finucane, and Scott Cody)
- Understanding Types of Evidence: A Guide for Educators (with Greg Chojnacki, Alex Resch, Alma Vigil, Steve Bates)
- MOOCs as a Massive Research Laboratory: Opportunities and Challenges (with Paul Diver)
- The Productivity of Pell Grant Spending: Enrollment Versus Attainment (with Sarah Turner)
Click for abstract
Abstract: Bayesian statistics have become popular in the social sciences, in part because they are thought to present more useful information than traditional frequentist statistics. Unfortunately, little is known about whether or how interpretations of frequentist and Bayesian results differ. In this paper, we test whether presenting Bayesian or frequentist results based on the same underlying data influences the decisions people made.
Click for abstract
Abstract: In the coming years, public programs will capture even more and richer data than they do now, including data from web-based tools used by participants in employment services, from tablet-based educational curricula, and from electronic health records for Medicaid beneficiaries. Program evaluators seeking to take full advantage of these data streams will require novel statistical methods, such as Bayesian approach. A Bayesian approach to randomized program evaluations efficiently identifies what works for whom. The Bayesian approach design adapts to accumulating evidence: Over the course of an evaluation, more study subjects are allocated to treatment arms that are more promising, given the specific subgroup from which each subject comes. We identify conditions under which there is more than a 90% chance that inference from the Bayesian adaptive design is superior to inference from a standard design, using less than one third the sample size.
Click for abstract
Abstract: This document was created as a resource for districts seeking to evaluate the effectiveness of educational technologies used in their districts. Although the examples focus on educational technologies, the concepts apply to other programmatic decisions undertaken by school districts or other organizations, such as adopting a curriculum or case management approach.
Click for abstract
Abstract: This paper explores the opportunities and challenges that Massive Open Online Course (MOOCs) are generating for research. A wide variety of topics related to pedagogical methods and student incentives lend themselves to research using MOOCs, and throughout we discuss lessons that can be gained both from observational comparisons as well as from the opportunity to run experiments on randomly chosen groups of students. We start by discussing dropout rates and study how students who decide to drop out are different from those who continue in the course. We then discuss class forums and video lectures and how interacting with this material is correlated with achievement. After that, we explore the strong correlation between procrastination and achievement and the implications for course design. We also examine the role of certificates in MOOCs and how they can affect choices and outcomes. Finally, we examine the potential of linking data across courses and the opportunities and challenges of working with data that originates in surveys of MOOC participants. All of these research opportunities offer Big Data challenges as well which have to be addressed with parallel computing.
Click for abstract
Abstract: Does the share of students enrolling in a college receiving federal Pell grants correspond to a college’s effectiveness in equipping students with economically meaningful postsecondary credentials? Focusing on four-year institutions, we propose an alternative measure that estimates the expenditures in Pell grants needed to produce one baccalaureate degree recipient at an institution ("Pell cost"). Our estimates of "Pell cost" tell a compelling story that contrasts sharply with public pronouncements made by organizations such as the New York Times and Education Trust, which have chastised institutions for low representations of students receiving Pell grants. There is wide variation in the Pell cost among four-year colleges and institutions with the same Pell shares vary considerably in their efficiency in using federal dollars. The measures presented in this paper are a "proof of concept" which rely on a number of assumptions necessitated by the limitations of available aggregate data. As such, they are intended to underscore the need for a fuller use of detailed administrative data to assess how well colleges and universities are helping low-income students to achieve goals of economic prosperity.
- Never put off till tomorrow?
- The effects of informational nudges on students' effort and performance: Lessons from a MOOC
- Massive Open Online Courses (MOOCs) as a Brick-and-Mortar Complement (with Louis A. Bloomfield & Sarah Turner)
Click for abstract
Abstract: This paper uses two randomized control trials to show that a very low-cost intervention can increase student achievement in a massive online open course (MOOC). First, an email (directive nudge) sent to a randomly selected group of students encouraged the students to procrastinate less. Students assigned to the treatment group were 16.85 percent more likely to complete the course. Second, another randomized control trial demonstrated that the effect on the completion rate cannot be attributed to the Hawthorne effect.
Click for abstract
Abstract: I evaluate the impact of giving students information that compares their performance to that of their classmates. I run a randomized experiment in the context of a Coursera massive open online course (MOOC), assigning students to one of two possible treatments. In the first, framed positively, students are told how many of their classmates they outperformed on a quiz. In the second, framed negatively, students are told how many of their classmates outperformed them. I find evidence that students respond to this informational nudge, and that framing matters. Students who are doing relatively poorly respond to the negative treatment by exerting more effort, and this effort translates, in some cases, into higher achievement.
Click for abstract
Abstract: We show that a Massive Open Online Course (MOOC) can serve as a complement to a brick-and-mortar introductory physics course. In two different randomized control trials we use small monetary incentives and informational nudges to encourage students into enrolling in a MOOC. Using these experiments as instruments we show that MOOC enrollment significantly improves performance in the brick-and-mortar classroom. For example, a student who chooses to enroll in the MOOC will score, on average, 1.64 standard deviations higher in the Quiz that has the most MOOC-related.