Applying Behavioral Insights in Policy Analysis:
Recent Trends in the United States

Maithreyi Gopalan and Maureen A. Pirog


Abstract


An understanding of human nature and of the motivations that drive human behavior have always informed public policies. The use of behavioral research in public policy analysis, which flows largely from social and cognitive psychology, behavioral economics, and other behavioral sciences, came into sharp focus in the last decade. Since then, policy initiatives incorporating behavioral insights have flourished, and thousands of research articles have been published on that topic. A lot of this research has focused on how behavioral insights used by governments at all levels can improve the delivery of governmental services and improve compliance and use of government services by the public. We review recent trends in policy initiatives that specifically incorporate behavioral insights in the United States and outline a framework for further integrating behavioral insights into the various stages of policy analysis and policy design.

Key Words

policy analysis, behavioral economics, nudges, policy evaluation


Introduction


An e-mail informed by behavioral insights, encouraging the U.S. Department of Defense (DOD) service members to participate in a thrift savings plan, led to roughly 4,930 new enrollments and $1.3 million in savings in just one month (Social and Behavioral Sciences Team [SBST], 2015). A series of eight personalized text messages sent to low-income high school students reminding them to complete required pre-matriculation tasks, led to a 5.7-percentage-point increase in college enrollment (Castleman & Page, 2015). While the above examples sound like marketing campaigns executed by multinational corporations, they were in fact initiatives pilot-tested by the U.S. federal government over the last few years ushering in what just might be a new way by which a government engages with its citizens to improve social welfare.

Governments have always tried to improve social welfare by introducing policies that often entail bringing about a change in citizen’s behavior. However, not until recently has the behavioral paradigm permeated public policy in a more pervasive way. We define the behavioral paradigm as the incorporation of findings from behavioral sciences—such as social and cognitive psychology, and behavioral economics—into public policy. We observe such a trend both across countries as well as in international organizations. For example, in 2015, the World Bank published its flagship World Development Report titled “Mind, Society, and Behavior,” which aimed to advance a new framework for development policy based on a “fuller consideration of psychological and social influences.” Similarly, the European Commission (EC) recently released a report reviewing the use of behavioral insights in policymaking across several different countries in Europe (Lourenc ̧o, Ciriolo, Almeida, & Xavier, 2016). Simultaneously, several national governments have begun to integrate the use of evidence-based research from the behavioral sciences in policymaking by establishing dedicated teams within the bureaucracy. The United Kingdom formed the Behavioral Insights Team (BIT) in 2010, a first-of-its-kind government entity dedicated to the application of insights from the behavioral sciences to public policy issues. Since then, countries such as Denmark, Sweden, Canada, Australia, and the United States, have formed dedicated departments or “nudge units” to develop and apply such behavioral insights to policymaking.

In September 2015, President Obama issued an executive order titled “Using Behavioral Science Insights to Better Serve the American People,” and formally established the SBST. This team, established under the National Science and Technology Council in the United States, consists of behavioral scientists tasked with incorporating behavioral insights into federal policies and programs. In its first year, SBST executed several proof-of-concept projects. These projects ranged from text-messaging campaigns designed to increase college enrollment of low-income students to projects intended to increase retirement savings among federal employees. The growing influence of behavioral insights on public policy is thus undeniable. In this article, following Chetty (2015), we argue that the incorporation of behavioral factors should be seen as a “natural progression of (rather than a challenge to) neoclassical economic tools.”

This article makes three contributions. First, we summarize the recent trends in the U.S. policy initiatives that have begun to incorporate behavioral insights. We will primarily focus our review on research of U.S. policy initiatives within two substantive policy fields—social policy and education policy—defined broadly, because they are at the forefront of testing and evaluating initiatives embedded with a behavioral component. We include research published in peer-reviewed academic journals, working papers, and reports from research think tanks and government agencies (at the federal, state, and local level) between 2010 and 2015 in our review. Second, we organize the research into a conceptual framework by adapting the taxonomy used by the EC in its report (Lourenc ̧o et al., 2016). Our article reviews research on policy initiatives embedded with a behavioral component in the United States and should be viewed as a complement to the recent EC report. Finally, we identify emerging themes from these policy initiatives with the specific aim of providing insights for policy design as well as ex-ante and ex-post policy analyses.1


Figure 1. Classification of Policy Analysis of Initiatives Embedded with Behavioral Insights.

Our thematic review of research on policy initiatives that have incorporated insights from behavioral sciences showcase the tremendous promise of this approach to public policy analysis and policy design. Furthermore, we outline a framework for incorporating such behavioral insights into all stages of policy analysis and effective policy design.


Conceptual Framework


We adapt the taxonomy recently used by the EC in this article to classify research on policy initiatives into three broad categories: behaviorally tested, behaviorally informed, and behaviorally aligned. Figure 1 illustrates our framework for classifying the research on various behaviorally embedded policy initiatives in this review.

Behaviorally tested policy analysis includes evaluation/analysis of those policy initiatives that have been rigorously tested in smaller experiments before scale-up or large-scale implementation. For example, educational interventions aimed at improving students’ noncognitive outcomes, such as grit and growth mindset (the belief that intelligence is not innate but can be developed with deliberate practice), were tested in social psychology labs in universities before being scaled to several schools (Paunesku et al., 2015). Behaviorally informed policy analysis includes evaluation/analysis of policy initiatives that have been designed based on previously available behavioral evidence; however, these initiatives are often not tested as rigorously as the behaviorally tested initiatives before implementation. For example, based on past evidence on reminder notices that improved people’s adherence to payment schedules in domains such as savings and child support payments, SBST designed an e-mail campaign reminding federal student loan borrowers about their repayments. This policy initiative was not piloted before implementation, given the robust evidence on other similarly tested initiatives. Last, behaviorally aligned policy analysis includes evaluation/analysis of policy initiatives that are most often traditional policy tools such as taxes or subsidies that do not explicitly rely on any existing behavioral evidence; however, the evaluation/analysis of these policy initiatives is aligned with a behavioral insight when analyzed post hoc after implementation. For example, Chetty, Friedman, and Saez (2013) observed that people in different states responded differently to the variation in Earned Income Tax Credit (EITC) policies. They then used behavioral insights to explore those differences and found that differences in people’s knowledge about the EITC’s incentive structure explained such spatial variation. They showed how the neoclassical model that typically assumes perfect information or knowledge about tax codes needed to be updated to understand the complexities of human behavioral responses to even traditional policy interventions, such as tax credits.

The subtle distinction we draw here in our classification of behaviorally embedded policy analysis can help unpack the similarities and differences between how various behavioral insights are embedded in policy analysis and help identify emerging themes for more effective policy analysis and design. Finally, behavioral insights have largely been synonymous with nudges. However, our framework recognizes that policy initiatives that incorporate behavioral insights go well beyond nudging (Bhargava & Loewenstein, 2015; Lourenc ̧o et al., 2016).

For example, Thaler and Sunstein (2003) define a nudge as “any aspect of the choice architecture that alters people’s behavior in a predictable way without forbidding any options or significantly changing their economic incentives” (Thaler & Sunstein, 2008, p. 6). One of the nudge-type initiatives that they highlight relates to the use of automatic defaults to increase retirement savings. The underlying behavioral insight was that more people would enroll in a savings plan and likely save more if the default option in a savings plan was to enroll everybody automatically and let people “opt out” if they wanted to, rather than enrolling people only when they “opt in.” Increases of as much as 50 percentage points in savings participation rates were observed in some studies (Madrian & Shea, 2001). These results have also been replicated in many subsequent studies on savings plan participation (Beshears, Choi, Laibson, & Madrian, 2008) as well as in other domains such as organ donations that have similar opt-in/opt-out structures (Johnson & Goldstein, 2003).

However, nudges are just a subset of the policy initiatives that embed an underlying behavioral insight about peoples’ response to a choice architecture. Behavioral insights often go beyond merely altering choice architecture. For example, monetary incentives that are tied to specific savings commitments that encourage savings among low-income individuals have been tested recently and show great promise (Jones & Mahajan, 2015). These interventions are also designed to encourage savings just like the automatic savings enrollment default nudges. However, they do so by providing a new policy tool—commitment devices—that go beyond just changing the choice architecture for potential savers. Similarly, reminder letters that prompt action or the reframing of the content of the message in a reminder letter in terms of gains rather than losses based on insights from loss aversion (Tversky & Kahneman, 1991) to encourage or discourage a behavior are somewhat different from just nuanced design changes advocated specifically by nudges. We support the perspective of Lourenço et al. (2016) and review the literature using this broader view of behavioral insights as applied to policy analyses and research. Thus, we include the analysis/evaluation of nudge-type policy initiatives within our broader framework depending on how a nudge-type policy initiative was evaluated. As illustrated in Figure 1, we observe that most nudge-type policy initiatives were analyzed using behaviorally tested or behaviorally informed approaches.

A systematic review of research on all policy initiatives that have incorporated a behavioral insight in the United States is beyond the scope of this article; however, we employed multiple search strategies to provide a snapshot of such research. First, we identified appropriate studies in Google Scholar and Thomson Reuters Web of Science using relevant keyword searches.2 Close to 80,000 published articles, books, and book chapters emerged in that search within relevant Web of Science categories between 2010 and 2015. We also conducted manual searches in a variety of relevant peer-reviewed academic journals, working papers, and reports to sharpen the focus of our search to include research on initiatives in the United States within the substantive fields of social and education policy, broadly. From these short-listed studies (50), we identify key examples in each category (behaviorally tested, informed, aligned) and use these to illustrate the application of behavioral insights for policy analysis in the main text of this review. In the Appendix, we include Table A1 summarizing results from the more extensive list of short-listed studies to provide an easily accessible reference for scholarship in this burgeoning field of study.


Behaviorally Tested Policy Analysis


Behaviorally tested policy analysis includes the evaluation/analysis of policy initiatives that are piloted in labs or smaller field experiments before being scaled up. These examples showcase how insights from basic research in the behavioral sciences can be harnessed to inform policy.

Behavioral Insight: Provide Timely Information and Increase Saliency of Information

Evidence from behavioral sciences shows that the provision of timely information and an increase in the salience of information presented can improve the take-up of government services. The e-mail campaigns carried out by the SBST, in collaboration with the DOD (mentioned in the introduction of this article) are great examples of behaviorally tested policy analyses. While past evidence exists on how timely, informational messages sent about the benefits of a program to potential beneficiaries result in higher uptake of the savings in other countries (Karlan, McConnell, Mullainathan, & Zinman, 2016), the SBST wanted to ensure that such informational messages would also work within the context of retirement savings for military service members in the United States. In one of the most successful pilot-tests conducted by SBST, the DOD sent approximately 720,000 not-enrolled service members one of nine e-mails, with messages incorporating various behavioral insights—framing the decision to enroll as a “Yes/No” choice, making the benefits of enrollment more salient, clarifying the next steps needed to enroll in the plan, and/or providing information about the projected financial benefits of retirement security (SBST, 2015). The positive results ($1.3 million in savings increase in just a month) from the most effective e-mail message have prompted the DOD to scale up this intervention. The DOD will be sending periodic e-mails with embedded behavioral framing of messages to service members going forward.

MDRC, a nonprofit education and social policy research organization, has led several behaviorally tested policy analyses in collaboration with the U.S. Department of Health and Human Services (U.S. DHHS). They have conducted about 15 randomized controlled trials that incorporated behavioral evidence on information provision as well as information salience. One of MDRC’s most successful policy initiatives, carried out in collaboration with the Texas Office of the Attorney General’s Child Support Division, provides an example. Several states, including Texas, allow child support payments to be lowered for an incarcerated parent during his/her prison term. However, the incarcerated parent has to apply for such an order modification. Many prisoners fail to apply for a child support modification and accrue very high child support arrears. To reduce the complexity in the process, MDRC sent a postcard to a random set of incarcerated parents informing them about the order modification program and a prefilled (with available personal information) application form. Those who received the reminders and the simplified application were 11 percent more likely to apply for the child support modification option (Richburg-Hayes et al., 2014).

Interventions that provided timely information have also shown positive effects on students’ post-secondary outcomes. For example, text messaging campaigns that reminded students to complete tasks needed for matriculation (Castleman & Page, 2015) improved college enrollment particularly among low-income students by 5.7 percentage points (as compared to the control group of low-income students who did not receive text messages). Castleman and Page (2015) also evaluated the impact of a peer mentoring intervention in which college student mentors reached out to a randomized group of high school students via text messages to help them navigate their transition to college. Peer mentoring increased college enrollment by 4.5 percentage points. Similarly, randomized control trials conducted by researchers in collaboration with a nonprofit research think tank, ideas42, included a text messaging campaign that provided information about student loan borrowing costs to students in a community college. Those students who received the text messages borrowed less compared to the students who did not receive text messages, $2,218 compared to $2,401 (ideas42, 2016). Online campaigns using smartphone apps that reminded students about the priority deadline for applying for the Free Application for Federal Student Aid (FAFSA) at a large public university also had a significant impact on FAFSA application completion rates as well as receipt of financial aid awards (ideas42, 2016). Lavecchia, Liu, and Oreopoulos, (2014) review several other educational interventions that have incorporated insights from behavioral sciences, many of which are behaviorally tested.

Interventions designed to provide effective information to aid decision making have also been used in other social policy domains. The U.S. Department of Agriculture Food and Nutrition Service (USDA FNS) pilot-tested four initiatives under the Supplemental Nutrition Assistance Program-Education (SNAP-Ed) project that provided information to low income children and women about the benefits of healthy eating (USDA FNS, 2012). These initiatives that included direct and online education (with substantial variations in program design and levels of exposure) were all rigorously evaluated using randomized control trials and/or quasi-experimental methods. However, only one of those four educational interventions showed a statistically significant positive effect on children’s eating behaviors and caregivers’ purchase and offering of healthy food items such as fruits and vegetables.

Interventions that go beyond just providing information or increasing the salience of information have also been analyzed by several behaviorally tested policy analyses. A central insight from behavioral sciences is that the framing of a message and the affective response invoked by the message matters as much as, if not more than, the specific contents of the message. We review a few behaviorally tested policy analyses that use such behavioral insights.

Behavioral Insight: Reframe the Information Content to Change the Emotional Affective Response of Recipient

The DOD collaborated with the SBST to increase re-enrollment of service members to the thrift savings plan after pilot-testing the use of another behaviorally informed e-mail campaign. The e-mail included three behavioral components—a personalized greeting that included the service member’s name, message emphasizing the timing (a new year) as an opportunity for service members to make a renewed commitment with their finances, and clear information about the steps needed to complete the re-enrollment process. The redesigned e-mail embedded with behavioral insights led to a 5.2-percentage point increase (from 23.5 to 28.7 percent) in re-enrollments in the first week (SBST, 2015). Based on this result, the DOD scaled up the effective behavioral messaging for encouraging re-enrollment.

In another similar initiative, MDRC, in collaboration with Franklin County Child Support Enforcement Agency in Ohio, carried out a randomized control trial to increase overall collections of child support from noncustodial parents. The team designed reminder notices that incorporated several behavioral insights that were sent to a random group of noncustodial parents. The control group received no reminder notice. The number of noncustodial parents who made a payment when sent a reminder was statistically significantly larger (by 3 percentage points) than the number of noncustodial parents in the control group. However, the reminder notice did not result in a statistically significant increase in total collections per person (Richburg-Hayes et al., 2014). Several other interventions implemented by MDRC resulted in treatment effect sizes that ranged between 2 and 3 percentage points in comparison with the control group.

In another behaviorally tested policy analysis, MDRC evaluated a policy initiative to increase the number of Temporary Assistance for Needy Families (TANF) recipients in Los Angeles County to sign up for services such as job search assistance, community service, employment, education, and/or other specialized services as part of their new welfare-to-work participation rules. MDRC used two different messaging strategies that were embedded with behavioral insights regarding loss aversion (Tversky & Kahneman, 1991). The message that highlighted the losses participants might experience by not attending a required activity increased the likelihood of participation initially by 4 percentage points, although these results were not sustained over time (Farrell, Smith, Reardon, & Obara, 2016).

Behavioral Insight: Reduce Complexity of Task

Low take-up of government support programs cannot always be improved by informational messages or reframing of the messages, especially in certain educational domains for vulnerable populations such as low-income students. This insight became apparent as a result of another clever behaviorally tested policy analysis. The FAFSA that students have to fill out to access government aid and other need-based institutional aid is infamous for its length and complexity. Research shows that such complexity acts as a significant barrier to many students accessing higher education and thereby exacerbates the enrollment gap between high- and low-income students (Dynarski & Scott-Clayton, 2006). Researchers, in collaboration with a tax-preparation software company, conducted a randomized field experiment that went beyond informational nudges to students and parents. Low-income families who were receiving tax preparation help were offered personal assistance to complete the FAFSA. Due to the large duplication of information between tax forms and FAFSA, the treated participants received largely prepopulated FAFSA forms in addition to extra guidance for completing the rest of the application and automatic online submission. Treated participants were also provided with personalized aid estimates and comparisons with tuition costs for nearby colleges. The effects of the personal assistance were large. High school seniors whose parents received the treatment were 8 percentage points more likely to have completed two years of college (going from 28 to 36 percent), during the first three years following the experiment. Families who received aid information but no assistance with the FAFSA did not experience improved outcomes (Bettinger, Terry Long, Oreopoulos, & Sanbonmatsu, 2012).

Such heterogeneous, context-dependent effects of several educational interventions highlight the need for designing, implementing, and evaluating several proof-of-concept projects before scaling up a behaviorally inspired intervention. Social-psychological interventions that have begun to show tremendous promise in education in the United States recently follow such an approach (Yeager & Walton, 2011).

Behavioral Insight: Target Students’ Subjective Experiences and Beliefs

These interventions use subtle reading and writing exercises to influence students’ subjective experiences and beliefs to promote their educational and psychological well-being. Social psychologists have traditionally used lab experiments as a first step, where the independent variables of interest are manipulated in a controlled experimental set-up, before implementing tweaked experiments in the field. For example, Paunesku et al. (2015) show that interventions that target students’ beliefs about their ability and motivation in school have significant effects on students’ academic outcomes. Such interventions were implemented to scale (1,500 students in 13 high schools in the United States) after rigorous testing in smaller lab and field experiments (Aronson, Fried, & Good, 2002; Blackwell, Trzesniewski, & Dweck, 2007). Social-psychological interventions that target students’ feelings of belonging on campus, especially during their transition to college, are tested in a variety of university settings—such as large public universities and smaller selective universities (Yeager et al., 2016) before being scaled to a variety of colleges. These interventions were also tested in smaller lab settings (Walton & Cohen, 2011) before being pilot-tested in the field. The central behavioral insight of such social psychological interventions is that by precisely targeting students’ subjective experiences in school, educators can positively impact students’ academic outcomes. Such precise psychological mechanisms, however, need to be drawn from basic laboratory research on attitude change and persuasion, and customized to the different contexts in smaller proof-of-concept studies before large-scale implementation (Yeager & Walton, 2011).

Behavioral Insight: Invoke Social Norms to Promote Desired Behavior

Finally, the above examples seem like most pilot-tested policy initiatives had the intended effect; however, not all behaviorally tested policy analyses reveal such positive effects. We argue that reporting and understanding the causes of such null findings are extremely important to move this research forward. For example, behavioral insights from social psychology have shown that individuals are very sensitive to social pressure and social norms. Smaller lab and field experiments on charitable giving have shown that social pressure—an individual’s fundamental dislike to say “no”—can be used to increase an individual’s charitable giving (Dellavigna, List, & Malmendier, 2012). Similarly, invoking social norms, that is, description of an individual’s peer behavior to encourage/discourage one’s behavior, revealed positive impacts in some settings; however, such insights might not translate to an alternative policy domain as the below behaviorally tested policy analysis example reveals.

The SBST, in collaboration with the US DHHS’s Centers for Medicare and Medicaid Services (CMS), sent letters informed by behavioral insights to a randomized group of medical providers with high prescription rates of controlled substances. Based on past evidence that medical providers respond to normative messages that provided feedback about the providers’ own vaccination rates compared to those of their peers (Kiefe et al., 2001), the SBST letter included details about the medical providers’ prescription rates compared to that of their peers’ rates of controlled- substance prescription. The control group did not receive any letter. In this case, no significant impact was seen in the subsequent year (Sacarny, Yokum, Finkelstein, & Agrawal, 2016). The null results spurred more analysis into the mechanisms of behavior change. Subsequent randomized control trials using letters with revised language based on more recent psychological evidence are currently under way.


Behaviorally Informed Policy Analysis


In contrast to behaviorally tested policy analyses, behaviorally informed policy analyses include the evaluation/analysis of initiatives that are not explicitly tested either as lab or field experiments before being implemented to scale. These initiatives, in most cases, are based on past behavioral evidence that has been rigorously tested in another (often related) policy domain. For example, information provision strategies that have been proven to work based on several other pilot tests in related policy domains, have often been implemented to scale without additional tests. We review a few behaviorally informed analyses of initiatives that were adapted to new policy issues based on robust past evidence.

Behavioral Insight: Provide Timely Information and Increase Saliency of Information

Providing reminders to encourage people to follow through on a desired course of action has shown huge promise in many domains such as personal savings (Karlan et al., 2016) and child support payments (Richburg-Hayes et al., 2014). Based on such evidence, the SBST designed an e-mail campaign reminding federal student loan borrowers about their repayments. The SBST and the Department of Education’s Office of Federal Student Aid (FSA) sent a reminder e-mail to over 100,000 borrowers who had missed their first payment. The e-mail specified that the borrower had missed a payment, and included additional salient information about the steps needed for the borrower to complete payment including an easily accessible link to the service provider’s payment system. The above policy initiative was evaluated using a quasi-experimental pre-post design (i.e., the overall payment rates were compared before and after the e-mail campaign). Although the SBST team did not use the more rigorous experimental approach to this evaluation, their pre-post comparison suggests that the reminder e-mail led to a 29.6 percent increase in the fraction of borrowers making a payment by the end of the first week after delivery of e-mail. Overall, by the end of the first week after the e-mail reminder, student payment amounts went up by 0.8 percentage points (SBST, 2015).

Similarly, based on past evidence on how timely notices increased the use of tax credits, the SBST, in collaboration with the FSA, sent informational e-mails about income-driven repayment (IDR) plans to approximately three million student borrowers. The e-mail included information about the eligibility criteria for IDR plans, the benefits of IDR, costs of not enrolling in IDR, and easily accessible online links to reach the service provider. To evaluate the impact of this initiative, SBST varied the timing of sending these e-mails—e-mails were sent in two waves three weeks apart from each other. The informational e-mail led to a substantial increase in applications for IDR plans within 20 days of the e-mail being sent. Among the group that received the e-mail, 4,327 applied for IDR as opposed to the 982 IDR applications received from the comparison group who had not yet received the informational e-mail. The SBST and the FSA were most concerned about the impact of the e-mail campaign on the seriously delinquent (90–180 days), approximately 800,000 student loan borrowers. Based on the positive results of these initiatives, the FSA has continued to collaborate with the SBST on initiatives designed to simplify the use of IDR. The ongoing efforts range from revising the IDR application form to innovative communication campaigns targeting struggling student borrowers based on scientific evidence from other domains such as take-up of tax credits postinformational notices (SBST, 2015). However, it is important to note that rigorous behaviorally informed policy evaluations of such initiatives described above are exceptions rather than the rule.

Behavioral Insight: Increase Salience of Information to Mitigate Effects of Limited Attention

The U.S. federal government has also implemented certain regulations in the financial sector that have been evaluated in behaviorally informed policy analyses. For example, the Credit Card Accountability Responsibility and Disclosure (CARD) Act of 2009 mandated changes to credit card statements to protect consumers from financial institutions that previously had taken advantage of consumers’ limited attention by obfuscating the true costs of certain financial instruments. These legislated changes were based on past research on people’s cognitive biases such as limited attention. The new law required financial institutions to disclose the length of time it would take to pay off a credit card balance in full if borrowers only pay the minimum monthly amount. This new law increased the salience of fees, and other costs to consumers to mitigate the effects of limited attention. Additionally, credit card companies had to disclose the minimum monthly payment needed to pay off the balance in three years. Agarwal, Chomsisengphet, Mahoney, and Stroebel (2015) evaluate the effectiveness of the CARD act using a quasi-experimental research design and find that the information disclosure requirements only had a negligible (but statistically significant) effect on borrowers’ repayment behavior. Account holders who paid at a rate that would repay the balance within three years increased by less than a percentage point (0.4 percentage points on a base of 5.3 percent). However, other evaluations that tested the mechanism of information disclosure reducing consumer indebtedness related to payday loans lends some support to such regulations. A Jamal Poverty Action Lab led-study by Bertrand and Morse (2011) showed that interventions that provided in-depth information regarding the cost of payday loans to a randomized group of low-income consumers significantly reduced their borrowing frequency, and overall borrowing amounts.

Behavioral Insight: Reduce Choice Overload

Similarly, psychological research has shown that many individuals suffer from choice overload—the inability to meaningfully compare choices when too many choices are provided at once. For example, in a series of classic experiments, Iyengar and Lepper (2000) showed that an extensive-choice context not only increased the burden on mental resources and the time and energy required to make a choice, but also reduced their overall satisfaction. In some cases, the overwhelming number of choices even paralyzed some individuals, preventing them from being able to make any decision at all. A regulatory approach undertaken by the U.S. government to facilitate better decision making that has been informed by research on choice overload is a mandate to standardize product attributes. For example, the federal government mandated that the Medicare supplemental insurance plans (Medigap) for senior citizens must conform to one of 10 standardized plan options (Medicare Improvements for Patients and Providers Act 2008). These plans, denoted with letters of the alphabet (as delineated by the CMS), are standardized across 47 states. For example, the level of coverage (or benefits) under Plan A in Florida is the same as that of Plan A in Indiana.

The number of product choices, or the outcomes of senior citizens who chose these designated products were not explicitly tested using lab or field experiments; however, such behaviorally informed regulatory approaches seem to provide a promising avenue for incorporating insights based on evidence from basic psychological (and other behavioral sciences) research. However, we highlight the need to evaluate such initiatives using behaviorally informed empirical analysis to understand both the impact of the initiative on the outcome of interest and the mediating mechanisms that the underlying behavioral insights presume. Such an evaluation is particularly pertinent in light of a recent review of the literature on the impacts of laws and regulations that require public information disclosure (Loewenstein, Sunstein, & Golman, 2014). The above review finds that while information disclosure, in many cases, does not affect the behavior of the recipients of the information, it seems to significantly affect the behavior of the providers of information.


Behaviorally Aligned Policy Analysis


Last, behaviorally aligned policy analysis includes evaluation/analysis of policy initiatives that are most often traditional policy tools such as taxes or subsidies that do not explicitly rely on any existing behavioral evidence; however, the evaluation/ analysis of these policy initiatives is aligned with a behavioral insight when analyzed post hoc after implementation.

Behavioral Insight: Differential Awareness of Programmatic Components Affect Take-Up

For example, Chetty et al. (2013) show that the EITC, the largest anti-poverty program in the United States, affects labor supply decisions of people differentially based on their knowledge of the tax code. They demonstrate how the neoclassical model of labor supply that typically assumes perfect information about tax codes needs to be updated to incorporate the effects of imperfect information on traditional policy instruments such as tax credits.

Past research on the EITC had demonstrated the effect of the tax credit in increasing the labor force participation of low-income workers. However, the evidence on the intensive margin, that is, hours of work and earnings (conditional on increased labor force participation) was mixed (Eissa & Hoynes, 2006). Chetty et al. (2013) exploited the variation in the tax-credit top-up levels across states to identify the effects of EITC on wage earning using detailed tax-return data, not previously available. One of the researchers’ primary insights was that the claimants differed in their responses to EITC (measured using the distribution of EITC claimants’ levels of reported incomes right around the EITC refund-maximizing amounts) both within and across the states. The researchers hypothesized that the differential response might be driven by differences in peoples’ knowledge about the EITC’s incentive structure and used empirical techniques to unpack those differences. This insight helped explain the spatial variation in responses to EITC. The researchers also help explain how information diffusion might drive such differential response across the intensive and extensive margins uncovered by earlier research.

Behavioral Insight: Reframing the Timing and Mode of Delivery of Programmatic Components Affect Take-Up

Another example of a behaviorally aligned policy analysis is conducted by Richards and Sindelar (2013). They evaluate existing proposals to encourage healthy food choices in the Supplemental Nutrition Assistance Program (SNAP), one of the largest food assistance programs in the United States, using behavioral principles. For example, they evaluate the proposal of subsidizing purchases of healthy foods in the SNAP using a behavioral lens. They recommend changes to the timing and mode of delivery of subsidy that would increase the salience of the subsidy thereby promoting healthy eating behavior. They also make innovative recommendations to changes in the SNAP to promote healthy eating that includes the use of default options to encourage healthy food choices and commitment devices that can be harnessed in addition to traditional price subsidies. We classify the above analysis as behaviorally aligned policy analysis because the proposals evaluated by the authors have not been implemented to date. However, the ex-ante evaluation of the proposed reforms to SNAP using behavioral insights is an excellent example of how behavioral insights can drive policy reforms. We also hope that such reforms are pilot tested and evaluated using behaviorally tested/informed policy analyses in the future.

Behavioral Insight: Can Default Options Result in Crowd-Out Effects?

Another excellent example of a behaviorally aligned policy analysis is the use of innovative empirical strategies and the use of “big data” to explicate the behavioral lever underlying a policy initiative, and/or the mechanisms of behavioral change. For example, a primary concern regarding initiatives that boost savings using automatic enrollment options (described earlier) has been that increases one observes in savings produced by automatic enrollment in savings plans might in fact be offset by reductions in savings (or increases in borrowing) in other savings accounts. Chetty, Friedman, Leth-Petersen, Nielsen, and Olsen, (2014) explore such a hypothesis empirically. They study the impacts of defaults on total savings of individuals by exploiting variation in employers’ contributions to retirement savings accounts (i.e., for all practical purposes similar to an automatic enrollment default option) using a rich panel data from Denmark. They analyze the savings behavior of employees who switch jobs and experience variations in employer contributions to their retirement savings account and find limited evidence for crowd-out effects. Specifically, they find that employees who move to a firm with a more generous pension contribution (at least 3 percentage points higher than the prior employer) on average reduce their own savings contribution by just 0.56 percentage points with no change in their savings in any other taxable account.

In the same study, Chetty et al. (2014) also compare the effectiveness of tax subsidies for pension contributions with the effects of automatic enrollment defaults into employer pension program. The automated enrollment into pension savings has huge impacts relative to tax subsidies. Essentially, a dollar of government expenditure on tax subsidies for pensions increases total savings by only 1 cent whereas the effect of an automatic enrollment default into pension savings is approximately 80 cents. The authors estimate that approximately 85 percent of individuals are “passive savers” who are unresponsive to subsidies (and also unresponsive to changes in any employer contribution amounts); 15 percent of individuals are “active savers” who respond to tax subsidies and reallocate their savings to other tax-saving instruments. Thus, automated defaults appear to work better than tax subsidies that require actions on the part of savers. This study is consistent with others that lends increasing confidence to the notion that automated defaults embedded in policy instruments seem to work well. While the above studies are carried out using data from Denmark, we include this example in our review to showcase the importance of such behaviorally aligned policy analyses.

Behavioral Insight: Reduce Complexity of Task

Finally, behavioral insights that reveal that individuals suffer from choice overload can be used to inform behaviorally aligned policy analyses as well. Specifically, the impacts of poverty on cognitive capacity (Mani, Mullainathan, Shafir, & Zhao, 2013), raises several concerns about the design and impact of several policy initiatives. For example, Bhargava, Loewenstein, and Sydnor, (2015) find that, while everyone struggles when choosing from a complex choice set, low-income households particularly struggle more when making complex choices. They analyze the health plan choices of employees at a large U.S. firm to examine the effects of choosing from a complex choice set. As a consequence, when the government offers numerous and complex options to citizens, it can result in lower average welfare and also have adverse distributional consequences. From a policy perspective, this is salient when offering health insurance plans under health care exchanges as these plans vary across parameters such as deductibles, copay rates, and out-of-pocket maximums.


Figure 2. Framework for Applying Behavioral Insights in Policy Analysis and Policy Design.


Lessons for Policy Analysis and Design


As the above review demonstrates, behavioral insights have informed public policy and continue to do so increasingly across several policy domains. While there has been an overwhelming agreement about the usefulness of incorporating behavioral insights into policy analysis, the exact approach for how to do so remains unclear (Congdon, 2013). In this section, we provide some thoughts on how to apply behavioral insights into the various stages of policy analysis—ex-ante policy analysis, ex-post policy analysis, and future policy design. We distinguish between ex-post policy analysis—that occurs upon or after the policy has been implemented—and ex-ante policy analysis—that occurs before the policy is implemented—to better delineate the specific insights that can be gained from the incorporation of the behavioral perspective in each of those domains.

As Figure 2 illustrates, the thematic organization of the analyses of policy initiatives embedded with a behavioral element (i.e., behaviorally tested, behaviorally informed, and behaviorally aligned initiatives) can be used to inform the various stages of policy analysis and policy design. First, we illustrate how lessons from behaviorally informed and behaviorally tested policy analysis can inform ex-ante policy analysis. For example, evidence from behavioral sciences can be used to enhance our understanding of the underlying policy problem that a policy initiative is trying to solve. Specifically, we describe how a diagnosis of the policy problem can reveal how peoples’ psychological impediments may interact with traditionally defined policy problems such as market failures and government delivery of services.

Second, we describe how lessons from the behaviorally tested policy analysis can be effectively applied to future policy design. We show how behavioral insights can provide a wider repertoire of policy tools at a government’s disposal to intervene and influence behavior change. Furthermore, behavioral testing of policy initiatives can also be used to study and understand the mechanisms of behavior change that can act as building blocks for designing more effective future policies. We also highlight how the government can combine traditional policy tools with newer behaviorally enhanced policy tools to intervene cost-effectively.

Third, we discuss how lessons from behaviorally aligned policy analysis can inform ex-post policy analysis. We show how one can better evaluate the impact of existing policies (that may use traditional policy tools such as taxes/subsidies) if we incorporate the rich evidence available from behavioral sciences.

Finally, we show how a fuller incorporation of behavioral insights into the various stages of policy analysis entails the need to view the policy process itself as cyclical. Lessons from ex-post policy analysis should indeed inform ex-ante policy analysis and future policy design. We describe such examples in the last section where past behaviorally aligned ex-post policy analysis have provided new insights that have driven the implementation and evaluation of new policy initiatives.

Lessons for Ex-Ante Policy Analysis

One of the primary steps in any ex-ante policy analysis is to understand the underlying policy problem that a policy initiative is intending to solve, that is, a rationale for government intervention in the first place. Traditional ex-ante policy analysis, based primarily within the neoclassical welfare economics framework, has focused broadly on two categories of situations that demand government intervention—efficiency and equity. Inefficiency within the neoclassical framework has been explored primarily in the form of various market failures and equity demands that the government intervene to alleviate poverty and/or redistribute resources within a society even in situations that do not necessarily promote efficiency.

The incorporation of behavioral insights into ex-ante policy analysis demands that we reframe how we think about the underlying policy problem that the policy initiative is trying to solve (Congdon, 2013). The primary taxonomy of market failures—public goods, externalities, natural monopolies (or other inefficient market structures), and information asymmetry—can be enhanced with another category that includes peoples’ psychological impediments such as imperfect optimization, bounded self-control, and nonstandard preferences3 (Congdon, Kling, & Mullainathan, 2011, p. 20; Madrian, 2014). Psychological impediments can also be explored as an underlying factor that may exacerbate or attenuate any of the existing categories of market failure or effective delivery of government programs (Congdon et al., 2011). We believe that a diagnosis of the policy problem that analyzes the interaction between psychological impediments and the existing sources of market failure, and the delivery of other governmental programs can be used to integrate behavioral insights more comprehensively into policy analysis. Using a case study of a policy initiative that we reviewed earlier, we illustrate how a behavioral diagnosis of the underlying policy problem can be incorporated into an ex-ante policy analysis.

Nonprofit research organizations such as ideas42 and MDRC have developed systematic approaches to diagnose a policy problem in the delivery of government programs by applying behavioral principles. This approach, referred to as “behavioral diagnosis and design” (Richburg-Hayes et al., 2014), or “behavioral mapping” (Hall, Galvez, & Sederbaum, 2014), includes a series of steps that aims to methodically diagnose the psychological impediments that result in the deviation of programmatic outcomes from a policy’s intended effects. Before MDRC sent out reminder postcards to incarcerated parents in Texas, they carried out a diagnosis of the decision-making environment of an incarcerated parent in the child support order modification policy context in Texas. Researchers identified several psychological impediments in the existing decision-making environment that an incarcerated parent faced. For example, many parents avoided even opening the letter notifying their eligibility for order modification due to a negative emotional response they had to any communication from the child support office. Neither the content of the letter nor the process of application for child support order modification was simple, adding to the cognitive load that the parents already faced. The interventions that MDRC designed included reminder postcards and effective reframing of the order modification message to address each of these psychological impediments.

Such a diagnosis should also be carried out to understand how psychological impediments may interact with market failures. For example, education provision in the United States is a classic example of a public good (with some features of a positive externality). Traditional ex-ante policy analysis advocates government intervention in education provision to mitigate the potential market failure that might result in underconsumption of education. College education in the United States is thus heavily subsidized by the government. However, college enrollment and completion rates, especially for racial minority and first-generation students, are lower than that of white and continuing-generation students (Ifill, Radford, Cataldi, Wilson, & Hill, 2016). Evidence from social psychology has shown how students’ sense of belonging, particularly for racial minority students and first-generation students on campus, can affect their engagement and performance in college (Walton & Cohen, 2011). Such impediments to the psychological processes that affect students’ persistence and performance in college further exacerbates the market failure of underconsumption of education. Social psychological interventions that target and promote students’ sense of belonging are being behaviorally tested across college campuses in the United States (collegetransitioncollaborative.org) to mitigate such externalities. Such policy initiatives can often complement the traditional policy tools that address the public good/positive externality nature of education provision using subsidized loans.

Lessons for Policy Design

Having diagnosed the policy problem, behavioral insights can (and should) be applied to effective policy design. First, behavioral insights can be used to enhance the policy toolkit—default options, e-mail reminders, and text campaigns are examples of new policy tools that have been ushered in by evidence from behavioral sciences research. Traditional policy design has predominantly focused on the use of price incentives and regulations to change behavior—provide subsidies or tax credits to encourage a particular behavior, tax those behaviors that need to be curtailed, or regulate markets to encourage/discourage behaviors. While price incentives are incredibly powerful in many instances in changing behavior, there are limits to the impact price incentives alone can have. The size of incentives, the structure of incentives, the framing of the incentive message, and the salience of the message strongly influence behavior change (Kamenica, 2012). Fryer, Levitt, List, and Sadoff (2012) analyze the impact of reframing a teacher incentive program using principles of loss aversion in nine schools in Chicago. They find that reframing the incentive structure using loss aversion (i.e., teachers are paid in advance and asked to give back the money if their students do not improve) had significant effects on students’ math test scores—students whose teachers received the reframed incentive structure showed between 0.2 and 0.4 standard deviation gains in math test scores.

Second, behavioral insights can (and should) be applied to tweak the traditional policy tools such as taxes and subsidies. For example, text/e-mail reminders can enhance the effectiveness of existing policy tools such as the subsidized FAFSA student loans.

Third, well-designed pilot studies can be used as a first stage before rolling out policy initiatives across the state/country. The UK BIT team uses a “test, learn, adapt” approach for policy design (Lourenço et al., 2016). The approach is based on three key principles: “Test,” that is, the identification of various policy interventions that can be evaluated and analyzed for their effectiveness; “learn” by measuring the results and identifying “what works”; and “adapt” using findings from the above initiatives and their effectiveness to adjust and design future policy intervention accordingly. We propose that an additional consideration about “how” it works is also crucial in the design of new policies. The mediating mechanisms through which behavior change can be influenced as well as the contexts with which these mediating mechanisms interact and covary demands as much attention as “what works” to move this research forward.

The main advantage of testing an intervention in a controlled experiment (in a lab or in smaller field experiments) is that the underlying theory and mechanism of change can be better understood (Mortensen & Cialdini, 2010). Recently, economists have also argued for the use of such mechanism experiments as a precursor to larger policy evaluations using randomized control trials (Ludwig, Kling, & Mullainathan, 2011). They argue that if a hypothesized causal mechanism is not effective in a controlled experimental set-up, a policy evaluation using a larger randomized control trial might be wasteful. Conversely, if the causal mechanism proves effective, especially under multiple contexts, a policy evaluation using a well-designed randomized control trial targeting the causal mechanism of change is not only more cost-effective but will also provide more insights into the efficacy of the intervention. Metaanalysis of experimental studies and policy evaluations must also strive to understand and review the candidate mediating mechanisms along with providing estimates of effect sizes across multiple trials and contexts as was the case with the growth mindset studies (Burnette, Boyle, Vanepps, Pollack, & Finkel, 2013).


Figure 1. Stylistic Example of a Behaviorally Enhanced Policy Analysis.
Note: The above figure is adapted from the figure 9.1 included in Weimer and Vining (2015, p. 205).

Lessons for Ex-Post Policy Analysis

Behavioral insights often enhance the evaluation of existing policy impacts when incorporated meaningfully in an ex-post policy analysis. As discussed earlier, Chetty et al. (2013) incorporate several behavioral insights to estimate the precise impacts of the EITC on both the number of hours worked and wage earnings of individuals across states.

A final lesson from behavioral sciences-inspired approach to public policy is an appreciation for the cyclical nature of the policy process. For example, lessons from behaviorally tested policy analyses—both positive and null findings—have been subsequently used to design other policy initiatives. Chetty et al. (2013) found that peoples’ differential knowledge about EITC across the different states resulted in differential EITC take-up rates. That insight provided the input for other pilot-tests that directly evaluated the mediating mechanism—knowledge about tax codes (Bhargava & Manoli, 2015; Chetty & Saez, 2013). Chetty and Saez (2013) conducted an experiment with 43,000 EITC clients of tax-preparation software company H&R Block. Half of the tax filers were randomly selected to receive information from their tax preparer about the marginal incentive structure of the EITC, while the other half did not. They found that this informational intervention had no effect on individuals’ earnings in the subsequent year on average. However, in another experiment, Bhargava and Manoli (2015) mailed eligible individuals simplified information about the EITC to 35,000 individuals who were eligible for the EITC but did not file the tax forms needed to claim it. They found that such an informational mailing intervention raised EITC filing rates significantly.

It is interesting to note that a potential reason why providing information about EITC eligibility might have increased EITC take-up rates, but information about the EITC tax code structure did not appear to have much impact on earnings, could indeed be another behavioral lever—cognitive overload (Chetty, 2015). For example, individuals might have paid more attention to information that they have unclaimed benefits, that is, money on the table, as compared to information that their marginal wage is different from what they mistakenly calculate it to be in the absence of EITC. The second information needs additional mental processing and steps to realize immediate gains. Such behavioral insights that can uncover heterogeneous treatment effects of the EITC deserve much more attention and are fruitful lines of future research.

In all, we summarize the steps involved in a behaviorally enhanced policy analysis using a stylistic example. Figure 3, an example adapted from Weimer and Vining (2015),4 illustrates how a diagnosis of the policy problem that identifies psychological impediments can enrich the standard neoclassical framework of linking the policy problem (i.e., the rationale of government intervention) to various policy alternatives.

As Figure 3 illustrates, the first step in any policy analysis within the neoclassical framework is the examination of the presence or potential for market failure that provides a rationale for government intervention. Our example starts with a similar premise. However, if there is evidence, or if theory suggests that there is a potential for market failure, we further recommend that the policy analyst carry out a diagnosis to identify the psychological impediments that might exacerbate or attenuate the market failure. Such a diagnosis can result in the design and implementation of initiatives that can be evaluated using behaviorally informed and tested policy analyses. In the absence of any source of psychological impediment, we recommend the use of traditional policy tools such as taxes, subsidies, and/or regulations. However, we suggest that the policy analyst explore an ex-post policy analysis of these traditional policy tools using relevant behavioral levers, as evidenced from behaviorally aligned policy analyses we reviewed earlier.

Similarly, even in the absence of a market failure, government intervention can be justified on the grounds of equity, especially in the context of antipoverty programs. A diagnosis of the decision-making environment faced by the targets of such policies will help shed light onto the policy alternatives that might be effective. The take-up rates of such policy interventions and the effect on the outcome of interest can (and should) be evaluated using the behavioral paradigm as shown in the stylistic example of a behaviorally enhanced policy analysis.


Criticisms


The proliferation of behavioral sciences-inspired research for designing public policies is not without its share of critics. Some critics contend that such an over-reliance on low-cost interventions has distracted governments from implementing more ambitious policies that rely on traditional policy instruments. They argue that while traditional policy instruments such as taxes and/or subsidies might be costlier to implement and harder to receive bipartisan support, taxes and subsidies may have a much larger potential to change behavior. For example, in a provocative editorial article titled “Economics Behaving Badly,” George Loewenstein and Peter Ubel (2010) argued that informational e-mails encouraging lower energy use show just a modest impact on driving/household energy consuming behavior. However, a traditional price-based solution such as a well-calibrated carbon tax policy could unleash a much larger impact by aligning energy prices that internalize the externality; however, such an effort lacks political will. Similarly, studies shows that information provision interventions such as laws mandating calorie labeling in restaurant menus have shown just a modest impact on the provision and selection of healthy food options (Elbel, Kersh, Brescoll, & Beth Dixon, 2009; Namba, Auchincloss, Leonberg, & Wootan, 2013), thereby doing very little to combat a public health issue such as obesity. Instead, Loewenstein and Ubel (2010) argue that the lack of political will to end the corn subsidies that result in lower costs of high-fructose corn syrup and low-priced unhealthy processed foods continue to exacerbate the obesity epidemic.

The small effect sizes of many behaviorally enhanced policy interventions have been endlessly debated by both vehement critics and ardent defenders. Critics argue that some of the problems that behaviorally embedded policy initiatives are trying to solve are systemic, and can thus not be solved through just small and simplistic tweaks to policy design features (Bhargava & Loewenstein, 2015). However, defenders maintain that such small interventions resulting in marginal changes have the potential to add up to more than the sum of its parts. Indeed, the famous social psychologist that many consider to be the pioneer of this line of applied behavioral science, Professor Daniel Kahneman, has repeatedly communicated his optimism for incorporating behavioral insights into the public policy process. He emphasizes that these initiatives have the potential to achieve “medium-sized gains by nanosized investments” (Kahneman, 2013). By analyzing and explicating the relative costs and benefits of behaviorally embedded policy initiatives, we hope that public policy researchers can move past such polarized reactions. We believe that behavioral insights complement—and do not replace—the need for traditional policy tools such as taxes and subsidies.

Finally, some critics have questioned the ethics behind nudge-type policy initiatives. The criticism rests on the claim that these policy initiatives might result in policymakers manipulating citizens’ choices by relying on certain automatic psychological processes of citizens (Bovens, 2009). While Hansen and Jespersen (2013) contend that not all nudges rely on automatic psychological processes, Thaler and Sunstein (2008) have argued that most nudges are liberty preserving because they do not alter the overall availability of choices to an individual. Hansen and Jespersen (2013) also emphasize that it is important to distinguish between transparent and nontransparent nudges when evaluating the ethicality of using such behaviorally enhanced policy initiatives. Most recently, Steffel, Williams, and Pogacar (2016) show how most nudges can be made completely transparent without reducing their benefits. We encourage more research to explore the ethics and the ultimate welfare implications of the use of this behavioral paradigm to policymaking. However, we argue that such research should be conducted within the larger context of evaluating relative trade-offs and benefits to cost comparisons of alternative policy initiatives.


Conclusion


Insights from behavioral sciences hold tremendous promise for applied policy analytic work. In less than a decade since the publication of Nudge (Thaler & Sun- stein, 2008)—viewed by many as the beginning of behavioral insights permeating the policy process—several key advances have been made to fully incorporate such insights into policy. However, there is not much consensus about how these insights can be fully incorporated into policy analysis. Our review aims to contribute toward building such a coherent approach for applying behavioral insights into the various stages of policy analysis and policy design.

First, behavioral insights should be integrated into ex-ante policy analysis more thoroughly by incorporating a diagnosis of the underlying policy problem—be it an analysis of a market failure or an antipoverty/equity concern that demands government intervention. Specifically, the interaction of peoples’ psychological impediments that might exacerbate or attenuate the policy problem(s) needs to be explored before policy tools are designed and implemented to resolve the policy problem.

Second, behavioral insights should be incorporated into policy design and implementation. Automatic defaults, reminder e-mails, and text campaigns that enhance take up of governmental programs and social psychological interventions that help smooth psychological frictions for vulnerable populations such as students and low-income households, enhance the policy toolkit at the disposal of the government. Such behaviorally informed policy tools can also be used in combination with existing tools such as taxes and subsidies to enhance the intended consequences of government intervention. The behavioral revolution in public policy not only encourages the use of rigorous pilot-testing of policy initiatives using randomized control trials to increase the internal validity of causal impacts, but also to understand the mechanism of change underlying the policy initiative.

Last, behavioral insights enhance ex-post policy analysis by providing better mod- els for peoples’ behavioral responses to policy changes. By incorporating various behavioral levers in empirical models that evaluate policy impacts, we gain a better understanding of the expected and unexpected consequences of policies. Furthermore, such an integration of behavioral insights highlights the need to view the various stages of the policy process as a cyclical and reinforcing process. The lessons from one stage of the policy process such as ex-post policy analysis should inform the design of the new policy tools and the evaluation of policy alternatives in an ex-ante policy analysis.

As the field matures, and our understanding of human behavior continues to improve, we see that some problems can be improved through minor tweaks to policy design features. However, some other policy problems need a fundamental rethinking of the underlying assumptions of human nature. Classic social psychology research of behavior change carried out by Kurt Lewin entailed the study of a tension system consisting of conflicting forces in the environment that simultaneously push and pull an individual’s behavior. Lewin distinguished between two kinds of conflicting forces—restraining forces and driving forces—that form the basis for behavior change. “Driving forces” are those that help promote behavior change. In contrast, “restraining forces” are those that preclude behavior change. Kahneman (2013) explains how the Lewinian approach for identifying the restraining forces entails answering the question: “Why don’t people already do what I wish they do?”

Just as Lewin favored reducing the “restraining forces” over increasing the “driving forces” for behavior change, the first decade of the nudges approach to policy making has followed a similar trend of targeting the “restraining forces.” Behaviorally informed and tested policy analyses of initiatives that involve changes in framing, or tweaks to choice architecture can essentially be classified as the government/change agent reducing the “restraining forces” that precludes human beings from making choices that are in their best interest. However, we recommend that an over-reliance on eliminating the “restraining forces” should not preclude a deeper exploration of other “driving forces” that can be harnessed for behavior change. For example, in a recent Huffington Post article, Lamberton and Castleman (2016) call for Nudge 2.0—an expanded nudge toolkit especially in education that can go beyond simplifying information or providing reminders that enable students to follow through on their commitments. They call for additional interventions that provide professional assistance that can aid students’ decision making by specifically incorporating their identity, beliefs, psychological biases, and emotions. We call for further research to explore the impact of educational interventions based on basic social psychological theory that can support rather than merely nudge students into making decisions that enable them to achieve a high-quality educational experience.

Finally, we recommend the “pragmatic approach” advocated by economist Raj Chetty (2015) for the incorporation of behavioral elements and factors in the policy process. Rather than debating the validity of a behavioral approach as being in contrast to the neoclassical framework/assumptions, behavioral insights should be judged by the usefulness of its predictions and empirical validity. Scholars and practitioners carrying out policy analysis work must strive to incorporate behavioral insights into policy design and all the stages of policy analysis.


Maithreyi Gopalan is a Ph.D. candidate at the School of Public and Environmental Affairs at Indiana University, Bloomington. She has completed a doctoral minor in Psychology and holds a Master’s in Economics. Her research interests lie in bring- ing psychological insights to bear on social and education policy.

Maureen A. Pirog is Rudy Professor of Policy Analysis at the School of Public and Environmental Affairs at Indiana University, Bloomington. She is also a distin- guished visiting professor at the University of Johannesburg.


Notes

The authors thank two anonymous reviewers, Dave Warren, Shannon Lea Watkins, and Michael D. Per- kins for their thoughtful and encouraging comments on earlier drafts of this manuscript. Any remaining errors are our own.


1 We adopt a broad definition of policy analysis in this article that includes Weimer and Vining’s (2015) description of “policy analysis,” “policy research,” and “academic research.” Weimer and Vining define “policy analysis” as a systematic assessment of alternative policy choices for policy problems that is largely carried out by analysts in a variety of public organizational settings such as federal, state, and local agencies. In contrast, they define “policy research” as pertaining to policy evaluations that aim at predicting the impact of changes driven by policies or the impact of changes in outcomes that can be “altered by public policy” (p. 26). Finally, they define “academic research” as empirical and theoretical analysis of public policy issues that aims to “contribute to a better understanding of society” (p. 25) that are not always relevant to specific public policies. This includes analyses published more traditionally in peer-reviewed academic journals. Given the blurring distinction between these categories, and how the empirical orientation and practice of policy analysts in governmental agencies have also begun to resemble academic and policy research, we adopt a broad definition of policy analysis in this review.

2 Keywords used:nudges, behavioral economics, behavioral science, behavioral insights, interventions, behavioral insights for social policy, behavioral insights for education policy, behavioral foundation of public policy.

3 Imperfect optimization: category of psychological impediments that refers to errors people make when choosing among alternatives; bounded self-control: category of psychological impediments that reflects peoples’ general tendency to not take action that has future benefits even when they recognize such benefits and would like to take action; nonstandard preferences: category of psychological impediments that pertains to people having preferences that are different from standard model. In this case, people are not making errors in choosing, or are not struck with the inability to take action even when they intend to. Their preferences (accurately identified and executed) are usually assumed away in the standard economic models.

4 Weimer and Vining (2015) include an additional category—government failure—that explores situations in which government intervention might fail. In those cases, they advocate policy solutions such as deregulation, legalization, and privatization. A burgeoning literature in political science and public administration is incorporating behavioral insights to understand and solve some sources of government failure. These problems most often pertain to problems of direct democracy and representative government that essentially explore the cognitive biases in electoral processes as well as in bureaucracy; however, a review and analysis of the insights from those studies to the policy process is beyond the scope of this review. We thus focus just on how behavioral insights can enhance our understanding of the sources of market failure and promote government delivery of antipoverty programs in the context of policy analysis and evaluation.


References

  • Agarwal, Sumit, Souphala Chomsisengphet, Neale Mahoney, and Johannes Stroebel. 2015. “Regulating Consumer Financial Products: Evidence from Credit Cards.” The Quarterly Journal of Economics 130 (1): 111–64.
  • Aronson, Joshua, Carrie B. Fried, and Catherine Good. 2002. “Reducing the Effects of Stereotype Threat on African American College Students by Shaping Theories of Intelligence.” Journal of Experimental Social Psychology 38: 113–25.
  • Avery, Christopher. 2013. Evaluation of the College Possible Program. National Bureau of Economic Research Working Paper 19562. http://www.nber.org/papers/w19562. Accessed December 31, 2016.
  • Bergman, Peter. n.d. Parent-Child Information Frictions and Human Capital Investment: Evidence from a Field Experiment. Working Paper. http://www.columbia.edu/~psb2101/BergmanSubmission.pdf. Accessed December 31, 2016.
  • Bertrand, Marianne, and Adair Morse. 2011. “Information Disclosure, Cognitive Biases, and Payday Borrowing.” The Journal of Finance 66 (6): 1865–93.
  • Beshears, John, James J. Choi, David Laibson, and Brigitte C. Madrian. 2008. “The Importance of Default Options for Retirement Savings Outcomes: Evidence from the United States.” In Lessons from Pension Reform in the Americas, ed. Stephen J. Kay, and Tapen Sinha. New York: Oxford University Press, 59–87.
  • Bettinger, Eric, Bridget Terry Long, Philip Oreopoulos, and Lisa Sanbonmatsu. 2012. “The Role of Application Assistance and Information in College Decisions: Results from the H&R Block FAFSA Experiment.” The Quarterly Journal of Economics 127 (3): 1205–42.
  • Bhargava, Saurabh, and George Loewenstein. 2015. “Behavioral Economics and Public Policy 102: Beyond Nudging.” American Economic Review: Papers & Proceedings 105 (5): 396–401.
  • Bhargava, Saurabh, George Loewenstein, and Justin Sydnor. 2015. Do Individuals Make Sensible Health Insurance Decisions? National Bureau of Economic Research Working Paper No. 21160. http://www. nber.org/papers/w21160. Accessed December 31, 2016.
  • Bhargava, Saurabh, and Dayanand Manoli. 2015. “Psychological Frictions and the Incomplete Take-Up of Social Benefits: Evidence from an IRS Field Experiment.” American Economic Review 105 (11): 3489–529.
  • Blackwell, Lisa S., Kali H. Trzesniewski, and Carol S. Dweck. 2007. “Implicit Theories of Intelligence Predict Achievement Across an Adolescent Transition: A Longitudinal Study and an Intervention.” Child Development 78 (1): 246–63.
  • Bovens, Luc. 2009. “The Ethics of Nudge.” In Preference Change: Approaches from Philosophy, Economics and Psychology, ed. Till Gru€ne-Yanoff, and Sven Ove Hansson. Amsterdam: Springer, 207–19.
  • Burnette, Jeni L., Ernest H. O. Boyle, Eric M. Vanepps, Jeffrey M. Pollack, and Eli J. Finkel. 2013. “Mind-Sets Matter: A Meta-Analytic Review of Implicit Theories and Self-Regulation.” Psychological Bulletin 139 (3): 655–701.
  • Castleman, Benjamin L., and Lindsay C. Page. 2015. “Can Personalized Text Messages and Peer Mentor Outreach Increase College Going Among Low-Income High School Graduates?” Journal of Economic Behavior & Organization 115: 144–60.
  • Chetty, Raj. 2015. “Behavioral Economics and Public Policy: A Pragmatic Perspective.” American Economic Review: Papers & Proceedings 105 (5): 1–33.
  • Chetty, Raj, John N. Friedman, Soren Leth-Petersen, Torben Nielsen, and Tore Olsen. 2014. “Active vs. Passive Decisions and Crowd-Out in Retirement Savings Accounts: Evidence from Denmark.” The Quarterly Journal of Economics 129 (3): 1141–219.
  • Chetty, Raj, John N. Friedman, and Emmanuel Saez. 2013. “Using Differences in Knowledge across Neighborhoods to Uncover the Impacts of the EITC on Earnings.” American Economic Review 103 (7): 2683–721.
  • Chetty, Raj, and Emmanuel Saez. 2013. “Teaching the Tax Code: Earnings Responses to an Experiment with EITC Recipients.” American Economic Journal: Applied Economics 5 (1): 1–31.
  • Congdon, William J. 2013. “Psychology and Economic Policy.” In Behavioral Foundations of Public Policy, ed. Eldar Shafir. Princeton, NJ: Princeton University Press, 465.
  • Congdon, William J., Jeffrey R. Kling, and Sendhil Mullainathan. 2011. Policy and Choice: Public Finance through the Lens of Behavioral Economics. Washington, DC: Brookings Institution Press.
  • Cook, Philip J., Kenneth Dodge, Roland G. Fryer, Jonathan Guryan, Jens Ludwig, Susan Mayer, Harold Pollack, and Steinberg Laurence. 2014. The (Surprising) Efficacy of Academic and Behavioral Intervention with Disadvantaged Youth: Results from a Randomized Experiment in Chicago. National Bureau of Economic Research Working Paper No. 19862. http://www.nber.org/papers/w19862. Accessed December 31, 2016.
  • Dechausay, Nadine, and Caitlin Anzelone. 2016. Cutting Through Complexity: Using Behavioral Science to Improve Indiana’s Child Care Subsidy Program. OPRE Report No. 2016-03. Washington, DC: Office of Planning, Research and Evaluation, Administration for Children and Families, U.S. Department of Health and Human Services.
  • Dechausay, Nadine, Caitlin Anzelone, and Leigh Reardon. 2015. The Power of Prompts Using Behavioral Insights to Encourage People to Participate. OPRE Report No. 2015-75. Washington, DC: Office of Planning, Research and Evaluation, Administration for Children and Families, U.S. Department of Health and Human Services.
  • Dellavigna, Stefano, John A. List, and Ulrike Malmendier. 2012. “Testing for Altruism and Social Pressure in Charitable Giving.” The Quarterly Journal of Economics 127 (1): 1–56.
  • Dynarski, Susan M., and Judith E. Scott-Clayton. 2006. The Cost of Complexity in Federal Student Aid: Lessons from Optimal Tax Theory and Behavioral Economics. National Bureau of Economic Research Working Paper No. 12227. http://www.nber.org/papers/w12227.pdf. Accessed December 31, 2016.
  • Eissa, Nada, and Hilary W. Hoynes. 2006. “Behavioral Responses to Taxes: Lessons from the EITC and Labor Supply.” Tax Policy and the Economy 20: 73–110.
  • Elbel, Brian, Rogan Kersh, Victoria L. Brescoll, and L. Beth Dixon. 2009. “Calorie Labeling and Food Choices: A First Look at the Effects on Low-Income People in New York City.” Health Affairs 28 (6): w1110–21.
  • Farrell, Mary, Jared Smith, Leigh Reardon, and Emmi Obara. 2016. Framing the Message: Using Behavioral Economics to Engage TANF Recipients. OPRE Report No 2015-02. Washington, DC: Office of Planning, Research and Evaluation, Administration for Children and Families, U.S. Department of Health and Human Services.
  • Farrell, Mary, Peter Baird, Bret Barden, Mike Fishman, and Rachel Pardoe. 2013. "The TANF/SSI Disability Transition Project: Innovative Strategies for Serving TANF Recipients with Disabilities." OPRE Report 2013-51. Washington, DC: Office of Planning, Research and Evaluation, Administration for Children and Families, U.S. Department of Health and Human Services. http://www.mdrc.org/ sites/default/files/tanf_ssi_disability_transition_project_fr.pdf. Accessed December 31, 2016.
  • Fryer, Roland G. 2013. Information and Student Achievement: Evidence from a Cellular Phone Experiment. http:// scholar.harvard.edu/files/fryer/files/million_manuscriptjune2013_0.pdf. Accessed December 31, 2016.
  • Fryer, Roland G., Steven D. Levitt, John List, and Sally Sadoff. 2012. Enhancing the Efficacy of Teacher Incentives through Loss Aversion: A Field Experiment. National Bureau of Economic Research Working Paper No. 18237. http://www.nber.org/papers/w18237. Accessed December 31, 2016.
  • Hall, Crystal C., Martha M. Galvez, and Isaac M. Sederbaum. 2014. “Assumptions About Behavior and Choice in Response to Public Assistance: A Behavioral Decision Analysis.” Policy Insights from the Behavioral and Brain Sciences 1 (1): 137–43.
  • Hansen, Pelle Guldborg, and Andreas Maaløe Jespersen. 2013. “Nudge and the Manipulation of Choice: A Framework for the Responsible Use of the Nudge Approach to Behaviour Change in Public Policy.” European Journal of Risk and Regulation 1: 3–28.
  • Harackiewicz, Judith M., Christopher S. Rozek, Chris S. Hulleman, and Janet S. Hyde. 2012. “Helping Parents to Motivate Adolescents in Mathematics and Science: An Experimental Test of a Utility-Value Intervention.” Psychological Science 23 (8): 899–906.
  • ideas42. 2016. Using Behavioral Economics for Postsecondary Success. http://www.ideas42.org/wp-content/ uploads/2016/09/Nudging-For-Success-FINAL.pdf. Accessed December 31, 2016.
  • Ifill, Nicole, Alexandria Walton Radford, Forrest Emily Cataldi, David Wilson, and Jason Hill. 2016. “Persistence and Attainment of 2011–12 First-Time Postsecondary Students After 3 Years (BPS:12/ 14).” NCES 2016-401. U.S. Department of Education. Washington, DC: National Center for Education Statistics. https://nces.ed.gov/pubs2016/2016401.pdf. Accessed February 8, 2017.
  • Iyengar, Sheena S., and Mark R. Lepper. 2000. “When Choice Is Demotivating: Can One Desire Too Much of a Good Thing?” Journal of Personality and Social Psychology 79 (6): 995–1006.
  • Johnson, Eric J., and Daniel Goldstein. 2003. “Do Defaults Save Lives?” Science 302 (5649): 1338–39.
  • Jones, Damon, and Aparajit Mahajan. 2015. Time-Inconsistency and Saving: Experimental Evidence from Low-Income Tax Filers. National Bureau of Economic Research Working Paper No. 21272. http://www. nber.org/papers/w21272. Accessed December 31, 2016.
  • Kahneman, Daniel. 2013. “Foreword.” In Behavioral Foundations of Public Policy, ed. Eldar Shafir. Princeton, NJ: Princeton University Press.
  • Kamenica, Emir. 2012. “Behavioral Economics and Psychology of Incentives.” Annual Review of Economics 4: 13.1–13.26.
  • Karlan, Dean, Margaret McConnell, Sendhil Mullainathan, and Jonathan Zinman. 2016. “Getting to the Top of Mind: How Reminders Increase Saving Getting to the Top of Mind: How Reminders Increase Saving.” Management Science 62 (12): 3393–411.
  • Karlan, Dean, and Jonathan Zinman. 2012. Borrow Less Tomorrow: Behavioral Approaches to Debt Reduction. http://crr.bc.edu/wp-content/uploads/2012/05/FSP-WP-2012-1.pdf. Accessed February 7, 2017.
  • Kiefe, Catarina, Jeroan Allison, Dale Williams, Sharina Person, Michael Weaver, and Norman Weissman. 2001. “Improving Quality Improvement Using Achievable Benchmarks for Physician Feedback: A Randomized Control Trial.” Journal of American Medical Association 285 (22): 2871–79.
  • Kling, Jeffrey R., Sendhil Mullainathan, Eldar Shafir, Lee Vermeulen, and Marian Wrobel. 2012. “Comparison Friction: Experimental Evidence from Medicare Drug Plans.” The Quarterly Journal of Economics 127: 199–235.
  • Lamberton, Cait, and Benjamin Castleman. 2016. “Nudge 2.0: A Broader Toolkit for Lasting Behavior Change.” The Huffington Post. http://www.huffingtonpost.com/cait-lamberton/nudge-20-a-broader-toolki_b_10108728.html. Accessed January 17, 2017.
  • Lavecchia, Adam M., Heidi Liu, and Philip Oreopoulos. 2014. Behavioral Economics of Education: Progress and Possibilities. National Bureau of Economic Research Working Paper No. 20609. http://www. nber.org/papers/w20609. Accessed December 31, 2016.
  • Liebman, Jeffrey B., and Erzo F. P. Luttmer. 2011. Would People Behave Differently If They Better Understood Social Security? Evidence from a Field Experiment. National Bureau of Economic Research Working Paper No. 17287. http://www.nber.org/papers/w17287. Accessed December 31, 2016.
  • Loewenstein, George, Cass R. Sunstein, and Russell Golman. 2014. “Disclosure: Psychology Changes Everything.” Annual Review of Economics 6: 391–419.
  • Loewenstein, George, and Peter Ubel. 2010. “Economics Behaving Badly.” The New York Times (July 14).
  • Lourenço, Joana Sousa, Emanuele Ciriolo, Sara Rafael Almeida, and Troussard Xavier. 2016. Behavioural Insights Applied to Policy European Report 2016. EUR 27726 EN. European Commission. doi:10.2760/903938.
  • Luca, Michael, and Jonathan Smith. 2013. “Salience in Quality Disclosure: Evidence from the U.S. News College Rankings.” Journal of Economics & Management Strategy 22 (1): 58–77.
  • Ludwig, Jens, Jeffrey R. Kling, and Sendhil Mullainathan. 2011. “Mechanism Experiments and Policy Evaluation.” Journal of Economic Perspectives 25 (3): 17–38.
  • Madrian, Brigitte C. 2014. “Applying Insights from Behavioral Economics to Policy Design.” Annual Review of Economics 6: 663–88.
  • Madrian, Brigitte C., and Dennis F. Shea. 2001. “The Power of Suggestion: Inertia in 401(k) Participation and Savings Behavior.” The Quarterly Journal of Economics 116 (4): 1149–87.
  • Mani, Anandi, Sendhil Mullainathan, Eldar Shafir, and Jiaying Zhao. 2013. “Poverty Impedes Cognitive Function.” Science 341 (6149): 976–80.
  • Manoli, Day, and Nick Turner. 2016. Nudges and Learning: Evidence from Informational Interventions for Low-Income Taxpayers. National Bureau of Economic Research Working Paper No. 20718. http://www. nber.org/papers/w20718.pdf. Accessed December 31, 2016.
  • Mayer, Susan E., Ariel Kalil, Philip Oreopoulos, and Sebastian Gallegos. 2015. Using Behavioral Insights to Increase Parental Engagement: The Parents and Children Together (PACT) Intervention. National Bureau of Economic Research Working Paper No. 21602. http://www.nber.org/ papers/w21602.
  • Mortensen, Chad R., and Robert B. Cialdini. 2010. “Full-Cycle Social Psychology for Theory and Application.” Social and Personality Psychology Compass 4 (1): 53–63.
  • Namba, Alexa, Amy Auchincloss, Beth L. Leonberg, and Margo G. Wootan. 2013. “Exploratory Analysis of Fast-Food Chain Restaurant Menus Before and After Implementation of Local Calorie-Labeling Policies.” Prevention of Chronic Disorders 10: 120224.
  • Oreopoulos, Philip, Robert S. Brown, and Adam M. Lavecchia. 2014. Pathways to Education: An Integrated Approach to Helping At-Risk High School Students. National Bureau of Economic Research Working Paper No. 20430. http://www.nber.org/papers/w20430. Accessed December 31, 2016.
  • Patterson, Mark, Saurabh Bhargava, and George Loewenstein. 2017. “An Unhealthy Attitude? New Insight into the Modest Effects of the NLEA.” Working Paper. http://dd42c9be-a-62cb3a1a-s-sites. googlegroups.com/site/sbhargav/NLEA_Patterson_Bhargava_Loewenstein.pdf?attachauth=ANoY7 cruyLyrfHnbRQjFridCWG-KAzqohiCpxrNWePQBeTUjZKoTNRPLowi1I0NHEbZmSQQSBeFwnMa I7cmOVXquxTkEWqJ6nHynL8wtS9SyP2OaNA1wbXV2N_IgRGd-C8lM7kpMMq_j4t3FBaGh94LTodi hb9EK-c6o1Fw06WuZTusNHUv1hrnUiVqy8eZb9dCQ1VY47Sq1REyhRgUMXpK62wwfNKrn1JuCjt rvEODyWnEOjMD8rX0%3D&attredirects=0. Accessed February 7, 2017.
  • Paunesku, David, Gregory M. Walton, Carissa Romero, Eric N. Smith, David S. Yeager, and Carol S. Dweck. 2015. “Mind-Set Interventions Are a Scalable Treatment for Academic Underachievement.” Psychological Science 26 (6): 784–93.
  • Richards, Michael R., and Jody Sindelar. 2013. “Rewarding Healthy Food Choices in SNAP: Behavioral Economic Applications.” The Milbank Quarterly 91 (2): 395–412.
  • Richburg-Hayes, Lashawn, Caitlin Anzelone, Nadine Dechausay, Saugato Datta, Alexandra Fiorillo, Louis Potok, Matthew Darling, and John Balz. 2014. Behavioral Economics and Social Policy: Designing Innovative Solutions for Programs Supported by the Administration for Children and Families. OPRE Report No. 2014-16a. Washington, DC: Office of Planning, Research and Evaluation, Administration for Children and Families, U.S. Department of Health and Human Services.
  • Rosinger, Kelly. 2016. Can Simplifying Financial Aid Information Impact College Enrollment and Borrowing? Experimental and Quasi-Experimental Evidence. EdPolicyWorks Working Paper Series No. 49. http:// sbst.gov/download/2015%20SBST%20Annual%20Report.pdf. Accessed February 7, 2017.
  • Sacarny, Adam, David Yokum, Amy Finkelstein, and Shantanu Agrawal. 2016. “Medicare Letters to Curb Overprescribing of Controlled Substances Had No Detectable Effect on Providers.” Health Affairs 35 (3): 471–79.
  • Social and Behavioral Sciences Team (SBST). 2015. Annual Report. https://sbst.gov/assets/files/2015-annual-report.pdf. Accessed December 31, 2016.
  • Starc Amanda. 2014. “Insurer Pricing and Consumer Welfare: Evidence from Medigap.” The RAND Journal of Economics 45 (1): 198–220.
  • Steffel, Mary, Elanor Williams, and Ruth Pogacar. 2016. “Ethically Deployed Defaults: Transparency and Consumer Protection Via Disclosure and Preference Articulation.” Journal of Marketing Research 53 (5): 865–80.
  • Thaler, Richard, and Cass Sunstein. 2003. “Libertarian Paternalism.” American Economic Review 93 (2): 175–79.
  • ———. 2008. Nudge: Improving Decisions About Health, Wealth, and Happiness. New Haven, CT: Yale University Press.
  • Tversky, Amos, and Daniel Kahneman. 1991. “Loss Aversion in Riskless Choice: A Reference-Dependent Model.” The Quarterly Journal of Economics 106: 1039–61.
  • U.S. Department of Agriculture Food and Nutrition Service (USDA FNS). 2012. Snap Education and Evaluation Study (Wave I): Final Report. http://www.fns.usda.gov/sites/default/files/SNAPEdWaveI_0. pdf. Accessed February 7, 2017.
  • ———. 2014. Evaluation of the Healthy Incentives Pilot (HIP) Final Report. http://www.fns.usda.gov/snap/ healthy-incentives-pilot-final-evaluation-report. Prepared by Abt Associates for the U.S. Department of Agriculture, Food and Nutrition Service. http://www.fns.usda.gov/sites/default/files/HIP-Final.pdf. Accessed February7, 2017.
  • U.S. Department of Education. 2013. Education Department Releases College Scorecard to Help Students Choose Best College for Them. Washington, DC: U.S. Government Printing Office.
  • Walton, Gregory M., and Geoffrey Cohen. 2011. “A Brief Social-Belonging Intervention Improves Academic and Health Outcomes of Minority Students.” Science 331: 1447–51.
  • Weimer, David L., and Aidan R. Vining. 2015. Policy Analysis: Concepts and Practice New York: Routledge. Yeager, David S., and Gregory M. Walton. 2011. “Social-Psychological Interventions in Education: They’re Not Magic.” Review of Educational Research 81 (2): 267–301.
  • Yeager, David S., Gregory M. Walton, Shannon T. Brady, Ezgi N. Akcinar, David Paunesku, Laura Keane, Donald Kamentz et al. 2016. “Teaching a Lay Theory Before College Narrows Achievement Gaps at Scale.” Proceedings of National Academy of Sciences 113 (24): E3341–48.