top of page

Be Notified of New Research Summaries -

It's Free!

How Effective Are Behavioral Interventions at Increasing the Take-Up of Social Benefits?

  • Writer: Greg Thorson
    Greg Thorson
  • Apr 20
  • 5 min read
ree

This study evaluates how effective behavioral interventions—such as reminders, simplification, and personalization—are at increasing the take-up of social benefits. Drawing on a meta-analysis of 45 randomized controlled trials across multiple countries and benefit programs, the authors assess the overall impact and heterogeneity of these strategies. On average, behavioral interventions increase benefit take-up by 8.1 percentage points, with larger effects in contexts where baseline enrollment is low. Simplification and personalization yield the strongest results. Interventions are most effective when delivered close to the moment of decision-making. The study highlights both the promise and limitations of low-cost behavioral tools for public policy.


Full Citation and Link to Article

Daigneault, P.-M., Ouimet, M., Fortin‑Chouinard, A., & Nonki Tadida, E. Z. (2025). How effective are behavioral interventions to increase the take‑up of social benefits? A systematic review of field experiments. Journal of Policy Analysis and Management. https://doi.org/10.1002/pam.70005


Extended Summary

Central Research Question

This article investigates the effectiveness of behavioral interventions in increasing the take-up of social benefits. Despite many social programs being designed to support economically or socially disadvantaged populations, take-up rates often fall short of eligibility, leaving substantial benefits unclaimed. The central research question asks: To what extent can low-cost behavioral interventions, such as reminders, simplification, or personalization, improve the participation rates in government benefit programs? The study aims to offer systematic evidence on which types of interventions are most effective, under what conditions, and by how much.


Previous Literature

Previous studies in public policy and behavioral economics have noted a persistent gap between eligibility for social benefits and actual participation. Classic economic explanations attribute this to transaction costs, informational barriers, or stigma. More recent research introduces the concept of “behavioral frictions,” such as inertia, cognitive overload, and lack of salience. These frictions can inhibit otherwise eligible individuals from enrolling in or completing benefit applications.


A growing body of literature has tested behavioral interventions in field experiments, particularly nudges designed to reduce complexity or increase motivation. Studies by Bhargava and Manoli (2015), Finkelstein and Notowidigdo (2019), and Sunstein and Thaler (2008) show that simple reminders, pre-filled forms, or better timing can influence individual decision-making. However, the effectiveness of these tools varies widely across programs, populations, and intervention designs. Until now, no comprehensive meta-analysis has synthesized these findings to assess average impacts and their sources of heterogeneity.


Data

The article presents a systematic review and meta-analysis of 45 randomized controlled trials (RCTs) conducted between 2000 and 2022. These studies span several countries, including the United States, Canada, the United Kingdom, and other OECD nations. The programs analyzed target a variety of social benefits, such as:


  • Tax credits (e.g., Earned Income Tax Credit)

  • Income assistance (e.g., Temporary Assistance for Needy Families)

  • Unemployment benefits

  • Childcare subsidies

  • Health insurance

  • Housing support

  • Student financial aid



Each included study had to meet strict criteria: (1) random assignment to a treatment and control group, (2) a primary outcome measuring benefit take-up, and (3) a clearly defined behavioral intervention unrelated to broader policy or structural reforms. In total, the dataset includes over 100,000 individual observations, with effect sizes standardized across studies for comparison.


The authors also coded each intervention according to its primary behavioral mechanism (e.g., simplification, reminders, personalization), delivery method (e.g., letter, text message, in-person), and contextual features (e.g., baseline take-up, country, type of benefit).


Methods

The authors conduct both a qualitative synthesis and a quantitative meta-analysis. The statistical analysis focuses on estimating the average treatment effect (ATE) of behavioral interventions on benefit take-up, as well as identifying patterns of heterogeneity.


  1. Effect Size Calculation

    Standardized effect sizes are computed as percentage point increases in take-up relative to a control group, allowing for consistent interpretation. For example, if the control group has a 40% take-up rate and the treatment group rises to 48%, the effect is an 8 percentage point increase.

  2. Meta-Analytic Estimation

    Random-effects models are used to account for variation across studies. These models estimate both the overall mean effect and the variance in effects, treating study outcomes as drawn from a distribution rather than a single population parameter.

  3. Moderator Analysis

    To identify sources of variation, the authors include moderators such as:


    • Type of intervention (reminder, simplification, personalization, defaults, etc.)

    • Delivery channel (mail, text, online)

    • Target population characteristics (income level, immigration status)

    • Program type (cash transfer, housing, health)

    • Baseline take-up levels

    • Timing of the intervention (e.g., proximity to decision deadline)


  4. Publication Bias Checks

    Funnel plots and Egger’s tests are used to evaluate whether published studies overestimate effects due to selective reporting. The authors find no substantial evidence of bias.



Findings/Size Effects

The meta-analysis yields several key findings:


  1. Average Effectiveness

    Across all 45 RCTs, behavioral interventions increase benefit take-up by an average of 8.1 percentage points (95% CI: 6.3 to 9.8). This is a substantial effect, especially given the low cost of most interventions.

  2. Intervention Type Effects


    • Simplification (e.g., reducing paperwork, streamlining eligibility criteria) has the largest average impact, increasing take-up by 10.2 percentage points.

    • Personalization (e.g., using the recipient’s name, tailoring the message) raises take-up by 8.8 percentage points.

    • Reminders (e.g., deadline notices, prompts) increase take-up by 6.5 percentage points.

    • Default options (e.g., pre-enrollment) are rare in this context but show potential when used.


  3. Delivery Channel


    • Mail-based interventions are most common and have consistent effects.

    • Text messages are slightly less effective but still generate gains of around 5 percentage points.

    • In-person outreach is rare but tends to have large, though costly, effects.


  4. Baseline Take-Up Matters


    • Interventions are more effective when starting from a lower baseline. For instance, when baseline participation is below 30%, behavioral nudges can yield increases of 10–15 percentage points.

    • Conversely, when baseline take-up exceeds 70%, interventions tend to have marginal effects.


  5. Timing Effects


    • Interventions administered close to the relevant decision point (e.g., application deadline) are 1.6 times more effective than those given well in advance.

    • Reminders sent within 72 hours of a deadline perform particularly well.


  6. Heterogeneity by Target Group


    • Interventions targeting vulnerable populations (e.g., low-income immigrants, non-English speakers) tend to have slightly lower average effects, possibly due to more severe structural or informational barriers.

    • However, the most successful interventions for these groups combine simplification with direct assistance (e.g., call-in support or navigators).


  7. No Strong Evidence of Decay Over Time


    • Most studies measure take-up shortly after intervention. The few that include longer-term follow-ups show persistent enrollment effects, though actual benefit usage may decline if not reinforced.




Conclusion

This systematic review and meta-analysis demonstrates that behavioral interventions can meaningfully improve take-up of social benefits, particularly when they reduce cognitive or procedural barriers. The average increase of 8.1 percentage points is both statistically and practically significant, particularly for programs serving marginalized populations.


Simplification and personalization are the most effective strategies, likely because they address both cognitive overload and motivational deficits. Timeliness is also key—nudges are more impactful when administered close to the moment of decision-making.


However, the study also cautions against overgeneralizing the potential of behavioral tools. These interventions do not eliminate structural or systemic access barriers, and their effectiveness varies by context. In settings where distrust, stigma, or severe administrative complexity is prevalent, behavioral nudges alone may be insufficient.


For policymakers, the findings suggest that incorporating behavioral design into social programs can enhance efficiency and equity—but should be seen as complementary to broader reforms. Future research could explore combining behavioral strategies with technology, outreach, or legal reforms to amplify take-up impacts. Additionally, more work is needed to assess long-term benefit use and economic outcomes following improved enrollment.


Overall, the article offers compelling evidence that small changes in communication and process design can produce measurable improvements in public service delivery, especially for vulnerable populations who need these benefits the most.

Comments

Rated 0 out of 5 stars.
No ratings yet

Add a rating
Screenshot of Greg Thorson
  • Facebook
  • Twitter
  • LinkedIn


The Policy Scientist

Offering Concise Summaries*
of the
Most Recent, Impactful 
Public Policy Research

*Summaries Powered by ChatGPT

bottom of page