Can Text Message Reminders Improve SNAP Interview Completion and Benefit Approval Rates?
- Greg Thorson

- Apr 9
- 5 min read

This study investigates whether human-centered text message reminders can improve compliance with mandatory interviews and increase SNAP benefit approval rates. Researchers conducted a randomized controlled trial with 1,554 SNAP applicants in Boulder County, Colorado, comparing standard mail reminders to an intervention group that also received text messages offering flexible “interview anytime” options. The results showed that texted applicants were 10 percentage points more likely to complete interviews, completed them 3–4 days earlier, and had 6–7 percentage points higher approval rates. Survey data confirmed reduced learning costs and greater confidence among treated participants, demonstrating the effectiveness of human-centered communication strategies.
Full Citation and Link to Article
"Administrative Checkpoints, Burdens, and Human‑Centered Design: 'Flexible Interview Anytime' Text Message Reminders Increase SNAP Participation in Boulder County. Moynihan, D. P., Somerville, S., Homonoff, T., Giannella, S., & Mendes, A. (2025). Public Administration Review.Advance online publication. https://doi.org/10.1002/pam.70007
Extended Summary
Central Research Question
This study investigates whether a human-centered design approach—specifically text message reminders offering flexible interview scheduling options—can reduce administrative burdens and increase participation in the Supplemental Nutrition Assistance Program (SNAP). The central research question is: Can flexible, human-centered text message reminders help SNAP applicants complete mandatory interviews at higher rates and receive benefits more successfully compared to traditional mail-only communications? The research explores how administrative checkpoints—particularly mandatory interviews—can hinder access to public benefits, and whether simple, low-cost interventions can reduce learning costs and improve compliance without altering the underlying policy requirements.
Previous Literature
The study draws on several strands of scholarship. First, it builds on the literature on administrative burdens, which conceptualizes the frictions individuals face in navigating public services. These include learning costs (difficulty understanding requirements), compliance costs (the effort to meet them), and psychological costs (stress or stigma). Scholars like Herd and Moynihan have emphasized how burdens can unintentionally reduce program participation, particularly among the most vulnerable.
Second, behavioral public administration has explored how nudges and simplified communication can improve outcomes in public services. Prior studies have shown that reminders—especially text-based ones—can increase follow-through in areas such as court appearances, immunizations, and social program renewals.
Third, the field of human-centered design has begun to influence public administration. This approach emphasizes iterative development based on the experiences of end users—both clients and administrators. Although more common in the private sector, human-centered design has gained traction in civic tech and digital government, notably in organizations like Code for America (CfA), which helped develop the intervention in this study.
Data
The primary data source was a randomized controlled trial involving 1,554 SNAP applicants in Boulder County, Colorado, between March and June 2023. All applicants received standard mailed interview appointment letters. The treatment group (n = 772) also received text messages reminding them of their interviews and explaining that they could call at their convenience to complete an interview. The control group (n = 782) received only the traditional mailer.
The study also included a supplementary text-based survey to assess learning costs and perceived ease of navigating the process. A total of 470 respondents answered at least one survey question, and 225 completed a secondary question on interview preparation confidence.
Administrative data were used to measure three primary outcomes: interview completion rate, the number of days before or after the scheduled interview the applicant completed their interview, and benefit approval. Additional analyses estimated the effect of the intervention on learning costs and confidence.
Methods
Researchers employed a randomized controlled trial (RCT) design. Applicants were randomly assigned to treatment or control groups based on whether their case number was odd or even. The intervention consisted of three text messages sent at key points: shortly after application submission, a few days before the scheduled interview, and 24 hours before the scheduled time.
Outcomes were measured using both county and state administrative data, including interview completion, timing, and benefit approval status. Survey data were used to evaluate the mechanisms behind the intervention—particularly learning costs and confidence in navigating the process.
To analyze outcomes, the researchers used ordinary least squares (OLS) regression with robust standard errors. For estimating the causal effect of interview completion on benefit approval (a downstream effect), they used two-stage least squares (2SLS) with the treatment assignment as an instrument. They also tested for heterogeneous treatment effects based on language preference, gender, homelessness status, and household size.
Findings/Size Effects
The intervention had statistically significant and substantively meaningful effects on all primary outcomes.
Interview Completion
Applicants in the treatment group were 10.7 percentage points more likely to complete their interview than the control group (p < 0.0001). When including rescheduled interviews from state data, the effect size remained significant at 8.4 percentage points.
Interview Timing
Treated applicants completed their interviews 3 to 4 days earlier than the control group (p < 0.0001), which suggests that many used the “call anytime” feature enabled by the new flexible scheduling option. Among those who completed interviews before their scheduled date, 88% were in the treatment group.
Benefit Approval
Treated applicants were 6 to 7 percentage points more likely to be approved for SNAP benefits compared to the control group. The estimated treatment effects across three models ranged from 6.3% to 6.9%, with all p-values between 0.0016 and 0.011.
In the 2SLS analysis examining the effect of actually completing the interview (complier average causal effect), interview completion was associated with a 61–67 percentage point increase in the likelihood of benefit approval. This large effect size reflects that interview non-completion is a major procedural reason for denial.
Learning Costs and Confidence
Survey results showed that text recipients were:
9 percentage points more likely to know an interview was required (p = 0.042)
11 percentage points more likely to report that it was easy to know what to do for the interview (p = 0.014)
13 percentage points more likely to score higher on a combined learning cost index (p = 0.0069)
These results support the idea that the text messages effectively reduced confusion and increased self-efficacy among applicants, making it more likely that they completed necessary steps.
Heterogeneous Effects
Most subgroup analyses showed consistent results across demographic groups. However, applicants from larger households were less likely to complete interviews when treated, possibly reflecting greater logistical challenges. This suggests that even beneficial nudges may not fully overcome deeper structural burdens in certain populations.
Conclusion
The study demonstrates that even modest interventions, grounded in human-centered design principles, can significantly improve access to public benefits. The texting intervention addressed a specific administrative checkpoint—the mandatory SNAP interview—without changing federal rules or removing requirements. By simply reframing communication and offering flexible scheduling, the researchers increased interview completion rates, reduced delays, and boosted benefit approval.
These findings have broad implications for policy design and implementation. First, they support a shift in focus from only reducing administrative burdens to examining specific checkpoints that act as veto points in access. Second, they illustrate that human-centered approaches can complement behavioral insights by targeting real pain points identified through fieldwork with users. Finally, the results underscore the importance of local experimentation and collaboration with civic tech organizations to test scalable, low-cost improvements in service delivery.
Despite some limitations—including a sample from a single county—the findings align with previous research from more diverse settings. The consistency of results suggests this approach could be adapted more widely. Moreover, this research highlights a scalable model for improving benefit uptake using existing technology, and a replicable model for integrating human-centered design and rigorous evaluation into public administration.






Comments