Does Including a Citizenship Question Reduce Survey Response Rates Among Immigrant Households?
- Greg Thorson

- Apr 12
- 5 min read
Updated: Jun 18

This study examines whether adding a citizenship question to the decennial census reduces survey response rates, especially among immigrant and mixed-status households. Using data from the 2019 Census Test, a randomized controlled trial of 480,000 households, the authors linked administrative records to household addresses to assess the impact. They found that the citizenship question reduced self-response rates by 2.4 percentage points in households with likely undocumented members, compared to all-U.S.-born non-Hispanic White households. Roster omissions were also significantly higher, doubling from 0.12 to 0.24 persons per household with the question. These findings indicate increased undercounts among vulnerable populations.
Full Citation and Link to Article
Brown, J. D., & Heggeness, M. L. (2025). Citizenship question effects on household survey response. Journal of Policy Analysis and Management. https://doi.org/10.1002/pam.70004
Extended Summary
Central Research Question
The article investigates whether the inclusion of a citizenship question on a national census questionnaire affects household survey response rates and the accuracy of self-reported household rosters. Specifically, it examines how this potential change in census content influences response behaviors among different household types, particularly those with non-citizen or likely undocumented members. The central research question is: Does adding a citizenship question to the decennial census reduce self-response rates and increase underreporting or omissions in vulnerable immigrant households compared to U.S.-born non-Hispanic White households?
Previous Literature
Prior research has raised concerns that survey items related to immigration status, particularly citizenship, may suppress participation among non-citizens or those living in mixed-status households. Ethnographic studies have shown that fear, confusion, and mistrust of government surveys can lead to intentional underreporting or omission of undocumented individuals. For example, de la Puente (1995) and Evans et al. (2019) documented the likelihood that non-citizen residents may be omitted from household rosters out of fear of government repercussions.
Empirical studies like Baum et al. (2022) and Poehler et al. (2020) used community-level data to analyze variations in census response rates across census tracts with differing demographic compositions. These studies found limited average effects of including a citizenship question on response rates, although they suggested possible effects among specific groups like Hispanics or foreign-born individuals. However, the tract-level approach potentially obscured heterogeneity within households.
Recent administrative record (AR)-based research has found that non-citizens, particularly likely undocumented immigrants, are significantly underrepresented in official survey-based population estimates compared to AR-based benchmarks (Brown, Heggeness, & Murray-Close, 2023). The present study builds on this insight by linking AR data directly to individual addresses in the 2019 Census Test to examine citizenship question effects at the household level, rather than aggregating up to the tract.
Data
The study uses data from the 2019 Census Test, a randomized controlled trial (RCT) conducted by the U.S. Census Bureau to evaluate the potential effects of a citizenship question. The sample includes 480,000 housing units across the United States, with tracts oversampled based on historically low self-response rates and high shares of non-citizens.
Roughly half the sample received a census questionnaire that included a citizenship question, while the other half received the standard 2020 Census version without the question. The survey was administered via mail, online, and phone, with materials available in both English and Spanish, depending on the language profile of the tract.
Researchers linked administrative records—including tax filings, Social Security data, and other federal databases—to each household using unique identifiers. These records enabled the construction of household-level variables indicating citizenship, immigration status, race/ethnicity, household structure, income, and other demographic characteristics. The AR data also allowed identification of households likely to include undocumented members, particularly through the presence of Individual Taxpayer Identification Numbers (ITINs), which are commonly held by individuals ineligible for Social Security numbers.
Methods
The authors conduct a randomized experimental analysis using difference-in-differences (DiD) and regression modeling. The primary dependent variables include:
Self-response rate: Whether a household responded to the census questionnaire.
Household roster omission rate: The gap between the number of household members reported at the beginning of the survey and the number for whom full data was provided.
Citizenship item nonresponse: The frequency with which households left the citizenship question unanswered in the version that included it.
The treatment variable is the randomized assignment to a panel that included or excluded the citizenship question. To detect heterogeneous effects, the authors estimate logistic regressions and OLS models with interaction terms between household characteristics (e.g., presence of non-citizens, race/ethnicity, immigrant origin) and the treatment assignment. Controls include household structure, income quintile, housing type, and other “structural” variables known to affect census response rates.
By focusing on household-level, rather than tract-level, data, the authors aim to identify variation in response behaviors that might otherwise be masked in aggregated analyses.
Findings/Size Effects
The key findings reveal that adding a citizenship question had significant negative effects on survey response rates and household roster completeness, but these effects were concentrated among specific vulnerable populations:
Overall Self-Response Rates
The overall self-response rate dropped slightly—from 51.96% without the question to 51.50% with it—a statistically insignificant 0.46 percentage point change. This finding aligns with prior tract-level research (Poehler et al., 2020), which reported minimal average effects.
Differential Effects by Household Type
The aggregate results mask much larger effects among subgroups:
Households with at least one likely undocumented member (ITIN holder) had a response rate of 27.5% without the citizenship question and 24.4% with it—a 3.1 percentage point drop, which is statistically significant.
Households with at least one non-citizen with a Social Security number (SSN) experienced a 1.3 to 2 percentage point decline, depending on whether they were born in Latin America or elsewhere.
In contrast, all-U.S.-born non-Hispanic White households saw only a 0.67 percentage point drop, from 70.4% to 69.7%.
These patterns indicate that the citizenship question had a chilling effect specifically on immigrant and mixed-status households, not on the general population.
Household Roster Omissions
The citizenship question also increased the underreporting of household members:
Among households with likely undocumented members, the average number of omitted individuals (measured as the difference between initial household count and number of complete responses) rose from 0.12 to 0.24 persons per household, a doubling of omissions.
For all-U.S.-born non-Hispanic White households, the omission rate was stable at 0.02 persons per household, regardless of treatment.
This finding suggests that even when such households did respond, they were more likely to exclude certain members—most likely those perceived as vulnerable—when the citizenship question was included.
Difference-in-Differences Results
Regression models confirmed the statistical significance of these subgroup effects. Interaction terms indicated that the response rate gap between likely undocumented households and all-U.S.-born White households widened by 2.4 percentage points when the citizenship question was present. The authors conclude that the inclusion of the citizenship question would exacerbate existing undercounts among non-citizens.
Implications for Official Statistics
These results help explain previous findings that administrative records consistently show more non-citizens than official population surveys like the American Community Survey (ACS) or the decennial census. Undercounts are not random—they disproportionately exclude the most vulnerable populations when sensitive questions are added to the survey instrument.
Conclusion
The study provides clear and policy-relevant evidence that adding a citizenship question to the census disproportionately reduces response rates and increases omission of household members among non-citizen and likely undocumented populations. Although the overall effect on response rates is small, the magnitude of the impact among these specific groups is both statistically and practically significant.
The research underscores the risks of using survey instruments that introduce fear, confusion, or mistrust among populations already facing barriers to participation. It also illustrates the limits of tract-level analyses in detecting group-specific vulnerabilities and makes the case for using administrative data to improve precision and inclusiveness in survey design.
From a policy standpoint, the findings suggest that the inclusion of a citizenship question in future censuses—as proposed in legislation like the Equal Representation Act—would likely increase the differential undercount of non-citizens, reduce the accuracy of population estimates, and potentially distort apportionment and federal funding formulas.
Ultimately, the article calls for a careful balance between information needs and the integrity of statistical data collection. Survey designers must weigh the potential policy benefits of asking sensitive questions against their chilling effects on data quality, especially for marginalized groups. By revealing these trade-offs using rigorous experimental evidence, this study makes a significant contribution to the field of public administration and population statistics.






Comments