top of page

Does Code Louisville Improve Earnings and Employment Outcomes Compared to Traditional Job Training Programs?

  • Writer: Greg Thorson
    Greg Thorson
  • Aug 23
  • 5 min read
ree

This study asks whether Code Louisville, a federally supported sectoral-style training program in computer coding, improves participants’ labor market outcomes compared to traditional job training programs. Using Kentucky Longitudinal Data System records matched to over 190,000 comparison individuals, the authors estimate effects for 1,108 enrollees. Men initially show little earnings impact but gain 5–10% in quarterly earnings and 5 percentage points in employment by three years after enrollment. Women experience faster earnings growth—about 5% within a year, rising to 15% after three years—alongside a 3-point employment increase. Benefits are largest for participants with bachelor’s degrees or higher .


Full Citation and Link to Article

Bollinger, Christopher R., and Kenneth R. Troske. 2025. Evaluation of a new job training program: Code Louisville. Journal of Policy Analysis and Management. https://doi.org/10.1002/pam.70028


Extended Summary


Central Research Question


The article evaluates whether Code Louisville, a job training program in computer coding supported by federal Workforce Innovation Fund dollars and administered locally by KentuckianaWorks, improves participants’ labor market outcomes. The authors seek to determine whether a sectoral-style training program—partially funded and structured like federal programs—delivers returns more comparable to private sectoral training programs or to more traditional federal job training initiatives, which have historically shown modest impacts. A central question is whether Code Louisville produces meaningful and sustained gains in employment and earnings for participants, and how these impacts vary by gender, education level, and program completion.



Previous Literature


Job training program evaluations have a long history in labor economics and policy analysis. Research on broad federal programs such as the Workforce Innovation Act (WIA), its successor WIOA, and the Trade Adjustment Act (TAA) generally finds modest but positive returns. These programs typically produce incremental earnings gains in the range of 10–25% depending on subgroup and context, though results are often heterogeneous and sometimes fade over time (Andersson et al., 2022; Heinrich et al., 2013; Hyman, 2018).


By contrast, evaluations of sectoral training programs—privately run or funded by nongovernmental organizations—tend to show much larger and more durable effects. Rigorous studies, including randomized controlled trials, find sustained gains in earnings of 12–34% for participants in programs focused on specific industries like information technology, health care, or advanced manufacturing (Katz et al., 2022; Baird et al., 2022).


The literature also highlights that training impacts vary by demographic characteristics and educational background. Human capital theory suggests that returns to additional training may be higher for individuals who already possess substantial educational foundations. Research on computer literacy and digital skills (Krueger, 1993; Autor et al., 1998) shows that technology proficiency is strongly correlated with higher wages, suggesting IT-focused programs may generate significant labor market returns.


Against this backdrop, Code Louisville presents a hybrid model: a federally supported but sectorally designed program focused on IT skills. The authors’ hypothesis, rooted in prior work, is that Code Louisville’s returns should exceed those of traditional federal programs but fall short of the highest returns reported in nongovernmental sectoral training programs.



Data


The study uses administrative data from the Kentucky Longitudinal Data System (KLDS), maintained by the Kentucky Center for Statistics. KLDS aggregates data from multiple state sources including K–12 education records, higher education institutions, unemployment insurance (UI) earnings and employment records, driver’s licenses, and vital statistics.


The dataset includes information on 1,222 Code Louisville participants, narrowed to 1,108 after applying restrictions such as age (20–63), residence in the Kentucky portion of the Louisville metropolitan area, and prior employment earnings. The comparison group is drawn from over 4 million individuals in the same geographic area who were between 20 and 64 years old from 2013 to 2020, excluding Code Louisville participants. After applying matching methods, the final comparison sample includes 192,690 individuals.


Key variables include demographic characteristics (age, gender, race, education), quarterly earnings and employment status, and county of residence. The study further distinguishes between enrollees and completers (defined as those who finish at least one 12-week training module). Of the 1,108 enrollees, 635 individuals are classified as completers.



Methods


The authors employ a quasi-experimental matching design combined with regression techniques to estimate the average treatment effect on the treated (ATT). The approach uses exact matching on gender, race, and education, along with close matching on prior earnings and age. This ensures treatment and comparison groups are highly similar on observable characteristics.


The preferred econometric specification is a panel fixed-effects regression model:


  • Dependent variables: quarterly earnings, log earnings, and employment status.

  • Key independent variables: treatment indicators interacted with time (quarters before and after enrollment).

  • Controls: individual fixed effects, cubic in age, and calendar quarter dummies to account for macroeconomic shifts.



The model estimates outcomes up to three years (12 quarters) after enrollment, separately for men and women, and separately for enrollees and completers. The study also explores heterogeneity by educational attainment (high school or less, some college, bachelor’s degree, master’s or higher, unknown).


Robustness checks include alternative regression specifications with and without fixed effects, inclusion of demographic covariates, and subgroup analyses by completion status.



Findings/Size Effects


The analysis reveals significant but heterogeneous impacts by gender, education, and completion status.


Men


  • Employment: Men see rapid employment gains, with about a 5-percentage-point increase within one year of enrollment compared to the matched group. This gain persists through the three-year follow-up period.

  • Earnings: Earnings effects are modest initially but become more pronounced over time. By three years post-enrollment, men experience 5–10% higher quarterly earnings relative to the comparison group.

  • Education: Gains are concentrated among those with at least a bachelor’s degree; effects for less educated men are smaller or statistically insignificant.


Women


  • Employment: Women’s employment effects are slower to appear. By three years post-enrollment, women gain about 3 percentage points in employment relative to the comparison group.

  • Earnings: Earnings impacts are stronger and faster than for men. Women experience about a 5% increase within one year of enrollment, growing to roughly 15% after three years. This growth is statistically significant and robust across models.

  • Education: As with men, women with higher education benefit the most, while women with lower educational attainment see smaller effects.


Completion status


  • The largest benefits accrue to participants who complete at least one training module. Completers exhibit larger and more consistent gains in both earnings and employment. Non-completers see little measurable improvement.

  • Among completers, women’s earnings growth is particularly strong, while men’s employment gains remain robust.


Comparison to other programs


  • Impacts are comparable to or slightly stronger than traditional federal training programs like WIOA, which typically yield 10–25% earnings gains for certain subgroups.

  • However, Code Louisville’s impacts remain below those of the most successful sectoral programs evaluated in randomized controlled trials, which report earnings increases of 12–34%.

  • A key distinction is delivery format: Code Louisville’s online training may reduce impacts relative to in-person models but at significantly lower cost, enhancing efficiency.



Conclusion


The evaluation of Code Louisville demonstrates that sectoral-style training programs supported by federal funding can generate meaningful improvements in labor market outcomes, though effects vary across groups. Men benefit primarily through increased employment rates, while women benefit more strongly through earnings growth. Across both genders, the largest returns accrue to those with higher levels of education and to individuals who complete at least one program module.


Although the size effects are somewhat smaller than those reported for nongovernmental sectoral training programs, the impacts are stronger than those of traditional federal programs and occur at substantially lower cost due to the program’s online delivery model and reliance on volunteer mentors. This efficiency strengthens the case for expanding similar blended-learning, sectorally focused programs to other regions.


The study also illustrates the value of using comprehensive state administrative data systems for program evaluation. By linking training participants to population-level labor market records, the authors provide rigorous, long-term evidence on program effectiveness that could guide future workforce policy.


In sum, Code Louisville offers a promising model for aligning federal workforce investments with local labor market needs while leveraging cost-saving innovations in delivery. Policymakers seeking to improve the returns to public training programs should consider scaling similar approaches, while researchers should continue to exploit state-level longitudinal data systems to assess program impacts with precision.



Comments

Rated 0 out of 5 stars.
No ratings yet

Add a rating
Screenshot of Greg Thorson
  • Facebook
  • Twitter
  • LinkedIn


The Policy Scientist

Offering Concise Summaries*
of the
Most Recent, Impactful 
Public Policy Research

*Summaries Powered by ChatGPT

bottom of page