Skip to content
Quick Start for:

Chapter 5: Return on Investment and Economic Impact

The Comptroller contracted for services to help analyze the state’s return on its investment and the economic impact of the Smart Jobs and Skills Development Programs one year after training was completed and again after three years. This analysis is required by statute enacted during the 1999 Legislature.

This chapter addresses that requirement within the limitations imposed by the poor quality of program data available. An accurate control group analysis was not possible because sufficient information to draw a valid control group was not available. Moreover, the cause of wage gains cannot be directly linked to training.


Findings

Between its inception in fiscal 1996 and fiscal 2000, the Skills Development Fund (SDF) has spent $61.7million. Based on the data collected, the Comptroller’s office estimates that SDF trainees gain in their first two quarters after training an aggregate of $119.3 million. By the end of the first year, that figure has increased to $133.3 million. The data on trainees’ earnings after three years, however, is insufficient to accurately estimate overall earnings gains. While the state’s $61.7 million in expenditures cannot be directly linked to workers’ training, it appears to have helped increase short-term earnings in the aggregate.

Between its inception in fiscal 1995 and fiscal 2000, the Smart Jobs Fund has spent $183.1 million. Based on the data collected, the Comptroller’s office estimates that Smart Jobs trainees gain in their first two quarters after training an aggregate of $154.6 million. By the end of the first year, that figure has declined to $111.0 million. Data on trainees’ earnings after three years, however, is insufficient to accurately estimate overall earnings gains. The cause of the initial small drop in the first year can be attributed for the most part to subsequent unemployment. While the state’s $183.1 million in expenditures again cannot be directly linked to workers’ training, it appears to have helped increase earnings.

In terms of tax revenues, separate analysis by the Comptroller’s Office suggests that 12.9 percent of an individual’s income contributes revenue to the state in the form of tax collections and fees. Using this formula, the return to taxpayers for SDF spending in the first six months after training is $15.4 million, representing a 25-percent return on total program costs.

For the Smart Jobs program, the implied public sector revenue in the first six months is $19.9 million, representing a return on total program costs of 10.9 percent. Based on this analysis, the short-term impact appears fairly positive for both programs.

While the experts conducting the return on investment analysis are not confident of the results, this report has been prepared with the best data that is available to comply with the Legislature’s mandate.


Conclusions

The need for skills development and an educated Texas workforce is clear. The quality of the state’s labor pool may well be the single most important determinant of its future economic prosperity. Programs such as SDF and SJF appear to have helped raise short-term earnings, in the aggregate.

This statement conceals certain details of the results of the programs. While there likely are individuals who have received significant benefits from these programs, there are also those who have gained virtually nothing.

Given the data limitations, attributing outcomes to these programs is not wise. For example, the inability to calculate the impact on earnings of external factors makes it difficult to state that a program caused earnings to increase, especially over time. By the same token, the negative results reported for SJF for one year after training are artificial, as participation in a training program almost certainly did not cause a person to become unemployed.

What emerges is that data limitations confine the conclusions that can be appropriately drawn to a broad statement about modestly positive results in the near term. This conclusion has some use for program evaluation, but is not useful for policy development and program operations and planning in the future.

One of the study’s inherent obstacles stems from using information that was not gathered and structured for purposes of measuring return on investment and economic impact. At present, the impact of training programs cannot be accurately distinguished from other factors that affect employment. The following are a series of general steps that should be considered to improve the quality of available data for future studies.

  • Expand the information reported in the SDF and SJF databases and improve the quality of the data collected through better database management. The databases do not have dates of birth or, in the case of SDF, gender. Much of the data was clearly miscoded or mismatched. A better unique identifier than merely a Social Security Number (SSN) is needed to join program databases with the state's UI database. Some people have had more than one training spell, complicating the identification of program impact.

  • Adopt better methods for matching records from program databases and the UI database for the purposes of analysis. More advanced methods could be used to link files using the SSN.

  • Collect data with analysis in mind. Determine an economical way to match program records with UI records beyond six months.

  • Incorporate information on appropriately constructed control groups that match trainee groups. Analysis of control groups provides a better set of benchmarks for comparing post-training earnings.

Another option exists. Instead of attempting to track individual trainees, a quasi-experimental framework could be constructed in which a carefully selected, stratified random sample of trainees is followed over time, scientifically matched to a control group. This framework would facilitate detailed survey and interview-based follow-up, which would permit more consistent and solid benchmark program evaluation and return on investment analysis.