Skip to content
Quick Start for:

Chapter 4: Trainee Profiles, Employment, and Wage Gains

The Comptroller’s office contracted with the State Occupational Information Coordinating Council, formerly an independent group but recently made a division of the Texas Workforce Commission called Career Development Resources (CDR), to analyze information about trainees in the Smart Jobs and Skills Development Fund (SDF). CDR attempted to determine the level of trainees’ wages, the employment retention rates and the average wage gains. Data was not available to determine one and three-year post-training results, which was required by the enabling legislation, so CDR used the best information available, which was primarily data obtained from three to six months after individuals were trained.

CDR was able to use about 33,700 individuals’ training records for the Smart Jobs program and slightly fewer for SDF. CDR performed data matches with the Texas Workforce Commission’s Unemployment Insurance (UI) system for employment wage records and the Texas Department of Human Services’ welfare records.

CDR found many errors and omissions in the trainee records because of after-the-fact data entry, multiple groups entering data and a lack of audit procedures to ensure data quality. The problems with the data make CDR’s report less reliable than the project managers envisioned for the study.

Moreover, SDF and Smart Jobs staff did not collect certain types of useful information for this report. For example, the data collected lacked geographic -specific identifiers for the place of employment, so CDR could not identify the effect of training on economically distressed areas. Other demographic information on trainees and specific training types could have improved the analysis.

For future reports, both the Texas Department of Economic Development and the Texas Workforce Commission should be required to modify their databases to collect the information necessary for effective follow-up performance reviews. The agencies should work with the Comptroller’s office and CDR on the needed changes so they can improve their programs based on full and accurate information.

This project also required CDR to create a control group to analyze trainees’ performance compared to people who did not participate in these training programs. The composition of the control groups matched the programs trainees in terms of their industry group and median earnings in the quarter before training. The quality of the data, however, was so poor that accurate conclusions about the control groups’ behavior and subsequent work history are not possible.

The training start and finish dates for individual trainee records were based upon the contracts’ beginning and ending dates. In examining records in the Smart Jobs program, some ending dates were a considerable length of time after the training was completed because contracts were not closed out in a timely fashion during this period. This delay could have resulted in recording higher average wage gains for trainees because the period of observation was longer and they would have had additional time to get better jobs with higher wages.


Wage Levels of Trainees

Pre-training quarterly earnings varied widely in both programs. Pre-training quarterly earnings among Smart Jobs trainees ranged from $1 to $172,723 or a maximum of $690,892 per year. For SDF, the range was from $0 to $168,000 or a maximum of $672,000 per year. While the maximum incomes listed appear unusually high for a small percentage of the trainees, the data came directly from the Unemployment Insurance database. There is no way to determine if this is accurate information or the result of data entry errors.

Approximately 19 percent of the persons served by both programs had no earnings in the pre-training quarter. About 10 percent of the persons trained by Smart Jobs and SDF had pre-training earnings which, if annualized, were between $50,000 and $100,00 per year (Table 20 and 21).

Table 20

Smart Jobs Trainees’ Pre-Service Earnings

UI-verifiable total earnings in the first
full quarter before being trained (Q-1)
Number
%
$0
6,552
19.4%
$1 to $6,250 (top of category annualized = $25,000)
13,428
39.8%
$6,251 to $12,500 (top of category annualized = $50,000)
9,734
28.9%
$12,501 to $25,000 (top of category annualized = $100,000)
3,543
10.5%
More than $25,001 (more than $100,000 per year)
441
1.3%
Column totals
33,698
100.0%


Range of quarterly pre-service earnings
$0 to $172,723
Median quarterly earnings (of those with earnings in Q-1>$1)
$ 6,312
Mean quarterly earnings (of those with earnings in Q-1>$1)
$ 7,649

Table 21

SDF Trainees’ Pre-Service Earnings

UI-verifiable total earnings in the first
full quarter before being trained (Q-1)
Number
%
$0
6,486
19.3%
$1 to $6,250 (top of category annualized = $25,000)
13,950
41.5%
$6,251 to $12,500 (top of category annualized = $50,000)
9,837
29.2%
$12,501 to $25,000 (top of category annualized = $100,000)
3,099
9.2%
More than $25,001 (more than $100,000 per year)
266
0.8%
Column totals
33,638
100.0%


Range of quarterly pre-service earnings
$0 to $168,000
Median quarterly earnings (of those with earnings in Q-1>$1)
$ 6,113
Mean quarterly earnings (of those with earnings in Q-1>$1)
$ 7,181

Median pre-training quarterly earnings among Smart Jobs trainees was $6,312; among SDF trainees, median pre-training earnings were $6,113 per quarter. If annualized, the median earnings among Smart Jobs and SDF trainees would both be approximately $25,000. The median, rather than the mean is a more representative measure.

Because both programs served significant numbers of trainees with pre-training quarterly earnings in excess of $50,000, mean figures were much higher than the medians. The mean pre-training quarterly earnings among Smart Jobs trainees was $7,649 (or $30,596 per year). For SDF, the mean was $7,181 (or $28,724 per year). In other words, both programs served individuals whose earnings were approximately the same as those of the average Texan whose mean annual earnings was $30,004 in 1998.

In both programs, the median and mean earnings were well above the federal definition of poverty for a single head of household of $7,980 per year or the commonly used definition of the working poor at 175 percent of poverty threshold or $13,965 per year.


Employment Retention

More than 80 percent of the participants served by Smart Jobs and SDF were employed in the first full quarter before they received training (Table 22). There are several possible explanations for the high pre-training employment rate. The training program’s notice of grant awards may have allowed grant-receiving firms time to hire new workers for training-targeted positions before training actually began.

For both Smart Jobs and SDF trainees, employment rates in the first full quarter after training increased by several percentage points. The percentage increase in employment during the second quarter after training was slightly larger among those trained through Smart Jobs.

Table 22

UI-Verified Employment in Texas in Targeted Quarters


Smart Jobs
(Total Number of
Records = 33,698)
SDF
(Total Number of
Records = 33,638)
Target Quarter
Number
%
Number
%
First full pre-service quarter
27,146
81%
27,139
81%
First full post-exit quarter
30,743
91%
29,588
88%
Second full post-exit quarter
30,434
90%
29,025
86%





Control groups as of the second full post-exit quarter
1,817
58%
2,045
68%

In both cases, the post-training employment rate decreased slightly by the second quarter after training. This decrease was slightly larger for those trained through SDF. Neither program had post-training employment rates in the second quarter after training drop to below their employment rates before training.

At least in the short run, the post-training employment rates among both Smart Jobs and SDF trainees appear to be higher than were the employment rates for those programs’ respective control groups at the second quarter after training.

The poor quality and reliability of the control group data, however, make it impossible to compare accurately the results of training versus the employment experience of the control group who are not trained.

Although the quality and relevance of instruction may account for some differences in results, many other factors—such initial contract selection criteria, industry, the level of trainees and others information—could affect the study’s results.

One program may have focused more attention on trying to make firms in declining industries more productive to weather the decline. To find an appropriate explanation, a more detailed analysis must be performed on the frequency distributions of grants across industries in conjunction with contextual information on the concurrent condition of those industries. That analysis was not feasible in this initial study.

Differences in post-training employment rates could be the result of different levels of selectivity or patterns in the grants awarded. A program may have put more effort into attempts to shore up marginally viable firms which are part of declining industries located in weak labor markets. Some declines in the post-training employment rate also will result if some of the grant-receiving firms fail. Without reliable information on the work sites that grant-receiving firms selected for training, there is no way to estimate the extent to which employment rate declines can be attributed to overall economic conditions.

Low wage workers tend to have higher turnover rates, more bouts of unemployment, and their bouts of unemployment tend to last longer. The low wages may indicate that workers lack the abilities that employers value the most or may result from a variety of unknown or personal factors. The greater the portion of low wage earners in the mix of persons served, the lower the expected post-training employment rates should be.

Other workforce development programs show that the lower the pre-training earnings, the less likely it is that trainees will have UI-verifiable employment after training. The difference in the change in pre-training and post-training employment rates for Smart Jobs and SDF may be due in part to a slightly greater tendency by the Smart Jobs program to gear training services to persons with higher pre-training earnings.

To some extent, performance measures can be “gamed” by choosing more viable firms from those applying for grants, the work sites targeted for training services and the direction of the trend in the industry’s economic activities. Moreover, “creaming” – serving a disproportionate number of trainees whose characteristics and pre-training histories indicate a stronger commitment to the labor force – can help make a training program appear more successful than it is.

Programs will be managed differently depending on the performance expected of them. If a program like Smart Jobs or SDF is expected to shore up productivity in marginal firms in declining industries in economically distressed areas and to serve persons with significant barriers to employment, then more weight should be given in performance evaluations to their faithful adherence to stringent criteria for awarding grants and to the characteristics of the participants selected for training.

On the other hand, these programs may be treated primarily as engines of economic development. In that case, emphasis in performance evaluations should be placed not only on the post-training employment and earnings of the trainees but also on: the employment demand growth among the grant-receiving firms; increased worker productivity and profitability of the grant-receiving firms; and the degree to which they stimulate economic growth and the purchasing power of their employees.

An examination of UI wage records where employment and earnings data for SDF trainees can be obtained after four quarters, employment retention drops off significantly. In Table 23, note that data was not yet available for about 13,100 trainees. This analysis was not performed on Smart Jobs data because the records were received too late.

Table 23

Employment and Earnings Among Former Trainees at Q+4


Employed 1 year after exit
Earnings 1 year after exit
Program
Number
%
Median
Mean
Smart Jobs
NA
NA
NA
NA
SDF (Subset N = 20,538)
12,497
60.8%
$7,526
$8,120

Table 23 illustrates that about 60.8 percent had UI-verified employment one year after

SDF training.

A preliminary analysis shows that one year after SDF training less than half (43 percent) were retained by the same employer. Another 8 percent remained in the same industry classification, but were employed by a different firm.

Trainees with the highest pre-training earnings were the most likely to have UI-verifiable employment at each post-training measurement interval. Differences in performance levels, therefore, may reflect more about the grant-recipients’ selection of participants for training than it is an indicator of instructional quality and relevance.


Welfare Programs

To help estimate the return on investment and economic impact of these programs, an examination of trainee participation in other state programs, such as welfare and unemployment was performed. To determine the impact these programs have on welfare recipients, records from Smart Jobs and SDF were linked to the payment history files maintained by the Texas Department of Human Services (TDHS).

The study found that the training programs did not reduce the state’s public assistance payments significantly because the vast majority of those receiving Smart Jobs or SDF training were not receiving public assistance. The total savings to the state in the first quarter after training amounted to $170,585 among all of Smart Jobs trainees and $227,673 among all SDF trainees.


Earnings Gains

Earnings gains measures are designed to estimate the market value of the knowledge, skill and abilities acquired by a trainee. Other factors operating can affect changes in a trainee’s earnings: inflation or cost of living adjustments, seasonal changes in hours worked, end-of-year or periodic bonuses, the employer’s changing labor demands for time-specific production runs or peak workload occasions, accrued seniority, new contract negotiated by an organized labor group, and other factors.

Short of engaging in an experimental design, it is impossible to infer that the Smart Jobs and SDF training was the sole cause of any earnings gains. However, for purposes of this report and to meet the requirements of the legislation, earnings gains were calculated for each of the programs and for the control groups.

Earnings gains were realized by trainees in both programs. Among Smart Jobs trainees, 78 percent were employed in both the pre-training quarter and the training quarter. Their median quarterly earnings gain was $1,590.

Among SDF trainees, 76 percent were employed in the first full pre-training quarter and the first full post-exit quarter. Their median quarterly earnings gain was $1,090.

Assuming that all trainees from both programs worked full-time during the pre-training quarter, the first full post-training quarter and the next three consecutive quarters, those median earnings gains for each trainee would add between $4,360 to $6,360 to their gross annual average.


Incumbent Workers and New Hires

More than half of the persons trained by Smart Jobs or SDF appeared to be grant-receiving firms’ new hires (Table 24). Most of the new hires trained in both programs appeared to come from other firms in other industries or came either from the ranks of the unemployed (or, at least, included persons without UI-verified earning from covered employment in Texas in the pre-training quarter). There is little evidence that grant-receiving firms hired workers away from their competitors in the same industry.

Table 24

Were Trainees Incumbent Workers or New Hires?

(Where were the trainees employed before the training period?)


Smart Jobs
SDF
Number employed in post-training quarter
30,743
29,588


Number
%
Number
%
Where
No UI-verified employment
4,583
14.9%
4,177
14.1%
were they
Same employer
12,675
41.2%
13,756
46.5%
employed
Same industry, different employer
1,718
5.6%
1,524
5.2%
In pre-service
Different employer, different industry
11,767
38.3%
10,131
34.2%
quarter?
Column totals
30,743
100.0%
29,588
100.0%

Among the trainees of both programs, less than half of those employed in first quarter after training actually worked for the same employer in pre-training quarter.

Some workers hired late in the pre-training quarter also could be considered new hires, not incumbents. Incumbent workers could be defined as those having had more than one or two months’ tenure with a firm.

Employment appears to be at a relatively stable plateau through the first two quarters following the training with modest earnings gains for the average trainee (Table 25). Coupled with evidence of a flurry of hiring activity in the training quarters, it suggests that earnings gains may have been realized because trainees worked more hours during the post-training quarters.

Table 25

Pre-Training Employment Continuity <Mobility> and Earnings Gains

Employment continuity or mobility

between the pre-service quarter
Median earnings gains or <losses> among trainees from
and the post-exit quarter
Smart Jobs
SDF
Same employer
$1,141
$865
Different employer, same industry
$1,579
$1,240
Different industry
$2,058
$1,575


Post-training Employment Mobility Among Trainees

UI wage records contain data on the employer of record and the industry of employment for covered workers. Armed with these data, an analysis of the employment mobility among trainees and the effect that the training had on their quarterly earnings was performed. In comparing information any two target quarters, three possible scenarios can result from training.

  • Trainees worked for the same employer in both quarters;

  • Trainees switched employers but stayed in the same industry;

  • Trainees found employment in another industry.

Of those three scenarios, the last is seen by employers as a cause for concern. Employers fear that they will not get a return on their training investment if the workers they train leverage their newly acquired skills to obtain employment at another company — especially a competitor.

This analysis focused on the first two quarters after training. Of both Smart Jobs and SDF trainees employed in the first quarter after training, most were still employed by the same firm in the second quarter (Table 26). This is true for trainees under Smart Jobs and SDF. Trainees under SDF were slightly more likely to go to work for another firm in the same industry. This tendency, however, may be a function of differences between SDF and Smart Jobs in program design. While Smart Jobs grants were awarded primarily to individual firms, some SDF grants went to consortia of firms within the same industry or directly to service-providers who devised training to address the needs of several employers in the area.

Table 26

Employment Mobility Among Trainees Between the First Post-exit Quarter and the Second Post-exit Quarter

(Did they go or stay after receiving training?)


Smart Jobs
SDF
Number employed in first training- quarter
30,743
29,588


Number
%
Number
%
Where were
Same employer
27,781
90.4%
24,394
82.4%
They employed
Same industry, different employer
339
1.1%
1,356
4.6%
in second
Different employer, different industry
1,792
5.8%
2,525
8.5%
post-training
No UI-verified employment
831
2.7%
1,313
4.4%
quarter?
Column totals
30,743
100.0%
29,588
99.9%

SDF column total at 99.9% due only to rounding error.

Despite commonly expressed employer fears, there is little evidence that trainees marketed the skills they acquired on one firm’s time to land a job with a competitor in the same industry. If trainees left a grant-receiving firm after being trained by Smart Jobs or SDF, they were more likely to find work in another industry or to leave UI covered employment in Texas altogether.

Except for those who were working in Texas in covered employment moving to “not located” status, there was very little change in median earnings (Table 27). There were no significant gains for those who remained with the same employer or for those who found work with a different employer in another industry. Among those who went to work for a different employer in the same industry, change in earnings was negative – possibly from fewer hours worked. Virtually identical patterns of employment mobility and continuity were exhibited by Smart Jobs and SDF trainees with almost identical consequences in terms of earnings gains or losses.

Table 27

Earnings Gains <loss> by Post-Training Employment Mobility

Employment mobility or continuity
between the first post-training quarter
Median earnings gains or <losses> among trainees from
and second post-training quarter
Smart Jobs
SDF
Same employer
$0
$17
Different employer, same industry
<$88>
<$89>
Different industry
$19
$6


Long-term effects

For both Smart Jobs or SDF, comprehensive data on long-term employment after training is not yet available. Where the data exist for certain trainees, there is a marked decline in the employment rates for trainees for both Smart Jobs and SDF at one year past training. The decline was more dramatic for those in the SDF. Fewer trainees had the same employer one year after training than they had in the first quarter after training.

Earnings gains between the first post training quarter and the quarter one year out were modest among trainees in the initial SDF data. Those gains over two post-training periods are well within the range that could be accounted for by increases in hours worked, overtime pay, seniority- or longevity-based pay increases or a union’s renegotiated pay scale, inflation or cost of living adjustments rather than pay raises commensurate with any increase in knowledge, skills and abilities or productivity resulting from the training they received. A comparison of the first post-training quarter and the one year out earnings among initial data for Smart Jobs actually shows a loss.

Detailed analysis of patterns in employment and earnings after one year cannot be performed until additional quarters of UI data is compiled. Although no firm conclusions can be made at this time about the long-term impact of Smart Jobs and SDF, the data suggest some direction.

Initial data suggest that the most significant impact came as firms hired additional workers, then used grant dollars to train them. To a lesser extent, grant dollars were used to train professional, managerial and technical personnel at the high-wage end of the staffing pattern and more likely employees of long standing rather than new-hires. Earnings among new hires may have increased as a consequence of working more hours in post-training quarter for several quarters. Earnings of the higher wage professional with more tenure at the firm may have increased through additional hours, overtime, bonuses and incentives and/or added commissions. More information about each trainee is necessary before a complete analysis can be performed.

Table 28

Tentative Findings at the One Year (Q+4) Interval

(based on a limited number of seed records )


Number of records run against Q+4
% found employed in covered job in Texas at Q+4
Median Earnings Grain or <loss>
Program


Employed in
pre-service quarter and Q+4
Employed in
pre-service quarter and Q+4
Smart Jobs
3,968
87.4%
$32
<$514>
SDF
18,468
67.5%
$1,463
$337

The further one pushes the analysis past the training end date, the more likely it is that trainees either change their place of employment in Texas or leave the Texas labor market altogether. It appears the new hires brought on board at the lower end of the staffing pattern just before training commenced were more likely to change their place of employment later, while the more senior professional, managerial and technical workers enjoyed more employment resilience, job continuity with the same firm and stable earnings at the higher end of the pay scale.


Control group comparisons

Ideally, control groups used in a quasi-experimental design of this nature are matched more precisely to a program’s trainees according to a rather exhaustive set of characteristics likely to have an impact on their post-training results. Control groups and trainees, for example, should be matched on demographic, educational background and location variables. To the extent that control groups are matched well on relevant characteristics, analysts have more confidence that differences in results for the trainees and the control group can be attributed to training received.

Control groups were selected separately for Smart Jobs and SDF. Since there is no master list of Social Security numbers for all Texans in the labor force, control groups were pulled from the UI wage record database. In the absence of demographic, educational background or locational data in the seed records or in the UI wage records, the criteria used to select control groups were based on the limited information that could be gleaned from the UI wage records of Smart Jobs and SDF trainees.

Shortcomings in the matching process severely limited the project manager’s confidence in inferences about program performance drawn from comparisons of the control groups’ results to those achieved by either Smart Jobs or SDF trainees.