Skip to content
Quick Start for:

Chapter 2: Customer Satisfaction Survey

House Bill 3657 directed the Texas State Comptroller to perform a biennial performance evaluation of the state’s Smart Jobs Fund and Skills Development Fund (SDF) programs. As part of the evaluation, the Legislature requested the Comptroller to conduct a survey and an analysis of program satisfaction from former grant recipients.

The Comptroller developed a customer satisfaction survey, which was mailed in April 2000 to all grantees. The Comptroller mailed 1,300 surveys, and 559 were returned for a response rate of 43 percent. The survey’s error rate is plus or minus 5 percent with a 95 percent level of confidence. Statistically, this is an acceptable level of accuracy for the results of the survey to be used reliably.

Most businesses are not prepared for the level of administrative requirements that must be performed under the Smart Jobs contracts. This detailed administrative work is handled by community colleges in the Skills Development Fund. Given the differences in administrative requirements, it could be expected that more positive comments would have been received for the Skills Development Fund than would have been received for the Smart Jobs program.

In addition, the period that the survey was administered could have influenced the responses related to Smart Jobs. The survey was completed in the Spring of 2000 — a period during which the Texas Department of Economic Development (TDED) began stricter enforcement of contract requirements as recommended in the State Auditor’s report. Stricter enforcement would have influenced the survey responses.

The survey asked a series of 17 questions that addressed issues of program quality, and allowed grantees to register concerns and suggestions for improving the programs. In the case of Smart Jobs, the survey was mailed directly to the businesses that received grants.

The Skills Development Fund survey was mailed to the community or technical college receiving the grant along with a list of businesses with whom they have contracts and a packet of surveys for the colleges to forward to those businesses. The survey was also made available on the Internet to allow businesses to submit their responses electronically. The survey was designed to allow each respondent to remain anonymous.


Review of Survey Results

The first nine questions focused on customer service. Respondents were asked questions that allowed the grantee to rate:

  • a program’s ease and timeliness of the application process,
  • the timeliness of funds distribution,
  • the helpfulness of program staff,
  • the clarity and appropriateness of reporting requirements,
  • the ease of contract close-out procedures,
  • the appropriateness of the training and
  • the comparability of the program with other customized job training programs.

Respondents were asked to check one of five boxes: strongly agree, agree, no opinion, disagree, or strongly disagree. Results for all nine questions are contained in Table 5.

Table 5

Response to Questions 1-9

Survey Questions
SDF Strongly Agree or Agree
Smart Jobs Strongly Agree or Agree
SDF No Opinion
Smart Jobs No Opinion
SDF
Disagree or Strongly Disagree
Smart Jobs Disagree or Strongly Disagree
  1. The application for training funds was easy to complete.
91%
50%
1%
8%
8%
41%
  1. 2. Decisions on grant applications and notification of grant actions were made on a timely basis.
73%
50%
1%
7%
26%
43%
  1. 3. Grant funds were distributed in a manner timely for my needs.
75%
45%
1%
12%
24%
43%
  1. 4. Staff assistance was helpful and timely.
91%
63%
1%
12%
9%
25%
  1. 5. Reporting requirements were clear.
81%
53%
2%
11%
17%
36%
  1. 6. Reporting requirements captured necessary information only.
81%
55%
2%
25%
18%
21%
  1. 7. Closing out the contract was easy to accomplish.
78%
39%
1%
28%
21%
32%
  1. 8. The training grant met my business needs.
95%
79%
0%
7%
5%
14%
  1. 9. This program compares favorably to other publicly funded customized training programs in which my company has participated.
91%
21%
3%
69%
6%
10%


Smart Jobs


Strengths

The question with the highest percentage of positive responses for the Smart Jobs program (for example, agree and strongly agree) was “The training grant met my business needs.” The highest rated item in the “strongly agree” column was “The training grant met my needs (30 percent), followed by “Staff assistance was helpful and timely (18 percent).


Weaknesses

Two questions tied for the highest negative response, both concerning the timeliness of the process (decisions on grant applications and distribution of grant funds). The highest percentage of “strongly disagrees” —nearly one in four respondents—felt very strongly that grant funds were not distributed in a timely manner.

Another question with a less than 50-percent positive response was, “Closing out the contract was easy to accomplish.” In other words, a majority of respondents were displeased with the closeout process.


Skills Development Fund


Strengths

Responses to the first nine questions of the survey pertaining to the Skills Development Fund received at least a 73-percent positive rating. Approximately three out of four respondents had a positive experience with each aspect of the Skills Development Fund program, from the application process to the contract closeout process.

The highest positive response (95 percent) was to the statement, “The training grant met my business needs.” Other statements with more than a 90-percent positive response were:

  • “The application for training funds was easy to complete.”
  • “Staff assistance was helpful and timely.”
  • “This program compares favorably to other publicly funded customized training programs in which my company has participated.”

Of all the questions, the most “strongly agrees” supported the statement that program staff assistance was “helpful and timely.”


Weaknesses

The greatest percentage of negative responses (26 percent) had to do with the timeliness of grant decisions. However, 73 percent of respondents were satisfied with the timeliness of grant decisions. The negative responses to timeliness of grant decisions may have been a response to the amount of grant dollars available ($61.7 million) compared to proposal requests received ($209 million).


Comparing Smart Jobs and Skills Development Fund

The most marked difference in the surveys was a 41 percent difference in the positive response to the first question: “The application for training funds were easy to complete.” While 91percent of Skills Development Fund respondents agreed or strongly agreed that the application for training funds was easy to complete, only 50 percent of the Smart Jobs customers felt the same.

Other areas of program operations where the Skills Development Fund received the higher rating by at least 25 percentage points, were:

  • Grant funds were distributed in a manner timely for my needs. (30 percent difference)
  • Closing out the contract was easy to accomplish. (39 percent difference)
  • Staff assistance was helpful and timely. (28 percent difference)
  • Reporting requirements were clear. (28 percent difference)
  • Reporting requirements captured necessary information only. (26 percent difference)


Question 10 through 12

These questions asked more specifics about the grant received and its use. These questions were “How much the grant was for?” (question 10); “How many people were trained?” (question 11); and “What type of training was provided?” (question 12). There were not enough complete responses received to these questions to allow analysis.


Open-Ended Questions

The Comptroller’s customer satisfaction survey contained four open-ended questions designed to generate more in-depth responses from grantees. The open-ended questions included:

  • #13. If you would not use this training fund again, please explain why.
  • #14. How could the grant have better met your needs?
  • #15. Please share any other comments, complaints, or recommendations.
  • #16. What would you have done if you had not received the grant?

The responses to these questions were coded and sorted. The responses fell into the following seven general groups:

  • administration, paperwork, bureaucracy;
  • praising program;
  • flexibility;
  • grant qualification criteria;
  • staff;
  • timeliness and deadlines; and
  • other issues.

A single response to a single question on a single survey could contain several specific comments; each of the specific comments received a separate code. Not all surveys contained responses to the open-ended questions; therefore, the number of comments, or responses, did not equal the number of surveys returned. In addition, some surveys contained comments repeated in more than one of the questions. As a result, the analysis of the combined responses for questions 13, 14 and 15 contains duplicate responses.


#13. If you would not use this training fund again, please explain why.


Smart Jobs Findings

By far, the most common response was “too much paperwork/administration.” More than a third of all responses (36 percent) fell into this category. The next most common comment (26 percent) had to do with time factors. Either respondents believed there were too many delays (in receiving approval for the grant or grant payments took too long) or they thought there wasn’t enough time to complete the training. Combined, these two categories accounted for 59 percent of all the responses to question 13.

The third most common comments concerned problems with TDED staff—too much turnover; lack of assistance; inconsistent responses to inquiries. Complaints about staff accounted for 17 percent of the comments. While there were no comments praising staff in response to question 13, the question did not invite positive comments. Despite this, two of the comments praised the program in general.

The remaining 21 percent of the comments addressed lack of flexibility (4 percent), problems with grant criteria (4 percent), problems with training providers (2 percent), a desire for another grant or more money (1 percent) and miscellaneous other responses (11 percent). It is important to note that TDED staff and the Smart Jobs program have no jurisdiction over what training providers the grantees used.

Of 448 responses, 78 percent said they would use the Smart Jobs program again; 22 percent said they would not.


Skills Development Fund Findings

Of the 121 responses, 94 percent said they would use the Skills Development Fund program again; 6 percent said they would not.

Question 13 only received 14 comments; five were about problems with the training provider or the community college administering the grant; four were about the amount of paperwork and administration; another four were about delays; and one praised the program staff. There were no comments about the grant criteria or the lack of flexibility in response to question 13.


#14. How could the grant have better met your needs?


Smart Jobs Findings

The most common comment (28 percent of the 386 responses) cited the need for Smart Jobs to resolve timing issues, especially the delays in reimbursing companies for incurred expenses. Others had problems receiving timely approval of their grant or were left insufficient time to complete the training.

The next most common set of recommendations (21 percent) was to simplify the application and reporting requirements. Eleven percent of the respondents said the program met their needs or that they were pleased with the program. Several survey responses contained comments highly praising the program.

Another 11 percent voiced concern about staff at TDED, and 9 percent wanted increased flexibility in the program to change their training plan as their needs changed from the time they applied to the time the training began. The remaining 22 percent of the comments covered a wide range of concerns, including the criteria for the grant, problems with training providers and a desire for additional funds.


Skills Development Fund Findings

Unlike the Smart Jobs program, the most common comments (31 percent of 69 responses) cited “other” issues, including the wish for another grant or more money (9 percent), problems with community college administration of the grant (4 percent) and problems with the training provider (4 percent).

The next most common set of recommendations (29 percent) concerned delays and deadlines. This proportion is about the same as for the Smart Jobs program for the same issue. The portion of Skills Development Fund customers commenting they were pleased with the program was 14 percent, slightly higher than Smart Jobs’ 11 percent.

Only 9 percent voiced concern over the complexity of the paperwork and administration of the program, compared to Smart Jobs’ 21 percent. Only 2 percent of the comments were critical of staff, compared to 11 percent for Smart Jobs.


#15. Please share any other comments, complaints, or recommendations.

This question was intentionally left broad to provide respondents an opportunity to comment on any aspect of the program.


Smart Jobs Findings

While 9 percent complained of a lack of flexibility in question 14, only 3 percent did so in question 15. Eighteen percent (versus 11 percent in question 14) voiced satisfaction with the program. The same pattern was evident concerning frustrations with TDED staff. Seventeen percent mentioned they had frustrations versus 10 percent in question 14.

Once again, most of the comments (22 percent) concerned the delays in the grant process, followed closely by criticisms of the amount and complexity of paperwork to administer the grant (18 percent). In addition to the 18 percent who were pleased with the program, another 3 percent of the comments praised TDED staff.


Skills Development Fund Findings

While 18 percent of Smart Jobs comments praised the program in response to this question, 28 percent of the 107 comments praised the Skills Development Fund. Another 13 percent praised program or community college staff, compared to 3 percent for Smart Jobs. Only two of all 107 comments directly criticized program staff.

Of the 37 percent of responses in the “Other” category, 18 percent were miscellaneous and did not fall into more detailed categories. Four percent, however, recommended moving Smart Jobs into the Skills Development Fund. Six percent expressed frustration with the community college’s administration of the grant, while another 6 percent wished there was more money available. Twelve percent of the comments concerned delays and deadlines, compared to 22 percent for Smart Jobs.


Questions 13, 14, & 15 Combined Findings (see Table 6)

Of all the 190 comments for these three questions, 21 percent praised the Skills Development Fund program, and 9 percent praised staff, resulting in a 30-percent positive rating. This percentage compares to 12 percent for the Smart Jobs program.

One out of five of the Skills Development Fund comments were about frustrations with deadlines and delays, compared to one out of four of the 764 Smart Jobs comments. Almost that same number were frustrated with the paperwork and administration of the Smart Jobs grant, compared to less than 10 percent for Skills Development Fund.

Table 6

Combined Comments for Questions 13, 14, and 15.

Category
Smart Jobs
Skills Development Fund
Lack of Timeliness
25%
19%
Administration/ Paperwork Problems
23%
9%
Other
16%
32%
Staff Complaints
13%
2%
Satisfaction/praise for Program
11%
21%
Lack of Flexibility in Program
6%
5%
Grant Criteria Too Tough
4%
1%
Staff Praise
1%
9%
Move Smart Jobs to Skills Dev. Fund
0%
2%
TOTAL*
99%
100%

* May not add to 100 percent due to rounding.


#16. What would you have done if you had not received the grant?

The intent of this question was to explore companies’ needs for each of the training programs.


Findings:

A fifth of the respondents stated there would have been no training without the Smart Jobs grant, while on the other end of the spectrum, 11 percent said they would have conducted the training anyway. The results for the Skills Development Fund is nearly the same, with 22 percent saying they would have gone without the training and 7 percent stating they would have held the training without the grant.

Another 2 percent of the Smart Jobs responses said nothing would have been different. In between are those who stated they would have had less training (the largest group at 28 percent for Smart Jobs and 35 percent for SDF).

Others would have delayed the training (11 percent for Smart Jobs versus 6 percent for SDF). Some would have had lower-quality training or conducted the training in-house (7 percent for Smart Jobs versus 9 percent for SDF). Some would have found funds elsewhere for the training (1 percent for Smart Jobs).

A small portion (6 percent for Smart Jobs, 4 percent for SDF) responded that without the grant, their business would be less competitive.

The conclusion is that a few of the companies would have had to forfeit the training altogether; however, for the most part, the grants allowed them to provide higher quality training in a more timely manner.

Table 7 shows the categories of the responses and their relative proportions for each of the programs.

Table 7

Responses to Question #16

Item
Skills Development Fund
Smart Jobs
Less training
35%
28%
Forfeit/cancel training
22%
22%
Other
18%
11%
Done it anyway using own money
7%
11%
Delay training
6%
11%
Done in-house instead of externally
5%
2%
Lower quality training
4%
5%
Less competitive
4%
6%
Borrow money
0%
1%
Used own money for other purposes
0%
0%
Nothing different
0%
2%
TOTAL*
101%
99%

* May not add to 100 percent due to rounding.


Close-Ended Question Section

The Comptroller’s customer satisfaction survey contained one closed-ended question designed to determine a company’s dependence on program funding to provide training.

#17. My company would not have provided the employee training made possible under the Smart Jobs/Skills Development Fund program had we not received the grant. Yes or No

A “yes” indicated the statement is true, that no training would have occurred. A “no” response meant training would have occurred without a grant. In the Smart Jobs program, 66 percent of the 407 responded that no training would have occurred without the grant. About 33 percent said that training would have occurred without the grant.

In the Skills Development Fund program, 72 percent of the 95 responses said “yes” meaning that no training would have occurred without the grant. About 28 percent said “no.”

Both programs showed companies depend upon these funding sources to help meet their training needs. It appears that a larger portion of the companies would have gone without training had it not been for the Skills Development Fund grant compared to Smart Jobs, however, the difference was only 6 percentage points.