Phase 1 Report of Feasibility Study on New Hire Programs for Canada: New Hire Programs in the United States



New Hire Procedures

1. Reporting Format and Methods of Transmission

Generally, employers reported by using the federal Employee's Withholding Allowance Certificate (W‑4 form), since by federal law all employees have to complete this form when they start new jobs. Employers use the W‑4 form to determine employee income tax withholding allowances. They put it in the employee's personnel file and disclose it to federal or state governments under certain circumstances.[13] The data elements on W‑4 forms are the employee's name, address and SSN, and the company's name, address and federal ID number. Some states use forms that ask for more information, such as the date of the hire, the employee's date of birth and medical coverage information.

As explained above, employers wanted many options for sending data. In all states, employers could mail or fax (toll free) a copy of the W‑4 form, or the state equivalent. Although more employers are now using electronic transmission, the majority of employers in the early to mid‑1990s preferred to use paper methods. In Washington, for example, less than 10 percent of reports were transmitted electronically in 1994. On the other hand, four years later, only 24 percent of reports submitted in South Carolina were sent on paper; the rest came on diskette, cartridge or reel. This may be because most of the South Carolina employers reporting voluntarily are also large, technologically savvy businesses.

Recently, many employers have preferred the Internet, particularly small businesses. But states have to protect the privacy of information sent this way. In Massachusetts, each employer uses a password to gain access to the reporting portion of the Website. Massachusetts' promotional material states that this takes less than one minute per report, after which employers get a message confirming receipt of the report.

Not all new hire programs have sophisticated systems that permit online reporting. For example, until recently, the Oregon new hire program had to print out electronic reports and enter them manually. Smaller programs, like Oregon's, have had the greatest difficulty in meeting the federal PWRORA requirements.

A few states used toll‑free employer reporting lines. A Washington state contact warned that telephone reporting “took two full‑time staff members to translate the information and very often the information given was incomplete and staff would not know how to contact [employers] to get the rest of the data required.” Other states used a 1‑800 line for which a trained staff member took the calls, which worked better.

2. Matching Child Support Cases with Employer Reporting Data

Most of the original new hire programs required employers to report new hires within 15 days to one month of hire or rehire. The state's division of child support, another state department or a private contractor could then match the employee's name and SSN against open child support cases. Many SSNs were inaccurate, either because of errors in completing the form or in data entry, so some states verified these data with a special computer program.

Any matches were passed on to child support enforcement agencies. In some states, the staff member would send an employment verification letter to the employer and, if employment was confirmed, take action toward an income deduction order. In Arizona, for example, the new hire appeared on the system within two days of the receipt of the information. In cases where a court order was already in place, the caseworker called the employer, verified that the non-custodial parent was still employed and started the wage assignment process within 10 days. In Washington and Massachusetts, a fully automatic system identified non-custodial parents who were in arrears and automatically sent letters to employers before staff intervention and assessment.

In California, the new hire data were mailed to child support offices rather than transmitted electronically or via fax. Understandably, there was a delay of about a month before the data came to the attention of caseworkers. The California database had duplicate listings, since its records went back six months and were matched every month with child support caseloads. Until the problem was fixed, it increased workload and reduced confidence in the reporting.

3. Wage (Income) Withholding

Wage withholding accounts for between half and two thirds of all enforcement collections and, furthermore, produces the highest compliance rates and collections (Bartfield and Meyer, 1994). Under the PRWORA, states must have procedures for withholding the wages of someone in arrears of a support obligation, and these procedures must not require a judicial or administrative hearing (Legler, 1996:542).

If an employee has to pay child support, employers are asked to set aside wages from an employee's income.[14] As mentioned above, some states have a fully automated procedure for issuing wage withholding, while others continue to use manual procedures. In Massachusetts, automation decreased child support enforcement costs from $9,174,000 to $777,000, or an average decrease per case of $286, from $306 to $20 (Department of Revenue, 1995). However, as with any such automated process, a review process should allow speedy correction when a wage withholding is issued erroneously.[15]

One state respondent spotted one problem: employers with multiple business locations or different payroll departments will report only one business address, which may not be the address to which the state should send the employee's income withholding order. As such, enforcement staff cannot automatically issue a wage withholding order until they can be sure that it will reach the appropriate payroll section.


Most state legislation specified a period, often 3 to 12 months, after which unmatched records were to be destroyed. However, Texas legislation specified that the state “shall not create a record regarding the employee and the information...shall be promptly destroyed.” The state could only retain the data if a support obligation was to be established or enforced. The new hire program has been criticized on the grounds of privacy invasion, and record destruction was originally the main mechanism for forestalling such criticism. The PRWORA, however, has no records destruction provision.

Measuring Program Success

One of the objectives of this research was to determine whether the different new hire models had different success rates. Meaningful across-state comparisons are difficult, as the amount and type of available information varies from state to state, so much so that we cannot be sure whether we are comparing apples and apples, or apples and oranges.

States typically quantified the success of new hire programs by looking at:

  • match rates and the number of child-support cases as a percentage of the number of employees reported under the program; and
  • the increase in child support collections attributable to the program.

The data in Table 3 illustrate match rates, but the lack of clarity in the documentation means, for example, that we don't know if the “enforcement” category was limited to cases where the payer was in arrears or whether it contained all matches where an order existed. Most documents did not define a match, although the match rate was usually based on the matches with open cases. For example, in Massachusetts, the match rate included those persons who were already paying child support; officials interviewed said this group was important because the state might need to increase the amount of the payments.

Table 3: Match Rates between Child Support Cases and Employer Reporting Data (in Percentages)

Targeted and mandatory programs:
  Average match rate (all cases) Most recent match rate (all cases) Paternity establishment matches Support establishment matches Enforcement matches
Alaska 3.9 4.0a 0.2d n/a 4.1d
Oregon 8.7 7.7b (0.9)e (2.8)e (5.9)e
Washington 8.8 10.1c n/a n/a n/a

Targeted and
voluntary program
  Average match rate (all cases) Most recent match rate (all cases) Paternity establishment matches Support establishment matches Enforcement matches
Texas 3.7 3.1b n/a n/a n/a

All employers and voluntary program:
  Average match rate (all cases) Most recent match rate (all cases) Paternity establishment matches Support establishment matches Enforcement matches
Arizona 8.0 6.1a 1.7a 1.3a 3.1a

All employers and mandatory programs:
  Average match rate (all cases) Most recent match rate (all cases) Paternity establishment matches Support establishment matches Enforcement matches
Connecticut* n/a 0.5b - - 0.5b
Iowa n/a 11.0b n/a n/a n/a
Kentucky n/a 9.6a n/a n/a 5.1a
Florida** 5.5 6.2c n/a n/a 2.6c
Maryland*** n/a 4.0a 0.8a 0.6a 2.9a
Massachusetts 3.7 3.5a n/a n/a n/a
Missouri n/a 10.6b n/a n/a n/a
New York n/a 7.0c n/a n/a 3.9c
Virginia n/a 7.3be n/a n/a n/a


  • a = 1997;
  • b = 1995;
  • c = 1996;
  • d = 1992;
  • e = 1994;
  • n/a = not available;
  • ( ) = estimate.
  • * Connecticut matched new hire data with delinquent obligors only.
  • ** Florida's program was mandatory for employers with a workforce of more than 250.
  • *** Maryland's program was originally voluntary; no data are available for match rates during that time. The rates reported above are from the first six months of the mandatory program from July to December 1997.

The rates in Table 3 show that there was little relationship between the type of program and the degree of success in matching the names of non-custodial parents to the data received from employer reporting. One might expect that the targeted programs would have higher rates, but this does not appear to have been the case. The most recent match rates for targeted programs ranged from 4 to 10 percent. The Texas voluntary program had a match rate of about 3 percent, and the Arizona program had a rate of about 6 percent. The mandatory programs reported rates from 3.5 to 11 percent. In Connecticut, where it was clear that matching occurred only for delinquent payers, the match rate was only 0.5 percent. The other programs reported match rates for enforcement cases from 2.6 to 5.9 percent of all new hire reports.

The lack of uniformity in match rates within the program types could be due to many non-program factors, such as variations by state in the following:

  • the percentage of child support cases on public assistance, as these cases are harder to enforce;
  • the percentage of child support cases where the non-custodial parent is being sought or is in arrears on payments;
  • the residential and employment mobility of child support payers; and
  • the state of the economy and whether a large number of employers offer seasonal work.

It is possible that states were defining matches differently. In addition, the Center for Law and Social Policy said that the caseload figures of child support agencies might be suspect: “there are justifiable concerns about the accuracy of IV‑D caseload data” (Center for Law and Social Policy, 1998:7). States varied in their case definitions and procedures for opening and closing cases, and even within states, practices varied. If the definition of a case varied by state, then so would the match rates.

Several state tracking studies also showed that caution should be exercised in using the match rate as an indicator of program success.

The California parent locator service surveyed county child support agencies in 1993 and 1994 to see whether new hire listings from a newly established registry were effective for child support enforcement. All California employers submitted the names of new hires within 30 days of employment. County child support agencies explained what they did with 299 matches between child support caseloads and new hire listings.

For unknown reasons, the counties reported that only 227 of the sample listings reached case files for processing, taking an average of 33 days to do so once they had been mailed. Of the listings, 66 percent provided information unknown to the county, such as an address, while 29 percent of the listings provided new no information and 5 percent of the listings involved the wrong person, usually because of inaccurate matching on SSNs.

The analysis found that collections were made in 8 percent of the “workable” listings. Of the 150 listings that provided new information, 80 percent led to a successful employer contact. Of these 120 listings, only 43 (36 percent) involved someone still working for the employer. The state tried to enforce 34 of the 43 cases. Of the 34 cases, 28 resulted in an order for wage assignment or licence hold. The average dollar amount of the wage assignments was $440 per month. At least one collection was made in 18 of the 28 wage assignments (64 percent), and at the time of the survey 11 of the 18 non-custodial parents were still paying child support. Enforcement efforts were made, on average, 65 days after the mailing of the listings.

The Florida program underwent a somewhat similar exercise, using new hire matches for the first six months of 1995 (Florida Advisory Council, 1995). The initial match rate was 5.8 percent. Of the 28,693 matches, 58 percent were “obligated cases” for which a child support obligation had already been established, and 42 percent were “unobligated cases,” apparently related to paternity establishment (only a few dozen of the latter resulted in an obligation being established).

Of the obligated cases, 91 percent required the location information. Of this group, 38 percent were “non-productive” (meaning that the employment had been terminated), 20 percent were pending and 42 percent were “productive.” Of the productive cases, in 89 percent, wage withholding was implemented. A third of obligated cases resulted in an income deduction order. If these ratios were applied to the initial match rate of 5.8 percent, the “success rate” would change from 5.8 percent to 1.1 percent of matched names (i.e. the cases where a deduction order was implemented).[16]

In Ohio, a review of a random sample of matched cases revealed the following outcomes:

  • 24 percent—employer was known prior to match;
  • 21 percent—income withholding was issued;
  • 18 percent—other enforcement actions were taken;
  • 16 percent—no action was taken;
  • 8 percent—“unnecessary preliminary actions” (undefined) were taken; and
  • 12 percent—cases were excluded because they were closed, could not be located, were dismissals or were transferred to other jurisdictions.

Therefore, 39 percent of matches resulted in some type of enforcement action and about a fifth resulted in income withholding.

An internal Connecticut study found that income withholding was not issued in 56 percent of matched cases, for the following reasons:

  • in 18 percent of all matches, the employee had already left the job by the time the referral was received;
  • in 13 percent, income withholding was already in place;
  • in 11 percent, the case did not have an income withholding order and additional work was required to secure the order; and
  • in 14 percent, there were other reasons (letter from Support Enforcement Division, February 1998).

In Iowa, an analysis of a sample of child support cases reported the following outcomes after matching:

  • 31 percent—no payment was received;
  • 20 percent—regular payments were received on cases that had a history of payment gaps;
  • 18 percent—the information received helped to locate a payer so that a support order could be issued;
  • 16 percent—the payer was making payments already;
  • 11 percent—a few payments were received as a result of the match, and then employment was terminated; and
  • 4 percent—an error in the SSN caused an invalid report.

Therefore, in 31 percent of the matched cases, employer reporting led to payments.

In late 1994 and early 1995, the Virginia program drew a sample of 295 matched cases, to determine the outcomes of the matches. Of the sample, 25.3 percent resulted in collections from wage withholding, and $32,377 of the amount received could be attributed to the program (meaning that the information was available before information from quarterly wage reports). Extrapolating the findings to the first 29 months of the program, the author of the study estimated that child support collections had increased by $20.2 million.[17]

The report also showed why there was no wage withholding for certain Virginia cases: in 34 percent of the cases, wage withholding was “not appropriate”; in 20 percent, employment had been identified by other means; in 11 percent, the person was no longer at the same job; in 4 percent, the information was incomplete and unusable; and in 32 percent, no reason was provided (Virginia Division of Child Support Enforcement, 1995).

These monitoring exercises illustrate that match rates do not necessarily translate into collections. Here is why an uncritical acceptance of match rates and extrapolation of them to Canada may be unwise.

  • We do not know how much the caseloads of our child support agencies differ from those in the United States, particularly in terms of the number and proportions of cases in arrears and other factors, such as parental mobility.
  • Many states report match rates, not only for those whose payments are in arrears,but for their entire caseloads, including cases where an order has yet to be established and those where paternity must be established.
  • Locating the non-custodial parent's place of employment is only the first step in collecting the amount owing.

Very often, the collection figures provided in program documents sound impressive—understandably, perhaps, because they are used as public relations and marketing tools to sell the program to employers and the general public. For example, a series of “success stories” is found in publications of the United States Office of Child Support Enforcement (1997).

  • Washington state attributed $7.8 million in total collections to the new hire program from July 1990 to January 1992.
  • Missouri estimated that its program increased collections of child support for fiscal year 1996 by $11 million.
  • An Oregon government report stated that, in the first 13 months of its program, the child support agency increased child support collections by $3.4 million. Additional payments were received from 4,800 of the 18,300 matched cases.

It is nearly impossible to compare collections across new hire programs or categories of programs because most sources provided total dollars in collections that could be attributed to the program, but did not provide the number of open child support cases or the number of cases in arrears. Ideally, we would need to calculate the average amount collected per case in arrears.

In addition, the collection data were not always consistent. For example, one Arizona annual report estimated collections were $350,000 for fiscal year 1995, but a more recent internal document increased that estimate to $1,636,675.

Something similar happened with the Florida data. The Florida 1995 report cited above estimated that the new hire program had led to orders worth $5.2 million annually, for about one million new hire reports. However, another source estimated that employer reporting produced an obligation amount of $15.2 million, a threefold difference (U.S. Office of Child Support Enforcement, Child Support Report, 1996). The differences might be due to the method of selecting which child support collections could be directly attributed to the new hire program, as opposed to other methods of enforcing child support orders.

The Exchange of Data with Other Social Programs

New hire databases were also used for purposes other than child support enforcement, such as fraud and overpayment detection for other social programs.

1. Accessing New Hire Data

Before the PRWORA, many new hire programs shared their databases with workers' compensation, unemployment security, Medicare and Aid to Families with Dependent Children (AFDC) agencies. We looked at five states with information-sharing relationships—Georgia, Massachusetts, Missouri, Texas and West Virginia—to examine the details of their arrangements.

All the state respondents we called said that they wanted access to the new hire database to more quickly find benefit recipients who had jobs. All agencies had been accessing quarterly employer wage reports to detect unreported employment and income. These data, as noted above, could be four to six months old by the time they were cross-matched with open cases, so some overpayments went on for months before being detected.

The Texas Workforce Commission (TWC) also used the data to run the new hire names against current unemployment benefits recipients, as well as against its old overpayment caseload. If the Commission matched a previous client and a large, recent overpayment, it reactivated the case and pursued an income withholding order.

Before the PRWORA, state policy or legislation sometimes prevented agencies from getting access to the new hire database. Many states, such as Alaska, had explicitly said that the employer reporting data could only be used for child support enforcement. In these cases, and in states that did not mention outside access at all, state legislation had to be modified. In Massachusetts and Missouri, legislation supplied the impetus to approach the IV‑D agency to work out an arrangement. In the other three states, knowledge of child support initiatives came about through regular communication among departments.

2. The Logistics of Data Transfer

Many agencies had to wait before accessing the data because computer systems had to be altered and software written to enable the exchange. For example, the West Virginia IV‑A (public assistance) department waited several years until the system protected the security of the data. This agency shared the same computer system and could have had online access to the database, but this access could have threatened security, since those using the database could change records as well as read them.

In most cases, the department with the new hire database sent the new hire information to the receiving agency by magnetic tape, doing so once a week, once a month or, in one case, every day. Where the new hire reporting program was funded under the same umbrella department as state welfare programs, direct access to the system was usually possible. Online access was preferable because it avoided the costs involved in purchasing and mailing tapes, as well as the occasional problem of lost or wrinkled tapes. Texas tried to work out an arrangement for direct access to new hire data but decided against it because it was too expensive for the other agencies to change their computer systems.

3. The Effectiveness of New Hire Data for Detecting Overpayments

In the five states with which we spoke, two public assistance programs and four employment security agencies had traced the savings they had achieved by using the new hire database. The two welfare agencies identified significant savings. (See Table 4.)

Table 4: Public Assistance Savings Attributable to the New Hire Program (in U.S. Dollars)


reference period
Monthly AFDC
(number of cases)
Monthly food stamps
number of cases)
Monthly Medicaid
(number of cases)
Total monthly savings Projected annual savings
1994 to 1995 $443,800
$1,217,483 $14,609,796

reference period Monthly AFDC
(number of cases)
Monthly food stamps
number of cases)
Monthly Medicaid
(number of cases)
Total monthly savings Projected annual savings
fiscal 1994 n/a
n/a $15,900,000
fiscal 1997 $1,247,206
n/a $1,447,771 $17,373,252

Note: Massachusetts 1997 AFDC cases include the General Assistance program and Emergency Aid to the Elderly, Disabled and Children.

Most monthly savings in Massachusetts came from AFDC cases. The agency estimated that it saved $1.25 million monthly from closing cases or reducing AFDC funding. The Virginia Department of Social Services saved more from closing food stamp cases than from closing AFDC cases. Program staff could not find any differences that could account for this variation.

The Virginia Department of Social Services used the new hire database to match recipients of public assistance, food stamps and Medicaid. An analysis of savings revealed the following:

  • There was a reduction in benefits, saving $87,000, and the department saved a further $357,000 a month by closing public assistance cases;
  • There were fewer public assistance cases, saving $87,000, and the department saved $357,000 a month by closing public assistance cases;
  • It saved $220,000 a month by reducing benefits to the food stamp program. It closed other food stamp cases, resulting in monthly savings of $377,800; and
  • It saved an estimated $175,800 from Medicaid closures.

Precisely how the reductions and case closures came about is not specified.

The results are less impressive for employment security agencies than for welfare agencies. (See Table 5.)

Table 5: Employment Security Savings Attributable to the New Hire Program (in U.S. Dollars)
(reference period)
Total Employment Security cases Quarterly savings Projected annual savings
(fiscal 1994)
900 $500,000 $2,000,000
West Virginia Bureau of Employment Programs
(October 1997 to February 1998)
107 $45,207 $180,828
Florida Division of Unemployment Compensation
(April 1995 to August 1995)
417 $84,556 $338,224

Massachusetts saved more because it was overpaying more. In Massachusetts, the average overpayment had been approximately $556, whereas in West Virginia it was $423 and in Florida it was $203.

It is important to be cautious when interpreting these savings. Projections of savings to six months or one year may not accurately reflect the movement of cases on and off public assistance. Some clients may return to social assistance in the interim if they lose their jobs. Furthermore, it may have been possible to recoup overpayments through quarterly labour reports. In addition, not all the recorded overpayments have necessarily been recouped. For example, Florida recouped only 33 percent of unemployment compensation overpayments six months after it took action. It would probably have recouped even less from public assistance clients.

To determine the cost effectiveness of using new hire data, one needs both the projected benefits and the costs to the department. Only one agency, the Texas Department of Human Services (DHS), established the cost of using the new hire database to match welfare cases. In a 1996 report, Texas DHS stated that it had saved an estimated $792,000 a year. It cost an estimated $210,000 to match new hires within DHS, so the real annual cost saving was $582,000, or a cost-benefit ratio of 1: $3.77.

Texas DHS also estimated a cost-benefit ratio for a mandatory program in Texas. It projected savings of $12.7 million and costs of $3.4 million, for a projected annual benefit of $9.3 million. The projected cost-benefit ratio for the mandatory program in Texas was $1: $3.76. Therefore, the cost-benefit ratio of moving to a mandatory program is virtually identical to the present benefit calculated for the voluntary program.

The author of the Texas report, however, warned readers of the difficulties in making projections of this kind. One cannot accurately project the benefits of mandatory reporting because one doesn't know the differences between those employers who report and those who do not. For example, they may hire DHS clients in different proportions (Texas DHS, 1996:11).

In summary, even before the passage of the PRWORA, other welfare programs were using new hire data to detect overpayments and fraudulent claims. The data were transferred smoothly, although state agencies needed to reformat their computer systems. As with child support enforcement, states reported substantial cost savings, especially for public assistance,with lesser savings for unemployment insurance.

State Costs of New Hire Programs

1. Start‑up Costs

It is difficult to determine program start‑up costs because, in many states, existing departmental budgets absorbed these costs. For example, Vermont's voluntary program did not get any funds and the Office of Child Support absorbed all fixed costs associated with the program, which included costs to reprogram the telephone system. Only a few states identified discrete start‑up costs for their initial employer reporting program. Between 1990 and 1992, the fixed start‑up costs of the Washington state program were $43,292 and the initial variable program costs were $351,110. In Iowa, the start‑up costs (in 1993 and 1994) were estimated at $440,424. The Florida figure was $91,300 (in 1995). In 1996, start‑up costs in Minnesota were said to be $94,000.

2. Annual Operating Costs

There is also little information in the documentation on the annual costs of operating new hire programs, although we asked states for cost-related information.

Annual costs ranged from just over $100,000 in Arizona to $500,000 in Minnesota (see Table 6). There is no apparent relationship between the type of program and annual expenditures. When we divided the annual spending by the approximate number of new hires reported in each jurisdiction, we found a large range in the overall cost per report—from $0.27 per report in Florida to $1.45 in Arizona. This range might be due to differences in salaries, overhead, automation and, possibly,privatization. Based on our experience with costing social programs, we assume that many of the differences are due to such accounting questions as how overheads were included in expenditures and how equipment costs were amortized over time.

Table 6: Annual Operating Costs by Type of Program and Number of New Hires (in U.S. Dollars)
(reference period)
Annual budget Approx. number of new hires reported Cost per report All industries or targeted industries ? Mandatory or voluntary ?
(fiscal 1994)
$233,795 n/a n/a Targeted Mandatory
(fiscal 1995)
$451,000 324,300 $1.39 Targeted Mandatory
(fiscal 1996)
$141,300 138,900 $1.02 Targeted Voluntary
(fiscal 1994)
$104,200 72,000 $1.45 All Voluntary
$268,600 992,000 $0.27 Large employers Mandatory
(fiscal 1994)
$270,850 483,300 $0.56 All Mandatory
(fiscal 1995)
$499,100 1,017,000 $0.49 All Mandatory

Note: n/a = information not available.

The resources put into data control and entry might have affected operating costs. These costs for the Washington state program ranged from $84,019 in 1992 to $258,880 in 1997—from 54 to 60 cents per new employee reported. Similarly, the New York program estimated that the per‑record cost was 52 cents in 1997; the anticipated volume in that state was an astonishing 4.8 million records. In Ohio, the cost was 43 cents per new hire and it was 17 cents in Missouri in 1996. It is possible that states placed varying degrees of emphasis on data control and cleaning, which could account for the difference in per-record costs and help explain the varying costs of the program overall.

3. Cost-benefit Ratios

Cost-benefit ratios should be more accurate measures of overall program performance than total collections, although the discussion above suggests that the expenditures included in the “cost” side of the equation probably differed by state, as did the calculations to determine the collections attributable to the program.

Despite this problem, the ratios are worth presenting for those states that provided them. The state programs that calculated collection dollars received per dollar spent were the targeted industry programs in Alaska and Washington state, the voluntary program in Texas and the Massachusetts mandatory program.

  • Alaska collected $2.00 for every dollar spent in 1992, $3.10 in 1993 and $3.20 in 1994.
  • In the first 18 months of the Washington program, the ratio of collections to agency costs was estimated at $22 to $1 (Welch, 1992:14).
  • Arizona estimated that it collected $11 for each dollar spent (Arizona DES, Division of CSE, 1995: Appendix C).
  • The ratios for Texas were $19 for each dollar spent in 1993–1994, $15 for each dollar in 1996 and $20 for each dollar in 1997.
  • In Massachusetts, between 1993 and 1994, the ratio was estimated as $4.67 collected for each dollar spent.

We cannot easily explain the difference between Texas and the other voluntary program, in Arizona. Indeed, the very high ratios for Washington and Texas compared to the other states make us suspect that they have not tallied costs and collections the same way. These data do not permit any conclusions about cost effectiveness in relation to the “type” of program (voluntary or targeted).

Date modified: