The Path to Justice in a Court-Based Drug Treatment Program

3. Results: Assessments of the Quality of Justice by Individuals Graduating and Discharged From the Program

Overall, program participants gave high ratings to the Ottawa DTC, whether they graduated or were discharged from the program. Table II below shows the summary scores on each of the seven dimensions of access to justice. The scores represent points along a 5-point scale where 1 is the lowest and 5 the highest and where 2.5 would be the mid-point. The lowest scores (2.81 and 2.62) are above the 2.5 level, and about half the scores are in the 4.0 to 5.0 range. The sample size in the following analyses varies from five to 14, depending on the type of participant and the order of the interview.

Table II: Summary Scores for Access to Justice Indicators; Graduated and Discharged DTC Participants Footnote 11
  Baseline Interviews Mid-Point Interviews Final Interviews
Graduated (n=5) Discharged (n=14) Graduated (n=5) Discharged (n=5) Graduated (n=8) Discharged (n=10)
Procedural Justice 4.15 4.31 4.45 3.73 3.89 3.37
Restorative Justice 4.44 4.06 4.50 3.00 3.83 2.83
Interpersonal Justice 4.51 4.42 4.78 4.57 4.39 4.31
Informational Justice 4.25 4.18 4.97 4.52 4.37 4.19
Functionality of Outcome 4.63 4.08 4.63 4.33 3.79 2.94
Transparency of Outcome 3.58 4.00 4.25 3.56 3.67 2.81
Intangible Costs 2.88 3.15 3.51 3.24 3.50 2.62
Average Scores 4.07 4.03 4.44 3.85 3.92 3.30

The comparative scores of graduates and discharged participants are pertinent for this study because we want to determine if scores relating to perceptions of justice relate to success. On average the scores for participants who graduate are higher for each set of interviews compared with people who were discharged before completing the program. Further, the average scores for the discharged group decline with each interview. The average score for the graduates increases between interviews one and two. For graduates, the average score for interview three declines from the second interview but still remains higher than the average score for those who were discharged.

Other cost indicators were measured in addition to intangible costs as defined above. These were opportunity costs, time spent traveling and out-of-pocket expenses. Opportunity costs were measured on a 5-point scale in the same way the other dimensions of justice were measured. Time spent traveling was measured in minutes per week and expenses were measured in dollars per month. Table III summarizes these results. The sample size in the following analyses varies from five to 14, depending on the type of participant and the order of the interview.

Table III: Time, Monetary and Opportunity Costs

Interview Order
  Average Time/Week Average Cost/Month Opportunity Costs
1 2 3 1 2 3 1 2 3
Discharged 102 124 128 $23 $32 $27 1.68 1.30 2.40
Graduate 133 168 128 $98 $63 $67 1.70 1.33 1.85

Time spent travelling to the court once a week and to a treatment-related activity once or twice a week was higher for graduates at the first two interviews but by the third interview it evened out. Graduates spent more money per month on program-related personal expenses than discharged participants across all three interviews.

On the final interviews graduates rated lost opportunity costs (for example, foregone opportunities for employment) much lower than discharged participants, 1.85 versus 2.40, respectively (the higher the score the greater the perception of lost opportunity). Perceived opportunity costs had increased significantly between the first and last interviews during their involvement with the program. Opportunity costs for graduates followed the same pattern as for individuals who were discharged from the program. However, by the final interview perceived opportunity costs were much lower for graduates. Graduates started out at the same level as people who were discharged from the program. However, over the course of the program graduates' assessments of opportunity costs diminished.

The three spider graphs below illustrate the pattern of change over the course of peoples' participation in the drug treatment program in assessments of the quality of justice comparing people who graduated with those who were discharged from the program. At the beginning (the first interview) graduates rate the quality of justice of the program at about the same levels as people who are eventually discharged. However, the gap between the two groups widens continuously at interviews two and three representing the period during which they are in the program. The first interview analysis is based on all participating graduates and discharged individuals; due to attrition the sample size decreases in the following interviews.

Figure I: Seven Dimensions of Access to Justice Comparing Graduates and Discharged Participants: Interview I

This is a radar/spider graph image, which compares the average scores of graduates and discharged participants on the seven dimensions of access to justice at the baseline interview. The radii or spokes represent the average score increases in whole number increments, starting from 0 to 5.

Figure I - Text equivalent

This is a radar/spider graph image, which compares the average scores of graduates and discharged participants on the seven dimensions of access to justice at the baseline interview. The radii or spokes represent the average score increases in whole number increments, starting from 0 to 5.

  • For procedural justice, the average score for both graduates and discharged participants is above four.
  • For restorative justice, the average score for graduates is greater than four and the average score for discharged participants is four.
  • For interpersonal justice, the average score for both graduates and discharged participants is four.
  • For informational justice, the average score for both graduates and discharged participants is four.
  • For functionality, the average score for both graduates and discharged participants fall between four and five, with graduates scoring slightly higher than the discharged participants.
  • For transparency, the average score for graduates and discharged participants is the same at four.
  • For intangible cost, the average score for both graduates and discharged participant is slightly above two.

Interview 1 analyses are based on 14 discharged participants and five graduates. Graduates start out rating the quality of justice somewhat lower than those who are eventually discharged (see Figure I). For graduates the scores are about the same on informational and interpersonal justice, they are higher on restorative justice and functionality of outcome and lower on three measures: procedural justice, intangible costs and transparency of outcomes.

However, approximately four months later, at the second interview (based on five discharged participants and five graduates), participants who will eventually graduate rate the quality of justice higher on all seven dimensions (See Figure II).

Figure II: Seven Dimensions of Access to Justice for Graduates and Discharged Participants, Interview II

This is a radar/spider graph image, which compares the average scores of graduates and discharged participants on the seven dimensions of access to justice at the mid-point interview. The radii or spokes represent the average score increases in whole number increments, starting from 0 to 5.

Figure II - Text equivalent

This is a radar/spider graph image, which compares the average scores of graduates and discharged participants on the seven dimensions of access to justice at the mid-point interview. The radii or spokes represent the average score increases in whole number increments, starting from 0 to 5.

  • For procedural justice, the average score for graduates is between four and five and the average score for r discharged participants is four.
  • For restorative justice, the average score for graduates is between four and five, and the average score for discharged participants is between three and four.
  • For interpersonal justice, the average score for both graduates and discharged participants is the same, at almost five.
  • For informational justice, the average score for graduates is five and the average score for discharged participants was slightly below five.
  • For functionality, the average score for graduates and discharged participants is between four and five, with graduates scoring slightly higher than discharged participants.
  • For transparency, the average score for graduates is slightly above four and for discharged participants, the average score is slightly below four.
  • For intangible cost, the average score for graduates is between two and three and for discharged participants the average score is two.

Based on the final interviews, conducted about two to three weeks after the individuals graduated from the program, graduates (n=8) rate the quality of justice even more positively on functionality, transparency and intangible costs compared with people who were discharged (n=10). See Figure III. Graduates are somewhat stronger on procedural and restorative justice. Both graduates and discharged individuals score slightly stronger and are similar on interpersonal justice in the third interview.

Figure III: Seven Dimensions of Access to Justice for Graduates and Discharged Participants, Interview III

This is a radar/spider graph image, which compares the average scores of graduates and discharged participants on the seven dimensions of access to justice at the final interview. The radii or spokes represents the average score in whole number increments, starting from 0 to 5.

Figure III - Text equivalent

This is a radar/spider graph image, which compares the average scores of graduates and discharged participants on the seven dimensions of access to justice at the final interview. The radii or spokes represents the average score in whole number increments, starting from 0 to 5.

  • For procedural justice, the average score for graduates is four and the score for discharged participants is between three and four.
  • For restorative justice, the average score for graduates is four, and the average score for discharged participants is three.
  • For interpersonal justice, the average score for both graduates and discharged participants is the same at slightly above four.
  • For informational justice, the average score for graduates and discharged participants is between four and five, with graduates slightly higher than discharged participants.
  • For functionality, the average score for graduates is greater than that for discharged participants at three and almost four respectively.
  • For transparency, the average score for graduates is slightly below four and for discharged participants, the average score is three.
  • For intangible cost, the average score for graduates is between one and two and for discharged participant the average score is below one.

Overall, looking at the data across the three successive interviews, graduates demonstrate a stronger sense of justice related to their experience in the program compared with individuals who are discharged before graduating. The decline for graduates at the third interview may be attributable to their having been out of the program and having to face the realities of post-program adjustment.

The following two spider graphs organize the data somewhat differently, showing scores for all three interviews on the same graph for discharged and graduated participants, rather than separately for each interview.

Figure IV: Seven Dimensions of Access to Justice, Three Interviews, Graduates

This is a radar/spider graph image, which compares the average scores of graduates and discharged participants on the seven dimensions of access to justice at the final interview. The radii or spokes represents the average score in whole number increments, starting from 0 to 5.

Figure IV - Text equivalent

This is a radar/spider graph image, which compares the average scores of graduates and discharged participants on the seven dimensions of access to justice at the final interview. The radii or spokes represents the average score in whole number increments, starting from 0 to 5.

  • For procedural justice, the average score for graduates at baseline and mid point is between four and five and drops to a four at the final interview.
  • For restorative justice, the baseline score is slightly above four, the midpoint score is above the baseline and the final score is below the baseline at slightly below four.
  • For interpersonal justice, the baseline score is between four and five, the midpoint score is above the baseline and the final score is slightly below the baseline.
  • For informational justice, the baseline score is between four and five, the midpoint scores is at the highest value of five and the final interview sore drops to between four and five.
  • For functionality, the baseline score is between four and five, the midpoint is almost a five, and the final score drops to slightly below four.
  • For transparency, the average score at baseline is almost four; the score for midpoint raises to above four and the final interview score drops slightly below four.
  • For intangible cost, the average score at baseline is two, the score for midpoint is between two and three and the final interview score falls between the baseline and midpoint interviews.

The spider graph for graduates shows that the mid-interview scores are higher than the base-line scores with the exception of functionality and restorative justice. The final interview scores fall below the baseline levels for functionality of outcome, procedural justice, interpersonal justice and restorative justice. They increase slightly on informational justice and transparency of outcomes. The final interview score is about the same as the mid-point interview for intangible costs (see Figure IV).

Figure V shows that for discharged participants declines in access to justice scores are consistent across the three interviews. One minor exception is that scores are the same for information justice for the base-line and final interviews.

Figure V: Seven Dimensions of Access to Justice, Three Interviews, Discharged

This is a radar/spider graph image, which compares the average scores of graduates and discharged participants on the seven dimensions of access to justice at the final interview. The radii or spokes represents the average score in whole number increments, starting from 0 to 5.

Figure V - Text equivalent

This is a radar/spider graph image, which compares the average scores of graduates and discharged participants on the seven dimensions of access to justice at the final interview. The radii or spokes represents the average score in whole number increments, starting from 0 to 5.

  • For procedural justice, the average score for discharged participants at baseline is in between four and five and the midpoint interview is four, and the final interview drops to between three and four.
  • For restorative justice, the baseline score is four, the midpoint score is between three and four, the final score three.
  • For interpersonal justice, the baseline score is between four and five, the midpoint score is greater than the baseline score at almost five, and the final score is slightly below the baseline.
  • For informational justice, the baseline score is between four and five, the midpoint scores is greater than the baseline score at almost five, and the final score is same as the baseline.
  • For functionality, the baseline and midpoint score is between four and five, and the final score drops three.
  • For transparency, the average score at baseline is four, the score for midpoint drops to between three and four, and the final interview score drops three.
  • For intangible cost, the average score at baseline is slightly above two, the score for midpoint is two, and final interview score falls below two.

Table IV summarizes the data shown in Figures IV and V. The averages for actual scores and cumulative totals for the differences between interviews 1 and 2 and interviews 2 and 3 show overall patterns similar to the previous analysis. Average scores drop consistently for discharged participants. The overall difference in scores between interviews 1 and 2 is -0.93. The cumulative difference between scores on interviews 2 and 3 accelerates to -3.58. The cumulative difference between interviews 1 and 3, representing the magnitude of the decline over the entire period in the program, is -5.13 for the individuals who were discharged from the program before completion.

For graduates the cumulative difference on scores between interviews 1 and 2 is +2.34 compared with -0.93 for the discharged group. The cumulative difference on scores between interviews 2 and 3 is about the same for both graduates and discharged individuals; -3.56 and -3.58, respectively. However, the overall cumulative difference in scores covering the entire program period (interviews 1 to 3) is much smaller for the graduates, -1.88 compared with discharged participants, -5.13. Clearly, the graduates rate their experience with respect to dimensions of justice much higher than discharged individuals. Declines are much less for graduates.

Table IV: Detailed Data, Seven Dimensions of Justice, Discharged and Graduate Participants

Discharged
Dimensions of Justice Interview Order
  1 2 2-1 3 3-2 1-3
Procedural 4.31 3.73 - 0.58 3.37 - 0.36 - 0.94
Restorative 4.06 3.00 - 0.94 2.83 - 0.17 - 1.23
Interpersonal 4.42 4.57 + 0.15 4.31 - 0.26 - 0.11
Informational 4.18 4.52 + 0.34 4.19 - 0.03 + 0.01
Functionality 4.08 4.33 + 0.25 2.94 - 1.39 - 1.14
Transparency 4.00 3.56 - 0.04 2.81 - 0.75 - 1.19
Intangible Cost 3.15 3.24 + 0.11 2.62 - 0.62 - 0.53
Overall Average X=4.02 X=3.85   X=3.30    
Cumulative Total     - 0.93   - 3.58 - 5.13
Graduated
Dimensions of Justice Interview Order
  1 2 2-1 3 3-2 1-3
Procedural 4.15 4.45 + 0.30 3.89 - 0.56 - 0.26
Restorative 4.44 4.50 + 0.06 3.83 - 0.17 - 0.61
Interpersonal 4.51 4.78 + 0.27 4.39 - 0.39 - 0.12
Informational 4.25 4.97 + 0.72 4.37 - 0.40 + 0.12
Functionality 4.63 4.63 0.0 3.79 - 0.84 - 0.84
Transparency 3.58 4.25 + 0.67 3.67 - 0.58 + 0.09
Intangible Cost 2.88 3.24 + 0.32 2.62 - 0.62 - 0.26
Overall Average X=4.06 X=4.40   X=3.79    
Cumulative Total     + 2.34   - 3.56 - 1.88

4. Conclusion and Methodological Considerations for Future Research

This study was an experiment in applying an approach developed in another context (i.e., civil law) to a different type of program: Drug Treatment Courts. It was also an experiment in applying that approach in a different way; rather than comparing two different programs (i.e., DTC versus non-DTC); two different groups within the program (successful versus unsuccessful participants) were examined. The results of this research show that the TISCO methodology for measuring access to justice can be used to examine DTCs. This approach for measuring access to justice is promising. In the early stages of the program, people who would eventually be discharged experienced similar quality of justice as the people who eventually graduated. By the mid-point of the treatment program (when second interviews were conducted), graduates rated higher in their feelings of justice compared to those who were discharged. This applies to all aspects of justice as it was measured. Feelings of justice for both graduates and discharged drug treatment court participants were reduced by the end of the program. For graduates, this may be because the interviews were carried out about two weeks after the end of the program when former participants had begun to cope with the reality of life after the treatment program.

It is possible that people who graduate from drug treatment programs are predisposed toward a more positive orientation to life generally and thus more likely to succeed. The data show that graduates perceived opportunity costs to be lower than people discharged from the program. This may be an indication that they were more likely to perceive other aspects of the program positively.

The method for assessing the cost and quality of justice used in this research is purely descriptive. It is, therefore, not possible to say that the experience of greater access to justice “predicts” success in the drug treatment program. Any further research should collect data on other factors known to be associated with success in drug treatment programs, such as having previously been in a treatment program. With this sort of data, multivariate models could be constructed to determine the extent to which perceptions of the quality of justice have a statistically significant and independent effect on success and this could be said to be predictive.

Second, conducting in-person interviews was very labour intensive. Collecting data by means of questionnaires might be considered in any further research, especially if the number of participants is much larger than was the case in this study.

Overall, it must be kept in mind that the number of respondents involved in this exploratory study was very small. The results are promising but the research should be replicated on a larger scale.

Date modified: