Initiative in Support of Access to Justice in Both Official Languages Evaluation

3. Methodology

The methodology selected for this evaluation is based on five approaches described in this section.

3.1. Database Analysis and Literature Review

Key Initiative-related documentation, including the database on the projects and organizations funded, was analyzed. These data provide information on the activities undertaken in the context of the Initiative, the outputs produced and the outcomes achieved by the Initiative’s two components. The database analysis also provides detailed information on the projects funded and on the recipient organizations, their activities and their outputs. Analyzing the documentation helped build a solid information base for preparing the interviews with key informants and the other data collection activities.

The list of documents consulted includes:

  • the Evaluation Framework of the Roadmap, prepared by the Department of Canadian Heritage;
  • the Initiative’s planning documents;
  • the funding files (project proposals, interim and final reports);
  • the documentation on the mandate of the various committees, especially the advisory committee and the federal-provincial-territorial working group.

3.2. Interviews with Key Informants

The interviews with key informants obtained informed opinions and perceptions on the relevance, effectiveness and efficiency of the Initiative. In total, 26 individuals were consulted, from various groups that were directly involved in implementing the Initiative’s activities, as follows:

  • employees of the Department of Justice Canada;
  • members of the Advisory Committee on Access to Justice in Both Official Languages;
  • members of the Federal-Provincial-Territorial Working Group on Access to Justice in Both Official Languages;
  • other informants such as justice professionals, university representatives and funded organizations’ employees.

In preparation for the interview, each informant received a guide with the questions to be asked. These guides appear in the annex to this report. The interviews were conducted in person or by telephone in the respondents’ preferred official language. All the data gathered were analyzed using the NVivo tool to identify the topics connected with each evaluation question covered by this research method.

3.3. Case Studies

Case studies pertaining to the Access to Justice in Both Official Languages Support Fund were conducted in 2010-11. The purpose of this analysis was to examine two of the Support Fund’s specific activities: the core funding of the provincial associations of French-speaking jurists and their national federation, as well as the funding for projects relating to access to justice in both official languages. More specifically, in keeping with the objectives of the Initiative, these projects dealt with the multiplier effect of the Support Fund by examining the funding activities, progress towards results, as well as lessons learned and good practices implemented in the context of the Support Fund. The findings from the case studies were used to comment on the effectiveness and efficiency of the Initiative presented in this report.

3.4. Survey of Funded Organizations

An on-line survey was launched of organizations who received funding under the Initiative. This method was selected in order to maximize the response rates of the funded organizations in this study. The on-line survey gives respondents more flexibility regarding when to answer the questions and, at the same time, to answer only those questions that pertain specifically to their situation. The on-line survey of the funded organizations addressed topics involving the Initiative’s effectiveness and efficiency.

A questionnaire with 35 questions was prepared and distributed to all of the Initiative’s funded organizations, which were 29 at the time of the evaluation. The questionnaire was first tested internally before it was distributed to all recipient organizations. They were able to fill out the questionnaire in the official language of their choice. On two occasions, a reminder email was sent to the organizations that had not yet filled out the questionnaire.

In total, 19 organizations filled out the questionnaire, for a response rate of 63.3%. The data gathered through this survey were analyzed with SPSS software. The on-line survey questionnaire appears in the annex to this report.

3.5. Expert Panel

An expert panel helped to provide context for the main data gathered through the other data sources and to qualify and validate some of the findings that emerged. A roll-up document, including a set of findings and issues, was prepared and distributed to participants in order to structure the discussion. A total of four experts from various regions in the country took part in the panel, which was conducted remotely through the WebEx tool. The questions asked during the expert panel appear in the annex to this report.

3.6. Methodological Limitations

The methodology is in line with current evaluation research standards in its use of multiple lines of evidence to address each evaluation question. There are two or more lines of evidence associated with each individual evaluation question. Consequently, as with any research, the present study is subject to methodological limitations. These include the following:

  • Measuring outcomes achieved remains a challenge for a number of funded organizations. As such, the reports submitted often report on outputs, rather than on outcomes achieved. This limits the quantity of data usable to report on the outcomes achieved. However, the interview guides and the survey questionnaire were designed to cover this aspect.
  • Potential respondent bias in the Support Fund case study findings, which are primarily from the perspective of project funding recipients, and therefore could be positively biased.
  • The same may be said of survey respondents. As funded organizations, they may be biased in favour of the continuation of the Initiative. It was not possible to target organizations that had unsuccessfully applied to the Initiative because they had other projects accepted.
  • The Initiative, as designed in 2008, has two components: the Support Fund and the new Justice training component. However, it must be acknowledged that the Support Fund dates back to 2003, whereas the Justice training component was announced in 2008. As such, the changes specific to each of its components are different and must be accounted for in the data collection and the presentation on outcomes achieved. The data collection tools used in this evaluation were therefore adjusted to properly document this double evolution, and the Initiative’s outcomes presented in this report reflect that reality.
Date modified: