Criminal Law Policy Function Evaluation

3. Methodology

The evaluation of the Section draws on four lines of evidence, that is, a document and data review, key informant interviews with departmental officials and other stakeholders, as well as a review of project-specific files. This section provides further details on each research method.

The methodological approach is largely qualitatively-based as the information and activities associated with policy development do not lend themselves to being quantified or rigidly structured. Qualitative methods permit a more open-ended approach that facilitates communication and permits flexibility in project development and in collecting information, which was key to this evaluation, it being the first major evaluation of a policy function within the Department under the 2009 Policy on Evaluation. The knowledge-base of the evaluators was gradually built up with each successive data gathering/analysis step so that better informed questions were asked and answers could be clarified in light of the data already acquired.

The methodology was developed in consultation with the CLPS Evaluation Working Group. All data collection methods and instruments were reviewed by the Working Group.

The evaluation matrix, which identifies the evaluation questions, indicators and lines of evidence and is used to guide the study, is included in Appendix A. The data collection instruments developed to respond to the evaluation matrix are in Appendices B and C.

3.1 Document Review

A number of reports and administrative documents were reviewed to obtain insight into the mandate, operations and relevance of the Section. Internal documents such as the Section’s orientation binder and file plan, the Policy Sector’s Business Plans, meeting agendas and minutes, working group terms of reference, memoranda to CLPS counsel, as well as financial and human resources information were reviewed as was publicly available information such as Budget speeches, Speeches from the Throne, Department of Justice Performance Reports and Department of Justice Reports on Plans and Priorities, documentation on the Parliamentary process and guides to making federal acts and regulations.

3.2 Secondary and Administrative Data

The evaluation included a review of administrative data from the Department’s iCase database from fiscal years 2010-11 to 2012-13, which provided descriptive information about the types of files for which the Section is responsible and the level of effort (number of hours) associated with these. iCase is the Department’s integrated case management, timekeeping and billing, document management and reporting system. The evaluation also included a review of results from the latest Public Service Employee Survey (PSES) (2011), providing information on demand for services, capacity and efficiency issues.

3.3 Key Informant Interviews

Key informant interviews provided descriptive information and opinions on both relevance and performance, but focused mostly on performance. They were a key line of evidence in gathering information on the effectiveness of the Section’s activities. A list of potential key informants was prepared, and interview guides tailored to each key informant group were developed in consultation with the Evaluation Working Group. Interviews were conducted with a total of 77 key informants. Except for CLPS, where interviews were conducted with both senior and junior staff, most of the interviews were conducted with personnel in managerial-level positions. Table 3 below provides a breakdown of the number of key informants interviewed by respondent group.

Table 3: Key informants
Category Number
Managers 7
Other staff 16
Sub total 23
Other Department of Justice sections 20
Departmental Legal Services Units 6
Federal departments/agencies 16
Provincial/Territorial representatives 8
Non-governmental organizations and other bodies 4
Sub total 54
Total 77

Interviews were multi-phased, providing an opportunity to reinvest acquired understandings and insights into the design of subsequent data collection instruments.

Potential interviewees received an invitation to participate in an interview. Those who agreed to participate were provided with a copy of the interview guide (in the official language of their choice) prior to the interview. Each interview was conducted in the respondents’ preferred official language, and key informants were assured of the anonymity of their responses. The evaluation included a mix of telephone and in-person interviews. To ensure their accuracy, interview notes were sent to each respondent for their review upon completion of the interview.

3.4 File Review

Six project files were reviewed to allow for a more in-depth understanding of the life of a file in relation to the performance measures for CLPS. The file review also allowed the evaluation to explore whether the information obtained from key informants on how the Section conducts its work was supported by documentation on file. The file review sample was chosen in consultation with the Evaluation Working Group and was selected to illustrate the variety of the Section’s work. The evaluators reviewed two legislative (one of which included substantial provincial/territorial involvement through a consultation process), two advisory and two international files.

A thorough examination of the project file was in most cases followed up by discussions with the policy officer to supplement the documentation on file. These discussions provided information on the context for the work and a more in-depth understanding of the file. To ensure that comparable information was collected from each file, the file review was conducted using a file review template. The file review template, which focused on factual information available in the files, is included in Appendix C.

3.5 Methodological Limitations

The evaluation faced a few methodological limitations. These are discussed below by line of evidence.

The interviews with key informants have the possibility of self-reported response bias, which occurs when individuals are reporting on their own activities and may therefore want to portray themselves in a positive light; and strategic outcomes response bias, whereby the participants answer questions with the desire to affect outcomes.
File review
The sample of files was chosen with the input of the Evaluation Working Group and was considered to provide a good representation of the diversity of the Section’s work. As files were not chosen by random selection and the sample is not large, the file review sample is not a representative one. Rather, the file review was intended only to be illustrative of the Section’s approach to its different types of work.
Data review: iCase
iCase is the Department’s case management and timekeeping system. The ability to use iCase to respond to evaluation questions was constrained in several ways, primarily due to the limited amount of data available for the evaluation period. As iCase was implemented in CLPS in April 2010, only three full fiscal years of data were available for the evaluation period. Being the first year of implementation, the evaluators were cautioned against using 2010-11 data due to potential limitations with its accuracy and completeness. With only two years of more reliable data available, the evaluators could not use iCase to identify any trends (e.g. in the volume of work) over the five year time period covered by the evaluation.

It was also difficult to determine the amount of policy versus pure advisory work done by the Section using iCase. For instance, a large portion of the Section’s international legal advisory and policy development work is entered into one field, which does not distinguish the nature of the service provided. Additionally, some of the advisory work that occurs on policy files is subsumed under the legislative file in iCase. Although iCase contains records of CLPS’ pure advisory work, the evaluators had some difficulty profiling the nature of this work. The third most common type of advice sought by clients is miscellaneous advice which is captured in iCase as “Advice by Subject”, an all-encompassing category for “one-off” advice that is requested by DLSUs for its client departments.

3.5.1 Mitigation Strategy

The mitigation strategy was to use triangulation to check findings against other sources and perspectives thereby countering the concern that the study’s findings were the result of a single method, a single source, or a single investigator’s observations/interpretation.

Triangulation of qualitative data sources

The evaluators compared and cross-checked the consistency of information derived by different means by comparing the perspectives of people from different points of view. By seeking information from the full circle of CLPS stakeholders, the evaluators were able to triangulate the views of CLPS staff, clients, partners, and other stakeholders and check interviews against program documents and other written evidence on file that can corroborate what interview respondents report. The evaluators found consistency in overall patterns of data from different sources.

Triangulation with multiple observers and analysts

Multiple as opposed to singular observers and analysts were used for the evaluation. Using several interviewers helped reduce the potential bias that could come from a single person doing all the data collection and provided another means of more directly assessing the consistency of the data obtained. The evaluation also used the strategy of triangulating analysts, that is, having more than one person independently analyzing the same qualitative data and comparing their findings. This process helped deepen the evaluators’ understanding of the issues and maximize their confidence in the evaluation findings.

Review by evaluation participants

The evaluators sent their draft interview notes to each key informant for their review. This helped ensure that notes taken during the interview were accurate and complete and served to enhance the overall validity of the interview data.

Date modified: