Public Law Sector Evaluation
In accordance with Treasury Board evaluation policies, the Department of Justice used a risk-based approach in planning evaluations to ensure the efficient use of evaluation resources. The Department of Justice Evaluation Division assessed the risk level (low [L], medium [M], or high [H]) of planned evaluations, taking into consideration six risk factors. The risk score for the PLS for each of these factors is shown in brackets: the contingent nature of program funding (H); the complexity of the program or service (H); materiality (i.e., the level of resources involved in program delivery) (M); skills and expertise (i.e., business risks facing the Department regarding recruitment and retention, and the need for specialized skill sets) (H); time since the last evaluation (H); and information challenges (i.e., whether program information is available and accessible to fully support an evaluation) (M). The PLS was selected as one of the first legal services evaluations due to its overall high-risk rating. The methodology developed responded to the level of risk by ensuring multiple lines of evidence that would support robust findings.
The evaluation of the PLS draws on five lines of evidence: a document and data review, key informant interviews, a file review, case studies, and focus groups. Each of these methods is described more fully below. This section also includes a brief discussion of methodological challenges.
The methodology was developed with the PLS evaluation working group. The data collection methods and instruments were all reviewed and approved by the working group.
The evaluation matrix, which lists the evaluation questions, indicators, and lines of evidence, and is used to guide the study, is included in Appendix A. The data collection instruments developed to respond to the evaluation matrix are in Appendix B.
3.1. Document and Data Review
The document and data review was conducted both to inform the development of data collection instruments and to address the majority of the evaluation questions.
Documents reviewed were obtained from internal, external, and publically-available sources. Departmental documents reviewed included: Departmental Performance Reports; Reports on Plans and Priorities; internal audit reports; the results from Public Service Employee Surveys conducted in 2005, 2008, and 2011 Footnote 5; and Client Feedback Survey results from 2010. Footnote 6, Footnote 7 Internal PLS documents were also reviewed, as well as publically-available information, such as Budget speeches and Speeches from the Throne.
In addition to documents, the evaluation involved the review of iCase data from fiscal years 2007–08 to 2011–12. iCase is the Department’s integrated case management, timekeeping and billing, document management, and reporting system.
3.2. Key Informant Interviews
The key informant interviews conducted for this evaluation addressed the majority of evaluation questions, and were a key line of evidence in gathering information on the need for the PLS, as well as the effectiveness of its activities. A list of potential key informants was prepared, and interview guides tailored to each key informant group were developed in consultation with the evaluation working group. A total of 46 interviews (group and individual) were conducted with 56 key informants representing Justice counsel in DLSUs (n=19) and other areas of the Department of Justice (n=18); PLS employees (n=13); and client departments and agencies (n=6).
Potential interviewees received an invitation to participate in an interview. Key informants who agreed to participate were provided with a copy of the interview guide (in the official language of their choosing), prior to the interview. Each interview was conducted in the respondents’ preferred official language, and key informants were assured of the confidentiality and anonymity of their responses. This evaluation included a mix of telephone and in-person interviews.
3.3. File Review
A review of a selection of closed files was conducted to allow for a more in-depth understanding of the life of a file in relation to the performance measures for the PLS. This method also allowed the evaluation to explore whether the information obtained from key informants on how the PLS conducted its work was supported by a review of selected case files.
The file review involved the examination of iCase data for 29 files, the majority of which were advisory files demonstrating the Sector’s work in policy development, legal policy advice, advisory services, and litigation support (where the PLS was not the lead in litigation). Files could include more than one type of legal service. Consequently, the file review sample included files where the PLS provided legal advice (11 files); litigation support (11 files); legal policy advice (7 files); negotiation assistance (6 files); and policy advice (5 files). Two litigation files (where PLS counsel were lead litigators) were included in the file review; however, considering that the majority of PLS work is of an advisory capacity, the file review was weighted toward advisory files.
The sample of files was chosen with the input of the evaluation working group and was selected to demonstrate the broad spectrum of the work conducted by the PLS, with a particular focus on higher profile files (in terms of legal risk, complexity) involving more than one PLS section. As files were not chosen randomly, and as the sample was not large (considering the thousands of files worked on by PLS counsel during the time period covered by the evaluation), the file review sample is not a strictly representative one. Rather, the file review was intended to be illustrative of the Sector’s approach to its work. All seven PLS sections were represented in the selection of files. File selection was roughly weighted according to the size/volume of work of each section, but consideration was also given to the variety/nature of work on each file. Eight CAILS files, five HRLS files, five ILAPS files, four JLT files, three OLLS files, two IPLS files, and two JACTPS files were reviewed.
To protect confidential information, as well as solicitor–client privilege, counsel from the PLS and staff from the Evaluation Division conducted the file review. To ensure that comparable information was collected from the files, counsel completed the standard file review template developed for the study (see Appendix B). The template collected information to respond to the evaluation matrix and focussed on factual information available in the files.
3.4. Case Studies
Ten case studies were conducted to allow for an exploration of best practices and the national approach taken by the PLS. Footnote 8 As such, cases selected were higher-profile files (i.e., those involving medium to high complexity and/or risk), and those involving a high degree of collaboration among PLS sections and/or between PLS counsel and DLSU counsel or clients representing different departments or agencies. Cases were chosen from the file review sample.
For each case study, file review templates (completed as part of the file review conducted for the evaluation) were reviewed. In addition, telephone interviews were conducted to supplement documented information, to provide context for the work, and to allow for a more in-depth assessment of how the file was handled and the effectiveness of the working relationship between the PLS and those requesting PLS services.
Twenty-five interviews were conducted with a total of 38 stakeholders. Interviews involved PLS counsel (n=20), client representatives (n=9), and other Justice counsel or DLSU lawyers (n=9) who had worked on the files chosen for the case studies. Interviewees took part in either an individual or small group interview. The approach used to schedule and conduct the interviews with case study participants was the same as the approach (described above) for scheduling and conducting the key informant interviews.
3.5. Focus Groups
Focus groups were the final line of evidence used in this evaluation. Four focus groups were conducted (one French and one English group with PLS counsel and with DLSU lawyers or litigators) after the other lines of evidence were completed. These focus groups were used to follow up on emerging findings and obtain additional details and insights about issues identified by the other lines of evidence.
The PLS English focus group involved five participants, and the PLS French focus group involved four participants. In each group, participants represented a different PLS section. The DLSU lawyers and litigators (six in the English group and four in the French group) who took part in the other two focus groups were those familiar with different PLS sections.
All focus groups were recorded to ensure accuracy of the notes, but participants were assured of the confidentiality of their responses.
The evaluation faced a few methodological limitations. These are listed below by line of evidence.
Document and data review: iCase limitations
The ability to use iCase to respond to evaluation questions was limited in several ways. This was primarily due to the limited information on key variables such as legal risk and complexity that is kept in iCase for advisory files. As noted in the 2008 Evaluation of LRM in the Department of Justice, attaching legal risk levels and complexity levels to advisory files has several conceptual difficulties, including whether legal risk ratings are only appropriate when there is a risk of litigation and when to assess legal risk. The Department continues to work on this, but is not yet at a point where iCase data can be relied upon for legal risk and complexity for advisory files. During the 2006–07 to 2011–12 period, less than one percent of PLS files had a complexity level assigned, and the result was the same for legal risk levels. This limits analysis of trends in the level of legal risk and complexity of advisory files, which would provide a better understanding of the nature of the demand for PLS legal services. In addition, the lack of legal risk and complexity information means that the evaluation cannot review efficiency and economy using standard law practice management measures, such as assignment of higher-risk, higher-complexity files to more senior counsel. Further discussion of issues to consider in conducting this type of analysis is in Section 4.3.3.
The use of iCase data to understand demand for legal services is also affected by the lack of a consistent approach in opening advisory files. Evaluation stakeholders indicated that the iCase data on the number of files may underestimate the advisory work of the PLS on files, since PLS counsel differ in their file-opening practices; PLS counsel may or may not open new advisory files for brief or informal advice provided. Since the decision on whether to open an advisory file is left up to individual counsel, a “file” is not a consistent unit of analysis. In addition, the number of advisory files does not accurately reflect the number of requests for PLS services, since multiple requests can occur for a single file. The PLS currently does not employ a standard task-management method or process that captures the number of requests, and the number of files may not be a reliable measure of demand for advisory services.
iCase data regarding hours spent on files should also be interpreted with caution. At times, PLS counsel and clients agree on a maximum number of hours to be entered into iCase before PLS work is completed. Counsel also have an annual “budget” for hours (1300) and, according to stakeholders, may not track hours precisely. Therefore, iCase data on hours may not always be accurate to the amount of time actually spent by PLS counsel on files.
The evaluation has used iCase data in a few, limited ways with the understanding and the caveat that the reliability of the data cannot be assessed. The PLS might want to undertake a review of how iCase fields are completed by counsel to build in more consistency, so that the data are more useful for senior management and could potentially provide more objective information for the next evaluation of the PLS.
Interviews and focus groups
The interviews and focus groups with key informants and case study participants have the possibilities of self-reported response bias, which occurs when individuals are reporting on their own activities and so may want to portray themselves in the best light, and strategic response bias, whereby the participants answer questions with the desire to affect outcomes.
In any given year during the evaluation (2007–08 to 2011–12), the PLS as a whole was consulted in excess of 14,000 files. To obtain a random sample with a reasonable error level would require reviewing hundreds of files, which was not feasible. Instead, the evaluation relied on the opinion of the working group members to select files that they believe reasonably represented the work of the PLS.
The mitigation strategy for the above methodological limitations was to use multiple lines of evidence that seek information from the PLS and those seeking PLS services, as well as management and “front line” staff; and from a file review and more comprehensive administrative data review (iCase). The mitigation strategy also included using both quantitative and qualitative data collection methods to answer evaluation questions. By using triangulation of findings from these different sources, the evaluation was able to strengthen its conclusions.
- Date modified: