Legislative Services Branch Evaluation
3. Evaluation Methodology
In accordance with TBS evaluation policies, the Department of Justice uses a risk-based approach in planning its evaluations to ensure the efficient use of evaluation resources. The Department of Justice Evaluation Division assessed the risk level (low [L], medium [M], or high [H]) of the evaluation of the LSB, taking into consideration six risk factors:
- Contingent nature of legal service or program funding (M);
- Complexity of the service (H);
- Materiality (i.e., the level of resources involved in program delivery) (M);
- Skills and expertise (i.e., business risks facing the Department regarding recruitment and retention, and the need for specialized skill sets) (H);
- Time since the last evaluation (H); and
- Information challenges (i.e., whether program information is available and accessible to fully support an evaluation) (M).
- The LSB was selected as one of the first legal services to be evaluated due in part to its overall high risk rating.
The methodology developed for this evaluation responded to the level of risk by ensuring multiple lines of evidence would support the findings. An evaluation matrix (see Appendix A) presenting the evaluation questions as well as the measures, indicators and sources of evidence to inform the evaluation questions was created to guide the development of the methodology and the evaluation. Specifically, the matrix was critical in the development of data collection tools and activities. It also served to guide the reporting of the evaluation results.
The LSB provides a unique service, and the skills and knowledge of its staff are highly specialized. Few legal counsel outside of the Branch have the appropriate knowledge to comment on the adequacy of the LSB outputs (i.e., drafted bills and regulations; advice provided; existing statutes and regulations harmonized; harmonization legislation). Therefore, the evaluation methodology was designed to focus on the protocols, processes, resources and standards used to address client requests and deliver effective and responsive legislative services rather than an assessment of the quality of the products per se. The level of quality and performance of the LSB was assessed primarily by considering the perspectives of Branch staff and clients and partners who work closely with the Branch. Existing documentation and administrative data served as supporting evidence to this end.
Relevance of the LSB was less of a challenge to assess considering its essential role within the federal government. A review of documentation and administrative data served to demonstrate this role.
Methods used to inform the evaluation consisted of a document review, an analysis of secondary and administrative data, a literature review, a file review, case studies, online surveys of LSB staff and clients, and key informant interviews. These methods are further described below.
3.1 Document Review
A number of reports and administrative documents were reviewed to obtain insight about the mandate, operations and relevance of the LSB. The documents reviewed include the following for the period covered by the evaluation:
- Budgets and financial documents;
- Department of Justice Reports on Plans and Priorities and Departmental Performance Reports;
- Speeches from the Throne;
- Memoranda of Understanding between the Department of Justice and client departments;
- Manuals and guides for drafting; and
- Other documents providing contextual information.
3.2 Secondary and Administrative Data
Existing quantitative data was also used to inform the evaluation, providing information on demand for services, performance, capacity and efficiency. Secondary and administrative data sources accessed for the evaluation included: administrative data (from the Department of Justice iCase database Footnote 29), results from the latest Public Service Employee Survey (PSES) in 2011, and web statistics for the Justice Laws and Bijurilex websites.
3.3 Literature Review/Scan
Documentation about legislative services in other jurisdictions, including the United Kingdom, Australia, New Zealand and the United States, was reviewed to explore similarities and differences to the LSB with respect to service delivery models and financial models related to the provision of legislative services within government. Technological support of the legislative function was also examined in these countries, where such information was available.
3.4 File Review
Thirty-seven (37) files closed within the evaluation period were selected for review. Eight advisory files and 29 legislation files (13 bills and 16 regulations files) representing a range of the types of requests made by clients were selected. The files reviewed were selected in consultation with the LSB Evaluation Working Group. The review provided insight into the kinds of requests made for LSB services and the responsiveness of LSB staff, as well as challenges faced during completion of the files. The Department of Justice Evaluation Division staff completed the file review. All file selection and review activities were conducted with the utmost care to protect solicitor-client privilege, and no information was collected or reported that would identify the client department or the specific issues within the file itself. Appendix B contains the file review templates..
3.5 Case Studies
Five case studies were conducted to collect additional information and to provide context with respect to the responsiveness, effectiveness and efficiency of the LSB. The files for the case studies were identified during the file review process and were selected to represent high priority and complex cases as well as a range of challenges, file nature and timelines. Thirteen (13) interviews were conducted with clients (n=4), LSB counsel (n=6) and other Justice counsel (n=3) associated with the selected files. Interviews were conducted either individually or in small groups. The case studies were completed by the Department of Justice Evaluation Division staff. As with the file review, all information for the case studies was collected and reported in a manner that protects solicitor-client privilege. The case study template and interview questions are contained in Appendix C.
3.6 LSB Staff and Client Surveys
An online survey was administered to LSB staff. The purpose of the survey was to collect information about the perspectives and experiences of LSB staff and management involved in the provision of drafting, advisory and revision services. Footnote 30 The staff survey questionnaire is contained in Appendix D.
A total of 93 individuals completed the survey, representing a response rate of 58% and a sample error of ±6.6% (based on a total of 161 eligible LSB staff). Table 3.1 provides a comparative breakdown of LSB’s human resources and survey respondents by classification. Overall, the respondent sample is fairly representative of the LSB’s counsel and management with only the professional, non-counsel staff being somewhat underrepresented.
|Classification||LSB Staff||Survey Respondents|
|#||% of Total||#||% of Respondents|
|Counsel or Legislative Counsel (LA-1 and LA-2A)||90||56%||56||60%|
|Senior Counsel, General Counsel or Senior General Counsel (LA-2B, LA-3)||23||14%||18||19%|
|Professional, non-counsel (EC)||33||21%||11||12%|
Source: LSB Budget Allocation Forecast 2012-2013, staff survey results.
An online survey was also administered to public service employees identified as having been clients of the LSB during the period covered by the evaluation. The purpose of the client survey was to collect feedback about levels of satisfaction with the services provided by the Branch. Appendix E contains the client survey questionnaire. A total of 151 clients completed the survey, representing a response rate of 65% (based on a valid sample of 233). Footnote 31 The majority (93%) of respondents to the client survey had received services from the LSB within the year previous to the survey.
Both the staff and client surveys used a 10-point scale to measure satisfaction or agreement levels. Results for these questions are reported as average (or mean) scores/ratings. According to Department of Justice standards, an average score of 8 or higher is acceptable. High levels of agreement or satisfaction are also reported and, unless otherwise stated, represent collapsed/cumulative ratings of 8 to 10.
3.7 Key Informant Interviews
Individuals from different stakeholder groups were interviewed to gather opinions about and provide context for the performance of the LSB. Individual interviews were completed over the telephone in the official language preferred by the interviewee. In total, 36 interviews were completed with the following groups:
- 12 LSB managers;
- 10 DLSU counsel;
- 7 client department representatives;
- 3 PCO/TBS representatives;
- 2 provincial/territorial legislative services representatives; and
- 2 other LSB staff members.
See Appendix F for the key informant interview guides.
3.8 Methodological Limitations
There were several challenges to evaluating the LSB. First, there had been no previous evaluation of the Branch; therefore, outcomes identified in this evaluation could not be compared to previous performance. As a result, conclusions about the current performance of the LSB relative to a baseline or previous standard often could not be made. However, findings from the current evaluation can serve as a baseline for comparisons in future evaluations and/or from which to set performance targets.
Solicitor-client privilege also created a challenge for data collection as care had to be taken in how the information was collected and analyzed. To ensure consistency in data collection across files and that no solicitor-client privileged information was collected, a file review template was created in consultation with the LSB working group. The analysis was based on the completed forms.
Further, administrative data for some indicators were not as comprehensive as originally hoped. As all bill files are designated Secret, they are not recorded in iCase other than for timekeeping purposes. In addition, the LSB does not track legal risk or complexity Footnote 32 of its files in iCase. The Branch, however, has a mechanism in place to closely monitor progress and risk levels on high risk files. This information did not lend itself readily to an analysis of efficiency and economy as it represents an insignificant proportion of the total number of files managed by LSB over the evaluation period. Thus, information about legal risk and case complexity was not available for any LSB files; trends had to be inferred from other lines of evidence such as information about case characteristics collected in the file review and from staff assessment. In addition, the limited information about case complexity in iCase did not support assessment of certain efficiency and economy measures (e.g., the appropriateness of the level of counsel assigned to a file based on level of complexity/risk).
Information for the jurisdictional review was limited to information available publicly, mainly on-line. The evaluation did not include resources to confer directly with justice representatives in the different countries. While the information about the legislative drafting processes was not comprehensive, there was sufficient information to determine whether the approach taken was a centralized, devolved or combined approach, which could be compared to the model adopted in Canada.
As with all surveys, the results of the survey of LSB staff are likely affected by self-selection bias, which is bias based on who responds and who chooses not to respond. Despite conducting a pretest to ensure relevance, clarity and ease of response, as well as best efforts to achieve a high response rate, only 58% of those surveyed completed it. A comparison of the respondents to the total complement of LSB staff demonstrates that the Branch counsel and management are well represented in the sample, while professional non-counsel staff is somewhat underrepresented. This would suggest that some caution should be taken when considering the Branch staff survey results, particularly in terms of representing the perceptions/experiences of all LSB staff. The survey of clients achieved a slightly higher response rate, and again the results would be subject to the same self-selection response bias and should be interpreted with some caution.
During the evaluation period, the LSB actively managed an average of more than 2,000 files per year. To obtain a random sample with a reasonable error level would require reviewing hundreds of files, which was not feasible. Instead, the evaluation included a small sample of files that the LSB Evaluation Working Group agreed reasonably represented their work. Therefore, the results of the file review should not be considered (statistically) representative of all files (i.e., the results of the file review cannot be generalized to all LSB files).
The mitigation strategy for the above methodological limitations was to use multiple lines of evidence that included both quantitative and qualitative data from a range of sources to answer evaluation questions. By using triangulation of findings from these different sources, the evaluation was able to strengthen its conclusions.
- Date modified: