Evaluation of Public Legal Education And Information: An Annotated Bibliography

3. Government Documents

3. Government Documents

3.1 PLEI-Related Documents

A number of the documents reviewed in this section precede the five-year period mandated in the bibliography's objectives. Some older materials were included if they were believed to be relevant to current evaluation contexts. Others, such as the 1986 evaluation assessment,were included to provide an historical perspective on the types of research conducted by and for the Department of Justice. This report can be contrasted with the most recent, comprehensive PLEI evaluation from 1997: Tools for moving forward.

Abdelahad, L., Sansfacon, D. & Beaulne, A. (1989). Access to justice: research reports on PLEI, research notes. Research and Development Directorate, Department of Justice Canada.

General Overview

This compilation of "notes" summarizes twelve PLEI-related research reports written for the Department of Justice's "Access to Justice" series between 1986 and 1987. Although the summaries do not provide guidance for evaluating PLEI in themselves, four of the entries may prove useful in developing an evaluation framework. In 1984 the Department of Justice created the "Access to Legal Information Fund" along with a strategy to fund new PLEI organizations in six provinces and territories. The funding was granted for a three-year period and toward the end of that time a call for tenders to evaluate these programs was issued. Two external consulting companies were hired to evaluate three out of the six programs (Manitoba, Yukon and Newfoundland), and their reports are annotated in this document.

  • R. Hikel, C. Meredith, C. Wihak, A. Woods (ABT Associates of Canada) (1986): Evaluation of the Community Legal Education Association [Manitoba], Department of Justice, Ottawa.
  • R. Hikel, C. Meredith, C. Wihak, A. Woods (ABT Associates of Canada) (1987): Evaluation of the Yukon Public Legal Education Association, Department of Justice, Ottawa.
  • D.H. Access Research Associates Inc. (1987): Evaluation of the Public Legal Information Association of Newfoundland, Department of Justice, Ottawa.

The other summary in this collection that is especially relevant to PLEI evaluation was on the Resource Book created by Focus Consultants as a result of the proposal annotated below.

  • Currie, Janet & Roberts, Tim (1986): An Evaluation Resource Book for Public Legal Education and Information Organizations, Department of Justice, Ottawa. [See section 4.2]

Alderson-Gill and Associates. (1995). Public legal education and information program: A new vision for PLEI. Department of Justice Canada.

General Overview

This document outlines a new role for the Department of Justice within the existing PLEI network and proposes three areas for emphasis and development. The three specified areas are: (i) cultivation of partnerships; (ii) ACJ Net; and (iii) focus on education. The new "vision" was created with the intention of addressing limited availability of resources while maximizing national benefits of PLEI activities. Each area of emphasis identified in the report situates the Department in a position to undertake the necessary evaluation components. Moreover, there is explicit recognition that assuming a leadership role in specialized research and evaluation of PLEI is a cost effective measure for all parties participating in PLEI. Under the mandate of cultivating partnerships, the Department can facilitate networking between PLEI providers. Through centralized coordination, providers can be certain that activities and efforts are not unnecessarily duplicating those of others. This encourages information sharing and should ultimately lead to incremental improvements for PLEI. Examples given for the Department's role include fostering a national network of organizations, creating a national PLEI advisory process, maintaining a current telecommunications network, and coordinating national strategies. Much of this work can be facilitated by ACJNet, which is the second major area of emphasis recommended. The link between evaluation and an emphasis on education is clearly anticipated in the report. By focusing on this area, Departmental contributions will increase knowledge about the relative effectiveness of different approaches to legal education through research and evaluation that is too costly for individual jurisdictions to fund in a systematic way. As the need for research and evaluation is widely acknowledged, the Department has a role to play in demonstrating leadership and coordinating partnerships in this area.

Challenges of Funder's Evaluation Requirements
  • Paper states that due to scarce resources funders are most interested in knowing whether PLEI services are having an impact and that "evidence of impact is easier to come by in the case of targeted PLEI programs than it is in the general distribution of PLEI pamphlets and booklets to the public" (p.13).
  • Funders that focus on quantitative data i.e., how many pamphlets were distributed, overlook the more active meaning of PLEI dissemination that "incorporates a responsibility to ensure that the recipients receive the information, understand it and act on it."
Evaluation Methodologies
  • PLEI involving plain language in legislation, etc. requires testing on consumer/client groups to determine if the language is clear and if the impact of the change can be perceived
  • Use of ACJNet to facilitate ongoing consultations, both formal and informal about programs and materials

Currie, J. & Roberts, T. (1984). An evaluation resource book for PLEI organizations: A proposal. Department of Justice Canada.

General Overview

This is a proposal for the creation of an evaluation resource book for PLEI organizations. The Canadian Law Information Council ("CLIC"), which was the clearinghouse for PLEI in Canada at the time of the research, contracted out this project to Focus Consultants. The paper identifies and discusses a number of basic challenges and specific problems that inhibit PLEI evaluation. Although most groups contacted did engage in some form of evaluation, the general consensus was that current evaluations were ineffective, incomplete and poorly utilized for the desired purposes. In advocating for the creation of a Canadian PLEI evaluation resource book, the authors note that "while there is a great deal of evaluation material being produced for human services and educational organizations, [they] found little that relates specifically to PLEI. PLEI incorporates aspects of both human service and educational programming but it has a distinctive overlay because of its legal focus and involvement of legal professionals" (p.23). Moreover, the PLEI providers interviewed identified a number of evaluation skills that were lacking within their organizations. They all thought that the proposed resource book would be extremely useful, especially for ensuring a common understanding between the Government, other funders and among PLEI organizations. Most of the challenges and specific problems identified in the report can be addressed though centralization of evaluation materials and criteria by an agency such as PLEAC or a branch of the Department of Justice. Many of the obstacles to meaningful evaluation come from a lack of clarity as to definitions and objectives of the projects and the evaluation itself. By providing guidelines and the skills/methods required to undertake evaluation, some of the burden may be alleviated. The challenges and obstacles enumerated in this report are still relevant to PLEI evaluation today. Together with the resource book that was written by the same authors in 1986, this report has enduring significance and should be used in developing evaluation materials and information.

The Challenges of PLEI Evaluation
Contextual Factors
  • The findings of this extensive research revealed a number of contextual factors that present significant barriers to PLEI evaluation. These include:
    1. Lack of experience and expertise
    2. Diversity in the structure and delivery of programs. Differences in focus and format often relate to varying definitions and/or intentions of the program. For example, a program targeted at knowledge acquisition ought to be assessed differently from one that seeks to effect attitudinal changes. Similarly, some services tend to be case-flow oriented and therefore outcome-driven, while others adopt broader objectives of empowerment, equality enhancement and increased democratic participation. Again, it is necessary to understand these objectives before approaching a program for evaluation.
    3. Different emphasis and goals held by funders and providers leads to distinctive modes of evaluation (qualitative versus quantitative)
    4. Resistance and inability of lawyers to elicit feedback for their services
    5. Ideological opposition to reinforcing barriers between participants and the authoritative "teachers"
    6. Practical considerations include: confidentiality of client information and lack of funds
      1. Lack of funding in the system generally (evaluation = cutting programs)
      2. Lack of funds in organizations: low priority to direct resources toward evaluation
      3. Lack of significant funds from any one funder to support comprehensive evaluation
  • The cumulative effect of these contextual barriers was that evaluation was seen as "too difficult, too time consuming and too painful."
Problem Areas Discovered
  • Contextual factors are often difficult to change
  • Internal evaluations:
    1. Lack of evaluation skills in the staff and lawyers working in the organizations
    2. Lack of objectivity
    3. Lack of information sharing between PLEI organizations
    4. Resort to purely quantitative evaluation to satisfy funders
    5. Questionnaires and surveys are not sufficiently critical or in depth to assist with increasing effectivity
    6. Poorly articulated goals and objectives
  • External evaluations:
    1. Lack of control over type and content of evaluation
    2. Lack of commitment to using results beyond statistics collected
    3. Funders and external evaluators do not accept responsibility for programs
    4. Poorly articulated goals and objectives (interviewees indicated that goals were often too lofty or unrealistic to produce a meaningful evaluation)
Evaluation Methodologies
  • External, professional evaluations were discussed in the report and although they can be very beneficial in some circumstances, they cannot constitute the primary method of ongoing PLEI evaluation.
  • Informal methods discussed include: peer review, staff "think tanks" and community networks which involve other PLEI and non-PLEI organizations that share or overlap in terms of clientele or research areas.

Godin, J. (1994). More than a crime: A report on the lack of public legal information materials for immigrant women who are subject to wife assault. Research Section, Department of Justice Canada.

General Overview

As noted in the title, this paper confirms the lack of PLEI materials available at that time for immigrant women who were victims/survivors of wife assault. Through interviews and a literature review the author identifies the need for PLEI materials in this area and some of the major challenges that must be recognized if and when the materials are produced. When read as a form of needs assessment, this document reiterates that comprehensive and effective evaluation requires informative and thorough needs assessment on the front end. Only by understanding the obstacles faced by various populations to accessing legal information can the provision of that information be assessed in a meaningful way.

The Challenges of PLEI Evaluation
  • Although the basic need for PLEI materials was undisputed by various organizations and agencies that dealt with immigrant women and/or victims of wife assault, the stronger message from these groups was the need for this material to be sensitive to corresponding issues faced by the target groups.
  • Due to complexity, it is very difficult to evaluate cultural sensitivity.
  • The author recounts that "PLEI is not useful if it is not retained" (p.13). This is an important point for evaluators to remember, since the common post-program survey does not account for retention levels. Therefore, while some key information such as suitability of location and time may be lost if evaluation is postponed, other data cannot be assessed immediately. Depending on the purpose of the evaluation, it may be pertinent to conduct follow up interviews after one, six, or twelve months have passed.
Challenges of Funders' Evaluation Requirements
  • The objectivity gained by funders' evaluations must be offset against the unparalleled feedback that community groups can offer regarding the effectiveness of PLEI materials. Just as genuine understanding of particular circumstances is critical at the needs assessment phase, evaluation must also be tailored to reflect the cultural, social and political characteristics of the targeted group.
Evaluation Methods
  • PLEI materials in this area were being field tested in consultations with relevant communities throughout the province.

Moliner, M. (1997). Public legal education and information review: Tools for moving forward. Programs Branch, Department of Justice Canada.

General Overview

This review offers the most recent assessment of the Department of Justice's involvement in PLEI programs. The report articulates multiple reasons for the Department's continued connection and suggests that Justice Canada clarify its role in PLEI delivery, which may be accomplished by ceasing to produce PLEI materials and distinguishing Departmental "Communications" from PLEI. After looking at PLEI delivery across Canada and consulting with many providers, the Advisory Committee proposes that the Department set criteria for funding PLEI, as well as introduce a multi-year funding strategy and a funding infrastructure for a national NGO such as PLEAC. The review further advises that funding and policy initiatives be targeted in order to fill existing gaps in PLEI (both population-related and issue-related gaps) and that the Department should take a leadership role in ensuring that this "targeted approach" is adopted by core PLEI providers. With regard to PLEI evaluation specifically, the review reaffirms that "evaluation of PLEI program delivery is necessary to ensure that it is achieving results" and that the Department of Justice should "support the creation of an evaluation tool to assess whether or not PLEI is having the desired impact." It further suggests that any evaluative framework developed be consistent with the targeted approach taken to funding and policy issues: the "needs, gaps, and priorities" analysis. Evaluation is seen here as a shared responsibility between funders and core PLEI providers. Accordingly, the role of the Department of Justice and other funders is to identify evaluation objectives and the provider's task is to make the evaluation relevant. The single evaluation-related recommendation is found in "Proposed Direction 7": It is proposed that Justice Canada require evaluations of audiences reached by the PLEI initiatives it funds and the impact they achieved.

The Challenges of PLEI Evaluation
  • Problems arise when evaluation purposes are not clearly defined, and more specifically what motives/agendas/biases are informing the evaluation.
  • Many providers think it is too hard and too expensive to measure the impact of their services.
  • The value of PLEI work remains largely unmeasured and therefore the willingness to continue funding programs is faltering.
  • Due to the complexity of the PLEI context many critical questions go unanswered:
    1. Against what standard (internal or universal) is PLEI being judged;
    2. In any given evaluation is it PLEI that is "on trial," a particular approach, or a specific activity;
    3. How are relevant comparisons made when activities and audiences vary greatly;
    4. How do you place a value on the impact realized from each separate effort or activity;
    5. How do you evaluate the evaluation i.e., the results may be predetermined depending on whether the evaluator places greater "value" on cost efficiency as opposed to reaching a small group of multiply-disadvantaged persons.
Connections between Goals of PLEI and its Evaluation
  • PLEI organizations understand and routinely insist that the evaluation be tied to goals and objectives.
  • The necessary objectives cited for determining whether a PLEI program is successful were taken from a 1979 Report of the "Law and the Layman Committee." Though unquestionably important, these criteria were fairly simplistic and uninformative in terms of providing guidance for how evaluations may be addressed.
Challenges of Funder's Evaluation Requirements
  • Concern that funder-driven evaluations will be misinterpreted and lead to skewed funding decisions
  • An ongoing concern for the Department is that "the reputation of [its] PLEI program remains misunderstood if it is unable to present a business case which indicates that the services it funds are effective as well as cost-efficient" (p.39).
  • A program review initiated by the Law Foundation of Alberta resulted in cutting 40% of funding to programs such as PLEI intermediary organizations. This illustrates that the fear that evaluations equal funding cuts is valid.
  • Even when methods for evaluating PLEI are devised, it is difficult to attach a dollar value to the various successes or shortfalls that emerge.
Evaluation Methodologies
  • Types of "soft" evaluation methods discussed include: gathering anecdotal comments, counting heads, using evaluation forms, soliciting expert critiques, field testing and comparing efforts of one PLEI program to another to learn from the experiences of colleagues
  • One PLEI provider suggested the use of a "pre-test experimental design [that would] assess changes in knowledge, attitudes, etc" (p.42).
  • Another method suggested was to engage in participant observation (i.e. in small claims court) to observe changes in self-help skills.
  • Other methods discussed in the electronic conference were:
    1. Qualitative – anecdotal comments, focus groups, field testing;
    2. Formative – early client input, focus groups of target populations, field testing;
    3. Peer review – specific activities or informal/spontaneous feedback
  • Methods can be facilitated by provincial and national PLEI associations and it is generally agreed that experiences should be shared to a greater extent.
  • Due to a lack of resources longitudinal studies are not realistic and other types of evaluative interactions/efforts have not been well documented. However, feedback seems to suggest that these less formalized methods are beneficial, but that they must be synthesized in some manner in order to track changes and incorporate transformations more effectively.
  • The report does not indicate that PLEI providers would like to see formal evaluations substituted for other methods of assessments and impact analysis.
Useful Materials for PLEI Evaluation

One possible evaluation methodology was suggested where the purposes of evaluating PLEI would be to:

  1. Evaluate what PLEI needs are met by core PLEI providers. Specifically:

    1. Who is reached?
    2. How did they gain access?
    3. Did they understand the information they received?
    4. Were they able to act effectively on the basis of the legal information provided?

  2. Evaluate what gaps exist in PLEI delivery:

    1. Who is not reached and why?

  • It is anticipated that adhering to this type of evaluation would facilitate priority setting by identifying which populations are most in need and are not being adequately served by existing programs.

Public legal education and information: An evaluation assessment. (1986). Programme Evaluation Section, Department of Justice Canada.

General Overview

This document looks at the current state of PLEI delivered and funded by the Department and contemplates whether the operation as a whole should undergo a full-scale evaluation. The evaluation is focused on the level of support by the Department to other PLEI providers, and therefore, it is somewhat limited in its applicability to independent PLEI organizations who are engaged in self-evaluation. In light of the interrelationship between definitional clarity and useful evaluation, a portion of the text is dedicated to defining the terms "public" "legal" "education" and "information." The following section provides a profile of the Department's PLEI program both pre and post 1984 by outlining various initiatives. It is noted that there is a definite lack of policy development in this area. The paper divides major issues of evaluation and methodology into two parts: (i) program management issues (highlighting flaws in design and implementation) and (ii) impact and effectiveness issues. A series of questions accompany each "issue."

Under the program management rubric, questions are styled to help refine and focus the program and the manner in which funding decisions are made. The questions under the second rubric are more directly relevant to PLEI organizations or their funders who are seeking accountability. Despite the general nature of the questions, they may be useful for guiding development of evaluation criteria and methodology. However, this assessment is not universally applicable to PLEI provider evaluation. For example, a PLEI organization would want to elicit responses that enable them to measure a program's effectiveness for a particular target group. Yet, in this study the question of "how much understanding of the information provided do the recipients display" was deemed to be "non-evaluable."

In summary, the division between management and impact/effectiveness used in this study may still be useful for Departmental assessments when considering its role in PLEI. Beyond that there is little practical utility to this paper for PLEI providers as the basis for this particular assessment was to determine the degree of risk for the Department if it made a decision about its PLEI programs with or without the results of a full-scale evaluation. Thus, where the goals and objectives of an evaluation are related to overall efficiency of resource use for the Department the issues raised are still applicable, whereas for purposes of ongoing monitoring and specific program evaluation, they are not critical. Finally, the costs involved in a full evaluation such as the one described are much higher than could be realistically spent by a PLEI organization, even on the provincial level. Therefore, the only PLEI partner who could benefit from using the information provided in this document would be the Department of Justice as they endeavour to assess their role in the Canadian PLEI landscape.

Evaluation Methodologies
  • Interviews with Department of Justice officials
  • Expert assessment of PLEI materials
  • Documentation review
  • Project review
  • Surveys (e.g. PLEI sponsors, other PLEI programs, etc.)
  • Incorporating information from other evaluations
Useful Materials for PLEI Evaluation

Some of the questions included in the impact and effectiveness section include:

  • Whether the PLEI projects funded by the Department have produced the intended effect of providing the general public and/or specified target groups with information related to their needs
  • Whether the availability of PLEI has increased the general public's and/or selected groups' knowledge and understanding of the law
  • Whether the PLEI programme has led to a change in public and/or target group attitudes to the law and the legal system
  • Whether the Department has promoted the development of new methods and techniques for PLEI delivery

3.2 Other Evaluation Documents

The Department of Justice has undertaken many evaluations of the PLEI projects it has funded. Although this practice has been less frequent in recent years, the report noted below is a good example of a governmental impact assessment.

Ellis, P. (1995). Educational programs that alter knowledge, attitudes and behaviours of youth. Research and Statistics Division and Evaluation Directorate, Department of Justice Canada.

This is a good review of literature and discussion on changing attitudes through Youth-in-school programs, Youth-out-of-school programs and also with regard to adult education and attitude changes in general. It looks at 8 possible behaviour modification components and surveys 46 documents that study and address attitudinal change though education.

Date modified: