Project Managers' Guide to Performance Measurement and Evaluation

APPENDIX 3: Tools for Project Managers - Project Level Evaluation Plan - Checklist

Element

Look for….

Tools

Project Description

  • Project objectives

  • Target group or beneficiaries

  • Activities

  • Outputs

  • Expected results (outcomes)

Consider providing a Logic model or Project "road map"

Indicators of success/impact

  • What are the indicators of success/impact?

  • Are they measurable?

Specific indicators

Data collection

  • Methods (Qualitative and Quantitative)

  • Data Sources

  • Feasibility

  • Logistics

  • Timing/frequency of data collection

  • Roles and responsibilities

  • Protocols for collecting and monitoring

  • Appropriate methods that are sensitive to the situation and population (gender, culture, language, literacy, age, community, disability)

  • Data collection plan and protocols

  • Ethical standards and confidentiality provisions

Who is responsible for conducting the evaluation?

  • Is the evaluator internal or third party?

  • Does the evaluator have the appropriate knowledge and skills, including cultural/diversity competence?

  • Are there any conflict of interest issues to consider?

  • How will privacy and confidentiality be addressed?

  • Is there good communication between the evaluator and the project sponsor?

Agreements, contracts and protocols

Partner and Stakeholder involvement

  • How will partners be involved in the evaluation?

  • How will stakeholders (e.g. funders) be involved?

Agreements, terms of reference for committees

Evaluation resources

  • Are sufficient resources allocated to carry out the plan?

  • Is the evaluation cost-effective?

Evaluation budget as a % of project budget

Actual and in-kind resources

Utilization of results

  • How will the project use the results?

  • How will DOJ use the results?

  • Project statement of how the results will be used to improve their project

  • DOJ statement of how the results will be used to inform decision-making

Does the evaluation make sense?

  • Is the type of evaluation planned appropriate? Realistic?

  • Is the evaluation plan practical and achievable?

  • Will results be meaningful & credible?

  • Will results be timely?

Your overall assessment

Advice of others

Considerations

  • Are there more suitable methods that would be better matched to the project?

  • Are there more cost effective strategies?

 

An Overview of Information/Data Collection Methodologies

There are various types of information or data, and various collection methods. Here’s an overview of some of the most commonly used methods.

Type of information/data

Examples of methods to collect information/data

Some advantages

Quantitative data

  • Closed-question Surveys (mail-out, e-mail, web site, telephone)

  • Project records/statistical reviews (client processing information; project dissemination log analysis)

  • You can gather information from many people and you can count and measure to produce statistics.

  • You can provide a quick overview of your project's activities (e.g. how many clients you served, how many pamphlets you disseminated, costs per activity)

Qualitative data

Project file or document reviews

 

You can build understanding of the context and experiential process from the project record.

 

Literature reviews

You can assess the relevance or your work within broad stage of knowledge development in the field.

 

Policy reviews

You can situate your work with broad stage of policy development in the field.

 

Key informant interviews

You can discover the context and meaning of peoples' experience with the project.

 

Case studies

You can get in-depth information or a story of what happened and what the results were.

 

Expert panels

You can acquire further knowledge and insights.

 

Focus groups

Like a group interview. You can get collective insight on a specific topic or questions.

 

Dialogue or learning circles

You can gather stakeholders together to share experiences and identify key learnings in a culturally appropriate way.

Guidelines for Tool Development & Examples

There are many different ways to collect project evaluation information, including the compilation of basic statistical information. This appendix briefly describes several of the tools that can be used to evaluate projects – and to determine, in particular, project impacts:

  • Workshop or Conference Evaluation
  • Interviews
  • Surveys
  • Focus Groups
  • Cluster Evaluations

Workshop or Conference Evaluation Feedback Forms

Workshops and conferences bring individuals together to share their experiences, exchange ideas, develop knowledge and acquire new skills. Participant feedback from such events can provide valuable information to determine the very immediate impact of the event. You can also evaluation feedback forms to get a sense of how people will use the knowledge or skills they acquired at the event. You would need to do further follow-up at a later point in time– such as participant interviews or a surveys – to find out whether and how people have applied the knowledge and skills that they acquired and how it has impacted their work.

What’s Involved?

Before the event: Once you have set your agenda, design a brief feedback form and include it in the participant package. Participants should fill out this form anonymously.

At the event: Have participants fill out the form and hand them in at the end of the event.

After the event: Compile the answers to assess what worked well, what did not work so well, and participants’ suggestions for improvements and/or next steps. Use this information in future work (e.g. future workshops or conferences, follow up steps).

Overall Design

A participant feedback form should:

  • Be one page or less
  • Be printed on coloured paper to stand out
  • Be easy to read and complete
  • Provide space for additional comments
  • Indicate to whom to submit the form
  • Explain how you will use the feedback, and
  • Thank participants for completing the form.
Designing the Questions
  • Ask only a few questions that participants can read and answer quickly
  • Make sure the questions are clearly worded
  • Use either close-ended or open-ended questions depending on the topic (see definitions below).

Close-ended (or closed) questions provide individuals with a set of answers to choose from, such as a multiple choice list of answers, “yes” or “no” boxes to check, or a rating scale to complete.

Open-ended (or open) questions do not provide individuals with a set of answers to choose from – the individual is expected to formulate their own answer, in their own way.

Here are some examples of topics suited to closed questions:

  • Objectives Achievement: To what extent do you think the event’s objectives were met? (Not met, partially met, fully met)
  • Satisfaction: How satisfied were you with the presentations? (not at all satisfied, satisfied, very satisfied)
  • Usefulness: How useful did you find this event to the work you do? (not at all useful, somewhat useful, quite useful)
  • Creature Comfort: How satisfied were you with the facility? (not satisfied, satisfied, very satisfied). How satisfied were you with the food? (not satisfied, satisfied, very satisfied).

Here are some examples of topics that may require “open-ended” questions:

  • Intentions: How will you apply the [knowledge, skills] you acquired at this event?
  • Lessons learned: What was the most important thing… least important thing you learned?
  • Opinions: What do you think about issue/idea/suggestion X?
  • Comments: Do you have any additional comments about this event?

Interviews

Interviewing individuals who have been involved in – or impacted by – a project can provide in-depth and detailed information about their perspectives and experiences.

One-on-one interviewspermit individuals to make anonymous comments and express their opinions freely.

Interview data can supplement – and permit a crosscheck of – information obtained from various sources.

Interviews can be conducted in-person or on the telephone.

What’s Involved?

Before conducting the interviews

  • Develop a list of those individuals who will be most knowledgeable. Think about who can best provide the information you need. It may be helpful to develop selection criteria to choose your key informants.
  • Decide what type of interviews you will conduct. The options include, for example, informal conversational interviews, interviews that focus on a list of key topics, or interviews that include a standardized set of (open and/or closed) questions.
  • Prepare an interviewer protocol to familiarize interviewers with the process to be used to contact, book, conduct and report on interviews. Confidentiality is a key issue to be addressed in an interviewer protocol.
  • For standardized interviews, develop an interview guide that contains all of the questions to be asked (include prompts where needed in the interviewer’s version).
  • Prepare an information package to send out to interviewees. The package should include information about the purpose of the interview, background information about the project, and a list of the topics (or the specific questions) that will be asked in the interview.
  • Contact potential interviewees to request their participation. Be clear about issues such as: recording of the interview, confidentiality, how the information will be used, how long the interview is likely to take, and the format (in-person or by telephone).

When conducting interviews

  • Follow the interviewer protocol closely.
  • Be prepared to handle situations, such as cancellations and “no shows” and requests for additional information, copies of the interview notes, etc.
  • Evaluation project managers may want to monitor the first few interviews and review the resulting interview notes to ensure quality control.

After conducting the interviews…

  • Finalize the interview notes according to the protocol.
  • Review each set of notes systematically and synthesize the answers to each of the questions.
  • Analyze the overall results of the interviews.
Overall Design

Interviews should:

  • Be carefully planned
  • Focus on key issues
  • Be as time-efficient as possible
  • Follow a logical sequence, and
  • Provide interviewees with opportunities to ask questions and to provide additional comments.
Developing Interview Questions

Interview questions should be:

  • Clearly stated
  • Brief and to the point
  • Relevant, and
  • Objective.

Here are a couple of examples of interview questions that could be asked of those involved in a newsletter project:

  • How did you (or others in your organization) use the newsletter in your work? [Open-ended]
  • To what extent was the newsletter useful in your work? (Not at all useful, somewhat useful, very useful). [Close-ended]

Surveys

A survey (or questionnaire) is a set of questions that is given to a group of individuals to complete. A survey can be used in a variety of different settings to collect information about the same set of questions from many different people. Surveys may consist of a few brief questions – or they may be more detailed and lengthy.

Although surveys may include either close-ended or open-ended questions (see definitions above), they usually consist primarily of close-ended questions, because these take less time to complete, and the results are easier to analyze statistically.

A survey can be administered in a number of different ways: the questions can be printed and sent (or given) out; an electronic survey form can be emailed out or posted on a web site; or individuals can be asked to respond to a telephone survey.

What’s Involved?

Before conducting the survey….

  • Decide how you will collect the completed surveys and record and analyze the answers.
  • Design the survey (see below)
  • Pilot test the survey with a small group and obtain feedback about the clarity of the questions, the time needed to complete the survey, etc.
  • Refine the survey based on the feedback from the pilot test.

While conducting the survey…

Collect and record/keep track of all completed surveys.

After conducting the survey…

  • Organize the answers to completed surveys and input the data
  • Conduct a statistical analysis (this will require software and some technical expertise)
  • Report on the findings.
Overall Design
  • Use a clear and easy-to-read format (large enough font, enough space for answers, etc.)
  • Provide clear instructions about how to complete the questions
  • Use as few questions as possible
  • Ensure there is a logical flow to the questions
  • If appropriate, develop a coding system to make it easier to input and analyze the data (this will require some technical expertise).
Developing the Questions
  • Keep each question as brief as possible.
  • Ensure each question focuses on only one topic or issue.
  • Use plain language.
  • Avoid biased questions.
  • Provide an “other” category for answers that do not fit elsewhere.

Here are a couple of examples of survey questions (open and closed) that could be asked of those who participated in an expert consultation to develop a research plan:

  • How well does the research plan reflect the priority issues in the field? (Does not reflect the priority issues; reflects some of the priority issues; reflects most of the priority issues; reflects all of the priority issues) (Close-ended).
  • Are there other priority issues that should be reflected in the research plan? (Open-ended).

Focus Groups

A focus group is a type of “group interview” in which a small number of people are asked to provide their perspectives on a specific topic. The group’s facilitator encourages all participants to express their views, but the group is not expected to reach consensus. For evaluators, focus groups can provide diverse perspectives and insights on an issue. The opportunity for group interaction and discussion may stimulate participants to make observations and comments that they otherwise may not have offered.

What’s Involved?
  • Determine who should participate in the focus group – usually the participants will be a group whose shared characteristics or experiences allow them to provide relevant insights and feedback on a specific issue or topic.
  • Invite participants to attend and provide them with sufficient information, e.g. an information package that describes the purpose of the group, the process that will be used, and your expectations. It is important to decide whether or not the focus group participants will be given incentives or honoraria for attending.
  • Focus on logistics, including arranging for a comfortable space, refreshments if needed, etc.
  • Find a facilitator with the right blend of expertise, experience, and skills.
  • Determine whether or not the discussion will be recorded via audio/videotape or note-taking (or both) and advise participants about confidentiality.
Overall Design
  • Determine what the group will focus on and develop the specific questions to be asked.
  • Timing – when will it be easiest for participants to participate (during the day? evening?
  • Find the appropriate (accessible, comfortable) setting.
  • Restrict the number of participants (focus groups usually include 6-8 individuals).
  • Limit the duration of the discussion to 1-2 hours.
Developing Questions
  • The questions that will be asked of the group should be pre-determined beforehand.
  • Ask only a limited number of questions (to avoid rushing participants).
  • Avoid controversial or very personal issues, as participants may not be comfortable discussing these in a group.

Here are some questions that might be asked of a small group of practitioners who have been involved in implementing an amended (or new) piece of legislation on specific offences against children:

  • How has the amended/new legislation affected your capacity to address offences against children [the specific ones addressed by the legislation]?
  • How has this amended/new legislation strengthened or weakened the criminal justice system’s response to the victimization of children?
  • How has this amended/new legislation contributed to/hampered the coordination of the criminal justice system’s response to the victimization of children?

Cluster Evaluations

Cluster evaluations look at how well a collection of similar projects meet a particular objective of change. Cluster evaluations are a potential way for the Family Violence Initiative to look across projects to identify common threads, themes and impacts and to identify overall lessons learned.

Some potential goals of a cluster evaluation include to:

  • Identify innovative, good or promising practices.
  • Assess the cluster’s progress towards the stated FVI goals and objectives.
  • Enable implementation adjustments throughout the course of the FVI.
  • Provide evaluation information to inform policy development.

Cluster evaluations are not a substitute for project-level evaluations. A third-party cluster evaluator typically conducts them. They may in part rely on some data collection by project-level evaluators. Logic Model Development Guide and W.K. Kellogg Foundation Evaluation Handbook, p. 17 (W.K. Kellogg Foundation).

What’s Involved?
  • Determine which projects have commonalities in project design.
  • Identify what you expect to learn from a cluster evaluation.
  • Invite project participation.
  • Develop evaluation questions based on the expected impacts and outcomes of the FVI as a whole.
  • Establish – and reach agreement with stakeholders – on the terms of reference for the cluster evaluation.
  • Select a cluster evaluator to carry out the evaluation.
Overall Design
  • Determine who will conduct the cluster evaluation, how information will be collected and by whom.
  • Consider the confidentiality provisions (e.g. will projects be identified in the cluster evaluation.
  • Consider individual project time frames and coordinate with the cluster evaluation time frame.
  • Consider bringing project recipients and evaluators together periodically, to share insights and learn from each other.

Cluster evaluation is a good method for obtaining information on projects that cumulatively are designed to bring about policy or systematic change. Such evaluations can lead to important “lessons learned”. This makes cluster evaluation particularly attractive to family violence issue-oriented projects.

Date modified: