Crimes Against Humanity and War Crimes Program


3.2. Program Design and Delivery (cont'd)

3.2.5. Partnership - Other Federal Departments and International Partners

Summary Findings

Document reviews and interviews conducted for the evaluation indicate that the Program has established effective partnerships with other departments and agencies of the Canadian Government. Relations between DFAIT and the Program are characterized by most interviewees as effective, despite some disagreements on the specific issue of denying entry to some former officials of designated regimes. The Program maintains a continuing dialogue with DFAIT staff on these issues.

Similarly, all data collection methods relevant to this issue (document reviews, interviews with departmental staff, interviews with international partner agencies, the case studies of other jurisdictions) point to a high level of effectiveness in developing and maintaining partnerships with relevant international organizations. The Program has clearly been shown to be effective in developing and maintaining international partnerships.

Evaluation Evidence
Other Government of Canada Departments and Agencies

Program documents provide examples of policy and operational aspects of the Program mandate requiring program units to cooperate with other departments and agencies including DFAIT, the Department of National Defence (DND) and the Canadian Security and Intelligence Service (CSIS). Similarly, Program Annual Reports and operational documents show examples of the necessity of collaboration between the DOJ War Crimes Section, the DOJ International Assistance Group and DFAIT when dealing with, for example, extradition cases.

Similarly, the majority of those interviewed in the four program departments were of the view that the Program had been actively engaged in establishing and maintaining partnerships with other federal departments, including DND, the Public Prosecution Service of Canada (PPSC), the Public Service Commission (PSC), the Privy Council Office (PCO), DFAIT and CSIS. They characterized this engagement as mainly informal in nature and based on the need to share information on specific files or working relationships relevant to joint involvement in cases, as with the PPSC.

Where they had experience of collaboration and cooperation with the Program, officers of other federal departments often pointed to good cooperation and effective partnership (for example in the Interdepartmental Working Groups on the ICC and International Criminal Tribunals for Yugoslavia (ICTY) and Rwanda (ICTR).

At the same time, a few interviewees noted that there was some scope for improvement in the Program's partnership with, in particular, DFAIT. Difficulties in the partnership have arisen when DFAIT seeks to obtain permission for individuals to enter Canada who are former officials from governments considered to have engaged in gross human rights violations or who are otherwise suspected of having been involved in crimes against humanity or war crimes. Denial of visas to some of these persons (who may now be participating in post-reform democratic governments) may negatively affect Canada's bilateral relationship with the countries involved.

It is important to point out, however, that people who have been involved in crimes against humanity and war crimes remain inadmissible to Canada, according to existing legislation. Program staff also pointed out that they maintain an open dialogue with DFAIT and have provided crimes against humanity and war crimes training to their officers in the past year.

International Partners

The most direct evidence of the effectiveness of the Program in developing and maintaining partnerships with international partner agencies can be found in the responses of international interviewees. The majority (14 of 23) responded that Canada has been very effective in establishing and maintaining these partnerships and has been very proactive in doing so. The factors they cited included:

  • The ease of communicating with an integrated program;
  • The willingness of program staff to support and assist other war crimes units through sharing information, policy dialogue, training support and other interventions;
  • The visibility of the Canadian Program (including exposure through participation in working groups and international conferences); and
  • The number of Canadians, some with a background in the Program, who are working with international tribunals dealing with war crimes and crimes against humanity.

One example of the level of close international partnerships given by both departmental staff and international agency representatives was Canada's place as the only non-European Union member of the European Network of contact points in respect of persons responsible for genocide, crimes against humanity and war crimes.

The RCMP also hosted the third annual International Expert Meeting on Genocide, War Crimes and Crimes Against Humanity at its headquarters in Ottawa (jointly with Interpol) - the only time it has been held outside Europe.

Turning to the survey of program staff, 97 percent of respondents indicated the Program had been effective in providing support to international organizations. As examples they cited participation in, and hosting of, international conferences, provision of training to other country staff, provision of information in support of cases, and assignment of Canadian staff to work with ICTY and ICTR, in particular.

The case studies also provided examples of effective international partnership. Three in particular (Criminal Prosecution, Removal Under the IRPA and Denial of Status Under 35 (1) (b)) involved collaboration between program staff and authorities of another government or an international tribunal.

Prosecutions require the development of good relations with police and judicial authorities in partner countries and the country of conflict in order to allow witnesses to be interviewed by program staff. The case of removal under the IRPA relied on information supplied from several foreign governments and ongoing communications with staff from Interpol, an international tribunal, the United Nations (UN) and national authorities of several countries. The denial of status case required staff of IAG at DOJ to work closely with authorities of another country to ensure adequate documentation to support extradition.

While these cases do not provide examples of the Program supporting agencies abroad (rather the reverse), they do demonstrate that the Program is able to collaborate effectively with a range of international partners as required. To some extent this ability is based on learning lessons over time (for example, how to secure the permission of local political leaders to investigate specific cases). On the other hand, it can also be linked to goodwill generated by support the Program has provided to, for example, Interpol or the ICTY and the ICTR.

3.2.6. Effectiveness of Training and Other Tools

Summary Findings

The Program faces an important challenge in providing adequate training to staff in all four participating departments. In particular, this seems to be an important problem for staff outside headquarters. There is also an apparent need for strengthening training for staff newly appointed to positions where they deal with crimes against humanity and war crimes issues.

Program manuals and other tools have been demonstrated to be relevant and useful although there is also a need for more frequent updating of some of these tools. In particular, the MWCS has not been kept adequately up to date, mainly because of problems relating to the electronic platform used to host the database.

Evaluation Evidence

The review of selected presentations, other training tools and operational manuals showed that there was a significant level of program-specific training and information dissemination activity being undertaken in the different program departments (especially CBSA and DOJ). The operational manuals reviewed were clearly relevant to the operational requirements of the Program and some were undergoing updating and finalization at the time of the evaluation.

Similarly, during interviews, some respondents from program departments reported that they had access to manuals and guidelines that were helpful (e.g., were recently updated to include an overview of war crimes) while other program staff noted that the material required updating and that was done infrequently because of a lack of resources.

Almost all program department respondents commented that the MWCS was out-of-date and needed improvement, including strengthening links to other systems on organized crime and counter-terrorism.

Some respondents indicated that training was provided, was useful, and had recently been improved (including, for example, more training made available to DOJ staff). However, most respondents noted that there was room for improvement. They cited as examples: a limited number of training sessions with a limited number of spaces offered to their organizational unit; more training required for front-line staff (including training in screening visa applications for visa officers, training in preparing and presenting cases before the Immigration Division, training in the use of online databases and Web-search methods, and training in investigative methods for supporting immigration remedies); the need for an annual conference among practitioners within the partner departments as well as DFAIT and DND; absence of a solid introductory course for new staff; and the need for enhanced training in French.

The single most frequent training need mentioned was the need to provide specialized training to staff assuming positions where they are engaged with the Program for the first time, especially CIC and CBSA staff in posts abroad and CBSA staff in regional offices in Canada (also including RCMP Liaison Officers at posts abroad).

Respondents to the online survey tended to rate training as one of the weakest program elements. Over one-third (37 percent) rated training as very or somewhat inadequate. Similarly, when asked if they agreed that training was adequate to support their work in addressing war crimes, 47 percent of survey respondents disagreed.

In order to make sure that the fairly negative survey responses on training were not confined to a single department, the evaluation ran cross-tabulations by department of the responses to both questions. The cross-tabs indicated that DOJ representatives were less likely to rate training negatively.[12]

Table 6: Cross-tabulations of Negative Responses to Adequacy of Training by Department (Q7 & Q10)
Training somewhat or very inadequate (Q7) 39% 63% 18% 44%
Disagree or strongly disagree on adequacy of training (Q10) 59% 50% 26% 46%

When the responses are compared between those stationed at headquarters (or in units in Ottawa) and those working in regional offices or posts abroad, there is a considerable difference. Only 28 percent of headquarters staff rated training as somewhat or very inadequate compared with 48 percent of respondents from regional offices in Canada or from posts abroad. This difference is much more noticeable than any noted across the four departments taken as a whole. It may result from the fact that many overseas and regional staff had been recently appointed to their current posts (in comparison to their colleagues at headquarters) and had not yet received training.

Finally, when asked if resources for training and professional development were adequate, 58 percent of respondents (from all program departments) rated them as very or somewhat inadequate. Some interviewees link this lack of financial resources to the lack of funds for training in the most recent Program renewal decision.

Other challenges in the area of training noted during key informant interviews or listed as comments by survey respondents included:

  • A limited number of training sessions available, especially for front-line staff, and of limited duration;
  • A need for more systematic introductory training, especially for newly posted staff;
  • A tendency for war crimes issues and procedures to be included only as very brief segments or modules in broader training packages; and
  • Inconsistent training material across regional offices, even within the same department.

In contrast to the interview responses and survey results, the case studies provide examples of how manuals, guidelines and other tools developed to guide departmental investigation and immigration enforcement processes are used at different steps depending on the remedies sought. The RCMP War Crimes Section, for example, now relies on the Major Case Management System used in all complex investigations to guide its approach to war crimes investigations. Cases 3, 4 and 5 all illustrate how CBSA staff made effective use of Enforcement Manuals at different stages in the process.

3.2.7. Utilizing Performance Data for Program Management and Policy

Summary Findings

There are concrete examples of the Program making use of performance data on allegations to make changes to operational policy and managerial processes. The Annual Reports also provide much of the necessary performance data on program outcome for the remedies available. At the same time, however, there are serious gaps in the performance monitoring system relating to information on training, education and outreach activities.

There is also an opportunity to strengthen operational management through investments in improving and integrating existing CBSA/CIC databases including NCMS and MWCS to provide performance information on interim outputs of immigration remedies.

Evaluation Evidence

The Program's overall accountability structure is detailed in the Result-Based Management and Accountability Framework (RMAF) for the Crimes Against Humanity and War Crimes Program (March 2006) prepared collectively by the four participating departments and published by DOJ.

The RMAF makes it clear that the primary mechanism for program performance reporting is the Annual Report providing the Canadian public with information on activities and results each year. The Annual Report presents information from the Modern War Crimes Inventory along with a summary of WWII cases and supplementary data from CBSA and CIC on outcomes relating to immigration remedies.

The RMAF goes further, however, and identifies four key elements of program performance to be monitored:

  1. Changes in governance and their contribution to program organization and effective use of resources;
  2. Efficient coordination to be monitored through a five-year operational plan approved by PCOC;
  3. Collaborative training and outreach activities; and
  4. The allegation management process and its outcomes to be monitored to identify best practices or lessons learned which can be incorporated into policies and procedures guiding the handling of allegations.

The RMAF also sets out targets and indicators for each of the four elements of performance and assigns responsibility for data collection to PCOC and the four departments. There is no single program performance report summarizing data on all indicators on a regular basis.

On the other hand, many indicators relating to Element 4, allegation management, are reported in the Program Annual Report. Similarly, the Annual Report and PCOC meeting minutes (and concept papers reviewed by PCOC) often review information on Element 1, program governance, and Element 2, coordination. It can also be argued that issues of governance and coordination are most effectively dealt with through periodic evaluations rather than ongoing monitoring.

There are also concrete examples of the use of allegations-related data to make policy changes and to alter program processes including the File Review Policy Adjustment approved by the War Crimes Operations Committee in September 2005 and a second File Review Policy Adjustment in 2006, as discussed in Section 3.2.2 above. The stated purpose of those changes in policy and in the file review process was to streamline the allegation management process by gathering more information prior to a decision by PCOC on adding or retaining allegations to the RCMP/DOJ inventory. By applying more stringent file review criteria, the changes were also aimed at achieving a better match between limited investigative resources and high-priority cases with the potential for prosecution.

The program performance element of the RMAF with the lowest level of coverage in program documents and reports is Element 3, efficient and effective training, education, outreach and international activities. Program departments were able to give specific examples of all indicators listed for this area: manuals and tools developed, information requests met, training courses and sessions held, conferences attended and knowledge transfer activities. What is missing is an analysis of the volume of these activities and any systematic effort to track their effectiveness through, for example, pre/post course knowledge assessments or follow-up contacts with trainees.

Most interviewees (especially from the RCMP and DOJ War Crimes Sections) felt strongly that the most important program performance information is readily available in the Annual Reports. They added that it has been used to change policy to deal more rationally with the inventory of cases and to strengthen case management and the process of allocating cases to different remedies. For DOJ and RCMP staff interviewed, the available data is sufficient to allow them “to manage the inventory in a cost-effective way.”

Some CBSA and CIC staff interviewed indicated an operational need for more detailed monitoring data on case loads and interim program processes such as processing times for background checks on visa requests involving war crimes allegations. A few pointed to problems in national immigration data systems as a platform to build performance data for the Program. As an example, they cited the National Case Management System (NCMS) and indicated it did not permit them to identify and track the volume of war crimes cases efficiently at a regional level.

[12] Care must be taken in interpreting this data given the very small numbers of respondents in each cell. Since the survey was not a random sample of staff but a directed survey aimed at staff with operational expertise, statistical validity cannot be assessed.

Date modified: