Informal Conflict Management System Evaluation
1. INTRODUCTION
1.1 Purpose of the Evaluation
The purpose of this study is to conduct a national evaluation of the Department of Justice’s Informal Conflict Management System (ICMS). The evaluation assesses the short-term results of the ICMS Implementation/Process including, but not limited to, whether:
- the activities have been implemented as intended;
- the ICMS has been administered efficiently and cost effectively;
- improvements are required;
- the program is well targeted and reaching the intended beneficiaries; and
- the short-term intended results have been achieved.
The evaluation questions are grouped under three issues including relevance, achievement of expected outcomes, and efficiency and economy. A list of the specific evaluation questions under each evaluation issue is provided below.
Evaluation Issues
- Relevance
Does the program area or activity continue to serve the public interest? Is there a legitimate and necessary role for government in this program area or activity?
- Achievement of Expected Outcomes
- How effective has ICMS been in enhancing knowledge of where to obtain conflict management services? To what extent has there been enhanced accessibility and usage of conflict management services?
- How effective has ICMS been in enhancing awareness of alternative ways to manage conflict? To what extent have these alternative ways of managing conflict been applied?
- How effective has the ICMS been in working with partners to increase recognition of their roles and their visible dedication to sustaining the progress of the ICMS?
- To date, how successful has the program been in shifting toward a collaborative workplace culture that is more open to and effective in resolving conflict?
- What further progress is the program likely to achieve over the next three to five years?
- Economy and Efficiency
- Is the design and operation of the ICMS program cost effective? Are the resources used efficiently?
- What improvements could be made to program design or delivery? What alternatives exist to ICMS?
1.2 Evaluation Methodology
The evaluation methodology consisted of surveys and interviews as well as a document and literature review, as described below.
- An on-line survey for departmental employees. An e-mail letter, in both official languages, was distributed to approximately 4,500 departmental employees across Canada via JustInfo, inviting them to participate in an anonymous on-line survey. The purpose of the on-line survey was to obtain information on awareness, use and impacts of the services delivered by the ICMS. Two hundred and seventy-six respondents completed the on-line survey, representing a response rate of about 6%.
- Interviews with senior ICMS management overseeing the program. The purpose of these interviews was to obtain a further description of how various elements of the program have been implemented as well as receive input on program relevance, achievement of expected outcomes, and economy and efficiency.
- Interviews with 10 ICMS Partners from a cross-section
of units within the Department, including Labour Relations, Strategic Planning
and Performance Management, Computing Science and Aboriginal Law. A breakdown
of those interviewed by region is provided below.
Headquarters
Alberta
Ontario
Quebec
Atlantic
Number of Partners
4
2
2
1
1
The purpose of these interviews was to obtain input regarding the relationships between ICMS and these partners including the roles of the partners in the program, the progress made towards achievement of the expected outcomes, and the opportunities for improvement. For the purpose of this report, the term key informant is used when the responses of ICMS partners and ICMS senior managers are reported together.
- Literature review. The purpose of this review was to obtain information regarding the need for the program, including whether it represents a legitimate and necessary role for government as well as best practices which may represent potential improvements to program design and delivery.
- Interviews with representatives of similar ICMS programs in seven other federal government departments. The purpose of these interviews was to obtain comparable data on the programs delivered by other federal government departments as well as identify best practices that could represent potential improvements to program design and delivery.
Questionnaires and interview guides are provided in Appendix A.
1.3 Challenges and Limitations
Three challenges and limitations that should be considered when reviewing the evaluation results are the incomplete data available on the services delivered, the limited data available on program delivery costs, and the potential non-response error associated with the employee survey. The impacts of these challenges and the steps taken to mitigate them include:
- Incomplete data collected by the program. The ICMS reported data on the numbers of direct services delivered but noted that not all interactions were recorded; therefore, the figures under-represent the true activity levels. Furthermore, data has not been collected on the numbers of training and awareness sessions staged or the number of participants involved in those sessions. As a result, there is no clear measure of the reach of the program. To respond to this limitation, the evaluation included questions about usage of services in the employee interviews and activity levels in the program representative interviews.
- Limited data available on program delivery costs. The program provided information on program budgets and staffing levels. However, the merger of the ICMS into the Office for Integrity and Conflict Management in the Workplace (OICMW) resulted in the sharing of resources and consequently, data is not available on the level of resources (human and financial) allocated to various program activities, thereby making it more difficult to assess program economy and efficiency. To respond to this limitation, qualitative questions on program efficiency were included in the key informant interviews. In addition, the available data on staff levels, program budgets and direct services delivered was used to compare the Department’s ICMS to similar programs in other federal government departments.
- Potential non-response error in the survey of Justice employees. Given the self-selected nature of the survey and a response rate of 6%, there is concern that the characteristics of the respondents may be different from those who did not. The table below compares the characteristics of those who responded to the survey to the total population of departmental employees as reported in the March 31, 2008 Department Demographic Profile. To assess the nature of the potential non-response error, the characteristics of the respondents were compared to the overall profile of the Department’s employees.
As indicated below, employees from the National Capital Region (NCR) were over-represented in the employee survey while members of the Law Group (LA) appear to be under-represented, although this may be a function of differences in how the job classifications were interpreted. Aboriginal peoples were over-represented while members of visible minorities groups were somewhat under-represented.
| Profile |
Categories |
Survey Respondents |
Employee Population[1] |
||
|---|---|---|---|---|---|
Number |
Percent |
Employed |
Percent |
||
Area of Work |
Regional Offices |
58 |
21% |
2,014 |
43% |
National Capital Region |
203 |
74% |
2,682 |
57% |
|
Departmental Legal Services Unit (DLSU) |
15 |
5% |
N/A |
N/A |
|
Total |
276 |
100% |
4,696 |
100% |
|
Gender |
Female |
195 |
71% |
3,170 |
67% |
Male |
79 |
29% |
1,526 |
33% |
|
Total |
276 |
100% |
4,696 |
100% |
|
Employment Equity Designated Groups |
Visible Minorities |
22 |
8% |
547 |
12% |
Aboriginal Peoples |
29 |
11% |
156 |
3% |
|
Persons with Disabilities |
14 |
5% |
238 |
5% |
|
Total |
65 |
24% |
941 |
20% |
|
Job Classification |
Administration and Foreign Service |
27 |
10% |
878 |
19% |
Technical/Operational |
35 |
13% |
314 |
7% |
|
Administrative Support |
51 |
19% |
853 |
18% |
|
Executive |
12 |
4% |
38 |
1% |
|
Scientific and Professional |
43 |
16% |
102 |
2% |
|
LA (Law Group) |
72 |
27% |
2,511 |
53% |
|
Profile |
Other[2] |
29 |
11% |
n/a |
n/a |
Total |
269 |
100% |
4,696 |
100% |
|
Yes |
52 |
19% |
N/A |
N/A |
|
Are you in management? |
No |
217 |
78% |
N/A |
N/A |
Other[3] |
7 |
3% |
N/A |
N/A |
|
Total |
276 |
100% |
4,696 |
100% |
|
The comparative response rates tended to be higher amongst employees in the NCR, which is where most of the program services have been delivered. Twenty-one percent of respondents were based in the British Columbia, Ontario, Quebec, Atlantic Canada and the Prairie Regional Offices.
Figure 1: Response Rate of the ICMS Survey
It is expected that those who were familiar with the ICMS and had used the services would be more likely to respond. As such, the survey results overstate the use of ICMS services and likely overstate awareness of the program. To respond to this limitation, the survey results were (1) cross-tabulated by respondent characteristics and (2) compared to available information on the numbers of services provided to assess how the non-response error may have affected results for particular questions. This is further described in the Major Findings chapter (Chapter 3).
1.4 Structure of the Report
This document is divided into four chapters:
- Chapter 2 provides a description of the objectives, activities, delivery structure and budget of the ICMS.
- Chapter 3 provides an overview of the major findings of the evaluation.
- Chapter 4 summarizes the major conclusions arising from the evaluation.
- Chapter 5 presents the recommendations and management response.
- Date modified: