Executive Summary - Canadian Heritage

Archived Content

Information identified as archived on the Web is for reference, research or record keeping purposes. It has not been altered or updated after the date of archiving. Web pages that are archived on the Web are not subject to the Government of Canada Web Standards. As per the Communications Policy of the Government of Canada, you may request alternative formats by contacting the Department of Canadian Heritage.

2 Methodology

2.1 Evaluation Design

2.1.1 Summative and Formative Elements

The evaluation design and questions were based in part on the Evaluability Assessment of the Aboriginal Peoples’ Program. The evaluation considered quantitative and qualitative evidence under multiple lines of evidence to support the findings and conclusions.

2.1.2 Understanding Results, Outputs and Outcomes

The document and file reviews found that the terms results, outputs and outcomes were used with different meanings to describe APP program and project achievements. This report uses the following Treasury Board Secretariat definitions5:

  • Expected Result: An outcome that a program, policy or initiative is designed to produce. Treasury Board policies and guidelines discuss results as immediate, intermediate and final or ultimate outcomes.
  • Outputs: Direct products or services stemming from the activities of an organization, policy, program or initiative, and usually within the control of the organization itself. Examples of outputs are completed radio broadcasts, cultural awareness and sensitivity training, and the delivery of program services and support, all of which are delivered to program beneficiaries.
  • Outcome: An external consequence attributed, in part, to an organization, policy, program or initiative. Outcomes may not always be within the complete control of a single organization, policy, program or initiative; rather, the concept of organization’s contribution to a result, or capacity to influence others to achieve a result may also be of importance. Outcomes can be qualified as immediate, intermediate, or ultimate (final), expected, and/or direct.

2.1.3 Methodology and Lines of Evidence

The following multiple lines of evidence were used for the evaluation and are detailed in the sections below:

  • A document review of: (i) program documentation to gain increased familiarity with the APP, its components and programming elements;
  • Evidence-based research on Aboriginal demographic trends, urban Aboriginal People and youth issues, women’s issues, and Aboriginal languages and cultures for Aboriginal Peoples;
  • Seventy-five (75) key informant interviews6;
  • A review of 164 project files covering all APP components and programming elements;
  • A review of National Association of Friendship Centres’ (NAFC) Database(s) and PCH’s APP Reporting Database;
  • Eighteen (18) case studies covering all APP components and elements; and
  • A comparative review of a federal Aboriginal comprehensive program.

2.1.4 Document Review

The document review examined a comprehensive list of documents including:

  • Official program documents, instruments and tools;
  • Independent reports of the APP initiatives;
  • Annual reports for delivery organizations;
  • Terms and Conditions of the APP and its components;
  • Evaluations and review reports of APP components and programming elements;
  • The Evaluability Assessment, including its literature review report;
  • Monitoring and reporting templates; and
  • Studies conducted or commissioned by the PCH.

The document review was successful in providing background and supporting information, as well as answering some of the evaluation questions and sub-questions. The review of previous evaluation reports, most predating the 2005 consolidation of APP, did not provide answers to evaluation questions about the implementation of APP.

2.1.5 Literature Review

The literature review sought to collect evidence-based research on Aboriginal demographic trends, urban Aboriginal people and youth issues, women’s issues, and Aboriginal languages and cultures for Aboriginal peoples both in Canada and internationally. The literature review incorporated a previous review undertaken for the Evaluability Assessment.
While the document review and literature review addressed many of the same evaluation questions, the literature review produced demographic and Aboriginal languages information. The literature review found no published academic research on the APP or of its programming elements.

2.1.6 Key Informant Interviews

PCH identified 75 key informants, individuals and organizations:

  • 15 officials from PCH7;
  • 15 Third-party delivery representatives (at national or provincial/territorial level);
  • 30 Ultimate Recipients, including one representative for each of the territorial governments; and
  • 15 Potential Ultimate Recipients8.

Interviews were completed with 75 key informants or their replacements if they were unavailable or no longer associated with the project9.

2.1.7 Project File Review

A stratified random sample of 175 funded project files was identified. The file review was augmented by discussions with NAFC and PCH staff to obtain a full understanding of the files and differences between projects. Selected project files were distributed across the projects in each APP programming element and across the full evaluation timeframe.

The final file review included 164 of the 175 selected project files as eleven files were received too late to be included. Twenty (20) of the 80 evaluation sub-questions were addressed by the file review.

Table 6: Project File Distribution by Selected Programming Element




PCH (%)

NAFC (%)

Other (%)

Aboriginal Languages Initiative (ALI)


12.8 %

19.4 %



Aboriginal Women’s Programming Element (AWPE)


7.9 %

14.5 %


30.8 %

National Aboriginal Broadcasting (NAB)


3.7 %

9.7 %



National Aboriginal Day (NAD)


1.2 %

3.2 %



Aboriginal Friendship Centers (AFC)


22 %


48.6 %


Territorial Language Accord (TLA)


1.8 %

4.8 %



Urban Multipurpose Aboriginal Youth Centers (UMAYC)


50.6 %

48.4 %

51.4 %

69.2 %

Project files were selected to ensure that about 25% of the project files were from each of the four (4) years of the evaluation. Table 7 below shows the distribution of the selected project files across the four (4) years covered by the evaluation.

Table 7 : Project File Distribution by Fiscal Year




PCH (%)

NAFC (%)

Other (%)

2005 / 2006


26.8 %

29 %

20.3 %

46.2 %

2006 / 2007


24.4 %

14.5 %

32.4 %

23.1 %

2007 / 2008


25 %

24.2 %

24.3 %

23.1 %

2008 / 2009


23.8 %

32.3 %

23 %

7.7 %

2.1.8 Database Review

The focus of the database review was to determine if the databases contained information appropriate for PCH program-level policy and management purposes. The database review included the examination and analysis of two(2) of the National Association of Friendship Centres (NAFC) databases and PCH’s APP reporting database10.

NAFC’s AFC database houses information about the Aboriginal Friendship Centers (AFCs). NAFC is a Third-party Organization contracting Ultimate Recipient Organizations to deliver UMAYC projects, mostly within AFCs. PCH staff developed the APP database in 2008 with the expectation that it would contain project results and other information needed to support the policy and program management of the APP. The APP database contained electronic files for 76 (46.3%) of the 164 reviewed project files.

2.1.9 On-site Case Studies

Eighteen (18) case studies helped gather illustrative information on the APP, its components and programming elements. The case study methodology sought to have the Ultimate Recipient Organizations identify their project results, best practices and lessons learned.

2.1.10 Comparative Review

The purpose of the comparative review was to compare APP to another federal government-funded comprehensive Aboriginal program. The comparison looked for similarities and differences in the regrouping of programming elements, challenges faced by the program during the transition phase, issues arising from the integration process, impact on program performance as a whole, and achievement of expected results.

The Children and Youth Cluster of Health Canada’s First Nations and Inuit Health Branch was the only group (cluster) of federally funded Aboriginal programs identified that had recently been grouped together into a larger program (or cluster of programs), in a manner similar to the APP. Discussion with the Evaluation Steering Committee resulted in an in-depth comparative review of that group of Aboriginal programs.

2.2 Methodological Limitations and Adjustments

2.2.1 Limitations

Time Frame to Realize Outcomes Achievement

The complexity of the Program, the holistic character of the approach and the length of time to realize specific outcomes were limitations to providing summative information for APP. Many factors have a bearing on outcomes, including such considerations as the social and economic conditions in which participants live. As the APP continues to mature and adjust to its transition from multiple programs to a single entity; the program will have an opportunity to refine its approach to measuring the immediate, intermediate and longer-term outcomes for the program.

Program Beneficiaries Interviews

Interviews or surveys of program beneficiaries11 of Canada’s urban Aboriginal Peoples were not part of the evaluation methodology. This decision was based on recommendations made in the Evaluability Study12 that pointed to challenges in surveying and consolidating responses from the many types of recipients for APP (i.e. Aboriginal Languages Initiative, Family Violence Initiative, Northern Aboriginal Broadcasting); and consideration these organizations in terms of the process involved. This decision removed a potential source of information and a direct line of evidence regarding the level of achievement of some of the expected outcomes.

2.1.1 Adjustments

Informal Discussions with APP Staff Added

The addition of formative issues to the evaluation required the gathering and analysis of additional qualitative and quantitative information. This was not anticipated when the key informant guides were developed and resulted in informal discussions with a limited number of PCH key informants.

Key Informant Interviews Expanded

The list of key informant interviews was expanded with the addition of 15 potential Ultimate Recipient Organizations to address the question: Is the level of program access appropriate? Potential Ultimate Recipient Organizations are Aboriginal organizations that have not yet received funding, directly or indirectly, to deliver an APP funded project.

Expert Panel Dropped

The methodology consisted in identifying up to six (6) individuals, mostly from the academic community, with the objective of providing an additional line of evidence from an unbiased source of expert opinion. As the evaluation progressed, and with limited information for the expert panel to consider, the Evaluation Working Committee decided to forego consultations with the expert panel.

Comparative Review

The original statement of work foresaw two (2) or three (3) comparative reviews. A review of Aboriginal programming in the Federal Government revealed that only First Nations Inuit and Health Branch of Health Canada had recently completed a restructuring and integration of their health programs. This one comparative review was completed in more detail than initially planned.

5 Treasury Board of Canada, Evaluation in the Government of Canada; the New Treasury Board Policy on Evaluation; Results-based Management Lexicon, April 1, 2009.

6 The number of key informant interviews was increased from 60 to 75 with a new group of unfunded organizations added to address the ‘accessibility to project funding’ evaluation sub-questions.

7 Most senior AAB officials and staff based in Ottawa and PCH staff in regional offices delivering APP programming were interviewed.
8 The original plan identified 60 key informants. The revised evaluation plan included 15 additional potential Ultimate Recipient key informants. PCH identified 25 organizations to contact, and the evaluators randomly selected from among those organizations.

9 Some of the key informants identified for the Ultimate Recipient Organizations were also interviewed a second time for the case studies. This occurred nine times.

10 The database review for NAFC covered all the evaluation period while the database review for APP covered fiscal years 2007/08 and 2008/09, the years for which project information is contained within the APP database.

11 The National Association of Friendship Centres was interviewed for this evaluation study. The National Association of Friendship Centres serve as both beneficiaries and third party delivery agents.

12 Prairie Research Associates (PRA) Inc. (2009) Evaluability Assessment of the Aboriginal Peoples’ Program Final Report, Prepared for Department of PCH, January 13, 2009.

[ Previous Page | Table of Contents | Next Page ]