Language selection

Search


Participant Funding Program Evaluation Final Report

Ready for QA

Ready for QA

Oct 27, 2015

Note: Highlighted content indicates that text needs to be modified and/or approved.

Issues

Outstanding

# Issue Description Items affected Recommendation
1 Text inside image Text content exists inside image. Table 6 Please supply as a table (text).
2 Using colour to convey information This table is using colour exclusively for the purpose of conveying textual information. Colour should be used only to enhance the display of information. Table 7 The yellow, green and orange cells could include the words “planning”, “conducting”, and “reporting”, respectively. The grey cells could have a symbol or text which signifies to the reader what a grey box means. In the former case, a table footnote should be added to explain what the symbol means (in words)
3 Complex images should contain a long description These are complex images/charts/graphs. Requires long description to identify the information contained in the image that may not be available to those using assistive technologies. Figure 1, Figure 2, Figure 3, Figure 4, Figure 5, Figure 6, The logic model flowchart Provide a long description (ideally a table depicting the values found in the graphs).
4 Colour contrast issues Background make the data less readable to those with visual impairments. Figure 2, Figure 4, Figure 5, Figure 6 Please supply images without backgrounds
5 Use of emphasis - underlining Emphasis using underlining is not permitted as it can be confused to be a hyperlink. Page 31/94 Will be removed

Completed

  1. Table contains no tabular data
    1. These are not true tables –tables contain tabular data
    2. Affected content: Table 1, Table 2
    3. Recommendation: Convert content to an unordered (bulleted) list

Executive summary

This report presents the findings, conclusions and recommendations of the Canadian Nuclear Safety Commission (CNSC) Participant Funding Program (PFP) evaluation. The PFP evaluation examines the program’s relevance, effectiveness, efficiency and economy during the fiscal years (FY) 2011–12 to 2013–14 period. As the program is essentially in its infancy, this evaluation focused on the achievement of immediate and intermediate outcomes as well as program delivery. The evaluation meets the requirements of the Financial Administration ActFootnote 1, the 2006 Government of Canada (GC) Policy on Transfer PaymentsFootnote 2 and the 2009 Treasury Board Secretariat (TBS) Policy on EvaluationFootnote 3; all of which require an evaluation of relevance and performance of grants and contributions every five years. The evaluation was undertaken between September 2014 and April 2015.

Program context

Canada’s 2010 federal budget, as implemented in the Jobs and Economic Growth Act, gave the CNSC the authority to create a Participant Funding Program. The PFP was established to give the public, Aboriginal groups and other stakeholders the opportunity to request funding from the CNSC to participate in its regulatory processes.

The PFP demonstrates the CNSC’s continued commitment to meaningful public and Aboriginal participation in nuclear review processes, while strengthening regulatory performance and protecting the environment.

The PFP’s objectives are:

  • to enhance Aboriginal, public and stakeholder participation in the CNSC environmental assessment and licensing process
  • to help stakeholders bring valuable information to the Commission through informed and topic-specific interventions related to aspects of environmental assessments and licensing

The PFP is available to eligible stakeholders whose proposed activities are related to aspects of environmental assessment and/or a licensing action for major nuclear facilities (e.g., uranium mines, nuclear power plants or nuclear waste facilities).  Funding may also be available for CNSC proceedings that are of significant interest to the public or to Aboriginal groups.

The maximum amount of funding available for each proceeding/project depends on potential direct impacts and public interest, as well as a number of other related factors.

The total PFP budget and resources examined for this evaluation (FY 2011–12 to 2013–14) was $2,800,000. For this period, the PFP spent $505,944.

Methodology

This evaluation was conducted in accordance with the Treasury Board Policy on Evaluation, April 1, 2009, and addresses its core evaluation issues: consistency with federal roles and responsibilities, alignment with government priorities, continued need for the program, achievement of expected outcomes, and demonstration of effectiveness, efficiency and economy.

The evaluation used multiple lines of evidence and complementary research methods to ensure the reliability of the information and data collected. The six main lines of inquiry employed in this evaluation were:

  • document review
  • literature review
  • interviews
  • data analysis
  • benchmarking
  • surveys

Conclusions

Relevance

Document reviews indicate that the program is aligned with government direction, roles and responsibilities, as well as with the mandate and outcomes of the CNSC.

Both internal and external stakeholders agree that the program continues to meet a demonstrated need. Recipients of the program are satisfied that the program is responsive to their needs and is delivered in a timely manner.

One of the major concerns of the program’s management has been to increase the reach (take-up) of the program, particularly to scientific/academic communities. The evaluation indicated that program management has taken effective steps to increase the reach of the program. While more could be done, the resources allocated to the program are quite limited; the effect on dedicated resources should be considered in any potential expansion.

Effectiveness

Effectiveness outcomes

There is general agreement between internal stakeholder groups that the value of public participation is recognized within program objectives and priorities.

Program communication efforts are extensive, and management has revamped the program to attract more applicants. Participants generally agree that the program is accessible, responsive and fair, and that the Funding Review Committee is effective. However, the Funding Review Committee could benefit from feedback to improve its recommendations in future hearings.

Interventions provided to the Commission by participants have improved in quality over the first few years of the program. There are views that greater scientific/ academic input is needed, but in general there is consensus that the program is value-added for CNSC Commission members. Quantitatively, PFP-funded interventions foster equal or greater dialogue at CNSC hearings in comparison to non-funded interventions.

Design and delivery outcomes

The program is designed and delivered effectively; outputs are well connected to outcomes. While some suggestions for improvement were offered, no significant barriers exist to delivery that would require major changes to the Treasury Board Secretariat (TBS) approved terms and conditions.

There is some evidentiary support that expansion would improve uptake, particularly for scientific experts. There is also support for increased customization/flexibility for opening the PFP.

Efficiency and economy

The program operates with modest resources - the equivalent of 1 FTE and a small dollar amount to compensate Funding Review Committee members. Further reductions in effort would likely impact effectiveness.

The program is comparable to other similar programs (National Energy Board [NEB], Canadian Environmental Assessment Agency [CEAA]) in terms of efficiency indicators (unit of output/unit of input).

There is evidentiary support that the program achieves economy through the Funding Review Committee and approval processes (reductions in the amount of funding applied for and received).

Recommendations

  1. It is recommended that a long-term, strategic focus be adopted for the program, with funding opportunities made available far enough in advance to support additional value‑added activity (e.g., research).

    Rationale

    This would have the expected effect of:

    • allowing longer openings and greater access by researchers
    • integrating PFP activity with other recent initiatives (e.g., the CNSC’s outreach program)
    • allowing improved communication of long-term need by and to affected communities and target groups (as opposed to a hearing-by-hearing basis)
    • communicating potential needs to participants far enough in advance to position themselves effectively and project when/whether opening periods of the program should be
    • allowing projection and testing of the impact of expanded reach on resource allocation
  2. It is recommended that increased feedback be provided to the Funding Review Committee after final CNSC funding decisions are rendered.

    Rationale  

    Feedback from interviews of the Funding Review Committee members indicates that they could use clearer guidance and feedback to improve their recommendations, particularly on appropriateness of professional fees.

Management action plan

A management action plan was developed based on the two recommendations of the evaluation. The management action plan outlines the planned actions, the responsibility centre for conducting the actions, the expected date of completion and the measures of achievement. See Appendix A for details on the management action plan.

1  Introduction

This report presents the findings, conclusions and recommendations of a program evaluation of the CNSC Participant Funding Program. The evaluation examined the program’s relevance, effectiveness, efficiency and economy during the period FY 2011–12 to 2013–14. As the program is essentially in its infancy, the evaluation focused on the achievement of immediate and intermediate outcomes as well as program delivery. The evaluation was undertaken between September 2014 and April 2015.

To reflect the early stages of implementation, the evaluation was more formative (e.g., covering program design, implementation and delivery) than summative (e.g., achievement of longer-term outcomes). All of the evaluation issues were assessed in a balanced manner, and recommendations were provided to support continuous improvement of the PFP.

Initial consultation with some internal stakeholders identified a number of management areas of interest. Some of the management areas of interest were in response to the PFP management reviewFootnote 4 conducted in 2013. As of the last CNSC Management Committee updateFootnote 5 provided by the PFP program manager, all action items stemming from the PFP management review were completed.

The program evaluation report is organized as follows:

1.1 Program description

The CNSC values public and Aboriginal input into its regulatory processes. To augment the avenues available for public input, the CNSC established the Participant Funding Program in early 2011.

The PFP complements existing public participation avenues by funding eligible applicants to participate in the Commission Hearings by bringing value-added information to the Commission through informed and topic-specific interventions, as well as other CNSC proceedings that are of significant interest to the public or to Aboriginal groups.

The PFP’s purpose is:Footnote 6

  • to enable the exercise of the Canadian Environmental Assessment Act, 2012 (CEAA 2012) on the part of the CNSC
  • to ensure more timely processes and meaningful public engagement in project reviews
  • to enhance the quality, thoroughness and credibility of the reviews, and reduce the risk of time-consuming and costly delays because of challenges due to adequacy of process
  • to help fulfil the CNSC’s constitutional and other obligations for consultation with Aboriginal groups on projects potentially affecting their rights and interests

The PFP is intended to improve the regulatory review process for large nuclear projects. Funding is available to enhance participation and to bring value-added information to the CNSC. Table 1 highlights the PFP’s objectives.

Table 1: PFP's objectivesFootnote 7

Anyone can request to participate in CNSC public Commission proceedings but only some eligible applicants will receive PFP funding. Eligible applicants must have:Footnote 8

  • a direct, local interest in the project; for example, living or owning property near the project area
  • Aboriginal traditional knowledge and/or local community insight relevant to the proposed project
  • interests in potential project impacts on treaty lands, settlement lands or traditional territories and/or related claims and right
  • plans to provide value-added information relevant to the CNSC’s mandate and specific matter before the Commission (“value-added information” is new, distinctive and relevant information that contributes to a better understanding of the anticipated effects of a project)

Funding prioritizes expenses associated with supporting Aboriginal participation, local concerns and the representation of many voices under one application. The program’s Terms and Conditions allow for an annual maximum fundingFootnote 9 of $550,000 to each individual recipient. However, in practice, the total amount available for each “opening” (hearing or meeting) is set much lower, and individual recipients are capped at amounts that are far less than the maximum amount payable under the program. Table 2 describes the criteria used to determine the maximum amount of funding available for each project.

Table 2: Applicant funding level assessmentFootnote 10

The intent of the funding is not to cover all the costs of a participant’s engagement in CNSC’s regulatory processes. PFP funding helps eligible applicants to cover expenses such as professional fees, travel and other expenditures.

The CNSC appoints an independent Funding Review Committee to determine the total funding for each project, review PFP applications and recommend which applicants will receive funding, the individual recipient funding amounts and eligible expenses. The Funding Review Committee members are compensated for their participation. The vice-president, Regulatory Affairs Branch, reviews the Funding Review Committee recommendations and gives final approval on all funding decisions.  

The Funding Review Committee includes up to three individuals external to the CNSC who are selected based on their knowledge and background in nuclear regulatory and environmental matters. The composition of the Funding Review Committee may change based on the subject matter. Other considerations include level of experience, availability, willingness and ability to participate in the funding review process for each project.

Contribution agreements are the vehicle for funding awards. They outline the information required by the CNSC for release of funds. It is the recipients’ responsibility to fulfill the conditions stipulated in the contribution agreements. Not meeting the contribution agreement conditions may result in the non-release or adjustment of payment. Payment is only for eligible costs incurred and is subject to the maximum contribution amount.

For further illustration on how program activities link to program outcomes, please refer to the PFP program logic model in Appendix C.

1.2 Resources

The Policy, Aboriginal and International Relations Division (PAIRD) in the Strategic Planning Directorate administers and manages the PFP. The PFP is part of the CNSC’s Class Grants and Contributions Program.

Table 3 below illustrates the budget and resources allocated to the PFP. Total budget for the program evaluation period was $2,800,000 (with an additional $1.1M for fiscal year 2014–15).

Table 3: PFP budget and resourcesFootnote 11

Elements

FY 2011–12

FY

2012–13

FY

2013–14

FY

2014–15

Budget

$600,000

$1,100,000

$1,100,000

$1,100,000

FTE (REG 7 and REG 5) plus communications

1.0

1.0

1.0

1.0

The PFP budget includes compensation provided to Funding Review Committee members. The PFP budget and resources only include the PFP funding level per annum and the PAIRD program administration staff. The estimate of 1 FTE includes the additional work conducted in support of the PFP (e.g., Strategic Communications Directorate).

Table 4 presents the PFP actual spending from FY 2011–12 up to and including FY 2013–14. Detailed PFP Program expenditures are listed in Appendix B.

Table 4: PFP actual spendingFootnote 12

Elements

FY 2011–12

FY 2012–13

FY 2013–14

Total

Spending

$91,818

$116,386

$297,740

$505,944

1.3 Governance

The Policy, Aboriginal and International Relations Division (PAIRD) in the Strategic Planning Directorate administers and manages the PFP on behalf of the CNSC. The VP, RAB, approves all participant funding requests based on recommendations made by the Funding Review Committee.

1.4 Stakeholders

There are a number of stakeholders of the Participant Funding Program evaluation. These include individuals, community members, Aboriginal groups, not-for-profit corporations and other stakeholders who have:

  • a direct, local interest in the project; for example, living or owning property near the project area
  • Aboriginal traditional knowledge and/or local community insight relevant to the proposed project
  • interests in potential project impacts on treaty lands, settlement lands or traditional territories and/or related claims and rights
  • plans to provide value-added* information relevant to the CNSC's mandate and specific matter before the Commission

* Value-added information is defined as new, distinctive and relevant information that contributes to a better understanding of the anticipated effects of a project.

2 Evaluation scope and objectives

The objectives of this evaluation are to assess relevance, effectiveness, efficiency and economy of the Participant Funding Program (PFP). This evaluation covers fiscal years 2011–12 to 2013–14. The results from the evaluation will be used to support organizational planning for future program renewal decisions, and to identify lessons learned and best practices to improve future program decisions.

2.1 Evaluation questions

This evaluation was conducted in accordance with the Treasury Board Policy on Evaluation. It addresses the core evaluation issues of consistency with federal roles and responsibilities, alignment with government priorities, continued need for the program, achievement of expected outcomes, and demonstration of effectiveness, efficiency and economy.

During the planning phase for this evaluation, June to September 2014, the evaluation function at the Canadian Nuclear Safety Commission (CNSC) consulted with program management and the Evaluation Advisory CommitteeFootnote 13 to validate the evaluation framework, including the evaluation matrix (see Appendix D), to guide the evaluation. On September 13, 2014, the CNSC Management Committee (which functions as the Departmental Evaluation Committee) endorsed the PFP evaluation terms of reference and associated evaluation questions, which were:

Relevance

  • Question 1: Is the PFP aligned with the roles, responsibilities and priorities of government?
  • Question 2: Is the PFP aligned with the CNSC’s mandate, strategic outcome and priorities?
  • Question 3(a): Does the PFP continue to address a demonstrated need?
  • Question 3(b): Has the PFP been responsive to the needs of target groups?

Effectiveness

  • Question 4: Does the current program design (e.g., terms and conditions) and delivery (e.g., impact of the management review recommendations) support effectiveness of the PFP?

  • Question 5: Are the internal program stakeholders aligned on the objectives and priorities of the program?

  • Question 6(a): Is the process for selecting projects fair and accessible to participants?

  • Question 6(b): Are there barriers or obstacles to enhancing participation?

  • Question 7: Are the program’s communications and outreach efforts effective at reaching the desired participant groups?

  • Question 8: Is the Commission provided with value-added submissions for decision making?

  • Question 9: To what extent has the PFP enhanced participation of public, Aboriginal groups and other non-traditional stakeholders in CNSC public hearings?

  • Efficiency and economy

    • Question 10(a): Is the PFP delivered efficiently in comparison with similar programs?
    • Question 10(b): Are there ways to improve program delivery?
    • Question 11: Are the program resources (dedicated staff and the Funding Review Committee) appropriate/ adequate to deliver the program?

    3 Evaluation approach and methodology

    The PFP evaluation is mandatory, and the evaluation issues and questions are specified by the TBS Policy on Evaluation (April 2009). However, to support the use of findings and recommendations, the evaluation included specific areas of interest to management and decision-makers. Most of the management information needs are within the parameters of the evaluation issues/questions.

    A PFP logic model (Appendix C) was developed specifically for the PFP evaluation in consultation with internal stakeholders. The logic model development was supported by: 

    • the Treasury Board submission approving the CNSC PFP terms and conditions (approved June 24, 2010)
    • the CNSC PFP management review (February 28, 2013)
    • the CNSC management response and implementation plan (August 8, 2013 e-Doc 4175191), December 6, 2013 (e-Doc 413006)
    • the previous version of the CNSC PFP logic model (e-Doc 4203568) 
    • the CNSC PFP Guide (February 2011)

    The logic model covers the mandatory performance measures and reflects the PFP terms and conditions and program maturity (e.g., newly implemented program, impact of CEAA 2012). The evaluation focused on design and delivery and immediate outcomes. Typically, a well-designed program achieves its immediate outcomes within three years of implementation. Intermediate outcomes are usually achieved between three to five years, and the ultimate outcome is what the program is striving to achieve in the long term.

    The PFP was implemented just three years ago and has undergone one management review since implementation. The evaluation developed lines of evidence focusing on program design and delivery, as well as achievement of immediate outcomes (e.g., does program design support the expected results?).

    The PFP Evaluation issues are based on the TBS policy requirement. To address effectiveness, the evaluation questions are derived from the logic model and management information needs. Evaluation questions were not all weighted equally. Questions related to management information and decision‑making needs raised during the pre-evaluation planning were given priority.

    The PFP is a relatively modest program in terms of expenditure and resource use. The evaluation was conducted to reflect the program’s size; the relative weight also guided the level of effort that evaluation staff use in developing lines of evidence to support the evaluation.

    The program evaluation matrix (see Appendix D) outlines which methods were used to capture data for each of the evaluation indicators. The evaluation matrix includes the use of multiple lines of evidence and complementary research methods to ensure the reliability of the information and data collected. Six main lines of inquiry were employed in this evaluation, including both quantitative and qualitative methods: a literature review, a document review, data analysis, interviews, surveys and benchmarking data. A description of the data sources is described below by line of inquiry.

    3.1 Data sources

    3.1.1  Document review

    An extensive document review (see Appendix G for the detailed document list) was conducted for the purposes of describing the program and its activities, outputs and mandate; assessing relevance; establishing production of outputs leading to achievement of outcomes, and; collecting data on the value‑added aspects of the PFP. The documents reviewed were:

    • PFP survey reports
    • Funding Review Committee recommendation reports
    • intervener reports
    • program area reports
    • briefing notes to senior managers
    • CNSC hearing manuscripts
    • CNSC record of proceedings, including reasons for decision
    • CNSC Participant Funding Program decisions    

    3.1.2 Literature review

    The purpose of the literature reviewwas to examine similar programsFootnote 14 and the strategies used to achieve similar goals. The reviewfocused on communication, targeting and marketing approaches used by other organizations, to assess lessons learned and best practices the PFP can build on.

    3.1.3 Interviews

    Interviews [24] were conducted with key stakeholders by the evaluation function. The key groups of stakeholders selected were:

    • CNSC staff and management directly involved in the PFP (e.g., staff from Strategic Planning, Communications, and Finance directorates)
    • directors-general and directors of regulatory programs relevant to the PFP
    • senior CNSC management (vice-presidents, Commission secretary, president)
    • Funding Review Committee (past and current members)
    Table 5: PFP interviewees

    CNSC senior management

    CNSC management

    Program staff, communications and finance

    Funding Review Committee

    Total

    7

    5

    8

    4

    24

  • Interviewees were assured of their anonymity (according to privacy and access to information laws). Interview findings were reported in aggregate with no references to individual interviewees.

    A customized template was developed to populate findings and conclusions from interviews. All interview notes were analyzed by indicator and respondent group to analyze relevant information.

    3.1.4 Data analysis

    Data analysis included the review of quantified data such as financial information, and collected information from the CNSC and other organizations.

    3.1.5 Benchmarking analysis

    The benchmarking component of the PFP Evaluation is limited to the National Energy Board (NEB) and the Canadian Environmental Assessment Agency (CEAA). The purpose of the benchmarking component is to examine NEB and CEAA policies and processes, design, implementation, delivery, as well as the effectiveness of targeting, marketing and outreach efforts. To provide management with a comprehensive picture of the CNSC PFP, the benchmarking component included: interviews and surveys with program management, staff and applicants; a document review; a literature review; and data analysis.

    3.1.6 Surveys

    To manage tight timelines, surveys are the most efficient means of gathering information from a large pool of potential and existing participants. The survey participants identified for the PFP Evaluation were CNSC staff impacted by the PFP (e.g., licensing) and groups targeted by the CNSC PFP (e.g., current list of academics, CNSC e-Doc 3987463). Detailed results of the PFP contracted surveys are set out in Appendix F. The detailed results of the CNSC PFP recipient survey are set out in Appendix H.  

    3.2 Limitations of the evaluation methodology and mitigation strategies

    The evaluation methodology was designed to provide multiple lines of evidence to identify relevant evaluation findings. The data and information were collected to respond to the evaluation questions and indicators. As in all evaluations, there are limitations and considerations that should be noted. Table 6 identifies the program evaluation risks and mitigation strategies considered.

    Table 6: Evaluation risks and mitigation strategy
    Image representing the evaluation risks and mitigation strategies.

    4 Management of the evaluation

    4.1 Roles and responsibilities

    The lead evaluator was responsible for managing all phases of the evaluation (planning, conduct and reporting) and for developing all evaluation deliverables, including the following: terms of reference, evaluation framework, data collection templates and instruments, contracts, correspondence to interviewees and survey respondents, draft evaluation reports, final evaluation report, technical support on management action plan, briefing materials to inform senior management of evaluation findings, conclusions and recommendations.

    Once the draft evaluation report was ready, a committee representing the Directorate of Power Reactor Regulation, Directorate of Nuclear Cycle and Facilities Regulation, and the Commission Secretariat was selected to review and comment on the draft.

    The Canadian Nuclear Safety Commission (CSNC) Management Committee served as the CNSC Departmental Evaluation Committee, which is responsible for the timely validation of evaluation reports and management action plans. The Departmental Evaluation Committee is supported secretarially by the Head of Evaluation (Director General, Strategic Planning Directorate) and includes the president of the CNSC, the deputy head responsible for approval of all CNSC evaluation reports and management action plans.

    Contracts

    Two surveys were contracted to Prairie Research Associates Inc. (PRA). The contract of the surveys supported the evaluation function’s use of in-house resources and contracted resources to produce timely evaluation reports.

    The report (Appendix F) summarized the methods and results of two online surveys conducted as part of the evaluation of the Participant Funding Program (PFP). The CNSC undertook the evaluation, which examines the relevance, design, delivery and performance (effectiveness, efficiency and economy) of the PFP, and hired PRA Inc., an independent research company, to conduct the surveys. The two surveys conducted by PRA Inc. were the applicant survey (n=50) and the target group survey (n=269).

    The applicant survey was designed for those who have applied for funding under the PFP at least once since its inception in 2011. Having participated in the program, respondents provided opinions on the transparency and fairness of the application and selection process, the clarity of program documentation and communications, the accessibility of the program, and potential improvements to the program.

    The target group survey was intended for individuals whom the CNSC targeted as potential applicants to the PFP, but who never applied for funding under the program. The target group included academics, researchers, Aboriginal groups, associations, unions, and other experts and stakeholders. The survey assessed to what extent the target groups are aware of the PFP; what target groups see as the barriers to their participation in the program, and; what communication methods or tools would improve awareness of, and participation in, the PFP.

    The survey report (by PRA Inc.) contained many useful open-ended questions for both the applicant and target groups. However, the survey response rate was low (11% and 18% respectively); therefore, caution was used to interpret responses in the evaluation. That said, the applicant group survey is consistent with the surveys done by program staff, and this continuing practice by staff has been encouraged. Respondents had generally positive views of the program. The target group survey suggests that, in general, better articulation of the CNSC program’s eligibility and purpose may yield additional applicants.

    Timelines – planned versus actual

    The timelines for planning and conducting this evaluation were all met, as set out in the evaluation framework. Table 7 identifies the timelines, categorized by planning phase (yellow), conducting phase (green) and reporting phase (orange).

    Table 7: PFP Evaluation timelines

    Year

    2014

    2015

    Phase

    Planning Conducting Reporting

    Activity/month

    8

    9

    10

    11

    12

    1

    2

    3

    4

    * - Timeline for activity indicated

    Review and approve evaluation framework

    *                

    Develop and approve evaluation terms of reference

    * *              

    Develop contract for evaluation

      * * * *        

    Development of data collection tools

      * * * *        

    Collect documentation

      * * * * * *    

    Issue contracts for evaluation

        * * *        

    Select survey participants

          * *        

    Select interviewees

        * *          

    Conduct document review

          * * * *    

    Selection of contractor for evaluation

        *            

    Conduct interviews

        * * * * *    

    Obtain analysis from contractor

                *    

    Analyze information

                * * *

    Draft findings and evaluation report

                * *  

    Approve evaluation report

                    *

    Challenges to implementation

    Timing

    This evaluation was expected to be completed within a tight timeline, given that the full evaluation scope and plan had not been previously developed. Without a clear plan (i.e. an evaluation framework validated with key program stakeholders) and careful project management oversight against timelines established within that plan, the evaluation would not have been delivered in its intended timeline.

    Mitigation strategy

    The evaluators met with key program stakeholders at the beginning of the evaluation to quickly identify and collect relevant background documentation, solicit opinions on perceived issues that define the scope of the evaluation, and identify intended involvement of key stakeholders throughout the evaluation process. Regular meetings with the program stakeholders were held throughout the entire evaluation time period to keep them informed and to solicit information on the PFP when required.

    An evaluation framework, including a logic model and an evaluation matrix, was subsequently developed; this plan effectively set the full scope, methodology and design, and timelines of the evaluation. Following approval of the evaluation framework, the evaluator developed and implemented a comprehensive work breakdown structure to manage the conduct of the evaluation process. The approval of the PFP terms of reference by the CNSC Management Committee occurred on September 23, 2014. As a result of careful planning and management of timelines, combined with effective communication between the evaluators and key stakeholders, the evaluation report and management action plan achieved their intended timeline.

    5 Findings and conclusions

    5.1 Relevance

    Evaluation questions explored in this relevance section include:

    • Question 1: Is the PFP aligned with the roles, responsibilities and priorities of the federal government?

    • Question 2: Is the PFP aligned with the CNSC’s mandate, strategic outcome and priorities?

    • Question 3(a): Does the PFP continue to address a demonstrated need?

    • Question 3(b): Has the PFP been responsive to the needs of target groups?

    Findings and supporting evidence

    The PFP is fully aligned with the Canadian Nuclear Safety Commission’s (CNSC) mandate, strategic outcome and priorities from review of the relevant acts, regulations and the CNSC’s program alignment architecture.

  • The Jobs and Economic Growth Act, 2010 amended subsection 21(1) of the Nuclear Safety and Control Act by adding the following after paragraph (b): (b.1) to “establish and maintain a participant funding program”.

  • The PFP program fell under the CNSC sub-program 1.1.4 (Stakeholder Engagement) of the Regulatory Framework Program (1.1) during the evaluation time period. The CNSC’s PAA was modified in 2014, and PFP activities are now linked directly to licensing and certification activity forming part of regulatory programs in the PAA. 

    Evidence indicating that the PFP addresses a continued need is demonstrated by the following:

    • All stakeholder groups (applicants, CNSC staff, managers and senior managers) were supportive of the program and consistent in their views that the program was needed.
    • Efforts of program staff to improve the program “reach” have been successful. There is some support for program expansion, but under defined conditions and at a gradual pace.
    • While similar participant programs do not exist worldwide, citizen engagement and outreach are increasingly recognized as key regulatory activities. The existence of the PFP positions the CNSC to set the example in many respects. 

    The PFP was responsive to the needs of target groups (Aboriginals, NGOs and the public):

    • Aboriginal Groups, NGOs and individuals participating in the program have been provided with the opportunity to express their views about effective participation through a survey conducted by staff; program staff responded in a timely manner to applicant questions and reacted to their feedback.
    • Generally, applicant views are highly positive about the program’s ability to respond to their concerns.
    • There is a low take-up rate among scientists and researchers (which is acknowledged by the program). There is some evidence from surveys of this group and other reviews that take-up could be improved; however, this would require that critical management decisions be made about program parameters (funding levels, timing, and many more).

    Figure 1 displays the percentage share of funds awarded by the CNSC PFP by recipient class; Aboriginal groups have the highest number of recipients [54] participating, while “individuals” have the lowest number [2].

    Figure 1: Percentage share of PFP expenditures by recipient class
    Figure 1 shows the percentage share of funds awarded by the CNSC PFP by recipient class; Aboriginal groups have 54%, NGO’s/Not for Profits have 44%, while Individuals have 2% recipients participating.

    Data Source: Freebalance

    Table 8: Number of contributions issued 2011–2012 to 2014–15Footnote 15

    Fiscal year

    2011–

    12

    2012–

    13

    2013–

    14

    2014–

    15Footnote 16

    total

    Number of recipients

    10

    6

    22

    31

    69

    Table 8 lists the number of contributions issued over the evaluation period (partial data for 2014–15 is included). In 2011–12, 11 applicants were approved to receive PFP contributions, while there were 31 in 2014–15, indicating a sizeable increase in takeup.

    Since inception, there has been a gradual increase in program spending over the three‑year program evaluation period. The spending for fiscal year 2011–12 was $91,818, increasing to $297,740 in 2013–14. As the number of hearings per fiscal year can be variable, spending amounts can also be variable.

    5.2 Effectiveness

    Evaluation questions explored in this section include the following:

    • Question 4: Does the current design and delivery of the program support effectiveness?

    • Question 5: Are the internal program stakeholders aligned on the objectives and priorities of the program? 

    • Question 6(a): Is the process for selecting projects fair and accessible to participants?

    • Question 6(b): Are there barriers or obstacles to enhancing participation?

    • Question 7: Are the program’s communications and outreach efforts effective at reaching the desired participant groups?

    • Question 8: Is the Commission provided with value-added submissions for decision making?

    • Question 9: To what extent has the PFP enhanced participation of public, Aboriginal groups and other non-traditional stakeholders in CNSC public hearings?

    Findings and supporting evidence

    CNSC management has made continual progress in adapting and modifying the program, and all major processes and components of the program are contributing to its effective operation. There is sufficient flexibility in the existing program terms and conditions to meet anticipated needs because:

    • PFP projects have clear objectives and are generally implemented as planned
    • 90% of PFP recipients [35/39] responded that their applications were processed in a timely manner and that the PFP enabled them to prepare effectively
    • baseline service standard (45 days from the time of the funding decision to the release of funding) was met in all cases

    There is strong internal understanding of the program’s objectives and priorities. This is supported by the following:

    • Interviews indicate that there is a general understanding and agreement on the objectives of the program by major stakeholders (directors general/directors/program-related staff). However, many of those interviewed [16/20] expressed a concern that working level staff do not fully support the PFP objectives – there is a perception that the program results in additional “non-value added” workload. A majority of these interviewees [15/19] suggested increased awareness/communications would be beneficial internally.

    • There is a high degree of collaboration between the Regulatory Affairs Branch, Secretariat, and Regulatory Operations Branch staff in setting priorities for the PFP.

    The process for selecting projects is fair and accessible to participants, although a small number of PFP applicants have raised concerns about the process.

    • Most CNSC staff, managers and senior managers interviewed [17/23] consider that the process for selecting PFP applicants is fair. A majority of interviewees [16/20] consider that theFunding Review Committee is a necessary component of the PFP.

    • CNSC management accepted most FRC recommendations without change over the evaluation. While CNSC management communicates the outcome of its decision clearly, the rationale is not provided to recipients. However, FRC members have suggested that increased feedback on the reasons for changing their recommendation would help improve their recommendations in the future. FRC member interviews also suggested that they could use increased guidance on the appropriateness of professional service fees, which would allow them to ensure applicants are dealt with consistently.
    • Applicants surveyed for this evaluation provided mixed views on whether the PFP application and selection process is transparent and fair; [4/11] stated that the requirements were clear and fair, while [4/11] said that there was lack of clarity on the selection process, and/or a lack of consistency by the panels that make funding decisions. It is worth noting that the response rate to the survey was low, and that previous surveys conducted by program staff generally indicated satisfaction with the process.

    The CNSC has been proactive in implementing program enhancements to increase participation. While no existing major barriers or obstacles to participation have been identified, there are some opportunities to “fine tune” the program. This is supported by the following: 

    • Program take-up has improved over time. Most internal interviewees [18/24], includingFunding Review Committee members, believe program management has been successful at improving both the number and quality of submissions. Previous barriers, such as time to prepare submissions, have been addressed through program changes. The program meets or exceeds the service standards set for reviewing and approving applications.

    • Participants themselves reported general satisfaction with the program, both through program- and evaluation- administered surveys. For example, [7/11] respondents to the evaluation survey [see Q17] reported no obstacles with their PFP application. Participants were very satisfied with the response of program staff to their questions and concerns.

    • Internal interviewees suggested that the number of applications from academics and research communities should be increased. The majority of internal views argued for more flexibility and customization in program delivery (longer opening periods and open to other regulatory activities). Interviewees did caution about the increase in administrative burden associated with such changes.

    • There is some evidence that a more flexible approach will increase take‑up. Over a third of the respondents to the academic target group  [18/48] said they might be more interested in applying to the PFP if there were changes to the subject matter or activities for which the PFP offers funding. In general, respondents to the target group survey considered the funding levels low and the timelines too short to conduct adequate research. Changes to the program terms and conditions may be necessary to increase uptake by academics (e.g., professional fees vs. salary costs). 

    PFP opportunities are well advertised with a wide variety of initiatives created along the communication and media routes. This is supported by the following:

    • Most CNSC staff, managers and senior managers [17/23] considered that PFP opportunities are advertised sufficiently; they indicated that a great deal is being done to ensure the program is well advertised (although they have suggestions for improvement).

    • Applicants agreed that it was clear what information was required for their PFP application, and that the PFP eligibility requirements were also clear. Though a limited sample, applicants suggested that a wide range of communication vehicles is still necessary to reach intended audiences.

    Overall, the Commission appears to be provided with value-added submissions for decision making. This is reflected by multiple lines of evidence:

    • Most CNSC staff, managers and senior managers, includingFunding Review Committee members [20/24], considered that the PFP is on the right path to achieve its outcomes. The impact of the submissions on the individual hearings added value, improved dialogue, and built bridges with people and organizations.

    • Successful applicants in both the program- and evaluation- administered surveys responded that they were able to present their concerns effectively at the CNSC hearing.

    • The number of questions raised by Commission members was used as a proxy indicator for the “value-added” by submissions. Seven of twelve hearings at which PFP-funded interventions occurred were used to compare the number of questions asked of PFP versus non-PFP interveners, and whether those questions were directed back to the intervener, proponent or CNSC staff. The following hearings were used: the Gunnar Remediation, Darlington, Key Lake/Rabbit Lake/McArthur River, Beaver Lodge, Pickering G Station, Cigar Lake and Chalk River.  The raw data for these cases are listed in Appendix I.

    PFP-funded interventions, on average, resulted in numbers of Commission member questions that were comparable to, or greater than, non-PFP funded interventions. Figure 2 shows the data in terms of the numbers of Commission members’ questions asked in response to an intervention (The Commission members may reply with questions to the intervener, applicant/licensee, or CNSC staff). The data shows that PFP-funded interventions, on average, result in greater numbers of Commission member questions than for non-PFP funded interventions. For PFP-funded groups, the intervener, licensee and staff group asked an average of 6.4, 7.6 and 5.7 questions respectively, while for non‑funded groups, the intervener, licensee and staff group asked an average of 2.5, 1.4 and 1.4 questions respectively.

    Figure 2: Average number of questions per intervention asked by Commission members (in seven cases)
    Figure 2 shows the data in terms of the numbers of Commission members’ questions asked in response to an intervention. The data shows that PFP-funded interventions result in greater numbers of Commission members questions than the non-PFP funded interventions.

    Another view of the data is shown in Figure 3, which displays the percentages of questions asked by Commission members in response to PFP and non-PFP funded interventions. Although there were 31 PFP interventions in the seven cases studied compared to 156 non-funded interventions, PFP‑funded interventions comprised 44% of the total number of questions asked by Commission members (1,467 in total).

    Note that in many cases the quality of the interventions is not similar. CNSC staff noted during interviews that some interventions contain technical errors or inaccurate statements. Nonetheless, the data showed that interventions sparked dialogue among Commission members, which also allowed proponents/licensees and CNSC staff to clarify or correct information so that Commission members were provided with additional information for decision making.

    Figure 3: Percentage of total questions asked by Commission members (in seven cases)
    Figure 3 is a pie chart that displays the breakdown of percentage share of questions asked by Commission members in response to PFP and non-PFP funded interventions. 126% are Non-funded Intervener, 17% are PFP-Licensee, 15 % are Non-funded – Licensee, 15% are Non-funded – Staff, 14% are PFP-Intervener and 13% are PFP-Staff.

    Case study: Intervention by the Canadian Environmental Law Association (CELA) at Ontario Power Generation’s Pickering Nuclear Generating Station 5‑year licence renewal (hearing held May 29–31, 2013)

    To illustrate how an intervention can influence Commission decision-making, the evaluation referred to the intervention by the Canadian Environmental Law Association at the license renewal hearing for Ontario Power Generation’s (OPG) Pickering Nuclear Generating Station in 2012. This case had been cited by staff as an example of an effective intervention during interviews.

    The Canadian Environmental Law Association applied for, and received PFP funding in the amount of $4,500. In its review of the submission, the Funding Review Committee (FRC) noted that “it was an excellent application at reasonable rates”. The Association did most of the work itself, and only asked for additional help in the form of junior law clerks. The FRC assessed that the application, which focused on emergency planning for severe accidents, would be of significant community benefit.

    OPG had applied to the Commission for a one-site licence covering both Pickering A and Pickering B, for a period of five years, prior to expiry on August 30, 2013. In its intervention, CELA expressed concern about OPG’s application and provided the following grounds of objection, among others:

    • Lack of demonstration of emergency preparedness with available and actionable emergency plans, frameworks and strategies to deal with a nuclear incident.
    • Lack of evidence to show that Pickering management worked cooperatively with municipalities and stakeholders to ensure 100% pre‑distribution of potassium iodine to all residents in the 10- to 100‑kilometre zones around Pickering, both within the Region of Durham and within the City of Toronto.
    • In the opinion of CELA, evidence showed that neither OPG nor Emergency Management Ontario and Durham Emergency Management had evacuation plans or a communication strategy in place to deal with a nuclear incident.
    • There were insufficient protective actions in place (including sheltering, family reunification and emergency drilling measures).

    A number of fact-finding and issue-specific questions (12 in total) were generated by the Commission to better understand the matter at hand in response to the intervention. The nature and quality of questions were as follows:

    • Commission members asked factual questions that required action‑oriented responses to address concerns raised by CELA, including an ageing nuclear station, insufficient emergency preparedness, lack of an evacuation plan, availability of potassium iodine, insufficient protective actions including sheltering and emergency drills.
    • The intervention added value to the Commission hearing and generated further in-depth discussion, as evidenced by the breadth and depth of questions and responses.
    • In its decision, the Commission specifically noted that CELA’s intervention “presented a thorough review of emergency management in Ontario”; much of the Commission’s subsequent direction in the area of off-site emergency management reflects the discussion that was initiated by CELA’s intervention.Footnote 17

    Overall, PFP interventions do bring value to decision making, and evidence from program operation to date shows that the program is meeting this objective. As previously mentioned, interviewees consistently expressed that increased take-up from scientific or academic researchers will bring further benefit. The principal means suggested by interviewees to increase reach/take-up is adopting a more strategic, long-term focus for the program; this will allow potential applicants to identify opportunities further in advance and to secure research funding (either through the PFP or elsewhere) in enough time to contribute meaningfully to the hearing or other need.

    In examining other programs, the National Energy Board (NEB) was found to be open for longer periods than the CNSC’s program. The NEB also creates an “issues list” that is approved by the Board when the hearing opens – the list specifies the nature of interventions that may be of interest to the Board to guide the selection and funding of interventions. Such a list could be extremely valuable in helping identify and guide experts/academics who may be interested in appearing before the Commission; it would also allow CNSC outreach activity to be targeted at communities of interest (academic conferences, workshops, etc.) that might generate PFP submissions.

    The PFP has considerably enhanced participation of the public, Aboriginal groups and other non-traditional stakeholders in CNSC public hearings since its inception. This was shown by multiple sources and is supported by:

    • a growth of participation in the program by applicants (mainly Aboriginal people, NGOs and individuals) and an increase in the number of PFP recipients over time

    • agreement among interviewees and survey respondents, across a broad range of stakeholders, that the PFP has enhanced the participation of public, Aboriginal groups and other non-traditional stakeholders in CNSC public hearings (with the exception of research academics and institutions)

    5.3 Efficiency and economy

    Under the Treasury Board 2009 Policy on Evaluation, efficiency is defined as maximizing the outputs produced with a fixed level of inputs or minimizing the inputs used to produce a fixed level of outputs; and economy is defined as “minimizing the use of resources […] to achieve expected outcomes.”Footnote 18 These elements of performance are demonstrated when:

    1. outputs are produced at minimum cost (efficiency)
    2. outcomes are produced at minimum cost (economy)

    Evaluation questions explored in this section include:

    • Question 10(a): Is the PFP delivered efficiently in comparison with similar programs?
    • Question 10(b): Are there ways to improve program delivery?
    • Question 11: Are the program resources (dedicated staff and the Funding Review Committee) appropriate/ adequate to deliver the program?

    Findings and supporting evidence

    Efforts to increase program uptake over the past three years have resulted in comparable levels of efficiency with other similar programs (NEB, CEAA). Given the program’s resource use is low (~1 FTE) and the attention paid to funding applications, it can be concluded that the program is delivered efficiently. This is supported by the following:

    • The NEB and CNSC programs are comparable to the CEAA (on which they were modelled). An international literature review revealed that there are no similar international programs to the CNSC PFP;
    • Most [7/13] CNSC staff and managers considered that the PFP is delivered efficiently, although ways of delivering the program more efficiently were suggested [see Q10(b)]. The PFP was described as well planned; it delivered its outputs expeditiously and met its timelines. Efforts by program staff to improve delivery were also noted (including increased use of electronic documents).
    • In terms of output/input indicators (funds awarded per FTE and recipients funded per FTE), the CNSC has improved over time and in comparison to the NEB and CEAA (it processes more recipients per FTE than the NEB). See figures 4 to 6 for further details.
    • Role of the CNSC Funding Review Committee: The Funding Review Committee played a significant role in reducing the amount of funding awarded to recipients. This reduction offset administrative costs due to the Funding Review Committee operation. The CNSC’s Funding Review Committee recommended only 43% of the requested funds and paid 94% of the recommended funds.  The estimated saving is well over their cost of $42,000. In comparison, NEB’s Funding Review Committee recommended 10% of requested funds and paid only 39% of the recommended funds. Note that many NEB applicants asked for much larger amounts of money than CNSC applicants. The CNSC does not experience similar amounts of pressure.
    • The total funds awarded by the CNSC for the three evaluation years are lower than the total funds awarded for either of the NEB or CEAA (see figure 4). The range of cost types in comparing the CNSC and NEB programs is about the same. See Table 9 for more details on the selected comparators of CNSC/ NEB/ CEAA.
    Table 9: Selected comparators of CNSC/NEB/CEAA

    Comparator

    CNSC

    NEB

    CEAA

    Desired outcome

    Enhance Aboriginal, public and stakeholder participation

    Assist the Commission in making fully informed decisions

    Facilitate effective public participation

    Assist eligible Aboriginal groups in participation

    Support meaningful public participation

    Availability

    As determined by the CNSC

    Only for public hearings

    For many EA steps

    Independent Funding Review Committee

    Yes

    Partial – both internal and external members

    Internal

    Total funds paid 2011–12 to 2013–14

    $475K

    $642K

    $9,911K

    Number of recipients 2011–12 to 2013–14

    38

    39

    481

    Staffing Levels (FTEs)

    1.0 (reg-05 program administrator @ 50%, and other program and related staff remaining 50%)

    2.0 (2 program administrators)

    6.0 (1 program manager, 5 program officers)

    • In comparison to the NEB, applicants for the CNSC program requested smaller amounts of the total offered, but were paid a higher percentage of the amount requested and eventually offered to applicants. This may suggest that the CNSC’s program has a clearer articulation of expectations (although the NEB’s program may be more complex). In both cases, the data shows efforts made to scrutinize and reduce the amounts awarded (offered) and eventually approved for payment. 
    Table 10: CNSC/NEB funding data 2011–12 to 2013–14
     

    Total offering ($K)

    Total requested by applicants ($K)

    Total recommended by FRC ($K)

    Total paid ($K)

    % FRC recomm. /requested

    % paid

    / FRC recomm.

    CNSC

    675

    1,176

    507

    475

    43

    94

    NEB

    3,990

    16,003

    1,647

    642

    10

    39

    CNSC/

    NEB, %

    17

    7

    31

    74

    NA

    NA

    • Figure 4 displays the total funds paid by the PFP programs of CNSC, NEB and CEAA for fiscal years 2011–12 to 2013–14. The CNSC shows the lowest level of funding. The NEB shows the next largest level of funding, and the highest level of funding is for the CEAA.
    Figure 4: Total funds paid: CNSC, NEB, CEAA
    • Figures 5 and 6 display the total amounts awarded and the total number of recipients per program staff member, respectively. Generally speaking, the CNSC’s newer program processes relatively less output than the longer standing CEAA program.  Discussions with program staff indicate that it is the number of recipients (essentially the number of contribution agreements) that drives effort and not dollar amounts (complexity). While data is limited, it does appear that the CNSC is improving with time, and is nearing the output ratio achieved by the longer-standing CEAA program.  Since the average dollar value of agreements processed by the CNSC is smaller than the other organizations, this also suggests that some expansion of the program is possible and higher dollar value agreements (such as research related ones) can be undertaken without impairing efficiency.
    Figure 5: CNSC, NEB and CEAA - funds paid per program FTE
    Figure 6: CNSC, NEB and CEAA - number of recipients per program FTE
    Figure 6 displays the data in terms of numbers of recipients in 2011-2012, 2012-2013 and 2013-2014 per program FTE from CNSC, NEB and CEAA. The graph shows that CNSC’s newer program processes relatively less output than the longer standing CEAA program.

    CNSC staff and applicants made a variety of suggestions to improve program delivery. Suggestions included:

    • creating a champion for the PFP at a high executive level in combination with a five‑year plan on PFP hearings
    • providing better direction (e.g., more guidelines) to its independent panels making funding decisions (it is not clear to applicants that they approach similar hearings consistently)

    All the collected information indicates that the program’s resources - FTE, Funding Review Committee members, and modest PFP budget is adequate and appropriate to deliver the program. This is supported by:

    • information that indicates that the PFP’s 1 FTE and modest budget is adequate to deliver the program
    • information that shows the increasing efficiency of the CNSC PFP (high efficiency of the number of recipients per FTE in contrast to the similar NEB PFP program)
    • some support for PFP program expansion (depending on future demand, and at a gradual pace)

    5.4 Conclusions

    Relevance

    Both internal and external stakeholders agree that the program continues to meet a demonstrated need. Program management has taken effective steps in increasing the reach of the program; while more can be done, resources allocated to the program should be considered in any potential expansion. Recipients of the program are satisfied that the program is responsive to their needs and is delivered in a timely manner.

    Document reviews indicate that the program is aligned with government direction, roles and responsibilities, as well as with the mandate and outcomes of the CNSC.

    Effectiveness - achievement of outcomes

    There is general agreement between internal stakeholder groups that the value of public participation is recognized within program objectives and priorities.

    Program communication efforts are extensive, and management has revamped the program to attract more applicants. Participants generally agree that the program is accessible, responsive and fair.

    Interventions have improved in quality over time. There are views that greater scientific/ academic input is needed, but in general there is consensus that the program is value-added for Commission members. Quantitatively, PFP-funded interventions foster equal or greater dialogue at hearings than do non-funded interventions.

    Effectiveness - design and delivery

    The program is designed and delivered effectively; program outputs are well connected to outcomes. While some suggestions for improvement were offered, no significant barriers exist to delivery that would require major changes to the TBS approved terms and conditions.

    There is some evidentiary support for expansion to improve uptake particularly for scientific experts. There is also support for increased customization/flexibility for opening the PFP. This would likely require increased program administrative costs.

    Demonstration of efficiency and economy

    The program operates with modest resources - the equivalent of 1 FTE and a relatively small fiscal amount to compensate Funding Review Committee members. Further reductions in effort would likely impact effectiveness.

    The program is generally comparable to other similar programs (NEB, CEAA) in terms of efficiency indicators (unit of output/unit of input), with some expansion possible.

    There is evidentiary support that the program achieves economy through the Funding Review Committee and approval processes (reductions in the amount of funding applied for and received).

    6 Summary and recommendations

    The Participant Funding Program is a relatively new program for the CNSC and is therefore limited in the amount of operational data it has collected. Although new, the program has received significant internal attention, most notably the conduct of a comprehensive management review completed in 2013 (for which a detailed action plan was developed and completed).The program has also been the subject of four presentations to the CNSC’s Management Committee in its short period of implementation. These efforts, taken together, point to a program that has been subject to a relatively high degree of scrutiny (for a program of its size) to ensure it is achieving its objectives (note: at the time this evaluation report was being written, the program was also undergoing an internal audit).

    Combined with evaluation evidence, this amount of attention suggests that the CNSC and program management are committed to adapting and continually improving the program. The program is solidly designed and delivered, and it is expected to achieve its long-term outcomes.

    The program also pays significant attention to ensuring economy in the disbursement of funds through the efforts of the Funding Review Committee. Overall, the program is run efficiently.

    The most significant issue for the program is assessing the benefits and costs of further increasing the take-up of the program (e.g., by scientific and technical experts) to bring additional informed interventions to the Commission for decision making. While the evaluation evidence provides support that the Commission receives better “value-added” submissions from PFP-funded interveners than from non-funded interveners, many stakeholders felt that it was feasible to attract additional interventions that offered higher value-added information.

    Recommendation 1

    It is recommended that a long-term, strategic focus for the program, with funding opportunities made available far enough in advance to support additional value-added activity (such as research).

    Currently, the program is only open to receive applications a short time before a specific hearing (the program recently began advertising upcoming openings for the whole fiscal year). In practice, this means that longer-term research or studies must be already completed by participants to apply for funding. Adopting a multi‑year, look-ahead approach would have the expected effect of allowing potential participants to understand how their needs/concerns could dovetail with issues relevant to the Commission, and allow for sufficient lead time to conduct any research or coordination activity required to bring their views forward.

    A long-term, multi-year plan, which was suggested by numerous stakeholders, would also allow for:

    • integration of PFP activity with other recent initiatives (e.g., the CNSC’s outreach program)
    • improved, targeted communication of funding opportunities to scientific and technical communities (that in many cases CNSC staff are already aware of)
    • projection/testing of the impact of expanded reach on program resources

    Recommendation 2

    It is recommended that increased feedback be provided to the Funding Review Committee after final CNSC funding decisions are rendered.

    Currently, PFP applicants/recipients are notified about the amount of funding approved by the CNSC, which frequently differs from the amount requested by interveners. CNSC management reserves the right to change FRC recommendations, but the reasons for such changes do not flow to the FRC.

    Feedback from CNSC Funding Review Committee interviews suggests that committee members could use clearer guidance and feedback in cases where their advice was not accepted, particularly for appropriateness of professional fees. This would allow them to provide better recommendations in the future, which will contribute to maintaining the perceived fairness of the program among applicants over the long term.

    Appendix A – Management action plan

    Caption

    #

    Recommendation

    Type of recommendation

    Response

    Planned actions

    Responsibility

    Expected date of completion (M/D/Y)

    Measures of achievement

    1

    It is recommended that a long-term, strategic focus be adopted for the program, with funding opportunities made available far enough in advance to support additional value-added activity (such as research).

     

    Program design

     

    Accepted

    1. Extend existing three‑ year forward plan to five years, based on Secretariat’s 10-year rolling agenda. The plan will be posted publicly; however, this document may need to be amended when there are significant unforeseen scheduling changes led by the Commission.
    2. Establish a process to manage longer-term funding opportunities and multi-year research proposals.

    RAB/SPD/PAIRD

    Q3 2015-16

     

    1. Publication of  Plan on CNSC website
    2. Process established and documented to manage longer-term funding opportunities and multi-year research proposals.

    2

    It is recommended that increased feedback be provided to the Funding Review Committee after final CNSC funding decisions are rendered.

    Program design

     
    1. The FRC will be provided with a briefing after the CNSC’s final funding decision and will also be provided a briefing regarding the value-added of each funding recipient.
    2. PAIRD will work on developing guidance to assist the FRC in determining the appropriateness of professional fees requested by funding applicants based on Public Works and Government Services standards and best practices.

    RAB/SPD/PAIRD

    Q2 2015-16

    1 and 2. FRC briefings for CNSC funding decisions and recipient value-added

    Appendix B – PFP program expendituresFootnote 19

    Caption

    Fiscal years/ budget item

    2011–12 

    2012–13 

    2013–14 

    Grand total

    Grand total 

    91,818

    116,386

    297,740

    $505,944

    Total O and M

    5,567

    10,785

    14,862

    31,214

    Advertising services

     

    5,196

    7,267

    12,463

    Mgmt consulting services

       

    6,529

    6,529

    Other professional services

    5,567

    3,434

    1,067

    4,501

    Temporary help services

     

    1,655

     

    1,655

    Translation services

     

    500

     

    500

    Total grants and contributions 

    86,252

    105,601

    282,878

    474,730

    Cont. to prov. non-profit orgs

     

    31,351

    38,394

    69,745

    Cont-First Nations and Inuit

    13,890

    4,975

    23,367

    42,232

    Cont-First Nations and Inuit Assoc.

    18,850

    18,000

    178,278

    215,128

    Cont-local non-profit orgs

    38,009

    33,697

    16,102

    87,809

    Contributions to individuals

    10,103

    0

    0

    10,103

    Contributions to national orgs

    5,400

    17,578

    26,735

    49,713

    Grand Total

    91,818

    116,386

    297,740

    $505,944

    Appendix C – PFP logic model

    Logic model component definitions

    Inputs
    Resources dedicated to or consumed by the program (e.g., money, staff and staff time, facilities)
    Activities
    What the program does with the inputs to fulfill its mission (e.g., educate the public, provide job training)
    Outputs
    The direct products of program activities
    Outcomes
    Benefits for participants during and after program activities
    The logic model flowchart
    Image representing a flowchart of the logic model in three sections.  The first one is for the Area of Control – Internal to the Organization composed of two main subjects (represented in arrows), the Inputs (Resources) and Activities.

    Caption

    Ultimate outcome

    Public confidence in CNSC decisions is maintained / the public is informed on the effectiveness of the regulatory regime

    Intermediate outcomes

    Enhanced public, Aboriginal groups and other desired stakeholders’ participation  in the CNSC regulatory process (proceedings)

    The Commission is provided with value-added submissions for decision making

    Immediate outcomes

    Internal Stakeholders are aligned on PFP objectives and priorities

    Desired participant groups are aware of program funding opportunities and process

    Participants perceive the program to be fair and readily accessible

    Outputs

    Program management and direction

    • Internal process, tools and documentation suite
    • Longer-term offerings plan
    • Reports for Management Committee
    • Review, audit and evaluation reports
    • Funding Review Committee Roster
    • External tools and documents

    Program delivery

    • Funding opportunities list
    • Funding advertising
    • Funding notifications
    • Funding Review Committee recommendations report
    • Decision reports
    • Contribution agreements
    • Recipient deliverables
    • Financial reports

    Program communications and outreach

    • Communication strategy and plan
    • Communication tools
    • Training and awareness tools
    • Marketing products
    • Briefing notes
    • Presentations for internal and external audiences

    Inputs

    FTEs, funding, technical expertise (e.g., aboriginal consultation group)

    Appendix D – Participant Funding Program evaluation matrix

    Caption

    Evaluation issue

    Evaluation question

    Priority

    Indicators

    Methods

    Data sources

    Relevance

    1.    Is the PFP aligned with the roles, responsibilities and priorities of the federal government?

    Low

    • The PFP is aligned with government-wide policy and direction. 
    • Document review
    • Report on plans and priorities/ Departmental performance report
    • Speech from the Throne/ budget implementation
    • MRRS 

    2.    Is the PFP aligned with the CNSC mandate, strategic outcome and priorities?

    Low

    • The PFP is aligned with the CNSC strategic outcome through program and sub-program and expected results (PAA).
    • Document review
    • Report on plans and priorities / Departmental performance report
    • MRRS 

    3.    Does the PFP continue to address a demonstrated need? Has the PFP been responsive to the needs of target groups?

    High

    • Take-up and number of enquiries/applications have increased over time.
    • Stakeholders agree the PFP is addressing a demonstrated need.
    • Stakeholders agree the PFP is responsive to the needs of target groups.
    • Data analysis
    • Interviews
    • Surveys
    • PFP Data
    • Interview summaries
    • Survey results

    Design and delivery

    4.    Does the current program design (e.g., Terms and Conditions) and delivery (e.g., impact of the management review recommendations) support effectiveness of the PFP?

    High

    • The PFP is delivered as designed
    • The PFP is supported by sufficient program and process documentation.
    • The PFP is designed and delivered similar to other PFPs (e.g., NEB).
    • The PFP program theory is relevant.
    • Stakeholders agree program design supports PFP effectiveness.
    • Stakeholders agree program delivery supports PFP effectiveness.
    • Data analysis
    • Document review
    • Interviews
    • Literature review
    • PFP data/ documentation
    • Previously conducted studies

    Performance

    (effectiveness)

    5.    Are the internal program stakeholders aligned on the objectives and priorities of the program?

    High

    • Internal stakeholders agree with and understand the PFP objectives and priorities.
    • Internal stakeholders agree the PFP objectives and priorities are clear.
    • Internal stakeholders agree the PFP objectives and priorities are communicated clearly and in a timely manner.
    • Interviews
    • Interview summaries

    6.    Is the process for selecting projects fair and accessible to applicants?

    Are there barriers or obstacles to participation?

    High

    • The PFP project selection criteria is applied consistently.
    • The PFP application, selection and review processes are applied consistently. 
    • Stakeholders agree the PFP process is transparent and fair.
    • Stakeholders agree funding recommendations and decisions are clear and fair  (e.g., applicants understand why selected / not selected and level of funding).
    • Data analysis
    • Document review
    • Interviews
    • Surveys
    • PFP Data/ Documentation
    • Survey results
    • Documentation summaries
    • Interview summaries

    7.    Are the program’s communications and outreach efforts effective at reaching the desired participant groups?

    Medium

    • The PFP communication strategy is appropriate for the desired participant groups.
    • Stakeholders agree PFP communication tools (e.g., web announcements, engagement strategies) are appropriate and reaching the desired participant groups.
    • Interviews
    • Surveys
    • Interview summaries
    • Survey results

    8.    Is the Commission provided with value-added submissions for decision making? 

    High

    • The PFP value added is clearly defined and consistently applied.
    • Stakeholders agree the Commission is provided with value-added submissions for decision making.
    • Interviews
    • Surveys
    • Document review
    • Interview summaries
    • Survey results
    • Documentation summaries

    9.    To what extent has the PFP enhanced public, Aboriginal groups and other non-traditional stakeholders in the CNSC public hearings?

    Medium

    • Stakeholders agree the PFP has enhanced public, Aboriginal groups and other non-traditional groups to participate in the CNSC hearings.
    • Interviews
    • Surveys
    • Interview summaries
    • Survey results

    Performance (economy and efficiency) 

    10.  Is the PFP delivered efficiently in comparison with similar programs? Are there ways to improve program delivery? (efficiency)

    High

    • Stakeholders agree the PFP delivered efficiently in comparison with similar programs. Suggested ways to improve program delivery are put forward.
    • Data Analysis
    • Interviews
    • Survey results
    • Interview summaries

    11.  Are the program resources (dedicated staff and the Funding Review Committee) appropriate/adequate to deliver the program? (economy)

    Low

    • Stakeholders agree the program resources are appropriate/ adequate to deliver the program.
    • Interviews
    • Surveys
    • Document review
    • Interview summaries
    • Survey results
    • Documentation summaries

    Appendix E – Matrix of interview questions

    PFP interview questions

    1. What is your overall view of the PFP program? Has it been successful?
    2. What would be your criterion for judging the program’s success?
    3. What do you see as the future need for this program? Should it expand, remain constant, or reduce in size and scope? Why?
    4. Are you aware of the Wright report (PFP management review)? If yes: Do you feel the recommendations of the report were successful in improving the program? How so?
    5. Has program management been successful in increasing the quality and quantity of submissions since the launch of the PFP?
    6. Are the criteria for selecting applicant proposals correct? Should other criteria be added or examined?
    7. Are applicants notified of projects well enough in advance of the application deadline to get meaningful submissions?  
    8. Is the amount of funding provided to applicants correct?
    9. Are there groups of applicants not currently targeted by the program that should be targeted in the future?
    10. What other changes, in your view, would bring improvements to the number and type of applications received by the program?
    11. Should the CNSC’s program always be open, or should it be open only for specific hearings or regulatory activities?
    12. Should the types of activities funded by the program (meetings, hearings, environmental assessments) change?
    13. What is your understanding of the program objectives?
    14. Do you feel the roles and responsibilities of CNSC stakeholders were well defined?
    15. Do you believe that all internal stakeholders agree on the objectives? If not, why?
    16. Do you believe there is support for the program at the working level? Does staff see its value?  
    17. What should be done, if anything, to communicate or build awareness and support of the program internally?
    18. Is the reporting by program staff sufficient to understand the program’s status, achievements and forward-looking plans?
    19. Does the program result in value-added submissions to the Commission? Does the benefit exceed the cost?
    20. Do you feel that the process for selecting applicants is fair? Are changes needed?
    21. Is the external Funding Review Committee a necessary component of the program?
    22. Are there risks in the program that are not being adequately addressed?
    23. In your view, are there things which are preventing the program from getting more/better submissions? What can be done to improve the take-up of the program?
    24. Are there better ways to create awareness of the Program in the desired participant groups?
    25. Is the PFP program on the right path to achieving its desired outcomes (e.g., enhancing the participation of interveners in CNSC regulatory processes, bringing value-added information to the Commission and helping recipients gain a better understanding of CNSC regulated projects and regulatory processes)?
    26. To what extent do the applicant’s submissions match the program objectives? 
    27. In your experience, are there any mismatches of the goals of an applicant submission versus the final product?  
    28. In your experience, what is the impact of the submissions on the individual hearings (by submission type)?
      1. What is the feedback to the Funding Review Committee on the quantity or quality of value-added submissions?
      2. Is the Funding Review Committee receiving enough information regarding the project and each applicant, including the performance/value-added of repeat applicants, in order to make informed decisions?  
    29. Is the PFP delivered efficiently in your view? Why or why not?
    30. How much effort (time) do you estimate the program requires from you (or your staff)?
    31. When you think of the funding application and review process, do you believe the review process brings value to the program? Is it efficient?
    32. Are there ways of delivering the program more efficiently?
    33. Is the program adequately resourced?
    34. If the program shows increasing uptake or is opened up further and a higher rate of applicants is expected, what would the program require in terms of added resources to efficiently handle the extra work?
    35. Are the program and funding opportunities advertised sufficiently and to the proper audiences? Are there any changes that you would make to the advertising and communications strategy for the program?

    Appendix F – Results from PFP applicant and target group survey

    Participant Funding Program Evaluation (CNSC)

    Survey report

    March 9, 2015

    Prepared for:

    The Canadian Nuclear Safety Commission

    Introduction

    This report summarizes the methods and results of two online surveys conducted as part of the evaluation of the Participant Funding Program (PFP). The Canadian Nuclear Safety Commission (CNSC) undertook the evaluation, which examines the relevance, design, delivery, and performance (effectiveness, efficiency, and economy) of the PFP. The CNSC hired PRA Inc., an independent research company, to conduct the surveys, and they delivered the final report on March 9, 2015.

    The Applicant survey was designed for those who have applied for funding under the PFP at least once since its inception in 2011. Having participated in the program, respondents provided opinions on the transparency and fairness of the application and selection processes, the clarity of program documentation and communications, the accessibility of the program, and potential improvements to the program.

    The target group survey was intended for individuals whom the CNSC targets as potential applicants to the PFP, but who have never applied for funding under the program. The target group includes academics, researchers, Aboriginal groups, associations, unions, and other experts and stakeholders. The survey assessed to what extent the target groups are aware of the PFP; what target groups see as the barriers to their participation in the program; and what communications methods or tools would improve awareness of, and participation in, the PFP.

    Methodology

    Table 1 summarizes the methodology of the surveys.

    Table 1: Summary of methodology

    CNSC PFP surveys

    Survey dates

    February 3–26, 2015

    Method

    Online survey

    Sample selection

    Provided by the CNSC

     

    Applicant group

    Target group

    Sample size

    n=50

    n=269

    Completions

    11

    48

    Response rate

    22%

    18%

    Questionnaires were developed for the applicant and target group surveys in consultation with the CNSC. The questions were aligned with the evaluation matrix for the overall PFP evaluation, with emphasis on relevance (the extent to which the PFP is responsive to the needs of target groups) and effectiveness (awareness and accessibility of the program). The CNSC also provided French versions of the questionnaires. The questionnaires, including all survey questions and skip logic, were programmed using computer-assisted web interviewing (CAWI) software. PRA conducted internal tests of the surveys to ensure the skip logic worked properly. Each survey contained a mix of fixed-response questions (multiple choice) and open-ended questions (free response). Both surveys were hosted on PRA’s website.

    The CNSC provided both survey samples, including 50 applicants and 269 from the target group. On January 30, the CNSC emailed an announcement letter to both groups to inform them of the survey. The letter described the goals of the surveys, and indicated that the respondents would receive an invitation to the survey within a few days. On February 3, 2015, PRA emailed survey invitations, which contained personalized links to the surveys for both groups. PRA sent three subsequent reminders over the next few weeks to help increase response rates. Also, when the invitations received “bounce backs” (automatic responses from inactive email accounts), web searches and telephone calls were used to attempt to get updated email addresses for the respondents.

    Both the applicant and target group surveys closed on February 26, 2015. Afterward, PRA generated the frequencies for each question, and reviewed the open-ended responses. Although the sample sizes were not sufficient for more in-depth analyses (e.g., cross-tabulations), the results of the surveys are summarized in this report.

    Note

    While respondents provided many insightful comments, the number of responses for each survey (11 applicants and 48 for the target group) suggests that caution should be used when interpreting the results.

    Summary of results

    The following subsections contain frequencies (tables) and open-ended responses (if applicable) for each survey question. Summaries are provided in point form. The open‑ended responses have been edited to ensure that respondents are not identified. The first survey summarizes the results of the applicant survey, while the second covers the target group survey.

    Applicant survey

    Status of funding application

    Just under half (5) of the respondents applied to the PFP only once.

    Three respondents applied twice, and three applied three or more times.

    Table 2: Q1. How many times have you applied for PFP funding?

    (n=11)

    %

    Once

    5

    45%

    Twice

    3

    27%

    Three or more times

    3

    27%

    Only one applicant had their PFP application denied.

    Six respondents (55%) had approved PFP applications, but only received partial funding.

    Three others (27%) received full funding, while one was uncertain about the status of their application.

    Table 3: Q2. Please select the statement that best describes your most recent funding application.

    (n=11)

    %

    Application denied

    1

    9%

    Application approved, and I received all the funding I requested

    3

    27%

    Application approved, but I received only part of the funding I requested

    6

    55%

    Do not know / no response

    1

    9%

    All (11) respondents applied for funding for professional fees, while seven also applied for travel expenses.

    Table 4: Q3. For which types of activities did you seek funding in your application?

    (n=11)

    %

    Professional fees

    11

    100%

    Travel expenses

    7

    64%

    Other costs (such as room rental, photocopying, meeting supplies)

    4

    36%

    Note: Respondents could provide more than one answer; totals may sum to more than 100%.

    Five out of nine respondents indicated that they were not able to undertake all the activities for which they sought funding. The types of activities that participants could not undertake, even with funding, were varied. Such activities include hiring a consultant, travelling, conducting interviews, and miscellaneous activities.

    Table 5: Q4. Were you able to undertake all the activities for which you sought funding?

    (n=9)

    %

    No

    5

    56%

    Yes

    4

    44%

    Open-ended responses

    • Could not get enough funding for a consultant and other activities
    • Could not travel to site
    • Lacked funds for travel and interviews
    • Funding for a meeting was denied
    • In process

    Three respondents said the types of eligible activities for PFP funding should change, while four said they should not change.

    Table 6: Q5. Should the types of eligible activities for funding in the PFP be changed at all?

    (n=11)

    %

    No

    4

    36%

    Yes

    3

    27%

    Do not know / no response

    4

    36%

    Open-ended responses

    • There is no rational way to divide a single pot of money between the two main components of the PFP ( i.e. consulting with Aboriginal groups  vs. informed and topic-specific interventions related to aspects of environmental assessments and licensing. Aboriginal consultations received the lion's share of funding for recent projects for which decision reports are posted. If a dedicated amount were allocated to each component, non-Aboriginal partners would have a better idea of how to realistically structure their proposals. Or, if the goal is to encourage Aboriginal and non-Aboriginal groups to apply jointly, this should be made clear.
    • It would be great to obtain more input from the public. Public engagement is triggered by many activities. The hearings become more meaningful if fewer restrictions are implemented.
    • Include more digital media submissions (e.g., short documentary)

    Awareness of CNSC hearings and the PFP

    Applicants reported finding out about a CNSC hearing through various means, including from the CNSC website (4 respondents), from a CNSC representative (4), from a colleague (3), from personal communications (2), and from a mailing list (1).

    No applicants reported hearing about a CNSC hearing through other websites, radio, newspaper (print or online), social media (Facebook, Twitter) or posters.

    Table 7: Q6. How did you first find out about the CNSC hearing that you wanted to attend?

    (n=11)

    %

    CNSC website

    4

    36%

    From a CNSC representative

    4

    36%

    From a colleague

    3

    27%

    Personal telephone call, email, or letter

    2

    18%

    Other (including a mailing list)

    1

    9%

    Note: Respondents could provide more than one answer; totals may sum to more than 100%.

    Open-ended responses

    Applicants reported finding out about the PFP through various means, including from a CNSC representative (4 respondents), from the CNSC website (3), from a colleague (3), and from a mailing list (1). One respondent knew about the PFP when it was created.

    No respondents reported hearing about the PFP through other websites, radio, newspaper (print or online), social media (Facebook, Twitter), posters or personal communications (telephone call, email, or letter).

    Table 8: Q7. How did you first hear about the CNSC's PFP?

    (n=11)

    %

    CNSC website

    3

    27%

    From a CNSC representative

    4

    36%

    From a colleague

    3

    27%

    Other (including a mailing list)

    2

    18%

    Note: Respondents could provide more than one answer; totals may sum to more than 100%.

    Open-ended responses

    PFP documentation and requirements

    The majority of the applicants (9) either agreed or strongly agreed that it was clear what information was required for their PFP application.

    Only one respondent strongly disagreed that this information was clear. One other respondent neither agreed nor disagreed.

    Most participants (10) either agreed or strongly agreed that the eligibility requirements of the PFP are clear. Only one respondent neither agreed nor disagreed. No respondents disagreed on this point.

    Table 9: Q8. Please rate your level of agreement with the following statement: It was clear from the PFP documentation (PFP guide, funding application form, etc.) what information was required for my application.

    (n=11)

    %

    Strongly disagree

    1

    9%

    Neither agree nor disagree

    1

    9%

    Agree

    4

    36%

    Strongly agree

    5

    45%

    Q9. Please rate your level of agreement with the following statement: The eligibility requirements of the PFP were clear.

    (n=11)

    %

    Neither agree nor disagree

    1

    9%

    Agree

    5

    45%

    Strongly agree

    5

    45%

    Communication with the CNSC

    The majority of participants (9) asked the CNSC at least one question during their application process. Three respondents raised at least one concern during this process.

    Table 10: Q10. At any point during your application, did you contact a representative of the CNSC to raise a concern or ask a question about the funding application?

    (n=11)

    %

    No

    1

    9%

    Yes, to raise a concern

    3

    27%

    Yes, to ask a question

    9

    82%

    Do not know / no response

    1

    9%

    Note: Respondents could provide more than one answer; totals may sum to more than 100%.

    Of the nine respondents who asked the CNSC a question, eight said they felt listened to, while one said they felt only somewhat listened to.

    Table 11: Q11. Did you feel listened to when you contacted the CNSC?

    (n=9)

    %

    Yes

    8

    89%

    Somewhat

    1

    11%

    All three participants who raised a concern said that the CNSC was timely in responding to their concern.

    Of the three respondents who raised a concern with the CNSC during their application process, none said they were satisfied with how the CNSC addressed their concern. One said they were very unsatisfied, while two said they were neither satisfied nor unsatisfied.

    Table 12: Q12A. How timely was the CNSC in responding to your concern(s)?

    (n=3)

    %

    Timely

    3

    100%

    Q13A. How satisfied were you with how the CNSC addressed your concern(s)?

    (n=3)

    %

    Very unsatisfied

    1

    33%

    Neither satisfied nor unsatisfied

    2

    67%

    The majority of participants (8) who asked the CNSC a question indicated that the CNSC was either timely or very timely in its response. One was uncertain.

    Most respondents (7) said they were satisfied or very satisfied with the way in which the CNSC addressed their questions. Two said they were neither satisfied nor unsatisfied.

    Table 13: Q12B. How timely was the CNSC in responding to your question(s)?

    (n=9)

    %

    Timely

    5

    56%

    Very timely

    3

    33%

    Do not know / no response

    1

    11%

    Q13B. How satisfied were you with how the CNSC addressed your question(s)?

    (n=9)

    %

    Neither satisfied nor unsatisfied

    2

    22%

    Satisfied

    6

    67%

    Very satisfied

    1

    11%

    Of the nine successful applicants, six stated that they received the decision on their application with enough time to prepare for the hearing, while two mentioned they would have liked more time to prepare. One was uncertain.

    Table 14: Q14. Did you receive the decision on your application with enough time left to prepare effectively for your involvement in the hearing?

    (n=9)

    %

    Yes, I had plenty of time left to prepare

    6

    67%

    Yes, but I would have liked more time to prepare

    2

    22%

    Do not know / no response

    1

    11%

    Of the seven applicants who did not receive the amount of funding they requested, three indicated that they received an explanation, three indicated that they did not receive an explanation, and one was uncertain.

    Table 15: Q15. Did you receive an explanation as to why you did not receive the amount of funding you requested?

    (n=7)

    %

    No

    3

    43%

    Yes

    3

    43%

    Do not know / no response

    1

    14%

    Of the three applicants who indicated that they received an explanation as to why they did not receive the amount of funding they requested, two stated that the explanation was not sufficiently justified, while one was uncertain.

    Table 16: Q16. In your opinion, was the explanation sufficiently justified?

    (n=3)

    %

    No

    2

    67%

    Do not know / no response

    1

    33%

    Open-ended responses

    • We did not get enough funding to hire a consultant, even though we were willing to find ways to lower the cost of a consult.
    • No. This is beyond the control of the CNSC. There is a significant problem of consistency with the independent panels that award funding.

    Challenges with the PFP application and selection process

    Seven out of eleven respondents reported no challenges with their PFP application, while three reported some challenges. One person gave no response.

    Table 17: Q17. What challenges, if any, did you experience when completing your PFP application?

    (n=11)

    %

    No challenges

    7

    64%

    Other

    3

    27%

    Do not know / no response

    1

    9%

    Note: Respondents could provide more than one answer; totals may sum to more than 100%.

    Open-ended responses

    • Finding out other potential applicants and discussing partnerships with them, so as not to duplicate work. Getting good quotes for consultants' fees. Accessing the full licence application, including appendices, to allow consultants to better describe their proposed work and calculate their fees.
    • There was no example of how a citizen representative could apply for funding. I had no experience with how much time is involved in requesting and reviewing all the mandatory and additional documents. Since I was not aware of the real time commitment, I could not estimate the cost of my own time and wage loss.
    • Still in process. Volunteer-based organization.

    Respondents provided mixed views on whether the PFP application and selection process is transparent and fair.

    Those who stated the process was fair (4) expressed that the requirements were clear and fair, and contact information was available, should questions arise.

    Those who did not agree that the process was fair (4) stated that there was a lack of clarity on the selection process, and/or a lack of consistency by the panels that make funding decisions.

    Table 18: Q18. Was the PFP application and selection process transparent and fair?

    (n=11)

    %

    No

    4

    36%

    Yes

    4

    36%

    Do not know / no response

    3

    27%

    Open-ended responses for “Yes”

    • Everything was clearly laid out with timeframes, funding parameters and contact information if issues arose.
    • It appeared to be.
    • Questions were answered, and fairness was shown.
    • We received enough funding to hire an expert to help us present our views to the Commission.

    Open-ended responses for “No”

    • It would be helpful to let applicants know that the panel wants to see CVs for the professionals mentioned in a proposal. Also, if our funding was cut severely because a lot had to be awarded to Aboriginal groups, then a fundamental change is needed.
    • I can't tell whether it was fair, as there was no explanation of the selection process, who applied, what each applicant requested.
    • I felt rushed to reply to accept the PFP offer. Then there was a long delay in the entire process, with no explanation at first.
    • There is a lack of consistency by the independent panels making funding decisions. An appeal process of their decisions is not apparent.

    Out of nine applicants, six said that they would not have been able to participate in the CNSC hearing without funding from the PFP. Three others were uncertain.

    Table 19: Q19A. Would you still have been able to participate in the CNSC hearing without funding from the PFP?

    (n=9)

    %

    No

    6

    67%

    Do not know / no response

    3

    33%

    Of the nine successful applicants in the survey, six said that they were able to present their concerns at the CNSC hearing either somewhat effectively (4) or very effectively (2).

    Table 20: Q20. How effectively do you believe you were able to present your concerns at the CNSC hearing?

    (n=9)

    %

    Somewhat effectively

    4

    44%

    Very effectively

    2

    22%

    Do not know / no response

    3

    33%

    Only one respondent could think of any other programs that are similar to the PFP: previous funding programs through the Canadian Environmental Assessment Agency (CEAA).

    Table 21: Q21. Can you think of any other programs that are similar to the PFP?

    (n=11)

    %

    No

    6

    55%

    Yes

    1

    9%

    Do not know / no response

    4

    36%

    Note: Respondents could provide more than one answer; totals may sum to more than 100%.

    Open-ended responses

    • Previous funding programs through CEAA

    Only one respondent reported that they previously received funding from another program for similar activities.

    Table 22: Q22. Did you receive funding from another program for similar activities?

    (n=7)

    %

    No

    6

    86%

    Yes

    1

    14%

    Background questions

    Of all the applicants who responded to the survey, eight were interested in a hearing about a project in Ontario, while three were interested in a project in Saskatchewan. No other provinces or territories were mentioned.

    Table 23: Q23. Please think back to the project that was the subject of the CNSC hearing you wanted to participate in. In which province or territory was the project located?

    (n=11)

    %

    Saskatchewan

    3

    27%

    Ontario

    8

    73%

    Participants were asked to select one response that best describes how they became interested in CNSC hearings. About a third (4) mentioned they were a member of a non‑profit organization, while another third (4) said they were a member of an Aboriginal organization or community. Two were community association members, and one lived near a nuclear facility.

    No respondent mentioned being a subject matter expert, a union member, a member of an academic institution, or an interested stakeholder in the general public.

    Table 24: Q24. Please select one from the following that best describes you in terms of how you became interested in CNSC hearings. Are you interested as…

    (n=11)

    %

    A member of a non-profit organization?

    4

    36%

    A community association member?

    2

    18%

    A member of an Aboriginal organization or community?

    4

    36%

    Other

    1

    9%

    Open-ended responses

    • I live near a nuclear facility

    Conclusion

    Three respondents provided additional comments about the PFP (please see Table 25).

    Table 25: Q25. Do you have any other comments about the PFP?

    (n=11)

    %

    No

    8

    73%

    Yes

    3

    27%

    Open-ended responses

    • Delay in receiving materials was frustrating, although CNSC staff tried to be helpful. Several changes in the date of the hearings made planning difficult.
    • I hope that more stakeholders will be made aware of this opportunity and can benefit from the program in the future.
    • The CNSC needs to provide better direction to its independent panels making funding decisions. It is not clear that they approach similar hearings consistently. Hence, funding can be provided for one set of hearings and denied for another, even though the subject matter in both is essentially the same.

    Target group survey

    Awareness of the PFP

    Among the target group respondents, only about one quarter (23%, or 11 respondents) had heard of the PFP before they received the survey invitation.

    Table 26: Q1. Had you heard of the PFP before receiving this survey?

    (n=48)

    %

    No

    37

    77%

    Yes

    11

    23%

    Interest in the PFP

    Half of the respondents (24) said they might consider applying to the PFP in the future, while about a third (16) did not know if they would apply. Only two said they would definitely consider applying, while six said they would definitely not apply.

    Table 27: Q2. Based on what you know about the PFP, would you consider applying to the program for funding in the future?

    (n=48)

    %

    I definitely would consider applying

    2

    4%

    I might consider applying

    24

    50%

    I definitely would not consider applying

    6

    13%

    Do not know / no response

    16

    33%

    Barriers and changes to the PFP

    The reasons the targeted individuals gave for not applying to the PFP in the past were mixed. Among the open-ended responses, some were not participating in any hearings or had seen no hearings of interest, while another indicated they would need help with other kinds of expenses. Some indicated that they already had other sources of funding, while one respondent was unsure if they were eligible to apply as a government employee.

    Table 28: Q3. Are there any reasons why you have not applied to the PFP in the past?

    (n=11)

    %

    I did not know I was eligible for the PFP

    2

    18%

    The PFP does not cover the fees with which I would need help

    2

    18%

    The PFP application process seemed too onerous

    2

    18%

    The documents or websites that describe the PFP were unclear

    1

    9%

    I do not need the funding to participate in Commission hearings

    2

    18%

    Other

    6

    55%

    Note: Respondents could provide more than one answer; totals may sum to more than 100%.

    Open-ended responses

    • At the moment, if there is a significant need to participate, I would be able to get other organizations to help with the costs. The travel costs for me would usually not be significant. If there was preparation work and research work to do, it may be of interest in the future (or if I lost current sources of financing).
    • Wasn't participating at the time.
    • There were no instances when I considered applying.
    • I have participated in CNSC and other public hearings in the past, but at no cost. My expenses were always covered. However, that does not preclude me from applying for funding in the future, should the need arise.
    • I am a member of local government - I am not sure if government bodies are eligible for funding.
    • I have seen no notice of CNSC events of interest.

    Among those who provided reasons they might not apply to the PFP in the future, seven respondents said it was not clear whether they were eligible for funding, while six said that they did not have the time to prepare for Commission hearings. Three said they were not interested in participating in Commission hearings.

    Others said that they did not have the expertise, or were unsure whether they had the expertise, to attend Commission hearings.

    Table 29: Q4. What are some reasons that you might not apply to the PFP?

    (n=21)

    %

    It is not clear to me whether I am eligible for funding

    7

    33%

    The PFP does not appear to cover the fees with which I would need help

    1

    5%

    The PFP does not appear to provide enough funding to make it worthwhile

    1

    5%

    I do not have enough time to prepare for Commission hearings

    6

    29%

    I am not interested in participating in Commission hearings

    3

    14%

    Other

    5

    24%

    Do not know / no response / no reason

    3

    14%

    Note: Respondents could provide more than one answer; totals may sum to more than 100%.

    Open-ended responses

    • My expertise is not really relevant to the subject matter of a Commission hearing.
    • It is not clear that my expertise is applicable.
    • I have no expertise in topics that might be eligible for funding. They just don’t apply to my research.
    • I don’t know enough about nuclear facilities or mines to have a useful contribution to any hearings. I do not think there are any such facilities in my province. I have a full-time job and do not have the time to attend hearings anyway.
    • I would only be interested in certain hearings.

    If they were to apply to the PFP in the future, many respondents (19) stated that they would be likely to apply for travel expenses under the PFP, while some (8) stated they would apply for professional fees. Seven said they would not apply, while two said they would apply for other types of costs (please see Table 30).

    Table 30: Q5. If you were to apply to the PFP, which types of funding would you be likely to apply for? Would you apply for...

    (n=48)

    %

    Nothing, I would not apply

    7

    15%

    Professional fees

    8

    17%

    Travel expenses

    19

    40%

    Other

    2

    4%

    Do not know / no response

    12

    25%

    Open-ended responses

    • Potentially travel and research costs.
    • I would apply for both professional fees and travel expenses. I would be particularly interested in hearing from people with experience on the issue from other jurisdictions.

    Seven respondents provided suggestions for other fees or expenses that the PFP should cover, including legal fees for Aboriginal groups, student funding, and preparation costs (researching and developing briefs).

    Table 31: Q6. Are there any other expenses that you believe should be covered under the PFP to encourage people to apply for funding?

    (n=48)

    %

    No

    8

    17%

    Other

    7

    15%

    Do not know / no response

    33

    69%

    Open-ended responses

    • Legal fees for Aboriginal groups.
    • Travel.
    • Student funding.
    • Travel costs are small compared to the time commitment to review documents and research issues. Those costs would have impact for those who did not have a funding behind them.
    • In my case, there could be out-of-pocket expenses in producing a suitable brief.

    Respondents indicated many different ways of discovering opportunities similar to the PFP, most commonly from the program website (21), from colleagues (21), and from personal communication such as telephone calls, emails, or letters (16). Twelve, or 25%, stated they find out about these opportunities through newspapers (print or online).

    Only three people (6%) stated they find out about similar opportunities through social media (Facebook, Twitter).

    Table 32: Q7. How do you typically find out about opportunities similar to the PFP?

    (n=48)

    %

    Program website

    21

    44%

    Other website

    8

    17%

    From a CNSC representative

    7

    15%

    From a colleague

    21

    44%

    Radio advertisement

    2

    4%

    Newspaper (print or online)

    12

    25%

    Social media (Facebook, Twitter)

    3

    6%

    Posters

    1

    2%

    Personal telephone call, email or letter

    16

    33%

    Other

    3

    6%

    Do not know / no response

    8

    17%

    Note: Respondents could provide more than one answer; totals may sum to more than 100%.

    Open-ended questions

    • CNSC website “subscription” and occasionally from the University research office.
    • Had not previously heard of these opportunities.
    • Never hear about them and/or don’t seek out info on the programs.

    Respondents provided many suggestions for increasing awareness of the PFP among the targeted groups. The most common suggestions were emails, advertisements or public announcements, and targeted information to those who are likely to be interested (e.g., non-profit organizations, universities, communities, and other interest groups).

    Several respondents mentioned that awareness of the PFP could be improved through proactively notifying communities when there is a project that might affect or concern them, and by explaining to the communities that funding is available through the PFP.

    A few respondents mentioned that a larger social media presence could help raise awareness of the PFP.

    Table 33: Q8. What, if anything, could the CNSC do to increase awareness of the PFP?

    (n=48)

    %

    Nothing

    5

    10%

    Other

    20

    42%

    Don’t know / no response

    23

    48%

    Note: Respondents could provide more than one answer; totals may sum to more than 100%.

    Open-ended responses

    • Emails (several respondents mentioned this).
    • Advertisements and public announcements (several respondents mentioned this).
    • Send information through appropriate channels to those who are likely to be interested, e.g., non-profit organizations, universities, affected/concerned communities, and other interest groups (several respondents mentioned this).
    • Social media (a few respondents mentioned this).
    • Greater visibility among universities.
    • Announce it well in advance of any scheduled hearings, and announce it with the notice for public hearings.
    • I think a regular annual call for proposals with a logo (through email) would raise awareness.
    • Local community newspapers.
    • Choose different forms of distribution to potential interest groups.
    • More publicity in media.
    • Provide information to target recipients, e.g. emails and other promotional materials. Be sure to include social sciences and humanities academics on your lists.

    Over a third of respondents (18) stated they might be more interested in applying to the PFP if there were changes to the subject matter or activities for which the PFP offers funding. Also, nine respondents mentioned that they might be more interested if the communications strategy changed. However, eight respondents stated that no changes to the PFP could make them more interested in applying for funding.

    Table 34: Q9. What changes, if any, could be made to the PFP to make you more interested in applying for funding? Changes in…

    (n=46)

    %

    No changes

    8

    17%

    Eligibility requirements

    3

    7%

    Program timelines

    5

    11%

    The amount of funding made available

    4

    9%

    The subject matter or activities for which funding is offered

    18

    39%

    The communications strategy

    9

    20%

    Other

    2

    4%

    Do not know / no response

    15

    33%

    Note: Respondents could provide more than one answer; totals may sum to more than 100%.

    Open-ended responses

    • Maybe some examples of what kind of funds have been allocated to have a sense of whether or not it is worth the effort to go through the application process.
    • I would only apply for funding if I strongly felt that the Commission should hear my views. Changes to the PFP are unlikely to influence that.

    Background

    Of the 48 respondents, 31 reported never having attended a regulatory hearing before. The other 17 respondents had attended at least one hearing before, with nine having attended a CNSC hearing, and 10 having attended a hearing by another regulatory organization.

    Table 35: Q10. Have you attended a regulatory hearing before, whether it was held by CNSC or another regulatory organization?

    (n=48)

    %

    No

    31

    65%

    Yes, a CNSC hearing

    9

    19%

    Yes, a hearing held by another regulatory organization

    10

    21%

    Note: Respondents could provide more than one answer; totals may sum to more than 100%.

    Among the 17 respondents who stated they had attended a regulatory hearing in the past—either a CNSC hearing or another hearing—10 (59%) said they did not receive external funding to attend the hearing. Others received funding from an academic institution (3), a provincial government (3), or the federal government (2).

    Table 36: Q11. Did you receive any external funding to attend a hearing in the past?

    (n=17)

    %

    No

    10

    59%

    Yes, from an academic institution (university, college)

    3

    18%

    Yes, from the federal government

    2

    12%

    Yes, from a provincial government

    3

    18%

    Other

    2

    12%

    Note: Respondents could provide more than one answer; totals may sum to more than 100%.

    Open-ended responses

    • As part of a Licensee.
    • My employer covered expenses in the past.

    From the 17 respondents who had attended at least one hearing in the past, 16 said they would consider attending a CNSC hearing, while one was uncertain.

    Table 37: Q12. Based on any hearings you attended in the past, would you consider attending a CNSC hearing?

    (n=17)

    %

    Yes

    16

    94%

    Do not know / no response

    1

    6%

    Respondents identified themselves as part of several different groups, most commonly academic institutions (43), not-for-profit organizations (14), and subject matter experts (14).

    Other groups included interested stakeholders in the general public (10), unions (8), community associations (4) and Aboriginal organizations or communities (3).

    Table 38: Q14. Are you a member of any of the following groups?

    (n=48)

    %

    Not-for-profit organization

    14

    29%

    Community association

    4

    8%

    Aboriginal organization or community

    3

    6%

    Interested stakeholder in the general public

    10

    21%

    Academic institution (university, college)

    43

    90%

    Union

    8

    17%

    Subject matter expert

    14

    29%

    Do not know / no response

    1

    2%

    Note: Respondents could provide more than one answer; totals may sum to more than 100%.

    Conclusion

    No respondents provided additional comments about the PFP.

    Table 39: Q15. Do you have any other comments about the PFP?

    (n=48)

    %

    No comments

    44

    92%

    Do not know / no response

    4

    8%

    Appendix G – Document list

    • Canadian Nuclear Safety Commission public meeting transcript on utilizing the resources of the Participant Funding Program to meet with the Mississauga First Nation Community to discuss Cameco’s Blind River refinery, October 1, 2014.
    • Canadian Nuclear Safety Commission Participant Funding Program decision on meeting with Mississauga First Nation regarding Cameco’s Blind River refinery, December 19, 2014.
    • Canadian Nuclear Safety Commission briefing note to Director, PAIRD, on Ontario Power Generation’s application to renew the power reactor operating licence for the Darlington Nuclear Generating licence for the Darlington Nuclear Generation Station, October 15, 2013. e-Doc 4199126.
    • Canadian Nuclear Safety Commission briefing note to VP, Regulatory Affairs Branch, from DG, SPD, in the matter of Ontario Power Generation’s application to renew the power reactor operating licence for the Darlington Nuclear Generating licence, Darlington Nuclear Generation Station; May 23, 2014. e-Doc 4436897.
    • Canadian Nuclear Safety Commission record of proceedings, including reasons for decision in the matter of Ontario Power Generation’s application to renew the power reactor operating licence for the Darlington Nuclear Generating licence for the Darlington Nuclear Generation Station.
    • Canadian Nuclear Safety Commission Participant Funding Program decision in the matter of Ontario Power Generation’s application to renew the power reactor operating licence for the Darlington Nuclear Generating licence for the Darlington Nuclear Generation Station.
    • Canadian Nuclear Safety Commission Participant Funding Program decision in the matter of Meeting with Kineepik Métis Local Inc. #9 Regarding the Environmental Assessment for Cameco’s Key Lake Extension Project, May 2, 2014.
    • Canadian Nuclear Safety Commission public hearing transcript on the application on meeting with Kineepik Métis Local Inc. #9 regarding the environmental assessment for Cameco’s Key Lake Extension Project, October 2, 2013.
    • CNSC staff meeting with Kineepik Metis Local Inc. #9 regarding the environmental assessment for Cameco’s Key Lake Extension Project, July 16, 2014.
    • Canadian Nuclear Safety Commission, licence, Cameco’s proposed Millennium Mine Project licence application.
    • Canadian Nuclear Safety Commission Participant Funding Program Finance Review Committee recommendations in the matter of Cameco’s proposed Millennium Mine Project licence application, March 14, 2014. e-Doc 4383718.
    • Canadian Nuclear Safety Commission Participant Funding Program Finance Review Committee recommendations report, VP-RAB briefing, April 4, 2014. e-Doc 4409830.
    • Canadian Nuclear Safety Commission briefing note to VP, Regulatory Affairs Branch, from DG, SPD, in the matter of Cameco’s proposed Millennium Mine Project licence application, May 7, 2014. e-Doc 4428545.
    • Canadian Nuclear Safety Commission Participant Funding Program decision in the matter of Cameco’s proposed Millennium Mine Project licence application.
    • Canadian Nuclear Safety Commission, licence, Saskatchewan Research Council’s licence application for the Gunnar Remediation Project.
    • Canadian Nuclear Safety Commission Participant Funding Program Finance Review Committee recommendations in the matter of the Saskatchewan Research Council’s licence application for the Gunnar Remediation Project, e-Docs 4319623, 4328939, 4319640, 4319344 and 4319371.
    • Canadian Nuclear Safety Commission briefing note to Director, PAIRD, on the Saskatchewan Research Council’s licence application for the Gunnar Remediation Project, November 18, 2014. e-Doc 4579471.
    • Canadian Nuclear Safety Commission briefing note to VP, Regulatory Affairs Branch, from DG, SPD, in the matter of the Saskatchewan Research Council’s licence application for the Gunnar Remediation Project, April 15, 2014. e-Doc 4417112.
    • Canadian Nuclear Safety Commission Participant Funding Program decision in the matter of the Saskatchewan Research Council’s licence application for the Gunnar Remediation Project, May 2, 2014.
    • Canadian Nuclear Safety Commission record of proceedings, including reasons for decision in the matter of the Saskatchewan Research Council’s licence application for the Gunnar Remediation Project, November 6, 2014.
    • Canadian Nuclear Safety Commission public hearing transcript on the application of Cameco Corporation’s application for the renewal of Class IB Nuclear Fuel facility operating licence for Blind River refinery. Hearing date: October 6, 2011.
    • Canadian Nuclear Safety Commission public hearing transcript on the application of Cameco Corporation’s application for the renewal of Class IB Nuclear Fuel facility operating licence for Blind River refinery. Hearing date: January 19, 2012. 
    • Canadian Nuclear Safety Commission record of proceedings, including reasons for decision in the matter of Cameco Corporation application to renew its nuclear fuel facility
    • operating licence for Blind River refinery. Public hearing dates: November 3, 2011 and January 19, 2012.
    • Canadian Nuclear Safety Commission Participant Funding Program decision in the matter of Cameco Corporation licence renewal for the Blind River refinery, December 14, 2011.
    • Canadian Nuclear Safety Commission, licence, Bruce Nuclear Generating Station A and B.
    • Canadian Nuclear Safety Commission Participant Funding Program Finance Review Committee recommendations in the matter of Bruce Power’s applications to renew the power reactor operating licence for Bruce Nuclear generation Stations A and B; February 17, 2014. e-Doc 4310869.
    • Canadian Nuclear Safety Commission briefing note to Director, PAIRD, on the Bruce Power’s applications to renew the power reactor operating licence for Bruce Nuclear generation Stations A and B, October 15, 2013. e-Doc 4199086.
    • Canadian Nuclear safety Commission briefing note to Director, PAIRD, on the Bruce Power’s applications to renew the power reactor operating licence for Bruce Nuclear generation Stations A and B, February 12, 2014. e-Doc 4329195.
    • Canadian Nuclear Safety Commission Finance Review Committee recommendations and funding rationale report, February 2014. e-Doc 4310869.
    • Canadian Nuclear Safety Commission record of proceedings, including reasons for decision in the matter of the Bruce Power’s applications to renew the power reactor operating licence for Bruce Nuclear generation Stations A and B, April 24, 2014. e‑Doc 4423908.
    • Canadian Nuclear Safety Commission Participant Funding Program decision in the matter of the Bruce Power’s applications to renew the power reactor operating licence for Bruce Nuclear generation Stations A and B, December 11, 2014.
    • Canadian Nuclear Safety Commission record of proceedings, including reasons for decision in the matter of Atomic Energy of Canada Limited application to renew its nuclear research and test establishment operating licence for the Chalk River Laboratories. Public hearing dates: June 8 and October 4, 2011.
    • Canadian Nuclear Safety Commission Participant Funding Program decision on AECL’s proposed 5-year licence renewal for its Chalk River Laboratories, July 21, 2011. File No.: PFP 2011-CRL01-FRCREP.
    • Canadian Nuclear Safety Commission public hearing transcript on the application by AECL’s proposed 5-year licence renewal for its Chalk River Laboratories. Hearing date:  October 4, 2011.
    • Chalk River Laboratories (CRL) application for operating licence renewal – 2011, 2010 September 30. File No.: CRL·ACNO·I0·0048·L. 
    • Canadian Nuclear Safety Commission Participant Funding Program decision on Cameco Corporation’s licence renewal for the Cigar Lake Uranium Mine in Northern Saskatchewan, February 2013.
    • Canadian Nuclear Safety Commission, licence, Darlington Nuclear Generating Station proposed refurbishment and continued operation.
    • Canadian Nuclear Safety Commission Participant Funding Program Finance Review Committee recommendations in the matter of the environmental assessment of the Darlington Nuclear Generating Station refurbishment and continued operation, April 27, 2012. File No.: PFP 2012-DRL02-FRC REC. e-Doc 3910390.
    • Canadian Nuclear Safety Commission briefing note to Director, PAIRD, on the environmental assessment of the Darlington Nuclear Generating Station refurbishment and continued operation, July 26, 2012. e-Doc 4055369.
    • Canadian Nuclear Safety Commission briefing note to Director, PAIRD, on the environmental assessment of the Darlington Nuclear Generating Station refurbishment and continued operation, December 19, 2012. e-Doc 4055369.
    • Canadian Nuclear Safety Commission public hearing transcript on the environmental assessment of the Darlington Nuclear Generating Station refurbishment and continued operation. Public hearing dates: December 3–6, 2012.
    • Canadian Nuclear Safety Commission record of proceedings, including reasons for decision in the matter of the environmental assessment of the Darlington Nuclear Generating Station refurbishment and continued operation. Public hearing dates: December 3–6, 2012
    • Canadian Nuclear Safety Commission Participant Funding Program decision in the matter of the environmental assessment of the Darlington Nuclear Generating Station refurbishment and continued operation, May 2012.
    • Canadian Nuclear Safety Commission, licence, Cameco’s licence renewals for Key Lake, Rabbit Lake uranium mine and mill, and McArthur River uranium mine.
    • Canadian Nuclear Safety Commission Participant Funding Program Finance Review Committee recommendations in the matter of Cameco’s licence renewals for Key Lake, Rabbit Lake uranium mine and mill, and McArthur River uranium mine, April 29, 2013. e‑Doc 4112385.
    • Canadian Nuclear Safety Commission briefing note to Director, PAIRD, in the matter of Cameco’s licence renewals for Key Lake, Rabbit Lake uranium mine and mill, and McArthur River uranium mine, July 2013. e-Doc 4165656.
    • Canadian Nuclear Safety Commission briefing note to VP, Regulatory Affairs Branch, from DG, SPD, in the matter of Cameco’s licence renewals for Key Lake, Rabbit Lake uranium mine and mill, and McArthur River uranium mine, e-Doc 4131199.
    • Draft Qs & As relating to Participant Funding for re-licensing hearing for Key Lake, Rabbit Lake and McArthur River uranium mine, September 11, 2013. e-Doc 4196588.
    • Canadian Nuclear Safety Commission public hearing transcript in the matter of Cameco Corporation’s application for a 10-year licence renewal for: Key Lake Uranium Mill, Rabbit Lake Uranium Mine and Mill, and McArthur River Uranium Mine. Public hearing date: October 1–3, 2013 public hearing.
    • Canadian Nuclear Safety Commission summary record of proceedings, including reasons for decision in the matter of Cameco Corporation - application for the renewal of the licence for Rabbit Lake Operation, public hearing dates October 1-3, 2013. 
    • Canadian Nuclear Safety Commission summary record of proceedings, including reasons for decision in the matter of Cameco Corporation - application for the renewal of the licence for Key Lake Operation. Public hearing dates: October 1-3, 2013.
    • Canadian Nuclear Safety Commission summary record of proceedings, including reasons for decision in the matter of Cameco Corporation - application for the renewal of the licence for McArthur River Operation. Public hearing dates: October 1-3, 2013.
    • Canadian Nuclear Safety Commission Participant Funding Program decision on Cameco Corporation’s application for a 10-year licence renewal for: Key Lake Uranium Mill, Rabbit Lake Uranium Mine and Mill, and McArthur River Uranium Mine, July 2013.
    • Canadian Nuclear Safety Commission, Licence, Cameco’s application to renew waste facility operating licence at decommissioned Beaverlodge mine and mill, April 3-4, 2013.
    • Canadian Nuclear Safety Commission public hearing transcript in the matter of Cameco Corporation’s 10-year licence renewal application for the decommissioned Beaverlodge mine and mill site, April 3, 2013. 
    • Canadian Nuclear Safety Commission summary record of proceedings, including reasons for decision in the matter of Cameco Corporation’s 10-year licence renewal application for the decommissioned Beaverlodge mine and mill site. Public hearing dates: April 3-4, 2013.
    • Canadian Nuclear Safety Commission briefing note to VP-Regulatory Affairs Branch from DG, SPD in the matter of Cameco’s 10-year licence renewal application for the decommissioned Beaverlodge mine and mill site. e-Doc 4061130.
    • Canadian Nuclear Safety Commission decision on the allocation of participant funding for Cameco Corporation’s 10-year licence renewal application for the decommissioned Beaverlodge mine and mill site, February 2013.
    • Canadian Nuclear Safety Commission, licence, Cameco’s application for a licence to allow operation of the uranium mine at Cigar lake project, 2012.
    • Canadian Nuclear Safety Commission Participant Funding Program Finance Review Committee recommendations in the matter of Cameco’s licence renewal for the Cigar Lake uranium mine, December 17, 2012. File No.: PFP2012-CGL01-FRC REC. e‑Doc 4051107.
    • Canadian Nuclear safety Commission briefing note to VP-Regulatory Affairs Branch from DG, SPD in the matter of Cameco’s licence renewal for the Cigar Lake uranium mine. e‑Doc 4083184.
    • Canadian Nuclear Safety Commission public hearing transcript in the matter of Cameco Corporation’s licence renewal for the Cigar Lake uranium mine. Public hearing date: April 3, 2013.     
    • Canadian Nuclear Safety Commission summary record of proceedings, including reasons for decision in the matter of Cameco Corporation’s licence renewal for the Cigar Lake uranium mine, April 3, 2013.
    • Canadian Nuclear Safety Commission decision on the allocation of participant funding for Cameco Corporation’s licence renewal for the Cigar Lake uranium mine.
    • Canadian Nuclear Safety Commission, licence, Pickering Nuclear Generating Station A and B.
    • Canadian Nuclear Safety Commission public hearing transcript of public hearing on the application by Ontario Power Generation for the renewal of the licence for the Pickering Nuclear Generating Station (day one). February 20 and May 29–31, 2013.
    • Canadian Nuclear Safety Commission record of proceedings, including reasons for decision in the matter of Ontario Power Generation application to renew the power reactor operating licence for the Pickering Nuclear generating Station. Public hearing dates: February 20 and May 29–31, 2013.
    • Canadian Nuclear Safety Commission Participant Funding Program decision on Ontario Power Generation’s licence renewal of the Pickering Nuclear Power Reactor operating licence, February 2013.
    • Canadian Nuclear Safety Commission Participant Funding Program Finance Review Committee recommendations in the matter of the Pickering Nuclear Power Reactor operating licence, December 17, 2012. e-Doc 4051301.
    • Canadian Nuclear Safety Commission, licence, Point Lepreau Nuclear Generating Station.
    • Canadian Nuclear safety Commission briefing note to Director, PAIRD, on Point Lepreau 5-year renewal of operating licence, December 12, 2011. e-Doc 3866430.
    • Canadian Nuclear Safety Commission Participant Funding Program decision on New Brunswick Power Nuclear Corporation’s licence renewal for the Point Lepreau Nuclear Generating Station, September 30 and October 14, 2011.
    • Canadian Nuclear Safety Commission public hearing transcript on the application of NB Power Nuclear Corporation’s request for approval to reload fuel and restart the Point Lepreau Nuclear Generating Station, Public Hearing date: October 6 and December 1–2, 2011. 
    • Canadian Nuclear Safety Commission record of proceedings, including reasons for decision in the matter of New Brunswick Power Nuclear Corporation’s request for approval to reload fuel and restart the Point Lepreau Nuclear Generating Station, and application to renew the power reactor operating licence for the Point Lepreau Nuclear Generating Station. Public Hearing Date: October 6, 2011 and December 1 and 2, 2011.
    • Canadian Nuclear Safety Commission, licence, Cameco Fuel Manufacturing Inc. in Port Hope.
    • Canadian Nuclear Safety Commission, Licence, Cameco Corporation licence renewal: Conversion facility in Port Hope, Ontario, October 27, 2011.  File No.: PFP2011-PHCF01-FRCREP. e-Doc 3821347.
    • Canadian Nuclear Safety Commission briefing note to Director, PAIRD, on Cameco Port Hope Conversion Facility, January 23, 2012. e-Doc 3867771.
    • Canadian Nuclear Safety Commission Participant Funding Program summary record of proceedings and decision in the matter of Cameco Corporation application for the renewal of the operating licence for Port Hope, Ontario Public. Hearing dates: November 3, 2011 and January 18-19, 2012.
    • Canadian Nuclear Safety Commission Participant Funding Program decision on Cameco Corporation application for the renewal of the operating licence for Port Hope, Ontario, December 14, 2011.
    • Canadian Nuclear Safety Commission public hearing transcript in the matter of Cameco Corporation application for the renewal of Class IB Nuclear Fuel Facility operating licence in Port Hope. Hearing date: November 3, 2011. 
    • Canadian Nuclear Safety Commission public hearing transcript in the matter of Cameco Corporation application for the renewal of Class IB Nuclear Fuel Facility operating licence for Port Hope Conversion Facility. Hearing date: January 18-19, 2012.
    • Canadian Nuclear Safety Commission, Licence Conditions Handbook, Bruce Nuclear Generating Station A and B.
    • Canadian Nuclear Safety Commission, Licence Conditions Handbook, Darlington Nuclear Generating Station, 2014.
    • Canadian Nuclear Safety Commission, Licence Conditions Handbook, Cameco’s Millennium project uranium mine licence. e-Doc 4406211 (Word) and 4413126 (PDF).
    • Canadian Nuclear Safety Commission, licence application for Cameco’s Millennium uranium mine project. e-Doc 4315539 (Word) and 4412049 (PDF).
    • Canadian Nuclear Safety Commission, Licence Conditions Handbook, Gunnar Remediation Project.
    • Canadian Nuclear Safety Commission, Licence Conditions Handbook, Key Lake Uranium Mill, Rabbit Lake Uranium Mine and Mill, and McArthur Uranium Mine.
    • Canadian Nuclear Safety Commission, Licence Conditions Handbook, Beaverlodge Mine/Mill.
    • Canadian Nuclear Safety Commission, Licence Conditions Handbook, Cigar Lake Uranium Mine.
    • Canadian Nuclear Safety Commission, Licence Conditions Handbook, Pickering Nuclear Generating Station A.
    • Canadian Nuclear Safety Commission, Licence Conditions Handbook, Darlington Nuclear Generating Station, 2012.
    • Canadian Nuclear Safety Commission, Licence Conditions Handbook, Port Hope Conversion Facility.
    • Canadian Nuclear Safety Commission, Licence Conditions Handbook, Blind River Refinery.
    • Canadian Nuclear Safety Commission, Licence Conditions Handbook, Point Lepreau Nuclear Generating Station.
    • Canadian Nuclear Safety Commission, Licence Conditions Handbook, Chalk River Laboratories.
    • Canadian Nuclear Safety Commission Guide, February 2011.

    Appendix H - Canadian Nuclear Safety Commission’s Participant Funding Program recipient survey covering fiscal years 2011–12 to 2013–14

    Summary

    1. Number of PFP recipients from 2011–14 = 69
    2. Number of Survey Respondents (sample) = 39
    3. % Response  = 56.52%
    (caption)

    Question

    Response choice

    Number of respondents

    Percentage

    1. How did you hear about the Participant Funding program (PFP)

    Direct contact with CNSC 

    23

    58.87%

    Word of mouth

    0

    0%

    CNSC website

    12.82%

    Interest in the subject matter

    11

    28.20%

    Other, please specify ----------------  (other organizations) 

    0

     

    N/A

    0

     

    2. If you had not received PFP assistance, would you have been able to participate in the proceedings?

    Yes

    1

    2.57%

    Somewhat/partially

    8

    20.51

    No

    30

    76.92%

    Do not know                                                                  

    0

    0%

    N/A

    0

    0%

    3. Was your application processed in time to enable you to prepare effectively for your involvement in CNSC proceedings?

    Yes

    35

    89.74%

    No

    4

    10.26

    PFP recipients’ comments

    • My application was processed within the indicated timelines, but I would have preferred to have had more time to work on the project.
    • It would have served us well if we had been notified of the PF earlier - that would have allowed us to participate in the site tour that took place while we were waiting for application to be approved. Given the timeframe, we are satisfied that we did an excellent job.
    • Yes, the finances were in place in a timely fashion that allowed [us] to plan and implement the input and involvement of our First Nations people from our northern communities.
    • Ample time for notice and time to start preparing the information, reading relevant documentation and holding feedback meetings within the communities.
    • The PFP group was very helpful in attending and responding to all my questions. In addition, the Commission facilitated the process very well.
    • We propose a 90-day review of the Commission documents (environmental assessment report, Licence Conditions Handbook (LCH) and Commission member documents (CMDs) instead of a 30-day review period.
    • Yes, our application was processed in good time, but a series of hearing postponements and a lack of availability of documents made time planning very difficult.
    • Yes, we were notified quickly of the funding decision, and were able to begin to prepare and write the required report immediately after having been approved.
    • It was not processed in good time. Due to time lag, we lost valuable contribution and input from some experts and members of the community who decided to take on other challenges in their lives, i.e., work in other places.

    4. Do you feel the Commission heard your concerns as shown through the intervention you prepared and through your oral submission at the hearing?

    Yes

    26

    66.66%

    No

    2

    5.12%

    Cannot tell (Commission hearing postponed)

    8

    20.51

    Uncertain

    3

    7.69%

    PFP Recipients’ Comments

    • “I felt that the Commission heard the concerns of the Metis Nation’s people since we have a great role to play in the hearings […]. Commission members sought my opinion on our attachment to Mother Land. The local environment provides the Aboriginal peoples all their needs for survival: a hunting and gathering culture necessitates a thorough and intimate knowledge of the environment, the plants, the animals, the weather patterns and the land. I was pleased to answer Commission questions”
    • Yes, I do. I was pleased with the Commission’s attention to the findings of the research that I presented as well as their thoughtful feedback.
    • Yes, but the time allocated to oral presentation was extremely short (10 minutes). We are hopeful that the Commission members will read our written submission.  However, we received very positive feedback on our presentation from members of the Commission.
    • Yes, the Commission addressed our main concerns regarding fish impingement and future decommissioning plans.
    • We believe that some Members of the Commission were concerned about the issues we had presented before them. But when it came to decision time, it did not matter - our concerns were not addressed at all.
    • Yes, the Commission heard our concerns via the intervention we had prepared and presented before it.
    • Yes, I believe the Commission heard “my voice’. I am confident the Commission took my submission and considered it in making its decision in the Cigar Lake and Beaverlodge applications.
    • Absolutely, this is a great program. The only concern would be the inadequate amount of monies allocated. Commission members appeared to have been attentive and engaged during our presentation and the question and answer period that followed. The Commission asked questions that were topic specific and reflective of key issues we had identified. However, the Commission decision will tell if, in fact, our concerns will be considered.
    • The exchange between the Commission and intervenors is polite and respectful. However, the Commission makes decisions based on its mandate, as enshrined in the CNSC Actand other acts.
    • It is important to note that the scope of the environmental assessment is too narrow.
    • A great number of concerns we raised were not addressed by the Commission.
    • I believe our concerns were heard. But whether the Commission will make its decision based on our intervention is uncertain.
    • We do feel the Commission listened to our concerns. However, a great number of our Aboriginal people are skeptical that our concerns and input as the natural stewards and caretakers of our Ancestral Lands were heeded.

    5. Were the materials provided (funding application form, PFP Guide, final financial report, website, etc.) useful?  Please comment:

    Yes

    39

    100%

    No

    0

    0%

    6. Please share any other comments about our program so we can make it more effective and efficient

    PFP recipients’ comments

    • All necessary documents were not provided in a timely manner.  For example, Commission Members’ Documents were provided in a timely manner but corresponding or related reference Commission documents were not included with the first batch sent to applicants.
    • The Commission is more likely to consider and approve 10 licences in a year, some of which will be granted approval for a 10-year licence. To improve management and delivery of the participant funding  program, the CNSC could do the following:
    • The PFP needs to accommodate intervenors during interim reporting to the Commission. Currently, the PFP is available to facilitate licence renewal or new licences for facilities/mines/mills.
    • 10-year licence period is too long. The public, Aboriginal groups and other stakeholders need a voice every few years…not every 10 years. The public and other stakeholders need a voice every few years…not every 10 years. PFP is currently available to renewal or new licences. The CNSC needs to expand to include funding for the interim reporting that will likely occur during any given licence period. Currently, there is no mechanism for this and this is unfortunate.
    • Increase the PFP funding envelope to enable intervenors seek and use expert knowledge and skills to prepare necessary document materials including reports and report summaries. This would also enable intervenors to meet the expert-hour-charge as opposed to “nickle-and-dime”.
    • We find the PFP to be well designed and delivered.  However, there are some difficulties with the timing vis-à-vis release of application documents in sufficient time to allow us evaluate the licencee’s proposal well in advance of submitting the PFP.
    • To make the PFP more effective, the CNSC needs to revamp other parts of its operation including the means and timing of releasing requested information. 
    • It would benefit interveners if they were be provided with the names and titles of individuals representing the proponents before the Commission hearing.
    • To make it more effective, the administrator of the PFP could allow access to documents associated with the funding process earlier. The period between the announcement of the offer and deadline to submit applications is very short.
    • More people from the Aboriginal communities must be given a chance to attend the Commission hearings in order to provide a deeper understanding of traditional and cultural knowledge to the Commission. If more Elders and children were to attend the Commission hearings, they would witness the process first hand and then be able to share their experience with a wider community within the Aboriginal groups.
    • It takes too long to process payment; i.e., the time between submission of the final report and payment is too long Perhaps a fraction of the total amount approved should be paid up front or on the delivery of the first deliverable.

    Appendix I – Data for seven CNSC hearing cases

    Table 11 – Data for seven CNSC hearing cases.
    Name of project Number of PFP-funded interveners PFP-funded Commission questions to interveners Number of non-PFP funded interveners Non-PFP funded Commission questions to interveners PFP- funded Commission questions to licencees Non-PFP funded Commission questions to licencees PFP- funded Commission questions to CNSC staff Non-PFP funded Commission questions to  CNSC staff

    Gunnar Remediation Project environmental/licence application 2014

    4

    55

    0

    0

    43

    0

    19

    0

    Cameco's Key Lake, Rabbit Lake and McArthur River licence renewal 2013

    7

    50

    15

    79

    67

    28

    50

    17

    OPG's Darlington Nuclear Generating Station refurbishment and continued operation 2012

    6

    18

    71

    134

    35

    112

    26

    114

    Beaverlodge 10-year licence application for decommissioned mine/mill 2012

    4

    26

    3

    9

    25

    6

    18

    2

    Cameco's Cigar Lake 10-year relicensing application 2012

    4

    35

    6

    28

    15

    11

    12

    14

    OPG Pickering Nuclear Generating Station licence renewal 2012

    5

    11

    54

    96

    36

    49

    38

    70

    AECL Chalk River Laboratories 5-year relicensing application 2011

    3

    15

    7

    37

    30

    8

    25

    4

    Total

    33

    210

    156

    383

    251

    214

    188

    221

    Average

     

    6.4

     

    2.5

    7.6

    1.4

    5.7

    1.4

    List of acronyms

    CEAA
    Canadian Environmental Assessment Agency
    CELA
    Canadian Environmental Law Association
    CNSC
    Canadian Nuclear Safety Commission
    EA
    Environmental assessment
    FTE
    Full-time equivalent
    FY
    Fiscal year
    GC
    Government of Canada
    MRRS
    Management, resources, and results structure
    NEB
    National Energy Board
    NGO
    Non-governmental organization
    PAA
    Program alignment architecture
    PAIRD
    Policy, Aboriginal and International Relations Division
    PFP
    Participant Funding Program
    PRA
    Prairie Research Associates Inc.
    RAB
    Regulatory Affairs Branch
    ROB  
    Regulatory Operations Branch
    RRED
    Regulatory Research and Evaluation Division
    TBS 
    Treasury Board Secretariat
    TOR
    Terms of reference

    Footnotes

    Footnote 1

    See section 42.1 of the Financial Administration Act.

    Return to footnote 1 referrer

    Footnote 2

    See section 6.5 of the Treasury Board’s Policy on Transfer Payments (2006).

    Return to footnote 2 referrer

    Footnote 3

    See Treasury Board’s Policy on Evaluation

    Return to footnote 3 referrer

    Footnote 4

    Wright, H. Management Review of the Participant Funding Program at the Canadian Nuclear Safety Commission. February 28, 2013. Internal document: Canadian Nuclear Safety Commission.

    Return to footnote 4 referrer

    Footnote 5

    Source: e-Doc 4135006, December 6, 2013.

    Return to footnote 5 referrer

    Footnote 6

    Source: Treasury Board Submission (835713), approved June 24, 2010.

    Return to footnote 6 referrer

    Footnote 7

    Source: The CNSC’s Participant Funding Program Guide (February 2011)

    Return to footnote 7 referrer

    Footnote 8

    Source: The CNSC’s Participant Funding Program Guide (February 2011)

    Return to footnote 8 referrer

    Footnote 9

    Source: Treasury Board Submission (835713), Section 39, page 58.  Approved June 24, 2010

    Return to footnote 9 referrer

    Footnote 10

    Source: The CNSC’s Participant Funding Program Guide (February 2011)

    Return to footnote 10 referrer

    Footnote 11

    Source: CNSC financial staff

    Return to footnote 11 referrer

    Footnote 12

    Source: e-Doc 4497340 CNSC document: PFP actual spending.

    Return to footnote 12 referrer

    Footnote 13

    Composed of CNSC directors general from the Regulatory Policy Directorate, Directorate of Assessment and Analysis, and the Strategic Planning Directorate (Head of Evaluation)

    Return to footnote 13 referrer

    Footnote 14

    Similar programs, for the purpose of the PFP Evaluation, are defined as programs with the same purpose/objectives/goals (e.g., participation in regulatory processes, increased understanding of technical information), but not necessarily the same funding/supporting mechanisms.

    Return to footnote 14 referrer

    Footnote 15

    Recipient data provided by program staff

    Return to footnote 15 referrer

    Footnote 16

    Data from 2014–15 is partial as it only covers April 1, 2014 to September 30, 2014

    Return to footnote 16 referrer

    Footnote 17

    Canadian Nuclear Safety Commission Public Hearing Transcript of Day-One Public Hearing on the application by Ontario Power Generation for the renewal of the licence for the Pickering Nuclear Generating Station. February 20 and May 29–31, 2013. pp 55–60.

    Return to footnote 17 referrer

    Footnote 18

    Treasury Board of Canada Secretariat, Policy on Evaluation, April 1, 2009.

    Return to footnote 18 referrer

    Footnote 19

    Data Source: PFP program financials provided by FAD/ Freebalance

    Return to footnote 19 referrer

    Related links

    Page details

    Date modified: