Archived - Audit of 2012-2013 Departmental Performance Report

April 2014

Archived Content

Information identified as archived is provided for reference, research or recordkeeping purposes. It is not subject to the Government of Canada Web Standards and has not been altered or updated since it was archived. Please contact us to request a format other than those available.

1.0 Introduction

The Treasury Board's Policy on Management, Resources and Results Structures (Policy on MRRS) reinforces the federal government's commitment to strengthen its management and accountability of public expenditures. Recognizing that all decision-making, particularly in the federal government, is informed by a number of complex factors, the Policy on MRRS seeks to ensure a common understanding of the nature of the programs. The Policy identifies the goals of these programs and the best available measures for the success and interpretation of the results achieved.

The Canada Border Services Agency (CBSA or Agency) embraces the MRRS concepts and has established a Program Alignment Architecture (PAA) and a Performance Measurement Framework (PMF) as a foundational structure for its Report on Plans and Priorities (RPP) and Departmental Performance Report (DPR). The PAA and PMF provide objective information to assist the Agency in allocating and reallocating resources to achieve optimal results for Canadians.

The Report on Plans and PrioritiesFootnote 1 (RPP) includes individual expenditure plans for each department and agency (excluding Crown corporations). The RPP provides increased levels of detail over a three-year period on an organization's main priorities by strategic outcome(s), programs, and planned/expected results, as defined in the established PAA and PMF. The RPP is tabled in Parliament, on or before March 31. 

The Departmental Performance ReportFootnote 2 (DPR) consists of individual department and agency accounts of results achieved against the RPP. The DPR, which covers the most recently completed fiscal year, is tabled in Parliament in the fall.

Both the RPP and DPR are tabled by the President of the Treasury Board on behalf of the ministers who preside over the appropriation-dependent departments and agencies, as identified in Schedules I, I.1 and II of the Financial Administration Act.

2.0 Significance of the Audit

Both the RPP and the DPR play an important role in the Government of Canada's planning and resource management processes.Footnote 3 These reports provide a mechanism for ministerial accountability to Parliament. Every Minister is responsible for his/her reports and must respond to questions that arise concerning their content and format.

The annual DPR is both an external and internal accountability document. It serves as a mechanism which enables the Agency to assess its performance in the previous fiscal year regarding the delivery of its strategic outcomes, expected results, and performance targets, as committed to in the RPP. The Agency's programs' results are evaluated through the measurement of its selected performance indicators as identified in the PMF. As such, if the performance indicators are not appropriate and results presented in the annual DPR are not reflective of the data collected by the Agency, there is a risk that the DPR may become ineffective in supporting CBSA's senior management in making decisions and ensuring accountability over Agency programs and activities. 

The audit's objective was to provide assurance that the Agency has adequate processes and controls in place to ensure that performance indicators are appropriately selected and that performance results presented in the 2012-13 DPR were reflective of the data collected by the Agency.

3.0 Statement of Conformance

The audit conforms to the Internal Auditing Standards for the Government of Canada, as supported by the results of the quality assurance and improvement program. The audit approach and methodology followed the International Standards for the Professional Practice of Internal Auditing as defined by the Institute of Internal Auditors and the Internal Auditing Standards for the Government of Canada, as required by the Treasury Board's Policy on Internal Audit.

4.0 Audit Opinion

The Agency had formal and informal processes in place to support selection of performance indicators and reporting of performance results in the CBSA's 2012-2013 DPR. However, the audit found gaps in the design of controls for both the selection and reporting processes. This results in a medium-high risk exposure to the Agency. 

5.0 Key Findings

The Agency does not have an enterprise-wide data governance framework in place that sets out the policies, roles, processes, standards, accountabilities, definitions, metrics and activities for managing its data assets in support of the DPR process.

The Agency had formal and informal processes in place to support the selection of performance indicators and reporting of performance results in the CBSA's 2012-2013 DPR. However, neither process was adequately documented to ensure all stakeholders understood requirements, roles and responsibilities, and timelines for the review and challenge of both the selected indicators for the PMF and the data produced for the 2012-13 DPR. In addition, the Agency's head of the evaluation function was not consulted on the 2012-2013 PAA and PMF, as required by the Treasury Board's Policy on MRRS and Policy on Evaluation

Broad roles, responsibilities and accountabilities of Programs, Operations and Corporate Affairs branches for Agency performance measurement and management have been defined and documented.  There is, however, a need to provide further clarity and guidance to key stakeholders at a more detailed level to reduce the risk of confusion and duplication.

Of the 14 performance indicators, the Agency's 2012-2013 DPR did not contain performance results on two of them. Furthermore, the Report did not contain an indicator to measure the performance of one of its seven programs (i.e. Revenue and Trade Management). The audit also found that standardized documented procedures were not developed for the remaining 11 DPR performance indicators. The 2012-2013 DPR results were produced by using informal, undocumented protocols to extract data and perform manual calculations. All DPR results were reviewed and challenged informally by the Operations, Programs, and Corporate Affairs branches.  

Due to the fact that data production procedures were not specifically developed for the DPR performance indicators, and that informal protocols were used to obtain the 2012-2013 DPR data, the audit team was unable to replicate the data production processes to reproduce the performance information in the 2012-2013 DPR. Consequently, CBSA was not able to provide evidence that performance results presented in the 2012-2013 DPR were reflective of the data collected by the Agency.

The absence of an enterprise-wide data governance framework, clear processes for indicator selection and performance reporting, standardized data production procedures and substantiation of performance data exposes the Agency to the risk that the DPR results are insufficient or unsuitable for decision-making, priority-setting, accountability reporting and year-to-year comparability.

6.0 Summary of Recommendations

This audit makes the following three recommendations:

  • Develop and implement an enterprise data governance framework to support the consistent entry, extraction, and monitoring of data that meet the needs of the Agency's Departmental Performance Reports.
  • Define and document the roles, responsibilities and accountabilities of all key stakeholders in support of performance indicator selection and departmental performance reporting processes.
  • Coordinate the development of data production procedures by all data owners for all performance indicators included in the Performance Measurement Framework; and document the validation process for all performance results included in the Departmental Performance Report. 

7.0 Management Response

The Corporate Affairs Branch (CAB) agrees with the recommendations of this audit. CAB will work in collaboration with key stakeholders to strengthen guidance, understanding and controls on Report on Plans and Priorities (RPP) and Departmental Performance Report (DPR) governance and accountabilities as well as business process and procedures. The work to be undertaken by CAB aims at ensuring: 1) sound decision-making as it pertains to the selection of key performance indicators for the RPP and DPR; and 2) RPP and DPR process integrity. This work will be undertaken in consultation with Programs and Operations branches. These elements, taken together, form the basis of a sound governance framework within which work on the RPP and DPR will be directed and managed over the next cycle. The development of the governance and accountabilities will be undertaken to ensure that it adheres to and compliments the CBSA Information Management (IM) Policy

Actions will include strengthening RPP and DPR governance through leveraging the existing MRRS Directors General Steering Committee, implementing the CBSA Performance Measurement Policy, documenting and communicating RPP and DPR business process, and standard operating procedures and tools. These measures will be fully implemented by December 2014.

8.0 Audit Findings

8.1 Data Governance

Audit Criteria:

  • The Agency has data governance processes and controls in place to support consistent production of data that meet the reporting needs of DPR.
  • The Agency has adequate monitoring controls in place for ensuring data integrity, and taking corrective measures as needed.

8.1.1 Data Governance Framework

According to the Data Governance Institute, " data governance is a system of decision rights and accountabilities for information-related processes, executed according to agreed-upon models which describe who can take what actions with what information, and when, under what circumstances, using what methods."Footnote 4

Similarly, the Gartner Research Group, an authoritative industry source, defines information governance as, "the specification of decision rights and an accountability framework to ensure appropriate behavior in the valuation, creation, storage, use, archiving and deletion of information. It includes the processes, roles and policies, standards and metrics that ensure the effective and efficient use of information in enabling an organization to achieve its goals."Footnote 5

A data governance framework should therefore encompass the policies, roles, processes, standards, accountabilities, definitions, metrics and activities required to create a consistent, enterprise-wide view of an organization's data to:

  • Promote information sharing
  • Improve quality, confidence and trust in the data used in decision-making
  • Make information accessible, understandable, and reusable
  • Reduce cost and duplication
  • Improve data security and privacyFootnote 6

In an organization such as the CBSA, which collects large amounts of data from multiple locations for input into numerous systems, having a data governance framework in place is imperative for ensuring the controls and processes are in place to support the consistent entry, extraction, and monitoring of data that meet the needs of the Agency's Departmental Performance Reports.

The audit found that, recognizing the need for data governance, three Directorates of the Agency have been exploring different data governance initiatives. However, there is currently no enterprise-wide data governance framework in place to support the common vision and consistent management of the data collected across the Agency to support the DPR.

In February 2013, Corporate Governance and Accountability Directorate (CGAD) of Corporate Affairs Branch launched a Performance Data Integrity Project intended to enhance the integrity of performance measurement data throughout the Agency by developing a data quality framework and measures (e.g. review data sources and data collection methods, build data integrity checks in data management cycle, institute data integrity evaluation procedures etc.) that would standardize and streamline the Agency's approach to managing performance measurement data. 

In April 2013, the Enterprise Architecture/Information Management Directorate of the Information, Science and Technology Branch produced a "draft" strategy document entitled Enterprise Data Governance which suggests that data should be managed as an asset.

In July 2013, Strategic Risk and Modernization Directorate (SRMD) of the Programs Branch presented a Strategic Overview document to the Executive Committee, identifying data collection and access, data integrity and analytics, telling the performance story and governance as key strategic challenges with respect to program performance reporting and analysis. In November 2013, a Program Performance Action Plan was drafted and presented to the Strategic Policy and Program Committee in December 2013. A Steering Committee and its supporting Working Group were created to oversee the development and implementation of the program performance action plan.

In addition, the Agency has identified "Information Integrity" as one of the top 20 corporate risks required to be mitigated to an acceptable level for the Agency. As per the CBSA Enterprise Risk Profile (ERP) 2013, the Agency defined numerous risks and grouped them under different internal vulnerabilities. These included data quality, challenge function, information management, systems, and policies, standards and structure.  

Although the above-mentioned initiatives are very important elements of a data governance framework, they were not undertaken in a collaborative way to promote the development of a consistent, enterprise-wide understanding of roles, processes, standards, accountabilities, definitions, metrics and activities related to data management in support of the DPR.

8.1.2 Data Quality Monitoring

Without a consistent understanding of data management requirements across the Agency, the CBSA does not have documented and agreed-upon controls for ensuring the quality and consistency of its DPR data. Consequently, key stakeholders do not monitor data quality for DPR in a structured and systematic way. 

Interviews indicated that data quality monitoring was integrated into various performance reporting processes on an informal basis, such as, but not limited to, the DPR and the Agency Performance Summary (APS). It was generally agreed that Operations Branch is responsible for entering data into the systems and, as such, is responsible for monitoring whether data were entered properly. All Program AreasFootnote 7 noted that they performed data trend and variance analysis on the data extracted from the systems through an informal review and challenge process. 

The audit found that roles, responsibilities and accountabilities vis-à-vis data ownership and stewardship were not clearly defined and documented. In addition, there is no data quality monitoring guideline to clarify processes, roles and responsibilities in support of consistent quality monitoring of the DPR data.  

Recommendation 1:

The Vice-President of the Corporate Affairs Branch should develop and implement an enterprise data governance framework to support the consistent entry, extraction, and monitoring of data that meet the needs of the Agency's Departmental Performance Reports.

Management Response Completion date
The Corporate Affairs Branch agrees with this recommendation. It will develop a governance framework to support the consistent entry, extraction, and monitoring of data that meet the needs of the Agency's Departmental Performance Reports.   September 2014

8.2 Performance Data Selection and Reporting

Audit Criteria:

  • The Agency has adequate processes and controls in place to ensure that performance indicators and measures for DPR (aligned with RPP) are appropriately developed, selected and reviewed on a periodic basis and updated as required.
  • Roles, responsibilities, and accountabilities of key stakeholders have been clearly defined and communicated.
  • The Agency has appropriate review, challenge and approval processes in place to validate the quality of performance data reported in the DPR.

8.2.1 Foundation of Reporting Structure

The PAA is intended to be a structured inventory of the Agency's programs. The objective of the PAA is to arrange these programs in a hierarchical manner to depict the logical relationship between Agency programs and how each program's expected results contribute to achieving the Agency's strategic outcome and mandate. The PMF sets out expected results to be achieved by each program and identifies performance indicators to be used for assessing the outcomes against the expected results. In essence, it outlines the systematic and ongoing process of collecting and analyzing performance information used to report against the PAA. The PMF contains performance indicators both at the program level and below the program level. The PAA and PMF are submitted annually to Treasury Board for review and approval.

The Agency uses the approved PAA and PMF as the foundation of its reporting structure, to plan its annual RPP and, against the latter, report the previous year's completed performance in its annual DPR. As per the 2012-2013 DPR, the Agency had seven programs, structured to support the achievement of its mandate and strategic outcome.Footnote 8 The CBSA reported on the performance of its seven programs through 14 program-level performance indicators, developed and selected as a part of its 2012-2013 PAA and PMF.Footnote 9

The audit found that CGAD had documented some of the PAA and PMF process through its Critical Path. However, the Critical Path is a corporate-level document, and is not sufficiently granular to address the needs of all stakeholders. The process enables CBSA's Program Areas to choose indicators on an annual basis; however, there is no overarching document that gives direction and guidance on the process; roles, responsibilities and accountabilities; and timelines for indicator selection at all levels of the Agency. The audit noted confusion about key roles and responsibilities, expectations and timelines. 

The Agency also produces an internal quarterly Agency Performance Summary (APS) to measure approximately 60 performance indicators, at a lower-program level, as part of the approved PAA/PMF. The APS is an executive summary of the program performance report and includes a combination of graphs, data, and analysis. The audit observed that the Agency uses the same data for both the APS and the DPR, thereby implementing some efficiency into the process.

As the PAA and PMF are very important for the Agency's external (e.g. DPR) and internal (e.g. APS) accountably reporting, it is expected the Agency's head of the evaluation function be consulted on the Performance Measurement Framework (PMF) embedded in the Agency PAA, as required by the Treasury Board's Policy on MRRS and Policy on Evaluation.

The audit found that CGAD did not seek and obtain advice from the Agency's head of the evaluation function for the 2012-2013 PAA and PMF. During the audit, CGAD was informed of this requirement and obtained advice from the head of the evaluation function on its 2014-2015 PAA and PMF. CGAD must continue to seek and obtain advice from the evaluation function on all subsequent PAA/PMF.

8.2.2 Roles, Responsibilities and Accountabilities

The audit found that broad roles, responsibilities and accountabilitiesFootnote 10 of Programs, Operations and Corporate Affairs branches for Agency performance measurement and management have been defined and documented. There is, however, a need to provide further clarity and guidance to key stakeholders at a more detailed level to reduce the risk of confusion and duplication.

The Treasury Board of Canada Secretariat (TBS) initiates the DPR reporting process by issuing a call letter. Within the Agency, the DPR process is led by and coordinated through CGAD. In support of this process, CGAD has documented some key timelines and the approval process through a Critical Path document. Once the TBS' call letter is received, CGAD requests all Program Areas produce the performance data needed for the DPR reporting. In the Programs Branch, the request is coordinated through SRMD.

Depending on the Program Area, the recipient of the call letter, the manner in which the information is disseminated, and the source of the data differs. As per interviews and walkthroughs, some Program Areas have the capacity to produce their own performance data for DPR reporting. Other Program Areas have no such capacity and, consequently, rely on the Performance Reporting Unit (PRU) of the Operations Branch to produce their performance data.

Based on the informal process in place, all Program Areas review, challenge and approve performance data for their respective programs before sending them to CGAD. In the Programs Branch, Program Areas are required to send performance data through SRMD, which reviews and challenges performance data received. Once satisfied, SRMD sends the performance data to CGAD for further review and challenge prior to inclusion in the DPR.

Overall, interviewees who were asked about the roles of CGAD and SRMD indicated confusion about the roles and responsibilities of each directorate and perceived a duplication of effort between these two. This duplication added an extra layer to both the performance indicator selection and DPR processes; and, consequently, Program Areas were provided with less time to develop, extract, review, challenge and approve their performance indicators and data for DPR reporting. Although both Directorates have been mandated to provide horizontal coordination (CGAD at the enterprise level, and SRMD at the Programs Branch level), a detailed description of their respective roles with respect to the process for producing the indicators in the DPR has not yet been developed or communicated.

Without clearly defined and documented roles, responsibilities, and accountabilities of every key stakeholder (such as, but not limited to, CGAD, SRMD, Program Areas, and PRU), the Agency is at risk of having inefficient and ineffective performance indicator selection and DPR reporting processes.

8.2.3 Review and Challenge of Performance Results

Appropriate review, challenge, and approval of the performance data included in the DPR are essential to support the reliability of the performance data reported. Clear policies, directives, and guidelines help to ensure that the quality of the DPR performance results is reviewed in a standardized manner.

As with the PAA and PMF process, the audit found that CGAD has documented some key challenge and approval processes through its Critical Path. However, the Critical Path is not sufficiently granular to address the needs of all stakeholders. Consequently, the Program Areas and PRU have developed their own informal processes to review, challenge and approve the DPR performance results.

The review and challenge process that is occurring in the Program Areas and the PRU was described and understood in the same way by both interviewees and walkthrough participants. The process is based either on trend analysis, subject matter expertise in the data, or actual double data extraction, but it is not formally defined or documented. In addition, there is no document that gives direction and guidance on the review and challenge process, timelines and roles, responsibilities and accountabilities at all levels of the Agency. The audit also noted confusion about the roles of CGAD and SRMD with regard to review and challenge, and the requirements for challenge and approval of performance results produced for the DPR. Without clear direction, the Agency is at risk of presenting inconsistent or inappropriate performance results in its DPR.

Recommendation 2:

The Vice-President of Corporate Affairs, in collaboration with Programs and Operations branches, should clearly define and document the roles, responsibilities and accountabilities of all key stakeholders in support of performance indicator selection and departmental performance reporting processes.

Management Response Completion date
The Corporate Affairs Branch agrees with this recommendation. The Corporate Affairs Branch will address the audit finding concerning the need to further clarify the roles, responsibilities and accountabilities for Agency performance measurement and management by finalizing the draft CBSA Performance Measurement Policy. This policy clearly defines and documents the roles, responsibilities and accountabilities at all levels of the Agency and provides guidance on performance measurement to support the Agency's compliance with the Treasury Board Secretariat Policy on the Management, Resources and Results Structures (MRRS) and other related policy instruments. September 2014

8.3 Integrity of Performance Data

Audit Criterion:

  • Controls and processes are in place to ensure performance results presented in the 2012‑13 DPR were reflective of the data collected by the Agency and substantiated with evidence-based information.

The Agency has seven programs structured to support the achievement of its mandate and strategic outcome. In the 2012-2013 DPR, the CBSA reported on the performance of its seven programs through 14 performance indicators, at the program level, developed and selected as part of its 2012-2013 PAA and PMF.Footnote 11

Of the 14 performance indicators, the Agency's 2012-2013 DPR did not contain performance results for two performance indicators: for the Risk Assessment program, the percentage of threats that led to a result; and for the Secure and Trusted Partnerships program, the percentage of trusted trader passages out of all passages. The DPR indicated that this was due to systems constraints.

Furthermore, the 2012-2013 DPR had no performance indicator in place to measure the performance of one of its seven programs; namely, Revenue and Trade Management program. The expected result of this program is, "duties and taxes owed to the Government of Canada are collected in accordance with trade policies."Footnote 12 Instead of reporting on compliance with this expectation, the Agency reported the total duties and taxes collected in 2012-2013: $25.6 billion.

For the remaining 11 performance indicators, the Agency measured and reported on their performance in the 2012-2013 DPR. It was expected that data would be produced using standardized documented procedures. The audit found that documented procedures were not developed for the 11 performance indicators. The 2012-2013 DPR results were produced by using informal, undocumented protocols to extract data and perform manual calculations. All DPR results were reviewed and challenged informally by the Program Areas, the PRU, SRMD and CGAD.  

Interviewees consistently stated that the data produced for the DPR were the best data available to the Agency, given its systems constraints. However, due to the lack of data production procedures for the 2012-13 DPR and the use of informal protocols, the audit team was unable to replicate the data production processes to reproduce the performance results in the 2012-2013 DPR.

Finally, the Agency did not substantiate the performance results produced for the 2012-13 DPR. Without appropriate substantiation, it could not be demonstrated that the performance information presented in the 2012-2013 DPR was appropriate and reflective of what is captured in the Agency's systems.

The absence of standardized and documented data production procedures and substantiated DPR results exposes the Agency to the risk that the DPR results are insufficient or inappropriate for decision-making, priority-setting, accountability reporting and year-to-year comparability.

Recommendation 3:

The Vice-President of the Corporate Affairs Branch should:

  • Coordinate the development of data production procedures by all data owners for all performance indicators included in the Performance Measurement Framework; and
  • Document the validation process for all performance results included in the Departmental Performance Report.
Management Response Completion date
The Corporate Affairs Branch agrees with this recommendation. The Corporate Affairs Branch will address the audit finding on the absence of documented, standardized data collection, extraction, analysis and reporting of the performance measurement information used to feed the RPP and DPR by establishing a director-level working group (Agency-wide) that will oversee the development of standard operating procedures (SOPs) and appropriate tools. October 2014

Appendix A – About the Audit

Audit Objectives and Scope

The objective of this audit was to provide assurance that the Agency has adequate processes and controls in place to ensure that performance indicators are appropriately selected and that performance results presented in CBSA's 2012-2013 Departmental Performance Report are reflective of the data collected by the Agency.

The audit examined the processes and controls that the Agency has in place to ensure that performance indicators are appropriately selected and performance results presented in the 2012-13 DPR were reflective of the data collected by the Agency. 

The audit did not provide assurance on the relevance of performance indicators presented in the 2012‑2013 DPR (i.e. whether the "right" things are being measured); the audit also did not examine whether the Agency's performance management framework is properly implemented; finally, the audit did not examine the financial information presented in the 2012-2013 DPR, published as part of the Public Accounts.

Risk Assessment

The risk assessment conducted during the planning phase identified a key risk: If the Agency's processes and controls are not functioning as intended to ensure the quality of its performance data presented in the DPR, there is a risk that the Agency could not demonstrate, in a meaningful and reliable manner, the achievement of its mandate to Parliament and the public.

Approach and Methodology

The examination phase of this audit was performed using the following approach:

  • Review legislation, policies, procedures, guidelines, reports, performance information, and other relevant documentation;
  • Review the process for establishing and selecting the performance measures for accountability reporting;
  • Assess controls and quality assurance processes to ensure that quality, appropriate, and standardized information is provided in public accountability reporting;
  • Examine standard operating procedures in place for a consistent data production for DPR reporting;
  • Select some performance data presented in the DPR 2012-2013 for detailed walkthroughs; and
  • Conduct in-depth interviews with various stakeholders within the Corporate Affairs, Programs, and Operations branches.

Audit Criteria

Given the preliminary findings from the planning phase, the following criteria were chosen:

Lines of Enquiry Audit Criteria
1. Data Governance
  • 1.1 The Agency has data governance processes and controls in place to support consistent production of data that meet the reporting needs of the DPR.
  • 1.2 The Agency has adequate monitoring controls in place for ensuring data integrity and taking corrective measures as needed.
2. Performance Data Selection and Reporting
  • 2.1 The Agency has adequate processes and controls in place to ensure that performance indicators and measures for the DPR (aligned with the RPP) are appropriately developed, selected and reviewed on a periodic basis and updated as required.
  • 2.2 Roles, responsibilities, and accountabilities of key stakeholders have been clearly defined and communicated.
  • 2.3 The Agency has appropriate review, challenge and approval processes in place to validate the quality of performance data reported in the DPR.
3. Integrity of Performance Data
  • 3.1 Controls and processes were in place to ensure performance results presented in the 2012-13 DPR were reflective of the data collected by the Agency and substantiated with evidence-based information.

Appendix B – List of Acronyms

Agency or CBSA
Canada Border Services Agency
APS
Agency Performance Summary
CGAD
Corporate Governance and Accountability Directorate (Corporate Affairs Branch)
DPR
Departmental Performance Report
ERP
Enterprise Risk Profile
MRRS
Management, Resources and Results Structures
PAA
Programs Alignment Architecture
PMF
Performance Measurement Framework
PRU
Performance Reporting Unit (Operations Branch)
RPP
Report on Plans and Priorities
SRMD
Strategic Risk and Modernization Directorate (Programs Branch)
TBS
Treasury Board of Canada Secretariat

Footnotes

Footnote 1

TBS (2013) Expenditure Management, RPP, http://www.tbs-sct.gc.ca/rpp/index-eng.asp

Return to footnote 1 referrer

Footnote 2

TBS (2013) Expenditure Management, DPR, http://www.tbs-sct.gc.ca/dpr-rmr/index-eng.asp

Return to footnote 2 referrer

Footnote 3

TBS, Guide to the preparation of Part III of the 2012–13 Estimates: 2012–13 Departmental Performance Report, Introduction, 3rd paragraph, Page 1.

Return to footnote 3 referrer

Footnote 4

Data Governance Institute, http://www.datagovernance.com/adg_data_governance_definition.html

Return to footnote 4 referrer

Footnote 1

Gartner Research, IT Glossary, http://www.gartner.com/it-glossary/information-governance

Return to footnote 5 referrer

Footnote 6

CBSA (2013) Draft Enterprise Data Governance: Strategy Document

Return to footnote 6 referrer

Footnote 7

Program Areas refer to Directorates having the lead roles, responsibilities and accountabilities for designing, planning, monitoring and reporting of the CBSA programs. There are four Program Areas within the Programs Branch: (1) Intelligence and Enforcement Directorate, (2) Pre-Border Programs Directorate, (3) Border Programs Directorate, (4) Post-Border Programs Directorate. To simplify this report, Program Areas refer as well to the Recourse Directorate of the Corporate Affairs Branch.

Return to footnote 7 referrer

Footnote 8

CBSA's seven programs are (1) Risk Assessment, (2) Secure and Trusted Partnerships, (3) Admissibility Determination, (4) Criminal Investigation, (5) Immigration Enforcement, (6) Recourse, and (7) Revenue and Trade Management. These programs, combined with their respective sub-programs, form the CBSA's Program Alignment Architecture (PAA) (for more information on the PAA, please refer to the CBSA's DPR 2012-2013).

Return to footnote 8 referrer

Footnote 9

The Agency has 14 performance indicators at the program level in place to measure and report performance in the 2012‑2013 DPR. The Agency has numerous other performance indicators, at lower-program level, in place to measure performance of its sub- or sub-sub-programs. Taken together, these performance indicators form the Agency's PMF.

Return to footnote 9 referrer

Footnote 11

Responsibility is defined as the duty to perform, and accountability as the obligation to report on the fulfillment of the responsibility (source: OCG, 2011, Audit Criteria related to the Management Accountability Framework)

Return to footnote 10 referrer

Footnote 11

The Agency has 14 performance indicators at the program level in place to measure and report performance in the 2012‑2013 DPR. The Agency has numerous other performance indicators, at lower-program level, in place to measure performance of its sub- or sub-sub-programs. Taken together, these performance indicators form the Agency's PMF.

Return to footnote 11 referrer

Footnote 12

CBSA, 2012-2013 Departmental Performance Report

Return to footnote 12 referrer

Date modified: