the WHITE HOUSEPresident Donald J. Trump

Search form

Circular A-129 Appendix D

Appendix D: Effective Reporting for Data-Driven Decision Making

Introduction

Effective data reporting as required under Section I.D.9 of this Circular is necessary to support strong credit program management and oversight. Effective reporting also provides parties at all levels of the organization with accurate, timely information on program performance, early warning of issues that may arise, and analytics to drive decision making. Agencies should coordinate reporting parameters with OMB to make sure the unique needs of each program are addressed. This includes the parameters for the quarterly summary reports of program performance that agencies are required to provide to senior level officials and OMB per Section III.B.2.a of this Circular.

The nature of a program’s reporting will depend on its specific policy goals, key risks and cost drivers, operational structure, and other program-specific factors. However, there are common objectives that reporting structures must meet. This appendix provides an overview of key objectives and considerations for effective data reporting, descriptions of common types of reports, and some examples of program reports.

Key Objectives and Considerations for Effective Reporting

Reporting systems should convey necessary information to the appropriate people in a timely fashion to make sure that decision makers and parties at all levels of the organization understand program performance, and other information needed to proactively manage the program. Reporting should facilitate the flow of information and discussion across the organization, and the escalation of emerging issues to the appropriate level, including senior officials within the program agency or other Federal stakeholders. While reporting should be tailored to programs’ particular circumstances, it should provide concise, standardized reporting of key information, trends, and findings, including material areas of divergence between projected and actual performance. Reporting systems should also have sufficient flexibility and adapt as necessary to new developments and emerging issues.

While effective program reporting will vary across agencies and programs, it should cover all aspects of program performance. Key objectives of effective reporting include but are not limited to:

  • Targeted Reporting. Reports should be tailored for the audience by including data that is relevant to the authorities and responsibilities of that audience. Reports for program staff should generally be more frequent and detailed than reports to senior leadership or other Federal stakeholders.Findings and Proposed Actions. Strong reporting clearly conveys the main message of the data—whether program performance is in accordance with expectations and where there are emerging issues. Graphics, tables, and trend analysis that compare performance over time and against expectations and other information can provide critical context for understanding program performance. Where appropriate, the reporting may also include brief explanations of significant findings that offer insight into actions that could be taken to respond to improve program performance.
  • Policy Goals. Reporting should cover performance indicators, typically outcome-based metrics that track the program’s progress toward achieving policy goals. For example, in a loan guarantee program that seeks to address a market failure due to lack of private market experience, the performance indicator reporting may include tracking borrower graduation to fully-private financing.
  • Portfolio Performance Risks. Agencies should identify programmatic and financial risks, and report information the agency needs to manage risks. This may include reporting on portfolio concentrations in geographic or technology areas, market risks that can affect credit performance, or other external factors that can affect the program such as market shifts. Reporting may include scoring or rating systems for agencies to segment and evaluate program performance. For example, an agency may develop a credit scoring approach for portfolio loan programs to identify portfolio segments at higher risk of default to target loss mitigation actions, or segments where a lower degree of subsidy may be sufficient to achieve policy goals. Similarly, for an infrastructure loan program a risk rating system can be used to evaluate the overall riskiness of a portfolio across diverse transactions, and individual deals in the context of the program’s established policy goals and risk thresholds.
  • Administrative Risks. Reporting should cover administrative risks specific to the program, including operational risks. This may include trend reporting on costs of origination, servicing, and managing the portfolio, or reporting on any operational interruptions. It may also include the status of key contracts (such as servicing or collection) to make sure that they do not expire.
  • Special Reporting. Where a certain function, loan, or loan type that merits greater management attention is not covered by existing reports, a program may need to develop new reports to make sure program staff and leadership are appropriately informed.
  • Common Types of Reports
  • A reporting system will generally be made up of a variety of types of reports. The following are common types of reports:
  • High-level Dashboards. Such a dashboard should use quantitative and qualitative information to summarize performance in meeting policy goals and address key risks. The dashboard should generally be no more than a few pages and should be tailored for the program’s characteristics. Key statistics should include relevant information on program activity, current performance trends, and forward-looking indicators of risk. In addition, it should include a high-level qualitative discussion noting areas that merit increased management focus. For example, a program that provides a small number of large loans might have a dashboard that provides metrics on the performance of the existing loans, a list of loans that are at heightened risk of financial distress, a summary of the status of current applications, and metrics on external factors such as particular market trends that may affect the portfolio’s health. Section III.B.2.a of this Circular requires programs to provide high-level summary performance data to agency leadership and OMB, in a dashboard or similar form. The dashboard should be distributed at least quarterly, but more frequent reporting may be required.
  • More Detailed Dashboards.Program staff will likely need more frequent and detailed reporting to effectively manage their areas of responsibility. The frequency, content and nature of these reports will depend on the particular program and the responsibilities of the report’s recipient, though reports should be automated and generated using the same data that feeds into high-level dashboards. For example, a servicing manager might have a weekly report with payment processing, loan status, or other statistics that affect the servicing operations, whereas a budget analyst may receive quarterly aggregated information necessary to better project future costs.
  • Pipeline Reports. Programs originating new loans may wish to develop reports providing key information on open applications, including characteristics, quantity and timing. For example, a credit program that provides small business financing may have a daily report tracking current year originations against available appropriations authority and against historical averages. An infrastructure program may have a pipeline report that tracks the requested loan amount, project type, application status, and other characteristics for each application, against program policy targets.
  • Watch Lists.
  • Program Operations Reports. Programs may develop reports on their operations that could include metrics on customer service, employee productivity, and administrative spending. In addition, programs that are being established or are facing significant operational changes may wish to report on the progress towards the implementation plan milestones and risks. For example, a program that provides many small loans and is focused on customer service might have a monthly dashboard for program management that provides information on call center performance (such as a breakdown of call time, dropped calls, reasons that individuals call, and accuracy of information provided), community outreach (such as number of people attending sessions providing loan introduction, or communications that result in successful applications), and overall applicant satisfaction (such as customer service scores, or complaint resolution metrics).

Examples of Reports

Although reports will vary based on program characteristics, each report should be designed for a clear purpose. In designing specific reports, agencies should be guided by the answers to the following questions.

  • Who is the target audience, and why do they need this report?
  • What information does the target audience need to effectively identify issues, provide input, or take action?
  • How can the information be best conveyed? How detailed should the information be? Should the information be presented quantitatively and/or qualitatively? Are historical trends needed?
  • When should the relevant information be reported to the target audience so that they can take appropriate action?
  • How often does the target audience need the report?

The Attachment to this Appendix includes three example reports for a hypothetical loan infrastructure program: a high-level dashboard; a pipeline report; and a watch list.  Each report is targeted towards senior leadership and other high-level decision makers. The program has been operating since fiscal year 2006 and mainly provides financing for new construction and renovation projects.  A central policy objective of the program is to support economic development through financing projects in underserved communities such as high-poverty or rural areas, where borrowers have difficulty obtaining financing on reasonable terms. Awards are made through a competitive process.

10314

10329

 

10342

 

10358