Español

Insights on Effectiveness

Aligning the Bank’s Work to the Needs of its Partners

Country strategies are the roadmaps that guide the IDB’s engagement with each borrowing member country, setting the strategic objectives and expected outcomes that will guide the Bank’s operational support as mutually agreed upon between the borrowing member country and the IDB. They are typically designed for a period of four to six years, depending on the political cycle of the country. Whenever a new national government is elected, a comprehensive dialogue with the new authorities, the private sector, civil society, and academia is launched and sustained. The process to prepare country strategies starts with the diagnostic of the most binding constraints to development and growth, which is used to inform the dialogue with the incoming government and the potential areas that will be prioritized in the country trategy as agreed with the parties involved. This process also contributes to stronger coordination between the IDB and the borrowing member country, as well as with other bilateral and multilateral organizations. This, in turn, allows for the systematic identification of synergies and helps avoid duplication of efforts. From 2012 to 2015, the Board of Directors approved 22 country strategies.

Figure D1
Operational Effectiveness and Efficiency Indicators: Country Strategy Effectiveness

Strategies are carefully tailored to help the countries address the development challenges they face in the areas where the IDB can provide the greatest added value taking into consideration the government’s priorities. As shown in Figure D1, all targets regarding the effectiveness of country strategies were surpassed with respect to the evaluability dimensions established in the Country Strategy Development Effectiveness Framework and the other aspects that are validated at the end of the country strategy design phase before its submission to the Board.

Country strategies also support accountability to stakeholders in the member countries (both borrowing and non-borrowing), and strengthen the vertical logic of a good number of IDB-supported interventions. From 2012 to 2105, 85 percent to 90 percent of sovereign guaranteed loan operations were aligned with at least one strategic objective identified in the corresponding country strategy.

Development Effectiveness Matrix (DEM) Highlights

As with country strategies, all projects supported by the IDB are also assessed before approval for their potential to be evaluated for development results once completed.

As part of a number of process improvements coming out of IDB-9, the Board of Executive Directors mandated in 2011 that all projects reach a minimum score of five on the Development Effectiveness Matrix (DEM) before being submitted for approval (see Chapter 2 of the 2014 DEO to learn more about the DEM). This means that beginning in 2011, all projects had to meet this satisfactory evaluability score before Board consideration, and hence, the targets for indicators 4.2.1 and 4.2.5 – regarding the evaluability of SG and NSG loans respectively – have been fully met.

As Figure A.3, “Development Effectiveness Matrix Scores by Evaluability Assessment Category” in appendix A shows, average DEM scores for both the Bank’s sovereign guaranteed and non-sovereign guaranteed loan portfolios remained consistently high over the 2012–15 period. Higher evaluability standards are generating a growing number of project evaluations. These evaluations, in turn, are helping to close knowledge gaps in the Region. On average 44 percent of approved projects included an impact evaluation plan at approval during the 2012–15 period (see Figure 3.1 in Chapter 3), and an average of 46 sovereign-guaranteed projects had an impact evaluation over the same period.

The Bank’s refinement of the development effectiveness toolkit for non-sovereign guaranteed loan operations allowed for an increase in the number of impact evaluations for these projects as well. Furthermore, an NSG evaluability checklist developed in 2014 introduced the question of whether evaluation results and/or relevant lessons learned from similar projects were reflected in new loan proposals. This has provided an incentive for project teams to refer to the self-evaluation results of similar past projects, and consider the lessons they present.

Figure D.2
Operational Effectiveness and Efficiency Indicators: Loan Effectiveness

Progress Monitoring Report (PMR) Highlights

The Progress Monitoring Report (PMR) is used to rate the execution of the Bank’s sovereign guaranteed loan operations. Once a project is approved and starts to be implemented, it is critical to monitor its progress to ensure that the project remains on track with respect to its objectives. The Bank’s performance classification as of the end of 2015 showed that 69 percent of projects were in satisfactory status, 14 percent were in alert status, and 16 percent were in problem status.10

Analysis of the individual factors that influence the classification of projects yields a number of insights. First, the proportion of projects experiencing delays after approval and before implementation is higher before the project goes into effect legally than after that point when the operations are eligible for disbursements.

Second, once projects are being implemented, the most common problem relates to delays in delivering outputs as originally planned. The main issues affecting the overall management of projects are changes in administration and/or national priorities, and capacity constraints in implementation.

Third, the percentage of operations in satisfactory status in 2015 (69 percent) was the same as it was in 2014. However, the figure marks a decline from the 75 percent of operations classified as satisfactory in 2013, the first year when operations were evaluated according to the revised PMR methodology. Although the difference can be attributed in part to actual performance factors, it also relates to measurement effects. Because older operations in the portfolio had less data than newer ones, under the new methodology, those operations were classified using only three performance indicators. Newer operations have more available data, and thus are classified using five performance indicators.

Since its conceptualization, the PMR and the system that supports it have undergone several enhancements (see Box 4.2). Methodologically, it has become a more rigorous tool with the inclusion of quantitative variables specific to the stage in the life-cycle of operations. When they are combined with a qualitative analysis, these quantitative variables help provide a more comprehensive assessment of a project’s performance.

The current review and validation process of the PMR incorporates all relevant parties in a project’s assessment, including specialists, team leaders, division chiefs, chiefs of operations, and country representatives. This makes the PMR a powerful decision-making tool for project teams and management alike.

Project Completion Report (PCR) Highlights

The performance of SG loan projects upon completion can also be assessed through the project completion report (PCR), which is another important component of the Bank’s Development Effectiveness Framework (DEF). Historically, an average of 80 PCRs were produced each year that assessed how effective IDB-supported projects were in delivering their expected outputs and reaching their development goals. The 2012–15 period has seen a sustained improvement in the percentage of projects with a satisfactory rating on development results upon completion, which at 88.7 percent has exceeded the target of 60 percent set for 2015 (Indicator 4.2.4 in Figure D2).

The methodology to prepare PCRs was modified in 2014, to be more evidence-based, objective, analytical, and more in line with efforts to further improve monitoring and evaluation through the DEF (see Chapter 4 of the 2014 DEO for details about the new PCR methodology). As shown in Figure 4.1, only seven PCRs were completed under the new methodology in 2015. This decrease in the number of PCRs relative to previous years is explained by the fact that as per the Bank’s PCR guidelines, the preparation of the PCR began when a project’s disbursements reached 95 percent. This led project teams to request extensions to start preparing PCRs when disbursements reached 100 percent.

Although its application has been slower than expected, the results coming out of the new PCRs appear to be promising. The quality and depth of analysis of these completion reports have improved noticeably. PCRs written under the new methodology provide more robust evidence of the IDB’s contributions to development results, and findings and recommendations from project design and implementation. Figure 4.1 provides the list of 2015 PCRs prepared under the new methodology.

Figure 4.1
Project Completion Reports (PCRs) Approved

Self-evaluation of IDB-supported private sector operations has been in place since 2005 following the Evaluation Cooperation Group’s Good Practice Standards (ECG-GPS) for these types of projects. Since 2005, completion reports for 64 projects have been prepared based on the Expanded Project Supervision Report (XPSR).

The Bank has worked closely with the Office of Evaluation and Oversight (OVE) to redesign the self-evaluation guidelines for NSG projects, to embrace a higher standard and to allow for greater harmonization between NSG and SG methodologies. The new evaluation framework has been applied to the latest evaluation exercise on a pilot basis and is expected to be operational by mid-2016.

Effective Safeguards: A Vehicle to Ensure the Sustainability of IDB Operations

Thanks to the Bank’s enhanced focus on sustainability and on project supervision, 89 percent of SG projects and 91 percent of NSG projects identified at approval as having high environmental and social risks were rated satisfactory in the implementation of mitigation measures during execution, exceeding the corresponding CRF target of 85 percent by 2015 (Indicators 4.2.2 and 4.2.6 in Figure D2).

This means that projects that were originally deemed highly risky from an environmental or social standpoint have benefited from the Bank’s thorough application of safeguards, which have often led to improved sustainability outcomes for the affected populations and/or ecosystems (see Box 4.2).

As part of a new three-year strategic plan for environmental and social safeguards, the Bank will ensure that medium-risk projects are also rated on their implementation of mitigation measures.

Development Results through Technical Cooperation

From 2012 to 2015, significant improvements were made in the evaluability of the Bank’s Technical Cooperation (TC) operations, especially after the inclusion of results matrices in the design of TC operations became mandatory in the TC Operational Guidelines in 2011.

The percentage of TC operations with results that can be validated in 2015 reached 91 percent, and that figure increases to 96 percent when considering only those TC operations approved after implementation of the 2011 guidelines.

Improvements in design have also led to better results, with 73 percent of TC operations that were completed in 2015 achieving satisfactory results, well above the 65 percent target established.

However, reporting on TC results has been somewhat of a cumbersome process because the Bank lacks a systematized and automated TC monitoring and reporting platform. The absence of such a systematic reporting methodology has made it challenging to reach robust and comprehensive conclusions about the effectiveness of the 1,777 TC operations approved during 2012–2015.

To address this, the IDB has finished the development of its first automated Technical Cooperation Monitoring and Reporting System (TCM), which is expected to be fully operational in the second half of 2016. The TCM was originally designed as an independent, stand-alone module. However, once the Bank decided to start implementing the integration of all IDB systems into one platform (i.e. “Convergence”), the TCM’s original design had to be slightly modified. A more comprehensive system is now in place through which the main stages of the technical cooperation operations’ life cycle (identification, preparation, approval and execution) have been integrated more efficiently for users’ benefit.

Figure D3
Operational Effectiveness and Efficiency Indicators: Technical Cooperation