Despite recent internal reforms, new evidence raises questions about bank quality control. In direct contrast to IDB management claims, the OVE evaluation found the actual evaluability of projects has deteriorated markedly when measured using a consistent methodology applied to three cohorts of projects.
A recent evaluation by Inter-American Development Bank Office of Evaluation and Supervision (OVE) reviewed every single one of the 160 projects approved by the Bank in 2009 to assess management claims in the 2010 Development Effectiveness Overview that project design quality has increased in recent years.
In direct contrast to IDB management claims, the OVE evaluation found the actual evaluability of projects has deteriorated markedly when measured using a consistent methodology applied to three cohorts of projects: those of 2001, 2005 and 2009. In 2009, OVE found that 88% of the approved projects had no adequate evaluability dimensions.
The internal dispute caused by the OVE evaluation over the extent of persistently poor project quality and how best to remediate this problem has dragged on for months before the Bank disclosed the long delayed report in early February.
To read the full OVE report in English, see Evaluability Review of Bank Projects 2009 (RE-379).
Accountability for Capital Increase Mandate to Strengthen Effectiveness
The motivation for emphasizing project quality was the very low levels of evaluability that OVE observed in prior evaluations of IDB projects, which in turn has prevented the IDB from demonstrating evidence of positive, sustainable development results.
The IDB strives to uphold high standards of quality while meeting the diverse and often unpredictable demands of a variety of clients in the region. Evaluability is one of the bedrock guarantees of knowing what effects the Bank’s loans and technical assistance are having and will have in the future. Without adequate evaluability, the IDB will have less assurance of delivering results – another persistent and well documented challenge for the Bank.
Evaluability typically involves whether a proposed operation is designed with:
- a good identification and diagnosis of the problem to be solved
- a good explanation of why the proposed operation is the best solution to the problem
- adequate treatment of the associated assumptions and risks
- clear definition of project objectives
- a robust monitoring and evaluation framework that provides good output and outcome indicators, baselines and methods
On the eve of the capital increase decision in March 2010, IDB management reported a significant improvement in evaluability based on the results of its own measure of evaluability – the DEM, for the 2009 project portfolio.
As part of the IDB-9 capital increase negotiations, Bank Governors issued clear and detailed instructions which place evaluability assessments at the heart of the project review process:
Governors endorse a further strengthening of the Operations Policy Committee (OPC) and the programming process, by the President of the Bank and Senior Management, to ensure that projects meet minimum evaluability thresholds. In this respect, Management will amend operational procedures, by end of Q3 of 2010, according to the following criteria: (i) all SG and NSG projects must be rated for evaluability; (ii) the evaluability score includes only the dimensions of evaluability of the DEM; (iii) SPD will support teams in meeting evaluability standards from project profile to project proposal, and will validate the final evaluability score for OPC consideration: RES will review the existing methodologies for scoring evaluability to determine any required improvements; OVE will report annually to the Board of Executive Directors on project evaluability (ex-ante), as well as validate achieved results in completed projects (ex-post): (iv) a minimum evaluability threshold of 5 will be required for all operations to be submitted to the Board of Executive Directors (GCI-9 Report on the 9th Capital Increase, May 2010)
A DEM score of 5 came to be defined as the minimal threshold for adequate project evaluability, a threshold that was subsequently incorporated into the GCI-9 agreement. The 2010 DEO suggested a dramatic positive increase in project evaluability – using the DEM as the basis for the claim that 75% of the 2009 projects had scores of 5 (partially satisfactory) or higher compared to 26% in 2008.
“Overall, DEM scores improved between 2008 and 2009. On a scale of 1 to 10, 10 being the highest, the simple average of the seven dimensions measured was 4.05 while the median was 3.76 in 2008; in 2009, the mean was 5.53 and the median 5.61. In terms of core DEM components, there were significant improvements in program logic from 5.39 to 6.74, in monitoring and evaluation from 4.00 to 5.00 and in ex-ante economic analysis from 1.99 to 3.96. The ratings also improved in country strategy development objectives. In sum, although values are low and need to be improved, there was a measurable increase in 2009.” (DEO, 2010: 32)
The perception of project quality improvement was driven home by IDB management with the proposed Global Capital Increase institutional strategy results framework that indicate a dramatic leap in quality upon entry of Bank projects (measured by the percent of projects scoring above a 5 on the DEM 10 point scale) from 26% in 2008 to 76% in 2009.
This positive improvement in project design quality was internded to underscore the Bank’s commitment to delivering and reporting results, which has been a systematic problem for the IDB.
New Evaluation Raises Questions About IDB Capacity to Ensure Project Quality
The new Bank evaluation, “Evaluability Review of Bank Projects 2009,” examines management claims of quality improvements. The Evaluability Review by OVE, an independent evaluation office, represents a periodic tracking of IDB systems to guarantee high quality operations that in turn will permit a higher probability of knowing the results of Bank investments. Evaluability refers to the ability of an intervention to demonstrate in measurable terms the results it intends to deliver. OVE has tracked IDB project evaluability by reviewing each approved IDB project in 2001, 2005 and in the record level of lending in 2009 (160 projects for nearly $16 billion).
In 2005, OVE found significant problems with the quality of Bank operations in terms of weak to non-existent evaluability. In part motivated by problems observed by OVE in the last Evaluability report (2005), IDB President Luis Alberto Moreno ushered in a profound organizational realignment, emphasizing a commitment to results and project quality as a guiding principle. The Realignment Document cited by OVE, although not publicly disclosed, validated the perceptions of poor project quality:
” The analysis of current practices has identified a pervasive confusion between the roles of quality control and quality enhancement as a key problem in the Bank’s present structure and procedures. […] This “cohabitation” of quality control and operational support responsibilities generates a cascade of undesirable consequences: accountability of the operational units for project quality is diluted; the independence of the quality control unit is compromised by their subordination to the pressures of project approval and disbursement; the process of reaching a consensus between operational and safeguards units is very time consuming and focuses on changing document texts rather than on risk management on the field”. (IDB Realignment Documents, 2006, GA-230, §5.6 and § 5.7, GA-232).
Project quality standards were promised in the 2006 Bank realignment. Furthermore, Bank management and governors agreed to a new institutional results framework that set a target for the percentage of projects with satisfactory evaluability at 85% by 2015, starting from a low baseline of 27% (GCI-9, Annex A)
Yet, despite the creation of a special new department (Office of Strategic Planning and Development Effectiveness, SPDE) and new methodologies (a project quality screening tool called the Development Effectiveness Matrix – DEM), OVE found that in 2009, project evaluability at the IDB had deteriorated even further than already low 2005 levels.
OVE argues that the SPDE DEO findings of high evaluability are strongly at odds with OVE evaluability assessments. Fig. 4.2 illustrates the divergence between OVE and SPD assessments of a critical dimension of project quality – problem diagnosis and project logic. OVE scores of 1 & 2 represent highly unsatisfactory and unsatisfactory evaluability in this area, which 1, 2 and 3 represent the same for SPD. The results are mirror images of each other.
For OVE, the underlying issue is whether project teams recognize the need to specify a problem to be solved or how the project represents the best option to solving that problem. While SPD endorses weak justifications for many projects (according to OVE) this may also reflect that lack of space to actually change predetermined projects identified by clients irregardless of the IDB’s knowledge about the sector or country.
“The large amount of lending in 2009, coupled with the evaluability deficiencies documented above, creates a portfolio which will be difficult to evaluate in the future. Projects for which the relationship between problems and solutions are not clear, projects with poor problem diagnostics, and projects with inadequate sets of indicators and monitoring frameworks will limit what the Bank can learn regarding development effectiveness from its main asset: its approved portfolio.” (10)
In general, OVE found that the DEM has strong bias toward positive findings, due in part to the over-reliance on binomial indicators for what are inherently multinomial and qualitative categories, the frequent lacks evidence and unreliability in matching evidence to final scores. Figure 4.1 summarizes the disparity found by OVE between DEM scores and an independent assessment of the evidence of project quality. In some areas, close to three fourths of Bank DEM scores are inflated.
Moreover, OVE found that the DEM screening tool as well as the internal quality control process were severely compromised by possible conflicts of interest (something singled in earlier evaluations, purportedly addressed in the 2007 organizational reforms, but persisting as a problem at the IDB today.
The report found that the Bank’s oversight system is not considering evaluability in its review processes. Through a review of meeting minutes, OVE shows that key decision points in the project review process by Bank managers are ignoring quality analysis and control. Observed participation in these quality control check points in the project preparation stages was found to be low in general, uneven among relevant Bank departments, and ultimately “ineffective as a mechanism of systematically assessing project evaluability.” [1] Of the 160 projects in 2009, only 10 were discussed by the Operations Policy Committee – the highest management instance of quality control, which however produced almost no discussion of evaluability issues or the DEM.
Identification and intention to solve some development problem is a weakness, particularly with NSG projects. Private Sector projects have also diverged from OVE standards for quality by adopting a DEM screening methodology that excludes much of the public sector focus on project logic, risk management and monitoring & evaluation. The NSG loan documents now lack logical frameworks and results matrix that preclude the definition of “specific development objectives or logical linkages between project components and desired final outcomes.”[2] Few NSG projects had adequate monitoring frameworks, none had adequate OVE rating. Few NSG projects properly calculated ERRs.
The OVE findings have been debated by Bank management since May of 2010, with various actions to challenge, diminish or ignore the core observations.
The Development Effectiveness Framework has been an evolving process that must work in order for the IDB to comply with GCI-9 commitments. The primary reasons given by project leaders for the OVE findings include 1) policy or resource constraints 2) minimum requirements. Staff reported perceptions that their projects did not require evaluability assessments (62% of responses), that the standards were excessively high, or that they lacked the time, incentives or space to include such analysis in the project documents.
Development Effectiveness Priorities Facing IDB Governors in Calgary
Ultimately, a commitment by management and Board is necessary to resolve the apparent disagreement over what represents improvement in the quality of Bank investments. Strengthening development effectiveness is at the heart of the Cancun Declaration and the GCI-9 agreement. The OVE findings on evaluability represent a sobering reminder of the task at hand
While apparently accepting the OVE evaluation, management has not indicated how it will accommodate the report’s recommendations – a practice all to common with past, ignored OVE evaluation recommendations.The main issues that will require clarification in any GCI-9 progress report to the IDB Board at the March Annual General Meeting of Governors include:
1. Fixing the DEM:
On December 9, 2010 the IDB Board Committee on Policy and Evaluation received and “took note” of the OVE evaluation on the DEM and evaluability. While no Committee Chair Report has been disclosed, the Committee’s informal advice to the Board has resulted in an overhaul of the DEM that could incorporate much of OVE’s criticism. A revised version of the DEM was to be presented to the Policy and Evaluation Committee on Feb. 17.
The central challenge to improving operational quality through the DEM is reflected in the low problem logic.The IDB frequently accepts projects identified by the client that leave little options for change, rather than engage in dialogue about the optimal solutions to a development problem. This disconnect between upstream planning and downstream investment is particularly evident for sustainability mainstreaming, as noted in a recent review of the Bank’s safeguard policy, which found that IDB Country Environmental Assessments had little bearing on the priorities reflected in Country Strategies or subsequent programming. [3]Time and resource constraints for project preparation have only intensified this tension between satisfying client demands and Bank standards.
2. Fixing the Conflict of Interest within SPDE to ensure Quality Control
OVE specified the various lapses in quality control in the new project preparation process. The Report recommendation 5 states: “The Realignment document described clearly the problems associated with “cohabitation” between quality control and operational support functions, yet a clear distinction has not been maintained between these two functions in recent Bank practice. This may be an important contributor to the problems found in the project review process noted in this report. OVE recommends that the division of responsibilities indicated in the Realignment document should be observed, with Vice President of Sectors & Knowledge (VPS) taking responsibility for supporting project teams and developing high quality projects, and SPD having only responsibility for quality review and control.”
The office of the Executive Vice-President sits on the Operations Policy Committee and has the highest operational authority for ensuring quality control. Julie Katzman was appointed in December as the new ExVP after Dan Zelikow’s resignation in July 2010. When asked about OVE’s recommendation to separate quality enhancement (VPS) and quality control (SPDE), Katzman stated her disagreement with OVE’s findings and instead endorsed the proposal by SPDE for the procedural separation of QE & QC functions or teams within the same overall reporting line within SPDE.
3.Results Accountability in the Context of the GCI-9 Agreement
To strengthen development effectiveness by 2015, as promised in the GCI-9, the IDB needs to ensure that results get delivered. Accountability for results at the IDB remains the top concern for Congressional leaders that must still approve the GCI authorization. The OVE evaluation gives even more reason to question whether the delivery of results will be possible in the medium term, which makes this issue the cornerstone of the 2013 GCI-9 mid-term evaluation now being planned.
Righting the ship on quality should be the top priority of the new Executive Vice-President. As the Calgary AGM approaches and with it many of the benchmarks for the GCI-9, Bank management will be expected to report on progress on the 13 core conditions of capital increase agreement. The first GCI-9 progress report will be a good indication of how the Bank management perceives compliance with the GCI conditions. Beyond procedural milestones, the substantive evidence of the OVE evaluability report should suggest more than that that baseline for project quality is lower than the DEO contended last year. Structural change is paramount. Is IDB management is willing this change?
[1] IDB – OVE, Evaluability Review of Bank Projects 2009, RE 379 pg 18
[2] RE 379, pg 15
[3] Independent Advisory Group on Sustainability, Final Report to the IDB, January 2010