2015/Internal/Office of U.S. Foreign Assistance Resources' Implementing the Managing for Results Framework

Below is the Executive Summary. Click here for the full report (PDF).

EXECUTIVE SUMMARY

The Managing for Results (MfR) framework represents the U.S. Department of State’s (hereafter referred to as the Department or DoS) integrated approach to planning, budgeting, managing, and measuring programs and projects. Currently, the Office of U.S. Foreign Assistance Resources (F) and the Bureau of Budget and Planning (BP), in coordination with the U.S. Agency for International Development (USAID), share responsibility for the framework. F maintains responsibility for MfR processes that relate to the management of the Department’s Foreign Assistance (FA) programs and resources. After a four-year implementation, the Planning and Performance Management Division (F/PPS/PPM) initiated an independent evaluation to assess the effectiveness of the framework as it relates to the management of FA programs and resources.

Methodology

The evaluation team reviewed MfR training, guidance, and communications as well as management documents and prior studies to gain a baseline understanding of the MfR processes and products. The evaluation team used electronic surveys, stakeholder interviews, and focus groups to collect stakeholder feedback from F leaders and staff, Bureau and Mission leaders, Bureau Planners (i.e., regional and functional bureau staff members who are responsible for developing MfR products), and external partners (e.g., USAID, Foreign Service Institute (FSI), BP). Finally the team analyzed MfR products from two regional and two functional bureaus to determine how the output of one process could be used to inform subsequent processes. Data collection limitations for this evaluation included low participation in surveys and initially scheduled Bureau Planner focus groups. To mitigate these limitations, the evaluation team extended the deadline for both surveys and scheduled additional Bureau Planner focus group sessions. Other limitations are discussed in pages 14-17 of the evaluation report.

Findings & Conclusions

The following sections highlight the evaluation team’s findings and conclusions aligned to the primary questions F sought to address when commissioning the MfR evaluation.

Evaluation Question 1: Complementary Processes

F seeks to understand the extent to which the processes of the MfR framework complement one another in practice. For example, what barriers prevent effective linkages between processes? And are any critical processes or components missing that would improve effectiveness?

1.1: Leadership Engagement. Department leadership does not consistently participate during MfR processes or use MfR products (e.g., strategic plans, performance reports).

Inconsistent leadership engagement at each level within the organization is a barrier that continues to hinder the adoption of the MfR framework. Stakeholders responsible for performing MfR responsibilities are not motivated to integrate processes into their management routines when they perceive leadership as disinterested in the MfR framework. Activities that foster leadership engagement and collaboration are largely absent from the current MfR framework. Thus the evaluation team concluded, F should consider new ways to promote dialogue between leaders and staff.

1.2: FA Program/Project Design and Performance Management1. The MfR framework does not include processes or products that promote FA program/project design and performance management.

While the Department’s responsibilities for managing FA resources have grown, corresponding policies and guidance have not expanded to promote FA program/project design and performance management. 18 FAM 005.1-7(G) directs F, working with others as appropriate, to “establish standard guidance and oversight mechanisms for incorporating best practices in program design, management, and monitoring and evaluation.” The evaluation team concluded that absent comprehensive program/project design and performance management guidance, standards, and tools, F is not well-positioned to be an effective steward of FA programs and resources.

1.3: Evaluation Outcomes. Few mechanisms exist to promote the implementation of program evaluation findings and recommendations.

While the Department recently revised its Evaluation Policy, the current policy does not establish guidance to promote the implementation or monitoring of program evaluation outcomes. Thus, the evaluation team concluded that no formal methods exist to encourage evaluation sponsors to take action on evaluation outcomes. Resource constraints also make it difficult for Bureau Evaluation Coordinators to effectively perform their duties. Finally, systems limitations prevent F and other stakeholders from utilizing data to perform meaningful analysis of evaluation activities to inform future decisions and promote learning across the Department.

1.4 Strategic Planning. The current strategic planning process is constrained by timing challenges, inconsistent or ambiguous higher-level policy guidance, and competing country-level planning models.

Evaluation participants identified multiple factors that make it difficult to achieve meaningful alignment between Department, Bureau, and Mission strategic plans. Thus, the evaluation team concluded that at a minimum, the Department would benefit from formalizing its internal approach and communications surrounding the development of the JSP and QDDR. Finally, the lack of complete integration between Department and USAID planning processes complicates these processes for leaders and staff who want to embrace the MfR framework.

Evaluation Question 2: Awareness, Understanding, and Implementation Gaps

F seeks to identify gaps in awareness, understanding, and implementation among all stakeholders of the MfR framework. For example, is F communicating effectively with its stakeholders about the MfR framework? Are current training activities effective? And to what extent does the lack of resources in key areas affect the implementation of the MfR framework?

2.1: Training Focus. Current MfR training and guidance predominately focuses on the implementation of individual processes or products rather than the implementation of the entire MfR framework.

F’s current training and guidance predominately focuses on helping stakeholders develop MfR products, but does not sufficiently reinforce linkages between MfR processes. Thus the evaluation team concluded that to increase usage, the Department will need to better educate stakeholders on how to implement and use MfR products within their daily environments.

2.2: Change and Communications Management. F lacks a comprehensive change and communications plan to guide its responsibilities for facilitating MfR processes as a whole.

The evaluation team was unable to identify a comprehensive, coordinated change and communications management strategy that guides F’s responsibilities for facilitating MfR processes. Multiple stakeholders contribute to knowledge management portals such that information is duplicated between some websites and absent others. Thus, the evaluation team concluded the current approach to content and information sharing reinforces the perception that MfR processes are not sufficiently integrated. These factors complicate F’s ability to effectively communicate with stakeholders.

2.3: Bureau and Mission Staffing. Bureaus and Missions have not consistently staffed their organizations to perform MfR processes.

At this time, the evaluation team concludes that the Department has not developed a comprehensive human resource management strategy to ensure appropriate staff and staff incentives exist to promote MfR implementation and usage. Without changes to the Department’s human resource practices, leaders believe they will not be able to hire and development staff who possess the necessary skills and experience to effectively perform MfR responsibilities. Thus, the evaluation team concluded that F should leverage its shared responsibility identified in 18 FAM 005.1-8(A)2 to strengthen training, recruitment and performance incentives.

Evaluation Question 3: Mission/Bureau Objectives

F seeks to understand whether the Mission Objectives (MOs) and Bureau Objectives (BOs) from Mission and Bureau strategies are effectively incorporated into each process of the MfR framework. For example, where are there gaps that need to be addressed? And how does the duration or shelf life of a MO/BO affect how well it works as a unit of analysis in each process?

3.1: Mission/Bureau Objective Performance Reporting. F and BP facilitate different performance reporting processes for Diplomatic Engagement (DE) and FA programs; the MfR framework does not currently enable stakeholders to report overall performance against Mission/Bureau Objectives.

Thus, the evaluation team concluded that the separation of DE and FA performance reporting processes limits visibility into the overall performance of Mission/Bureau Objectives that comprises both funding streams.

3.2: Mission/Bureau Objective Development. Stakeholders are advised to develop specific, actionable Mission/Bureau Objectives; however, stakeholders are motivated to develop broad objectives to accommodate the duration of the federal budget cycle and to avoid off-cycle changes to strategic plans.

MOs/BOs will continue to be broad and vague until more flexibility is permitted for updating them mid-cycle, or unless budget requests and performance narratives no longer need to be aligned to MOs/BOs. Stakeholders report not wanting to be “locked in” to specific, concrete MOs/BOs that they must align their budget requests to two and three years into the future (i.e., F requires Missions to align Mission Resource Requests (MRR) to MOs and BP requires Bureaus to align Bureau Resource Requests (BRR) to BOs annually). Thus, the evaluation team concluded that this then contributes to the perception that strategic plans are too broad or inflexible to be meaningful or serve as useful management tools. As a result, some stakeholders are less inclined to use their strategic plan. Emphasizing that the Action Plans can and should be updated will help stakeholders as they conduct strategic reviews and determine when sub-objectives have been accomplished or more carefully hone performance metrics to enhance performance monitoring.

Recommendations

The evaluation team identified a number of recommendations along with suggested implementation actions within the evaluation report. Recommendations are structured around four main themes.

Part 1: Framework Integration

Stakeholders often perceive the MfR framework as a series of independent processes and reporting exercises. Thus, opportunities exist to strengthen the integration between MfR processes and products.

Recommendation 1.1: Develop guidance for conducting strategic performance reviews.

Mission and Bureau leaders do not consistently use MfR products or engage in MfR activities. Strategic performance reviews, whereby Bureaus and Missions review progress against their strategic plan, would lead to strengthened performance information as leaders and staff learn what performance information is needed, how best to collect it, and at what frequency it should be collected to appropriately monitor results. F should develop guidance to implement the “senior-level” reviews recommended within the 2015 QDDR. Guidance should include the benefits of conducting strategic performance reviews, participant roles, timing and frequency, inputs, intended outputs, and uses of the information. Completion of this review should coincide with the development of MRRs/BRRs.

Recommendation 1.2: Explore opportunities to provide a complete picture of performance against strategy.

While MOs/BOs represent broad organizational objectives identified during the planning process, downstream budgeting and measuring activities separate FA and DE resource and performance discussions. This makes it difficult to communicate a complete picture of organizational progress towards the achievement of individual MOs/BOs. F should work with BP to explore changes or additions to current performance reporting activities in order to provide a more complete picture of performance against MOs/BOs. Opportunities include the strategic performance reviews or adjustments to current MfR products.

Recommendation 1.3: Strengthen evaluation outcomes by revising evaluation guidance and exploring opportunities to strengthen FACTS Info evaluation reporting and analysis.

Few mechanisms exist to promote and monitor the implementation of program evaluation findings and recommendations. Current guidance only directs the documentation and dissemination of evaluation findings. Several Bureau leaders reported not knowing offhand the outcome of evaluations or whether recommendations were implemented. F could update evaluation guidance to encourage evaluation sponsors to develop and submit action plans to monitor evaluation outcomes. Guidance could be updated to encourage Bureaus to strengthen the Bureau Evaluation Coordinator role by designating at least one individual whose primary duty is to select, plan, and monitor evaluations and subsequent action plans. F should explore whether FACTS Info, the FA budget formulation and performance reporting system, can be used to track action plan progress, or flag evaluations by sector to alert OUs who are starting programs in the same sector that they could benefit from an active or complete program evaluation.

Recommendation 1.4: Explore opportunities to achieve greater alignment between Department, Bureau, and Mission strategic planning activities.

While current strategic plan sequencing helps align JRS and ICSs, there is no systematic method for aligning the JRS, FBS, or ICS to the JSP or the QDDR. F should consider altering or allowing more flexibility in the timing, duration, and sequencing for Bureaus and Missions strategic planning activities to make sure they are contributing to Department-level goals.

Recommendation 1.5: Assess implementation of the entire MfR framework within individual Missions and Bureaus.

Current After Action Reviews of F products and processes focus on reviewing individual MfR processes once they have concluded. Currently, no process exists that assesses the full, end-to-end implementation of the MfR framework, which limits visibility into the effectiveness of MO/BO integration, as well as implementation activities within Missions and Bureaus. To provide greater visibility into implementation successes and challenges, F should periodically review the complete implementation of all MfR processes for a sample of Missions and Bureaus.

Part 2: FA Program/Project Design and Performance Management

No processes in the MfR framework currently focus on the design and performance management of FA programs or projects. The absence of strong program/project design and performance management guidance increases the risk of ineffectively run programs, the inability to show positive results or value achieved for expended budgets, and jeopardizes the quality of downstream performance monitoring and evaluation activities.

Recommendation 2.1: Develop FA program/project design and performance management policy with associated guidance and tools.

MfR processes do not currently encompass program level planning that looks at how programs/projects collectively can or should contribute to broader strategic objectives, nor does F provide program/project design and performance management guidance. A policy is warranted based on the funding magnitude of FA programs/projects. The gap in this area adversely affects the quality of MfR activities, including increasing the risk of poorly managed programs/project and ineffective performance reporting and program evaluation activities that are dependent on sound program/project design and performance management. Providing this guidance upfront can help Bureaus and Missions better organize and plan for monitoring and evaluation at program/project start-up, thus reducing the challenges currently faced by FA programs having limited baseline data and/or insufficient metrics to demonstrate program success. F should create a policy to govern FA program/project design and performance management with accompanying guidance, training, tools, and templates. While there should be one standard policy, the tools (i.e., guidance, training, and templates) associated with implementing it should be scalable and tailorable for programs of differing sizes.

Part 3: Change and Communications Management

Stakeholders do not share a common understanding of the MfR primary purpose, customer, benefits, and intended use of each product.

Recommendation 3.1: Design and launch a customer-focused communications campaign plan.

Multiple entities (e.g., F, BP, and USAID) communicate about MfR products based on their specific equities. As a result, communications are generally designed around specific processes or products rather than customer needs and at times contradict one another. Messaging emphasizes MfR products’ ability to influence budget decisions, without discussion of the benefits that are more likely to resonate with customers. F should design and launch a customer-focused communications campaign plan to communicate the framework’s purpose, concrete benefits, and success stories from the field. The Department should formalize its internal approach and communications surrounding the development of the JSP and QDDR and the relationship between the two strategic documents.

Recommendation 3.2: Create a one-stop website for MfR communications, guidance, tools, templates, and training.

Many knowledge management systems and portals (e.g., SharePoint, Communities@State) are used to share information and content with stakeholders. This scenario makes it difficult for F to effectively manage change and confusing to customers who are trying to better understand MfR products and processes. MfR communications, guidance, tools, templates, and training should be accessible via a one-stop website that allows stakeholder interaction and engagement, such as Communities@State. Such a website should allow for two-way engagement between F and end users of MfR products and processes.

Recommendation 3.3: Enlist a network of change champions within the Department to promote MfR framework, processes, products, and practices.

F has promoted MfR success stories but has not fully tapped into the potential of the leaders who share these stories. F can maximize the impact of key messaging around customer benefits through testimonials. Leaders who have found success with the MfR framework should be tapped to promote the benefits, share their successes and lessons learned, and educate other staff on MfR benefits and best practices. This would build MfR capability organically using spokespeople who believe in the framework.

Recommendation 3.4: Revise planning guidance and training materials for clarity.

Stakeholders shared that they receive conflicting guidance regarding the appropriateness of making mid-cycle changes to strategic plans. For example, Missions receive differing guidance from F, BP, or the Bureau Planners working in their respective Regional Bureaus. Staff and leaders may also have confused the guidance between the two guidance documents when discussing them in interviews and focus groups. F should work with BP to promote consistency between guidance for the FBS/JRS and the ICS, while recognizing differences between these organizations. Emphasize that the sub-objectives, indicators, and milestones that comprise action plans could and should be updated following strategic performance reviews.

Recommendation 3.5: Increase collaboration between F/PPS/PPM and F/RG.

Bureau staff reported that F/PPS/PPM and F-POCs are not always consistent or in agreement with one another, which could impact F’s credibility with customers. F leadership should consider creating more unity and collaboration between these two offices as a top organizational priority. Synching these organizations will optimize F’s ability to benefit MfR customer implementation. F leadership should support and reinforce the importance of unified messaging to customers. To support this change, F leadership should consider the following options for increasing collaboration between F/PPS/PPM and F-POCs: a leadership offsite, periodic synch meetings, staff rotation program, and a leader buddy program.

Part 4: Training and Capacity Building

Missions and Bureaus have been implementing MfR processes and products, but without sufficient capacity, training, or a reward structure, implementation remains inconsistent.

Recommendation 4.1: Create and execute an MfR learning strategy and training plan.

Current MfR communications, training, and guidance are process-specific, and few focus on how to holistically implement the MfR processes, products, and practices within a Mission or Bureau. In addition, the Department’s broader training and professional development strategy has not kept pace with growing responsibilities for implementing FA as an instrument of Foreign Policy. FSI has not fully incorporated MfR or FA program/project performance management training into its core curriculum. F should develop an MfR framework training plan to guide curriculum design, development, and management for the Managing Foreign Assistance Overseas FSI course F may be assuming responsibility for. Training could be designed to influence attitudes and behaviors of adult learners by focusing on performance management concepts and how to apply these concepts within a Foreign Policy context.

Recommendation 4.2: Continue to transition F/PPS/PPM to a more consultative role while pursuing longer-term changes to the Department’s employee training and development practices.

Missions and Bureaus continue to face increasing strategic planning, budgeting, managing, and measuring requirements as FA increases, without a corresponding increase in State Department FTEs. While tools and templates are useful, technical assistance is reported to have a greater impact. F/PPS/PPM could collaborate with D-MR annually to identify high-priority Missions and Bureaus that would benefit from technical assistance in strategic planning, program design, performance monitoring and/or evaluation for FA. This could be done on a small scale with the staff within F (i.e., no new resources).

Recommendation 4.3: Promote knowledge transfer in evaluation to build capacity.

Bureau leaders report difficulty recruiting and hiring skilled evaluation experts into the USG.

F could include standard language within Department evaluation Statements of Work (SOW) to provide training and mentoring of Bureaus’ evaluation staff to maximize knowledge transfer.

Recommendation 4.4: Advocate for incorporating MfR concepts into FSI training, Foreign Service precepts, and the FS Management Cone structure.

Incorporating MfR concepts into FSI training, Foreign Service precepts, and the FS Management Cone structure would help the Department fulfill responsibilities outlined within 18 FAM 005 pertaining to improving recruitment, professional incentives, and training for MfR concepts. F should consider establishing a working group consisting of participants from BP, FSI, and HR in order to identify longer-term opportunities for expanding training and professional development opportunities for FA program/project performance management and evaluation staff.

----------------------------

Footnotes

1. Program/project design specifies how specific activities within the program/project will be coordinated to achieve the desired result. Design activities include, but are not limited to, preparing a concept paper defining the proposed program/project outline; developing a logical framework; developing a monitoring and evaluation plan; preparing a cost estimate, financial plan, and implementation plan; and preparing formal approval documents. Program/project management is a methodical approach to guiding execution of and adding a measure of control to project processes from start to finish. Program/project management activities include, but are not limited to managing a program or project schedule, scope, cost, communications, human resources, risks, and procurements (e.g., contracts, grants). Performance management is the collection and use of metrics to inform program/project-management-related decisions. Performance management activities include, but are not limited to, selecting and defining indicators, setting targets, collecting or overseeing the collection of baseline and ongoing monitoring data, and using monitoring data to inform decisions related to project execution and management.

2. 18 FAM 005.1-8(A) “F and RM, working with regional and functional bureaus, will develop models for the skill sets and structures that will best support integrated policy, analysis, strategic planning, budgeting, and performance evaluation in Washington and in the field…F, RM, and the DG will determine how the Department can best incorporate these development specialists and those with unique development backgrounds into key positions in Washington and in the field, including in our budgeting and planning structures, and how to improve training, recruitment, and performance incentives...”