EVM Process

The EVM process can be broken down into thirteen fundamental activities, outlined and described in this section:

  1. define the scope of effort with a WBS;
  2. identify who in the organization will perform the work;
  3. schedule the work to a timeline;
  4. estimate resources and authorize budgets;
  5. determine objective measures of earned value;
  6. develop the performance measurement baseline;
  7. execute the work plan and record all costs;
  8. analyze EVM performance data and record variances from the performance measurement baseline (PMB) plan;
  9. forecast estimates-at-completion (EACs) using EVM;
  10. conduct an integrated cost-schedule risk analysis;
  11. compare EACs from EVM (9) with EAC from risk analysis (10);
  12. take management action to respond to risks; and
  13. update the performance measurement baseline as changes occur.

1: Define the Scope of Effort with a WBS

The WBS is a critical component of EVM that defines the work to be performed. It should be the basis of the cost estimate and the program schedule. In the schedule, activities traceable to the WBS elements are linked to one another with logical relationships and lead to the end product or final delivery. The WBS progressively deconstructs the deliverables of the entire effort through lower-level WBS elements. Figure 25 shows a breakdown of the overall program plan.

Figure 25: Work Breakdown Structure Integration of Cost, Schedule, and Technical Information
Tip: Click the figure to view a larger version in a new browser tab.

Note: CDR = critical design review.

The hierarchical WBS ensures that the entire statement of work accounts for the detailed technical tasks and facilitates communication between the customer and supplier on cost, schedule, technical information, and the progress of the work. It is important that the WBS is comprehensive enough to represent the entire program at a level of detail sufficient to manage the size, complexity, and risk associated with the program. Furthermore, there should be only one WBS for each program. It should match the WBS used for the cost estimate and schedule so that actual costs can be fed back into the estimate and schedule. While costs are usually tracked at lower levels of the WBS, what is reported in an EVM system is usually summarized at a higher level. Because of its hierarchical structure, the WBS can be expanded to different degrees of detail so that problems can be identified and tracked at various levels.

2: Identify Who in the Organization Will Perform the Work

Once the WBS has been established, the next step is to assign someone to do the work. An organizational breakdown structure (OBS) is used to show who is assigned each task. To ensure that someone is accountable for every WBS element and its associated tasks, it is useful to determine levels of accountability, or control accounts, at the points of intersection between the OBS and the WBS. The control account becomes the management focus of an EVM system and the focal point for performance measurement.

It is at the control account level that actual costs are collected and variances from the baseline plan are reported in the EVM system. Figure 26 shows how control accounts are determined. The WBS is shown at the top, including program elements, contract reporting elements, and detailed elements. The left-hand side of the figure shows the OBS. The control accounts lie in the center of the figure, where the WBS and OBS intersect. As the box at the far right of the figure indicates, each control account is further broken down into work packages and planning packages. Each control account has a control account manager who is assigned responsibility for managing and completing the work.

Figure 26: Identifying Responsibility for Managing Work at the Control Account
Tip: Click the figure to view a larger version in a new browser tab.

Note: WBS = work breakdown structure.

A control account manager is responsible for managing, tracking, and reporting earned value data within each control account. Thus, control accounts are the natural control point for EVM planning and management.

Work packages—detailed tasks typically 4 to 6 weeks in duration—are defined by who authorizes the task and how the work will be measured and tracked. Work packages reflect near-term effort and require specific effort to meet control account objectives. Planning packages represent far-term work and are usually planned at higher levels. Budgets for direct labor, overhead, and material are assigned to both work and planning packages so that total costs to complete the program are identified at the outset. As time passes, planning packages are broken down into detailed work packages in a process called “rolling wave” planning. Rolling wave planning is described later in the chapter.

In planning the baseline, programs ought to consider the allocation of risk into the baseline up front—especially when addressing the issue of rework and retesting. Experts have noted that to set up a realistic baseline, anticipated rework could be included as a separate work package. Doing this accounts for a reasonable amount of rework while preserving the ability to track variances. Using this approach, programs should include rework in the budget baseline because they acknowledge effort that is bound to involve revision, such as design.

3: Schedule the Work to a Timeline

Developing a schedule provides a time sequence for the program’s activities. A program schedule also provides the vehicle for developing a time-phased budget baseline. The typical method of scheduling is the critical path method, implemented in standard scheduling software packages. The critical path method is used to derive the critical activities—that is, activities that cannot be delayed without delaying the end date of the program.50

Because some costs, such as labor, supervision, rented equipment and facilities, and other program elements typically cost more when the program takes longer, a schedule can contribute to an understanding of the cost impact if the program does not finish on time. The program’s success also depends on the quality of its schedule. If the schedule is of high quality, it shows the logical relationships between program activities, and includes activity resource requirements and realistic durations. The schedule shows when major events are expected as well as the completion dates for all activities leading up to them, which can help determine if the schedule is realistic and achievable. A detailed schedule can be used to identify where problems are or could potentially be. Moreover, as changes occur within a program, if the schedule is kept up to date (well-statused), it will aid in analyzing how the changes affect the program.

A schedule is key in managing program performance and is necessary for determining what work remains and the expected cost to complete it. As program complexity increases, so must the schedule’s sophistication. We have identified the following ten best practices associated with a high-quality and reliable schedule:

  • Capturing all activities. The schedule should reflect all activities as defined in the program’s work breakdown structure (WBS), which defines in detail the work necessary to accomplish a program’s objectives, including activities both the owner and contractors are to perform.

  • Sequencing all activities. The schedule should be planned so that critical program dates can be met. To do this, activities must be logically sequenced and linked—that is, listed in the order in which they are to be carried out and connected to other activities to show schedule dependencies. In particular, a predecessor activity must start or finish before its successor. Date constraints and lags should be minimized and justified. This helps ensure that the interdependence of activities that collectively lead to the completion of activities or milestones can be established and used to guide work and measure progress.

  • Assigning resources to all activities. The schedule should reflect the resources (labor, materials, travel, facilities, equipment, and the like) needed to do the work, whether they will be available when needed, and any constraints on funding or time.

  • Establishing the duration of all activities. The schedule should realistically reflect how long each activity will take. When the duration of each activity is determined, the same rationale, historical data, and assumptions used for cost estimating should be used. Durations should be reasonably short and meaningful, and should allow for discrete progress measurement. Schedules that contain planning and summary planning packages as activities will normally reflect longer durations until broken into work packages or specific activities.

  • Verifying that the schedule can be traced horizontally and vertically. The schedule should be horizontally traceable, meaning that it should link products and outcomes associated with other sequenced activities. Such links are commonly referred to as “hand-offs” and serve to verify that activities are arranged in the right order for achiev-ing aggregated products or outcomes. The schedule should also be vertically traceable—that is, data are consistent between different levels of a schedule. When schedules are vertically traceable, lower-level schedules are clearly consistent with upper-level schedule milestones, allowing for total schedule integrity and enabling different teams to work to the same schedule expectations.

  • Confirming that the critical path is valid. The schedule should identify the program’s critical path—the path of longest duration through the sequence of activities. Establishing a valid critical path is necessary for examining the effects of any activity’s slipping along this path. The program’s critical path determines the program’s earliest completion date and focuses the team’s energy and management’s attention on the activities that will lead to the program’s success.

  • Ensuring reasonable total float. The schedule should identify reasonable total float (or slack)—the amount of time a predecessor activity can slip before the delay affects the program’s estimated finish date—so that the schedule’s flexibility can be determined. The length of delay that can be accommodated without the finish date slipping depends on the number of date constraints within the schedule and the degree of uncertainty in the duration estimates, among other factors, but the activity’s total float provides a reasonable estimate of this value. As a general rule, activities along the critical path have the least total float. Unreasonably high total float on an activity or path indicates that schedule logic might be missing or invalid.

  • Conducting a schedule risk analysis. A schedule risk analysis starts with a good critical path method schedule. Data about program schedule risks are incorporated into a statistical simulation to predict the level of confidence in meeting a program’s completion date; to determine the contingency, or reserve of time, needed for a level of confidence; and to identify high-priority risks. Programs should include the results of the schedule risk analysis in constructing an executable baseline schedule.

  • Updating the schedule using actual progress and logic. Progress updates and logic provide a realistic forecast of start and completion dates for program activities. Maintaining the integrity of the schedule logic is necessary for the schedule to reflect the true status of the program. To ensure that the schedule is properly updated, people responsible for the updating should be trained in critical path method scheduling.

  • Maintaining a baseline schedule. A baseline schedule is the basis for managing the program scope, the time period for accomplishing it, and the required resources. The baseline schedule is designated the target schedule and is subjected to a configuration management control process. Program performance is measured, monitored, and reported against the baseline schedule. The schedule should be continually monitored so as to reveal when forecasted completion dates differ from baseline dates and whether schedule variances affect downstream work. A corresponding basis document explains the overall approach to the program, defines custom fields in the schedule file, details ground rules and assumptions used in developing the schedule, and justifies constraints, lags, long activity durations, and any other unique features of the schedule.

For further discussion of these scheduling best practices, see GAO’s Schedule Assessment Guide.51

4: Estimate Resources and Authorize Budgets

Budgets should be authorized as part of the EVM process, as well as the resources needed to do the work. In activity 3, we discussed how the schedule is resource loaded. Resources should not be limited to labor and material costs. All required resources should be accounted for, such as the costs for special laboratories, facilities, equipment, and tools. This feeds directly into the EVM process and should tie back to the cost estimate methodology.

Management reserve should be included in the budget to cover uncertainties such as unanticipated effort resulting from accidents, errors, technical redirections, or contractor-initiated studies. When a portion of the management reserve budget is allocated to one of these issues, it becomes part of the performance measurement baseline that is used to measure and control program cost and schedule performance. Management reserve provides management with flexibility to allocate budget to mitigate problems and control programs. However, it can be applied only to in-scope work and cannot be used to offset or minimize existing cost variances.

Programs with greater risk, such as development programs, usually require higher amounts of management reserve than programs with less risk, such as programs in production. Two key issues associated with management reserve are how much should be provided to the program and how it should be controlled. Research has found that programs typically set their contract value so they can set aside 5 to 10 percent as management reserve. This amount may not be sufficient for some programs and may be more than others need. One way to derive the amount of management reserve needed is to conduct a risk analysis for schedule (to determine the schedule reserve needed) and for cost (to determine the management reserve for cost). Risk and uncertainty analysis should be used to specify the probability that work will be performed within budget. The likelihood of meeting the budget can then be increased by establishing sufficient management reserve budget.

Controlling management reserve is also important. Typically held at a high level, the management reserve budget may be controlled directly by the program manager or distributed among functional directors or team leaders. In any case, it must be identified and accounted for at all times.

5: Determine Objective Measures for Earned Value

Performance measurement is key to earned value because performance represents the value of work accomplished. Before any work is started, the control account managers or teams should determine which performance measures will be used to objectively determine when work is completed. These measures are used to report progress in achieving milestones and should be integrated with technical performance measures. Examples of objective measures are requirements traced, reviews successfully completed, software units coded satisfactorily, and number of units fully integrated. Table 21 describes several acceptable, frequently used methods for determining earned value performance.

Table 21: Typical Methods for Measuring Earned Value Performance
Scroll to the right to view full table.
Method Description
Fixed formula (0/100, 50/50, 25/75, etc.) A specified percentage of the earned value is assigned to the start milestone of the work package. The remaining earned value is assigned when the work is complete.

Used for smaller work packages planned to start and end within two reporting periods.

The 0/100 technique should only be used on work packages planned to start and end within one reporting period. This technique is commonly used for receipt of materials.
Percent complete Performance is measured by an estimate of the percentage of work completed. This should be based on objective and quantifiable work completion. The percent complete for each work package is the cumulative value of the work accomplished to date divided by the total budget for the work package.
Weighted milestone This method divides the work package into measurable segments, each ending with an observable milestone. A weighted value is assigned to the completion of each milestone.

This method is more suitable for longer duration work packages that have intermediate and tangible results. For the most effective use, the method requires at least one interim milestone for each reporting period.
Physical measurement Measurement can include any units that can be explicitly related to the completion of work. Examples may include length of cable laid, quantity of concrete poured, or the quantity of similar units.

Source: Project Management Institute, Inc. Practice Standard for Earned Value Management, Second Edition, 2011. | GAO-20-195G

No single method for measuring earned value status is perfect for every program. Several WBS elements may use different methods. It is important that the method be the most objective approach for measuring true progress.

Two other methods used to measure earned value include level of effort and apportioned effort. Both are subjective, however, and should only be used when no other method discussed in the table above is applicable. Level of effort reflects earned value for activities that are merely related to the passage of time and have no physical products or defined deliverables. One example is program management. Level of effort should be used sparingly; programs that report a high amount of level of effort for measuring earned value are not providing objective data and the EVM system will not perform as expected. As a general rule, if more than 15 percent of a program’s budget is classified as level of effort, then the amount should be scrutinized. When level of effort is used excessively for measuring status, the program is not implementing EVM as intended and will fall short of the benefits EVM can offer. While the 15 percent benchmark is widely accepted as a trigger point for analysis, no given percentage should be interpreted as a hard threshold because the nature of work on some programs and contracts does not always lend itself to more objective measurement.

Apportioned effort is work that by itself is not readily divisible into short-span work packages but is related in direct proportion to an activity or activities with discrete measured effort. Apportioned effort work packages can be as discretely defined as individual work packages, but apportioned effort tasks are unique because they are closely dependent on another distinct work package. Examples include quality control responsibilities associated with pipefitting or pouring concrete. These quality control activities should span their dependent activities and their earned value should be based on the related activities’ earned value.

As work is performed, it is earned using the same units as it was planned with, whether dollars, labor hours, or other quantifiable units. Therefore, the budget value of the completed work is credited as earned value, which is then compared to the actual cost and planned value to determine cost and schedule variances. Figure 27 shows how this works.

Figure 27: Earned Value, Using the Percent Complete Method, Compared to Planned Costs
Tip: Click the figure to view a larger version in a new browser tab.

Figure 27 displays how planned effort is compared with work accomplished. It also shows how earned value represents the budgeted value of the work completed and directly relates to the percentage complete of each activity.

When earned value is compared to the planned value for the same work and to its actual cost, management has access to program status. This provides management with a better view of program risks and better information for understanding what resources are needed to complete the program.

6: Develop the Performance Measurement Baseline

The performance measurement baseline represents the cumulative value of the planned work over time. It takes into account that program activities occur in a sequence, based on finite resources, with budgets representing those resources spread over time. The performance measurement baseline is the resource consumption plan for the program and forms the time-phased baseline against which performance is measured. Deviations from the baseline identify areas where management should focus attention. Figure 28 shows how the performance measurement baseline integrates cost, schedule, and technical effort into a single baseline.

Figure 28: The Genesis of the Performance Measurement Baseline
Tip: Click the figure to view a larger version in a new browser tab.

Note: BCWS = budgeted cost for work scheduled; BAC = budget at complete; PMB = performance measurement baseline; WBS = work breakdown structure.

The performance measurement baseline includes all budgets for resources associated with completing the program, including direct and indirect labor, material, and other direct costs associated with the authorized work.

The performance measurement baseline includes any undistributed budget. Undistributed budget is used as a short-term holding account for new work; it is distributed to a control account once the work is planned in detail. To ensure timely performance measurement, it is important that undistributed budget be distributed to specific control accounts as soon as practicable. Some sources we reviewed stated that undistributed budget should be distributed within 60 to 90 days of acquiring the new funds or authorization.

The performance measurement baseline does not include management reserve or any fee and thus does not equal the program contract value. Because the budget for management reserve is accounted for outside the performance measurement baseline, it cannot be associated with any particular effort. Once a risk is realized and recovery actions identified, then the management reserve is distributed to the appropriate control account. The management reserve and performance measurement baseline values together make up the contract budget base which in turn represents the total cost of the work. Fee is added to the contract budget base to reflect the total contract price.

Figure 29 depicts a typical time-phased cumulative performance measurement baseline that follows the shape of an S curve. It portrays a gradual build-up of effort in the beginning, followed by stabilization in the middle, and finally a gradual reduction of effort near program completion.

Figure 29: The Time-Phased Cumulative Performance Measurement Baseline
Tip: Click the figure to view a larger version in a new browser tab.

Note: BCWS = budgeted cost for work scheduled; CBB = contract budget base; PMB = performance measurement baseline.

Common problems in developing and managing the performance measurement baseline are:

  • It may be front-loaded—that is, a disproportionate share of budget has been allocated to early tasks. In this case, budget is typically insufficient to cover far-term work. Front-loading tends to hide problems until it is too late to correct them, putting the program at risk of severe overrun in later phases.

  • The performance measurement baseline can become a rubber baseline—that is, a continual shift of the baseline budget to match actual expenditures in order to mask cost variances. This results in deceptive baselines by covering up variances early in the program, delaying insight until they are difficult, if not impossible, to mitigate.

  • The performance measurement baseline can become outdated if changes are not incorporated quickly. As a result, variances do not reflect reality, which hampers management in realizing the benefits of EVM.

7: Execute the Work Plan and Record All Costs

For this activity, program personnel execute their tasks according to the performance measurement baseline and the underlying detailed work plans. Actual costs are recorded by the accounting system and are reconciled with the value of the work performed so that effective performance measurement can occur. A program cost-charging structure must be set up before the work begins to ensure that actual costs can be compared with the associated budgets for each active control account. In particular, material costs should be accurately charged to control accounts using recognized and acceptable techniques to keep variances due to accounting accrual issues to a minimum.

8: Analyze EVM Performance Data and Record Variances from the Performance Measurement Baseline Plan

Because all programs carry some degree of risk and uncertainty, cost and schedule variances are normal. Variances provide management with essential information on which to assess program performance and estimate cost and schedule outcomes. Cost and schedule variances should be examined periodically with management’s focus on variances with the most risk to the program. This means that EVM data should be regularly reviewed if they are to be of any use. In addition, management must identify solutions for problems early if there is any hope of averting degradation of program performance.

9: Forecast Estimates at Completion Using EVM

Managers should rely on EVM data to generate EACs at least monthly. EACs are derived from the cost of work completed along with an estimate of what it will cost to complete all unaccomplished work. A best practice is to continually reassess the EAC; however, some organizations will also conduct periodic bottom-up estimating.

10: Conduct an Integrated Cost-Schedule Risk Analysis

A schedule can be used, in combination with risk analysis data (often including traditional 3-point estimates of duration or the impact of risk drivers) and Monte Carlo simulation software, to estimate schedule risk and the EAC. Risk analysis uses data that represent the probability that risks will occur and estimates of the risks’ impact on the schedule and cost. Although historical data can be used, much of the risk analysis data is derived from interviews and workshops.

Using the results of the schedule risk analysis, the cost elements that relate to time uncertainty (labor, management, and rented facilities) can be linked directly to the uncertainty in the schedule. The schedule risk analysis provides quantification of risk and uncertainty related to time-dependent cost elements in addition to an estimate of when the program may finish and the identification of key risk drivers. These results can be exported to a spreadsheet where cost models and estimates are often developed and stored. The cost risk and uncertainty analysis uses these schedule risks to link the uncertainty in cost to the uncertainty in schedule. This approach models the way labor cost will be determined because it converts time to a cost estimate by using labor and associated rates along with any material costs.

The GAO Schedule Assessment Guide has more details on performing a schedule risk analysis.52

11: Compare EACs from EVM with EAC from Risk Analysis

This activity demonstrates the integration of EVM and risk management processes. The integrated cost-schedule risk analysis produces a cumulative probability distribution for the program’s cost. This estimate can be compared to the estimate using EVM extrapolation techniques. The comparison is valuable because it is performed on EACs created with quite different approaches. If different approaches produce results that are in general agreement, their EAC forecasts are probably sound. If not, one or the other method (or both) should be reviewed for changes and revisions.

12: Take Management Action to Respond to Risk

Management should integrate the results of information from activities 8 through 11 with the program’s risk management plan to respond to emerging and existing risks. Management should focus on responses and identify ways to manage cost, schedule, and technical scope to meet program objectives. It should also keep track of all risks and analyze EVM data trends to identify future problems.

13: Update the Performance Measurement Baseline as Changes Occur

While the 32 EIA-748 guidelines are for the overarching goal of maintaining the integrity of the baseline and the resulting performance measurement data, changes are likely throughout the life of the program. It is imperative that changes be incorporated into the EVM system as soon as possible to maintain the validity of the performance measurement baseline. When changes occur, both budgets and schedules are reviewed and updated so that the EVM data stay current.

Furthermore, the EVM system should outline procedures for maintaining a log of all changes and for incorporating the changes into the performance measurement baseline. A detailed record of the changes made to the performance measurement baseline makes it easy to trace them to the program. This also lessens the burden on program personnel when compiling information for internal and external program audits, EVM system surveillance reviews, and updates to the program cost estimate. If changes are not recorded and maintained, the program’s performance measurement baseline will not reflect reality. The performance measurement baseline will become outdated and the data from the EVM system will not be meaningful.

Some changes may be simple, such as modifying performance data to correct for accounting errors or other issues that can affect the accuracy of the EVM data. Other changes can be significant, such as when major events or external factors beyond the program manager’s control result in changes that will greatly affect the performance measurement baseline. Key triggers for change include:

  • contract modifications, including engineering change proposals;

  • shifting funding streams;

  • restricting funding levels;

  • major rate changes, including overhead rates;

  • changes to program scope or schedule;

  • revisions to the acquisition plan or strategy; and

  • executive management decisions.

Because the performance measurement baseline should always reflect the most current plan for accomplishing authorized work, incorporating changes accurately and in a timely manner is especially important for maintaining the effectiveness of the EVM system.

Case study 23 highlights a program in which a performance measurement baseline was not representative of a program’s external commitments.

Case Study 23: Performance Measurement Baseline Data, from Space Launch System, GAO-15-596

The Space Launch System (SLS) is National Aeronautics and Space Administration’s (NASA) first heavy-lift launch vehicle for human space exploration in over 40 years. For development efforts related to the first flight of SLS, NASA established its cost and schedule commitments at $9.7 billion and November 2018. The program, however, had continued to pursue more aggressive internal goals for cost and schedule.

NASA was using contractor earned value management (EVM) data as an additional means to monitor costs for SLS, but the EVM data remained incomplete and provided limited insight into progress toward the program’s external committed cost and schedule baselines. Program officials indicated that the SLS contractor performance measurement baselines—which established the program scope, schedule, and budget targets to measure progress against—were based on the program’s more aggressive internal goal for launch readiness for EM-1 in December 2017 and not its external committed date of November 2018.

Both contractor and program-level EVM data were only reported relative to the December 2017 date, according to program officials. The potential impact of cost and schedule growth relative to the program’s external committed cost and schedule baseline of November 2018 was neither reported nor readily apparent, and rendered the EVM data less useful in support of management decisions and external oversight. Major NASA programs are required by statute to report annually to certain congressional committees on changes that occurred over the prior year to the programs’ committed cost and schedule baselines, among other things. As this report reflects cost and schedule overruns that have already occurred, it does not serve as a mechanism to regularly and systematically track progress against committed baselines so that decision-makers have visibility into program progress and can take proactive steps as necessary before cost growth or schedule delays are realized.

By pursuing internal launch readiness dates that were unrealistic, the program left itself and others in a knowledge void wherein progress relative to the agency’s commitments was difficult to ascertain. As the EVM system only tracked progress toward the program’s internal goals, the program lacked a mechanism to track progress to its external cost and schedule baseline commitments. Without such a tracking tool, stakeholders lacked early insight into potential overruns and delays.


  1. GAO, Schedule Assessment Guide: Best Practices for Project Schedules, GAO-16-89G (Washington, D.C.: December 2015), 6.↩︎

  2. GAO, Schedule Assessment Guide: Best Practices for Project Schedules, GAO-16-89G (Washington, D.C.: December 2015)↩︎

  3. GAO, Schedule Assessment Guide: Best Practices for Project Schedules, GAO-16-89G (Washington, D.C.: December 2015)↩︎