Cost Estimating Challenges

Reliable cost estimates are important for program approval and for continued funding. However, cost estimating is challenging. Limited time and resources often prevent the development of the perfect cost estimate; it is proper and prudent to complete the estimate with the best available information at the time while also documenting the estimate’s shortcomings. To develop a sound cost estimate, estimators must possess a variety of skills and have access to high-quality data. Moreover, credible cost estimates take time to develop—they cannot be rushed. These challenges increase the possibility that programs will fall short of cost, schedule, and performance goals. Recognizing and planning for these challenges early in the process can mitigate the risks.

Even in the best of circumstances, cost estimating can be difficult. The cost estimator typically faces many challenges. These challenges often lead to unreliable estimates—for example, estimates that contain poorly defined assumptions, have no supporting documentation, are accompanied by no comparisons to similar programs, are characterized by inadequate data collection and inappropriate estimating methodologies, are sustained by irrelevant or out-of-date data, provide no basis or rationale for the estimate, or adhere to no defined process for generating the estimate. Figure 1 illustrates some of the challenges a cost estimator faces and some of the ways to mitigate them.

Figure 1: Cost Estimating Challenges and Mitigations
Tip: Click the figure to view a larger version in a new browser tab.

Some cost estimating challenges are common. For example, deriving high-quality cost estimates depends on the quality of historical databases. It is often not possible for the cost analyst to collect the kinds of data needed to develop cost estimating relationships (CER) and other estimating methods. In most cases, better data enables the estimator to create a better estimate. Because much of a cost analyst’s time is spent collecting and normalizing data, experienced and well-trained cost analysts are necessary. Too often, individuals without these specialized skills are tasked with performing a cost analysis to meet a pressing need.

In addition, limited program funding and available time often hinder broad participation in cost estimation processes and force the analyst (or cost team) to reduce the extent to which trade-off, sensitivity, and even uncertainty analyses are performed.

Many cost estimating challenges can be traced to over-optimism. Cost analysts typically develop their estimates from technical baselines provided by program offices. Recognizing the uncertainty in a program’s technical baseline can help form a better understanding of where problems will occur in the execution phase. For example, if a software program baseline states that its total source lines of code will be 100,000 but the eventual total is 200,000, the cost will be underestimated. Or, if the baseline states that the new program will reuse 80,000 lines of code from a legacy system but can eventually reuse only 10,000, the cost will be underestimated.

Similarly, program proponents often postulate the availability of a new technology, only to discover that it is not ready when needed, which then increases program costs. Proponents also often make unrealistic assumptions about the complexity or difficulty of new processes, such as first-time integration efforts. In both instances, the additional time and effort leads directly to greater costs, as case study 1 demonstrates.

Case Study 1: Using Realistic Assumptions from Space Acquisitions, GAO-07-96
In five of six space system acquisition programs GAO reviewed, program officials and cost estimators assumed when cost estimates were developed that critical technologies would be mature and available. They made this assumption even though the programs had begun without complete understanding of how long they would run or how much it would cost to ensure that the technologies could work as intended. After the programs began, and as their development continued, the technology issues ended up being more complex than initially believed. For example, for the National Polar-orbiting Operational Satellite System (NPOESS), DOD and the U.S. Department of Commerce committed funds for developing and producing satellites before the technology was mature. Only one of 14 critical technologies was mature at program initiation, and it was found that one technology was less mature after the contractor conducted more verification testing. GAO found that the program was later beset by significant cost increases and schedule delays, partly because of technical problems such as the development of key sensors.

Program stability presents another serious challenge to cost analysts. Budget decisions drive program schedules and procurement quantities. If development funding is reduced, the schedule can stretch and costs can increase. If production funding is reduced, the number of quantities procured will typically decrease, causing average unit procurement costs to increase. Projected savings from initiatives such as multiyear procurement—contracting for purchase of supplies or services for more than one program year—may not be realized. Case study 2 shows how cost overruns happen due to program instability.

Case Study 2: Program Stability Issues, from Federal Real Property, GAO-14-648

As of 2014, DHS and GSA were managing an estimated $4.5 billion construction project at the St. Elizabeth’s Campus in Washington, D.C. The project is designed to consolidate DHS’s executive leadership, operational management, and other personnel at one secure location rather than at multiple locations throughout the Washington, D.C. metropolitan area. GAO was asked to examine DHS and GSA management of the headquarters consolidation, including the development of the St. Elizabeth’s campus.

In 2007, DHS and GSA estimated that the total cost of construction at St. Elizabeth’s was $3.26 billion, with construction to be completed in 2015, and with potential savings of $1 billion attributable to moving from leased to owned space. However, according to DHS and GSA officials, the lack of consistent funding had affected cost estimates, estimated completion dates, and savings. For example, in 2006, DHS and GSA projected that USCG would move to St. Elizabeth’s in 2011, but the move was delayed because sufficient funding for Phase 1 of the project was not available until fiscal year 2009. In 2009, DHS and GSA updated the projected completion date to the summer of 2013. The majority of funding for the St. Elizabeth’s consolidation project through fiscal year 2013 had been allocated to the construction of a new consolidated headquarters for the U.S. Coast Guard (USCG) on the campus.

According to DHS and GSA officials, the funding gap between what was requested and what was received from fiscal years 2009 through 2014, was over $1.6 billion. According to these officials, this gap had escalated estimated costs by over $1 billion—from $3.3 billion to $4.5 billion—and delayed scheduled completion by over 10 years, from an original completion date of 2015 to the then current estimate of 2026. However, GAO found that DHS and GSA had not conducted a comprehensive assessment of current needs, identified capability gaps, or evaluated and prioritized alternatives to help them adapt consolidation plans to changing conditions and address funding issues as reflected in leading practices. DHS and GSA reported that they had begun to work together to consider changes to their plans, but as of August 2014, they had not announced when new plans will be issued and whether they would fully conform to leading capital decision-making practices to help plan project implementation.

Stability issues can also arise when expected funding is cut. For example, if budget pressures cause breaks in production, highly specialized vendors may no longer be available or may have to restructure their prices to cover their risks. When this happens, unexpected schedule delays and cost increases usually result. A quantity change, even if it does not result in a production break, is a stability issue that can increase costs by affecting workload.

Significantly accelerating a development schedule also presents risk. In such cases, technology tends to be incorporated before it is ready, tests are reduced or eliminated, or logistics support is not in place. The result can be a reduction in costs in the short term but significantly increased long-term costs as problems are discovered, technology is back-fit, or logistics support is developed after the system is in the field.

In developing cost estimates, analysts often fail to adequately address risk, especially risks that are outside the estimator’s control or that were not expected. This can result in point estimates that give decision-makers no information about their likelihood of success, or give them misleading estimate confidence levels. A risk and uncertainty analysis should be part of every cost estimate, but it should be performed by experienced analysts who understand the process and know how to use the appropriate tools. On numerous occasions, GAO has encountered cost estimates with meaningless confidence levels because the analysts did not understand the underlying mathematics or tools.

A risk analysis should be used to determine a program’s contingency funding.5 All development programs should have contingency funding because it is unreasonable to expect a program not to encounter problems. Program managers need ready access to funding in order to resolve problems without adversely affecting programs (for example, by stretching the schedule). Unfortunately, budget cuts often target contingency funding, and in some cases such funding is not allowed by policy. Decision-makers and budget analysts should understand that eliminating contingency funding limits program managers’ ability to respond to program risks.

Too often, organizations encourage goals that are unattainable because of over-optimism. A 2012 report by NASA’s Office of Inspector General found that a culture of optimism helps when developing and procuring state-of-the-art and cutting edge technological products, but it can also lead management to overestimate their ability to deliver such products within their cost and schedule baselines. It can also result in an underestimation of risks, which can lead to the development of unrealistic cost and schedule estimates. While program managers believe they build risk into their plan, they often do not sufficiently account for risk.6

Optimistic program managers believe in the original estimates for the plan without adequately allowing for changes in scope, schedule delays, or other elements of risk. In addition, in a competitive environment, contractor program managers may overestimate what their company can do compared to their competition.

To properly mitigate this optimism, it is important to have an independent view of the program. While this function can be performed either by inside or outside analysts, if the organization is not willing to address and understand the risks its program faces, it will have little hope of effectively managing and mitigating them. Having this “honest broker” approach to programs helps bring to light actions that can potentially limit the organization’s ability to succeed. Therefore, program managers and their organizations must understand the value and need for risk management by addressing risk proactively and having a plan to respond to risks.

Earned Value Management Challenges

OMB requires that major acquisition programs manage risk by applying earned value management (EVM), among other ways. Reliable EVM data usually indicate how well a program is performing in terms of cost, schedule, and technical matters. This information is necessary for proactive program management and risk mitigation. EVM systems represent a best practice if implemented correctly, but an unreliable EVM system will produce unreliable results. (See case study 3.)

Case Study 3: Applying EVM, from Nuclear Waste Cleanup, GAO-19-223

The Department of Energy’s (DOE) Office of Environmental Management (EM) manages most of its cleanup of nuclear waste (77 percent of its fiscal year 2019 budget) under a category that EM refers to as operations activities, using less stringent requirements than are used for its capital asset projects. EM’s mission is to complete the cleanup of nuclear waste at 16 DOE sites and to work to reduce risks and costs within its established regulatory framework. In December 2018, DOE reported that it faced an estimated $494 billion in future environmental cleanup costs.

Our analysis of EM contractors’ EVM systems for operations activities found that EM has not followed best practices for a reliable EVM system. The EVM data for contracts covering operations activities contained numerous, unexplained anomalies in all the reports GAO reviewed, including missing or negative values for some of the completed work to date. Negative values should occur rarely, if ever, in EVM reporting because they imply the undoing of previously scheduled or performed work. In addition, GAO found problems with the estimate at completion in all 20 contractors’ EVM systems. More specifically, GAO found (1) many instances where the actual costs exceeded the estimates at completion even though there was still a significant amount of work remaining; (2) several occasions where the estimates at completion were less than half of the original budget at the beginning of the project; and (3) several contractors reported estimates at completion of zero dollars when their original budgets were for hundreds of millions of dollars. These problems indicated that the EVM systems were not being updated in a timely manner or were not well monitored since the estimate at completion values were too optimistic and highly unlikely.

Even though EM requires most of its contractors for operations activities to maintain EVM systems, EM’s 2017 policy generally does not require that EVM systems be maintained and used in a way that follow EVM best practices. Until EM updates its cleanup policy to require that EVM systems be maintained and used in a way that follow EVM best practices, EM leadership may not have access to reliable performance data to make informed decisions in managing its cleanup work and to provide to Congress and other stakeholders on billions of dollars’ worth of cleanup work every year.

Perhaps the biggest challenge in using EVM is the tendency to rebaseline programs. This happens when the current baseline is not adequate to complete all the work, causing a program to fall behind schedule or run over planned costs. A new baseline serves an important management purpose when program goals can no longer be achieved because it gives perspective on the program’s current status. However, auditors should be aware that comparing the latest cost estimate with the most recent approved baseline provides an incomplete perspective on a program’s performance because a rebaseline shortens the period of performance reported and resets the measurement of cost growth to zero.

All of the challenges discussed above make it difficult for cost estimators to develop accurate estimates. Therefore, it is very important that agencies’ cost estimators have adequate guidance and training to help mitigate these challenges. In chapter 20, we discuss audit criteria related to cost estimating and EVM.


  1. For our purposes in this Cost Guide, contingency represents funds held at or above the government program office for “unknown unknowns” that are outside a contractor’s control. In this context, contingency funding is added to an estimate to allow for items, conditions, or events for which the state, occurrence, or effect is uncertain and that experience shows are likely to result in additional costs. Management reserve funds, in contrast, are for “known unknowns” that are tied to the contract’s scope and managed at the contractor level. Unlike contingency, which is funding related, management reserve is budget related. The value of the contract includes these known unknowns in the budget base, and the contractor decides how much money to set aside. We recognize that other organizations may use the terms differently.↩︎

  2. NASA’s Challenges to Meeting Cost, Schedule, and Performance Goals,” NASA Office of Inspector General, September 27, 2012, Report IG-12-021, Washington DC.↩︎