Assumptions, Sensitivity, and Risk Analysis

Every estimate is uncertain because assumptions must be made about future projections. Sensitivity analysis examines how changes to key assumptions and inputs affect the estimate and can help mitigate uncertainty. Best practice cost models incorporate sensitivity analyses without altering the model so that the effect of varying inputs can be quickly determined (more information is in chapters 11 and 12). For example, a decision-maker may challenge the assumption that 5 percent of the installed equipment will be needed for spares, and asks that the factor be raised to 10 percent. A sensitivity analysis would show the cost impact of this change. The cost estimator should always perform a sensitivity analysis that portrays the effects on the cost and schedule of an invalid assumption. Such analysis often provides management with an invaluable perspective on its decision making.

In addition to sensitivity analysis, factors that will affect the program’s cost, schedule, or technical status should be clearly identified, including political, organizational, or business issues. Because assumptions themselves can vary, they should always be inputs to program risk analyses of cost and schedule. Often, risk analysis emphasizes the breadth of factors that may be uncertain. In a risk identification exercise, the goal is to identify all potential risks stemming from a broad range of sources. A good starting point would be to examine the program’s risk management database to determine which WBS elements these risks could affect. Another option would be to examine risks identified during a program’s integrated baseline review—a risk-based assessment of the program plan to see whether the requirements can be met within cost and schedule baselines.

Regardless of what method is used to identify risk, it is important that more than just cost, schedule, and technical risks are examined. For example, budget and funding risks, as well as risks associated with start-up activities, staffing, and organizational issues, should be considered. Indeed, risks from all sources, including external, organizational, and even program management practices, in addition to the technical challenges, need to be addressed as well.

Well-supported assumptions should include documentation of an assumption’s source and should discuss any weaknesses or risks. Solid assumptions are measurable and specific. For example, an assumption that states “transaction volume will average 500,000 per month and is expected to grow at an annual rate of 5 percent” is measurable and specific, whereas “transaction volumes will grow greatly over the next 5 years” is not as helpful. By providing more detail, cost estimators can perform risk and sensitivity analysis to quantify the effects of changes in assumptions.

Assumptions should be realistic and valid. This means that historical data should back them up to minimize uncertainty and risk. Understanding the level of certainty around an estimate is imperative to knowing whether to keep or discard an assumption. Assumptions tend to be less certain earlier in a program, and become more reliable as more information is known about them. A best practice is to collect all assumptions in a single location so that risk and sensitivity analysis can be performed efficiently and quickly.

Certain ground rules should always be tested for risk. For example, the effects of the program schedule’s slipping on both cost and schedule should always be modeled and the results reported to management. This is especially important when the schedule is known to be aggressive or was not assessed for realism. Too often, we have found that when schedules are compressed, for instance to satisfy a potential requirements gap, the optimism in the schedule does not hold and the result is greater costs and schedule delays.

Cost estimators and auditors should be wary of overly optimistic technology forecasts. It is well known that program advocates tend to underestimate the technical challenges facing the development of a new system. (For more information see GAO’s Technology Readiness Assessment Guide).20 Estimators and auditors alike should always seek to uncover the real risk by performing an uncertainty analysis. In doing so, it is imperative that cost estimators and auditors meet with engineers familiar with the program and its new technology to discuss the level of risk associated with the technical assumptions. Only then can they realistically model risk distributions using an uncertainty analysis and analyze how the results affect the overall cost estimate. Technology maturity assumptions also tend to be optimistic. Having reviewed the experiences of DOD and commercial technology development, GAO found that programs that relied on technologies with a demonstrated high level of maturity were in a better position to succeed than those that did not. Simply put, the more mature technology is at the start of a program, the more likely it is that the program will meet its objectives. Technologies that are not fully developed represent a significant challenge and add a high degree of risk to a program’s schedule and cost. Programs typically assume that the technology required will arrive on schedule and be available to support the effort. While this assumption allows the program to continue, the risk that it will prove inaccurate can greatly affect cost and schedule. Case study 11 provides an example of the impact of underestimating technology maturity.

Case Study 11: Technology Maturity, from Columbia Class Submarine, GAO-18-158

Additional development and testing were required to demonstrate the maturity of several Columbia class submarine technologies that were critical to performance, including the Integrated Power System, nuclear reactor, common missile compartment, and propulsor and related coordinated stern technologies. As a result, it was unknown whether they would work as expected, be delayed, or cost more than planned. Any unexpected delays could postpone the deployment of the lead submarine past the 2031 deadline.

GAO found that the Navy underrepresented the program’s technology risks in its 2015 Technology Readiness Assessment (TRA) when it did not identify these technologies as critical. Development of these technologies was key to meeting cost, schedule, and performance requirements. A reliable TRA serves as the basis for realistic discussions on how to mitigate risks as programs move forward from the early stages of technology development. Not identifying these technologies as critical meant Congress may not have had the full picture of the technology risks and their potential effect on cost, schedule, and performance goals as increasing financial commitments were made. The Navy was not required to provide Congress with an update on the program’s progress, including its technology development efforts, until fiscal year 2020—when $8.7 billion for lead ship construction would have already been authorized. Periodic reporting on technology development efforts in the interim could have provided decision-makers assurances about the remaining technical risks as the Navy asked for increasing levels of funding.

Consistent with GAO’s identified best practices, the Navy intended to complete much of the submarine’s overall design prior to starting construction to reduce the risk of cost and schedule growth. However, the Navy awarded a contract for detail design while critical technologies remained unproven—a practice not in line with best practices that led to cost growth and schedule delays on other programs. Proceeding into detail design and construction with immature technologies can lead to design instability and cause construction delays. The Navy planned to accelerate construction of the lead submarine to compensate for an aggressive schedule, which may have led to future delays if the technologies were not fully mature before construction started in 2021.

Once the risk and uncertainty and sensitivity analyses are complete, the cost estimator should formally convey the results of changing assumptions to management as early and as far up the line as possible. The estimator should also document all assumptions to help management understand the conditions on which the estimate was based. When possible, the cost estimator should request an updated technical baseline in which the new assumptions have been incorporated as ground rules.


  1. GAO, Technology Readiness Assessment Guide: Best Practices for Evaluating the Readiness of Technology for Use in Acquisition Programs and Projects, GAO-20-48G (Washington, D.C.: January 2020).↩︎