Assessing the Reliability of a Cost Estimate

In this chapter, the cost estimating best practices and the four characteristics of a high-quality cost estimate are presented as criteria—that is, the required or desired state or expectation with respect to a program or operation. Auditors can use these criteria to assess the reliability of a cost estimate and determine the thoroughness of an organization’s cost estimating guidance.42 In addition, non-audit organizations can use the best practices and four characteristics to validate the quality of a program’s cost estimate.

Cost Estimate Audit Criteria

Auditors should identify criteria. Criteria provide a context for evaluating evidence and understanding the findings, conclusions, and recommendations included in an audit report. According to Government Auditing Standards, criteria represent the laws, regulations, contracts, grant agreements, standards, specific requirements, measures, expected performance, defined business practices, and benchmarks against which performance is compared or evaluated.43

Auditors should use criteria that are relevant to the audit objectives and permit consistent assessments of the subject matter. By using the process task lists and best practices described in this Guide, auditors and others charged with determining the quality of a cost estimate can:

  • assess the reliability of a cost estimate,
  • evaluate the extent to which an organization’s processes and procedures address best practices, and
  • independently validate a cost estimate.

Auditors write statements of quality of cost estimates by determining the extent to which the estimate reflects each best practice. Best practice evaluations are then summarized at the characteristic level to determine the extent to which the estimate meets the four characteristics. For example, a cost estimate that completely addresses the components of a sensitivity analysis, a risk and uncertainty analysis, includes methodology cross-checks, and is compared to an independent cost estimate can be considered a credible cost estimate. A cost estimate that fully reflects the comprehensive, well-documented, accurate, and credible characteristics is considered a reliable cost estimate.

Case study 21 provides an example of the extent to which an agency’s cost estimate met the four characteristics of a reliable cost estimate.

Case Study 21: Unreliable Cost Estimates, from Ford-Class Aircraft Carrier, GAO-17-575

The Navy is investing over $43 billion to develop three Ford-Class nuclear-powered aircraft carriers. This class of ships is intended to feature an array of cutting-edge technologies to improve combat capabilities and create operational efficiencies by increasing the rate of aircraft launches and reducing the number of personnel needed to operate the ship. The Navy expects to achieve these improvements while simultaneously reducing acquisition and life cycle costs. However, this expectation has not been borne out. Costs to construct the lead ship Gerald R. Ford (CVN 78) increased from $10.5 billion to $12.9 billion (nearly 23 percent), and promised levels of capability have been reduced. GAO assessed the extent to which the CVN 79 follow-on ship’s cost estimate was reliable and provided a reasonable basis for meeting the cost cap given known cost risks from the performance of the lead ship.

GAO’s review of the CVN-79 cost estimate found that the $11.4 billion the Navy budgeted to construct the ship was likely insufficient. The CVN 79 estimate was substantially comprehensive in that it included all life cycle costs, mostly defined the program, and had a product oriented work breakdown structure. However, GAO found several weaknesses in the other best practices that indicate the $11.4 billion was not a realistic program estimate. In particular, GAO found that the estimate was only partially well documented and lacked analysis to support the derived cost savings from CVN 78. The cost estimate documentation also did not describe the estimating methodology and there was limited documentation to support the cost model inputs. The estimate was partially accurate because of its optimistic assessment regarding the labor hours needed to construct the ship and because the estimators did not use timely data to ensure that the cost estimate reflected the costs most likely to be incurred. Finally, the estimate was partially credible because it did not sufficiently account for program risks. As a result, the cost estimate did not provide a reliable basis for important program decision making, such as developing annual budgets, making requirement trade-offs, and gauging shipbuilder progress, among other things.

Relevance of Cost Estimating Criteria

As detailed in appendix I, in developing this Guide, we researched legislation, regulations, policy, and guidance for the most pertinent criteria to cost estimating and EVM. We intend this Guide to serve as a starting point for auditors to identify criteria. For each new engagement, auditors should exercise diligence to see what, if any, new legislation, regulation, policy, and guidance exists.

Auditors also need to decide whether criteria are valid. Circumstances may have changed and the criteria may no longer conform to sound management principles or reflect current conditions. In such cases, auditors need to select or develop criteria that are appropriate for the engagement’s objectives. Table 16 lists salient legislation and regulations as sources of criteria related to cost estimating and EVM.

Table 16: Select Cost Estimating and EVM Criteria for Federal Agencies: Laws and Regulations
Scroll to the right to view full table.
Year Title Applicable agency Notes
2009 Weapon Systems Acquisition Reform Act of 2009, as amended DOD Limits weapon system cost overruns and strengthen oversight and accountability. It established four offices within DOD: Systems Engineering; Developmental Test and Evaluation; Cost Assessment and Program Evaluation; and Performance Assessments; and Root Cause Analyses. The Act also requires DOD to ensure that the acquisition strategy for major defense acquisition programs includes measures to ensure competition or the option of competition throughout the program life cycle.
1982 Unit Cost Reports (‘Nunn-McCurdy’); 10 U.S.C. § 2433 DOD Establishes the requirement for DOD to prepare unit cost reports on major defense acquisition programs or designated subprograms, if a program exceeds cost growth thresholds specified in the law. This is commonly referred to as a Nunn-McCurdy breach, which DOD is required to report to Congress and, if applicable submit a certification to Congress in order to continue the program, in accordance with 10 U.S.C. 2433a.
1994 The Federal Acquisition Streamlining Act of 1994, ’ 5051(a), 41 U.S.C. ’ 3103 All civilian agencies Established congressional policy that agencies should achieve, on average, 90 percent of cost, performance, and schedule goals established for their major acquisition programs; requires an agency to approve or define cost, performance, and schedule goals and to determine whether there is a continuing need for programs that are significantly behind schedule, over budget, or not in compliance with performance or capability requirements, and to identify suitable actions to be taken.
1996 Clinger-Cohen Act of 1996, 40 U.S.C. ’’ 11101’11704 All Requires agencies to base decisions about information technology investments on quantitative and qualitative factors associated with their costs, benefits, and risks and to use performance data to demonstrate how well expenditures support program improvements.
2006 Federal Acquisition Regulation (FAR), Major Systems Acquisition, 48 C.F.R. part 32, subpart 34, Earned Value Management System All Earned Value Management System policy was added by Federal Acquisition Circular 2005-11, July 5, 2006, Item I’Earned Value Management System (EVMS) (FAR Case 2004-019).
2008

Defense Federal Acquisition

Regulation; Earned Value Management Systems, 73 Fed. Reg. 21,846 (April 23, 2008), codified in pertinent part at 48 C.F.R. ’’ 234-201 to 234-203, and ’’’ 252.234-7001 - 7002)
DOD DOD’s final rule (1) amending the Defense Federal Acquisition Regulation Supplement (DFARS) to update requirements for DOD contractors to establish and maintain EVM systems and (2) eliminating requirements for DOD contractors to submit cost/schedule status reports.
2010 Government Performance and Results Act (GPRA) Modernization Act of 2010, Pub. L. No. 111-325, 124 Stat. 3866 (Jan. 4, 2011) All Significantly enhances the Government Performance and Results Act (GPRA) of 1993, Pub L. No. 103-62, 107 Stat. 285 (Aug. 3, 1993). Requires agencies to prepare (1) multiyear strategic plans describing mission goals and methods for reaching them, and (2) annual program performance reports to review progress toward annual performance goals.
2017 American Innovation and Competitiveness Act 42, Pub. L. No. 114-329, 130 Stat. 2969, 2989 (Jan. 6, 2017), codified in pertinent part at U.S.C. ’’1862s-2(a)(2)(D) National Science Foundation When engaging in oversight of a major multi-user research facility project, the Director of the National Science Foundation is required to ensure that policies for estimating and managing costs and schedules are consistent with the best practices described in the GAO Cost Estimating and Assessment Guide, among other guidance.

Source: GAO. | GAO-20-195G


  1. From an auditing perspective, reliability means that data are reasonably complete and accurate, meet the intended purposes, and are not subject to inappropriate alteration. For more information, see GAO, Assessing the Reliability of Computer-Processed Data, GAO-09-680G, (Washington, D.C.: July 2009).↩︎

  2. GAO, Government Auditing Standards, GAO-18-568G (Washington, D.C.: July 2018).↩︎