Survey of EVM
Process Tasks
Conduct an integrated baseline review that validates the performance measurement baseline
Receive contract performance reports and conduct monthly EVM Analysis
Use EVM data to analyze performance
validate the data,
determine what variances exist,
probe schedule variances to see if activities are on the critical path,
develop historical performance data indexes,
graph the data to identify any trends, and
review the format 5 variance analysis for explanations and corrective actions.
Use EVM data to project future performance
identify the work that remains,
calculate a range of EACs and compare the results to available funding,
determine if the contractor’s EAC is feasible, and
calculate an independent date for program completion.
Continue EVM until the program is complete
Ensure management is kept informed on updates to EACs and other EVM data
Ensure the EVM System was validated for compliance with the EIA-748 guidelines
Conduct regular EVM system surveillance to ensure the contractor’s effective management of cost, schedule, and technical performance and compliance with EIA-748 guidelines
Best Practices
Establish a comprehensive EVM System
- The program has a certified EVM system.
- The program has an EVM system that is certified to be compliant with the 32 EVM system guidelines.
- Documentation identifies when the certification was performed and who did the certification.
- An IBR verified that the baseline budget and schedule captured the entire scope of work, risks were understood, and available and planned resources were adequate.
- An IBR was conducted for implementation on a program.
- The IBR identified risks and verified that the baseline’s budget and schedule are adequate for performing the work
- The schedule reflects the work breakdown structure, the logical sequencing of activities, and the necessary resources
- There is evidence that the program has scheduled the authorized work in a way that identifies the program WBS, and describes the sequence of work and the time-phased budget.
- EVM surveillance is being performed by independent and qualified staff.
- Surveillance reviews are conducted regularly by independent and qualified staff.
Ensure that the data resulting from the EVM system are reliable.
- EVM data do not contain any anomalies.
- EVM data are validated and reviewed for anomalies
- EVM data are validated and reviewed for anomalies
- EVM data are consistent among various reporting formats.
- EVM data is consistent among all reporting formats.
- EVM data is reported to management and stakeholders in program briefings and traceable to the EVM reporting formats.
- EVM data is consistent among all reporting formats.
- The estimate at complete is realistic.
Ensure that the program management team is using earned value data for decision-making purposes.
- EVM data, including cost and schedule variances, are reviewed on a regular basis and analysis is conducted on EVM trends and metrics
- Management uses EVM data to develop corrective action plans.
- The performance measurement baseline is updated to reflect changes.
Likely Effects if Criteria Are Not Fully Met
- Unless EVM is implemented at the program level rather than solely at the contract level, the program may not have an effective means to measure how well the government and its contractors are meeting a program’s approved cost, schedule, and performance goals.
- Without continuous planning through program-level EVM, program managers may not be able to adequately plan for the receipt of material, for example government-furnished equipment, to ensure that the contractor can execute the program as planned.
- Unless EVM is implemented at the program level rather than solely at the contract level, program managers may have difficulty identifying key decision points up front that should be integrated into both the contractor’s schedule and the overall program master schedule so that significant events and delivery milestones are clearly established and communicated.
- If a program reports a high amount of level of effort for measuring earned value, it may not be providing objective data and the EVM system will not perform as expected. When level of effort is used excessively for measuring status, the program is not implementing EVM as intended and will fall short of the benefits EVM can offer.
- A continual shift of the baseline budget to match actual expenditures in order to mask cost variances—a rubber baseline—results in deceptive baselines by covering up variances early in the program, delaying insight until they are difficult, if not impossible, to mitigate.
- If changes are not incorporated quickly, the performance measurement baseline can become outdated. As a result, variances do not reflect reality, which hampers management in realizing the benefits of EVM.
- Unless changes are incorporated into the EVM system as soon as possible, the validity of the performance measurement baseline will not be maintained.
- If changes are not recorded and maintained, the program’s performance measurement baseline will not reflect reality. The performance measurement baseline will become outdated and the data from the EVM system will not be meaningful.
- If an IBR is not conducted, management will lack confidence that the performance measurement baseline provides reliable cost and schedule data for managing the program and that it projects accurate estimated costs at completion.
- Using poor estimates to develop the performance measurement baseline will result in an unrealistic baseline for performance measurement.
- If the performance measurement baseline is not validated through an IBR, there will be less confidence in the accuracy and soundness of monthly EVM reporting.
- If contract performance report data do not accurately reflect how work is being planned, performed, and measured, they cannot be relied on for analyzing actual program status.
- If variance analysis thresholds are not periodically reviewed and adjusted, they may not provide management with the necessary view on current and potential problems.
- If the contract performance report is not detailed enough, cost and schedule trends and their likely effects on program performance will not be transparent.
- If EVM data are not analyzed and reviewed at least monthly, problems may not be addressed as soon as they occur. As a result, cost and schedule overruns may not be avoided, or at least have their effect lessened.
- Unless past performance captured in in a contract performance report is analyzed, management may lack insight into how a program will continue to perform and important lessons learned.
- If contract performance report data are not validated, existing errors will not be detected and the data will be skewed, resulting in erroneous metrics and poor decision making.
- If the contract performance report data contain anomalies, the performance measurement data may be inaccurate.
- Unless EVM data are graphed to determine trends, management may lack valuable information about a program’s performance, which is important for accurately predicting costs at completion.
- Unless management knows the reasons for variances, they may not be able to make informed decisions about the best course of action.
- Unless a contractor’s estimate-at-complete (EAC) is compared to independent estimates and completion and trend data, management may lack insight into its reasonableness. In addition, requests for additional funds, if necessary, may lack credibility.
- Unless EACs are created not only at the program level but also at lower levels of the WBS, areas that are performing poorly will be masked by other areas doing well.
- Unless management has accurate progress assessments of program status, it may not be able to make informed decisions that lead to greater success; additionally, the ability to act quickly to resolve program problems will be hampered.
- Unless management knows whether the activities that are contributing to a schedule variance are on the critical path or may ultimately be on that path if mitigation is not pursued, it will not be able to project when a program will finish,
- If EVM measures such as variances or indexes are used as award fee criteria, emphasis will be put on the contractor’s meeting a predetermined number instead of achieving program outcomes. It may encourage the contractor to behave in undesirable ways, such as overstating performance or changing the baseline budget to meet variance thresholds and secure potential profit.
- Unless the contractor’s (and subcontractors’) EVM system is validated, there will be a lack of assurance that that it complies with the agency’s implementation of the EIA-748 guidelines; that it provides reliable data for managing the program and reporting its status to the government; and that it is actively used to manage the program.
- Unless the contractor’s EVM system is subjected to periodic surveillance, the government will lack assurance that it:
- summarizes timely and reliable cost, schedule, and technical performance information directly from its internal management system;
- complies with the contractor’s implementation of EIA-748 guidelines;
- provides timely indications of actual or potential problems by performing spot checks, sample data traces, and random interviews;
- maintains baseline integrity;
- depicts actual conditions and trends;
- provides comprehensive variance analyses at the appropriate levels, including corrections for cost, schedule, technical, and other problem areas;
- ensures the integrity of subcontractors’ EVM systems;
- verifies progress in implementing corrective action plans to mitigate EVM system deficiencies; and
- discusses actions taken to mitigate risk and manage cost and schedule performance.
- summarizes timely and reliable cost, schedule, and technical performance information directly from its internal management system;
- If a program requests several overtarget budgets, there may be a severe underlying management problem that should be investigated before a new budget is implemented.