Developing a Corporate Surveillance Plan
A corporate-level surveillance plan should contain a list of programs for review. The plan’s objective is to address, over the course of the year, the question of whether the contractor is applying the full content of its EVM system relative to the 32 guidelines. The surveillance organization should select candidate programs by the risk associated with completing the remaining work, so that surveillance can be value-added. To facilitate selection, it is important to evaluate the risks associated with each program. Table 26 outlines some factors that may warrant program surveillance.
Table 26: Sample Factors in Selecting Projects
Factor | Description |
---|---|
Contract value | The contract value is viewed in relative terms for the organization. The higher the dollar value of the contract, the greater the potential for the program will be selected for a review |
Type and phase of contract | The type and phase of a program may provide good indications of risk. Development and notable customer contracts (e.g., Department of Defense (DOD) ACAT I/ACAT II programs) are typically larger with more discrete effort using earned value management (EVM) scheduling and work/budget practices whereas production and operations and maintenance contracts are considered lower in risk due to the repetitive or level of effort nature of the program. High dollar firm fixed price (FFP) contracts primarily hold significant risks to the contractor and may contain EVMS clauses containing reporting requirements on schedule performance. The contract phase may determine the type of program, for example, when transitioning from development to production. Development programs benefit from work definition, budget, and authorization practice reviews, whereas production programs may lend themselves more readily for assessment of manufacturing scheduling and material management and control. |
Value and nature of remaining work | The higher the dollar value of the remaining work, the greater the probability a program will be selected for a review. The technical content of remaining work is also reviewed to determine the level of performance risks on the contract. |
Experience of organization program office | The program office’s experience with implementing and using EVM processes may influence the selection of projects for surveillance. The lack of experience with EVM in the program office’s personnel might allow program baseline planning to be accomplished without following documented procedures, thereby increasing the risk of poor applications with unreliable program data. Conversely, program offices that are more experienced with EVM applications and data use are more suited to maintain better data integrity required for program reporting, thus lowering risk. |
Internal surveillance | Some program teams engage in internal surveillance. In these instances, the organization may take into account the frequency, quality, and confidence it has in the program team’s internal surveillance when determining the frequency and selection of the program for surveillance. |
Current or cumulative cost or schedule variances or variances at completion | Projects experiencing difficulty in maintaining cost or schedule control increases the probability the program will be selected for a review. Variances may be indicators of possible issues and may be further investigated within work/budget, scheduling, managerial analysis, or change management practices. |
Baseline volatility, resets or changes | The frequency of baseline resets or changes, especially when accompanied by elimination of cumulative cost or schedule variances, may be indicative of a number of situations: poor original baseline planning, a change in work approach, make or buy determinations, or significant schedule/technical changes. Projects reflecting a significant number of baseline resets increases the probability the program will be selected for a review. |
Schedule risk analysis confidence level | The program schedule is a foundational element of the EVMS. The lower the confidence in the quality, analysis, or executability of the schedule as well as questionable outcomes resulting from schedule risk assessments increases the probability of selecting the program for review. |
Risk and opportunity assessment | The management and maintenance of the risk and opportunity management process needs to be considered, to include: (1) the quality of the risk and opportunity assessment and the related risk and opportunity handling plans; (2) the extent of risk and opportunity management integration with EVMS, as well as adequate management reserves to address risks and opportunities not included in the PMB. Other factors to consider are the confidence level of the PMB and the program’s risk and opportunity trends. |
Findings or concerns from prior reviews | Past results may indicate the need for adjusting the frequency of the reviews. Latency in closing previous findings/action items could be a concern and may cause a program to fall out of compliance. |
Customer or management interest | The inclusion of subcontractors on the program can influence the selection process. Example considerations: the number of subcontractors, the degree of experience with EVMS, EVMS contractual requirements (e.g., formal EVMS flow-down, integrating the subcontractor into the prime’s EVMS, reporting only). |
Subcontractor considerations | The degree of customer or management concerns or interest in the program may be a factor influencing the selection process. |
Source: ©2018 National Defense Industrial Association (NDIA) Integrated Program Management Division (IPMD), Surveillance Guide (Arlington, VA: November 2018). | GAO-20-195G
Senior management may ask the surveillance organization to focus its review on specific procedures arising from government program office concerns, interest in a particular process application, or risks associated with remaining work. This enables the surveillance organization to concentrate on processes that are the most relevant to the program phase. For example:
a surveillance review of the change incorporation process would be more appropriate for a program in which a new baseline had recently been implemented than for a program that had just started and had not undergone any changes;
a surveillance review of the EAC process would yield better insight to a development program in which technological maturation was the force behind growing EAC trends than it would to a production program that had stable EAC trends;
although the goal is to review all 32 EIA-748 guidelines each year, if a program were almost complete, it would not make sense to focus on work authorization because this process would no longer be relevant.
Developing a Program Surveillance Program
The surveillance team designated to perform program reviews should consist of experienced staff who fully understand the contractor’s EVM system and the processes being reviewed. The surveillance organization should appoint the team leader and ensure that all surveillance team members are independent. They should not be responsible for any part of the programs they assess.
Key activities on the surveillance team’s agenda include reviewing documents, addressing government program office concerns, and discussing prior surveillance findings and any open issues. The team should allocate sufficient time to complete all these activities. The documents for review should give the team an overview of the program’s implementation of the EVM process. Recommended documents include:
- at least 2 months of program EVM system reports;
- EVM variance analyses and corrective actions;
- program schedules;
- risk management plan and database;
- program-specific instructions or guidance on implementing the EVM system;
- WBS with corresponding dictionary;
- organizational breakdown structure;
- EAC and supporting documentation;
- correspondence related to the EVM system;
- contract budget baseline, management reserve, and undistributed budget log;
- responsibility assignment matrix identifying control account managers;
- work authorization documentation;
- staffing plans;
- rate applications used; and
- findings from prior reviews and status.
Additionally, it is recommended that if there are any concerns regarding the validity of the performance data, the government program office be notified. Finally, inconsistencies identified in prior reviews should be discussed to ensure that the contractor has rectified them and continues to comply with its EVM system guidelines.
Executing the Program Surveillance Plan
Surveillance should be approached in terms of mentoring or coaching the contractor on where there are deficiencies or weaknesses in its EVM process and offering possible solutions. The contractor can then view the surveillance team as a valuable and experienced asset to determine whether it can demonstrate that it is continuing to use the accepted EVM system to manage the program.
Successful surveillance is predicated on access to objective information that verifies that the program team is using EVM effectively to manage the contract and complies with company EVM procedures. Objective information includes program documentation created in the normal conduct of business.
Besides collecting documentation, the surveillance team should interview control account managers and other program staff to determine if they can describe their compliance with EVM policies, procedures, or processes. The interview enables the surveillance team to gauge the EVM knowledge of the program staff and their awareness of and practice in complying with EVM guidelines. This is especially important because control account managers are the source of much of the information on the program’s EVM system. Interviews also help the surveillance team determine whether the control account managers see EVM as an effective management tool. The following subjects should be covered in an interview:
- work authorization;
- organization;
- EVM methodologies, knowledge of the EVM process, use of EVM information, and EVM system program training;
- scheduling and budgeting, cost and schedule integration, and cost accumulation;
- EACs;
- change control process;
- variance analysis;
- material management;
- subcontract management and data integration; and
- risk assessment and mitigation.
During interviews, the surveillance team should ask them to verify their responses with objective program documentation such as work authorizations, cost and schedule status data, variance analysis reports, and back-up data for any estimates at completion.
When all the documentation has been reviewed and interviews have been conducted, the surveillance team should provide appropriate feedback to the program team. The surveillance team leader should present all findings and recommendations to the program staff so that any misunderstandings can be clarified and corrected. Specifically, surveillance team members and program personnel should clarify any questions, data requests, and responses to be sure everything is well understood.
Once program personnel have provided their feedback, a preliminary report should be prepared that addresses findings and recommendations. Findings fall into two broad categories: (1) compliance with the accepted EVM system description and (2) consistency with EVM system guidelines. Practices may comply with the system description, while others may fall short of the intent of an EVM guideline because of discrepancies in the system description. If findings cannot be resolved, confidence in program management’s ability to effectively use the EVM system will be reduced, putting the program at risk of not meeting its goals and objectives. Open findings may also result in withdrawing advance agreements and acceptance of the company’s EVM system.
Team members may recommend EVM implementation enhancements, such as sharing successful practices or tools. Unlike findings, however, recommendations need not be tracked to closure.
In addition to findings and recommendations, the final team report should outline an action plan that includes measurable results and follow-up verification to resolve findings quickly. It should present the team’s consensus on the follow-up and verification required to address findings resulting from the surveillance review. An effective corrective action plan must address how program personnel should respond to each finding, and it must set realistic dates for implementing corrective actions. The surveillance review is complete when the leader confirms that all findings have been addressed and closed.
Managing System Surveillance Based on Program Results
After a program’s surveillance is complete, the results are collected and tracked in a multi-program database. This information is transformed into specific measures for assessing the overall health of a contractor’s EVM system process. These measures should be designed to capture whether the EVM data are readily available, accurate, meaningful, and focused on desirable corrective action. The types of measures may vary from contractor to contractor, but each one should be well defined, easily understood, and focused on improving the EVM process and surveillance capability. They should have the following characteristics:
- surveillance results identify deviations from documented EVM application processes, and
- process measures that indicate whether the surveillance plan is resolving systemic issues.
To develop consistent measures, individual program results can be summarized by a standard rating system.
Summarizing individual program findings by a standard measure can help pinpoint systemic problems in a contractor’s EVM system and can therefore be useful for highlighting areas for correction. This may result in more training or changing the EVM system description to address a given weakness by improving a process. Without the benefit of standard measures, it would be difficult to diagnose systemic problems; therefore, it is a best practice to gather and review them often.