INTRODUCTION

In FY 1995, the INS OIRM began activities in support of 12 automation initiatives. The goal of the 12 initiatives was to improve automated systems by developing accurate, timely, and integrated databases to improve INS' effectiveness and productivity. In FY 1996, OIRM reorganized these 12 initiatives into 8 functional program areas, including OIRM Operations, as shown in the following table. (See Appendix IV for a detailed description of the eight automation programs.) Except for OIRM Operations, each of the functional program areas consists of separate automation projects.2 According to the most recent INS funding profile, the automation programs through FY 2001 and beyond will cost just over $2.8 billion.

 

REORGANIZATION OF INITIATIVES INTO PROGRAMS

INITIATIVE NO.

FY 1995 INITIATIVES

FY 1996 PROGRAMS

PROGRAM NO.

1

2

Infrastructure

Video Teleconferencing

Infrastructure

1

3

4

Asylum Reform

Benefits Systems

Examinations Systems

2

5

Control Admissions at Ports of Entry Inspections Systems

3

6

7

ENFORCE

Electronics Support

Enforcement Systems

4

8

Biometrics Identification Systems Biometrics Systems

5

9

10

Verification Systems

Corporate Database

Corporate Information Systems

6

11

Financial & Administrative Management Systems Management Systems

7

12

Planning OIRM Operations

8

 

The INS produces two periodic reports that display the status of certain automation projects. The first is the OIRM Monthly Progress Review, which is produced for internal review by INS management. The second is the Quarterly Reporting of INS Automation Initiatives, which is provided to the Justice Management Division and the Office of the Inspector General for its oversight reviews.

FINDINGS AND RECOMMENDATIONS

  1. STATUS OF THE AUTOMATION PROJECTS

The INS could not sufficiently track the status of its automation projects to determine whether progress was acceptable given the amount of time and funds already spent. As a result, INS continued to spend hundreds of millions of dollars on automation projects for which there were inadequate budgeted costs or explanations for how the funds were spent. In addition, projects were running behind schedule with no documented explanations as to what was causing the delays. We also found serious deficiencies in OIRM compliance with the system development life-cycle process. As a result, INS had no assurance that systems will meet performance and functional requirements.

In our initial audit, we reported that after spending almost $500 million on its automation projects during FYs 1995 and 1996, INS had not sufficiently tracked its projects. As a result, INS could not determine if progress towards the completion of its automation programs was acceptable given the amount of time and funds spent. In addition, INS had not developed comprehensive performance measures for 16 of the 22 projects in support of the automation programs that were tracked on an ongoing basis by INS. Consequently, INS had no assurance that these projects would meet INS' overall information technology goals.

The INS responded by stating that it would link planned to actual activities and the related funding in the revised quarterly reports it sends to the Justice Management Division. The quarterly information would be derived from improved project management tracking practices among project managers. In addition, INS stated it would establish comprehensive performance measures for all major projects managed by OIRM.

In our current audit, we determined that after spending approximately $813 million on its automation programs during FYs 1995 through 1997, INS still cannot sufficiently track the status of its projects to determine whether progress is acceptable. As a result, during FY 1997, project costs continued to spiral upward with no baselines against which actual costs could be compared. Also, INS staff were unable to adequately explain how funds were spent. In addition, over a 14-month period, at least seven automation projects experienced significant, unexplained delays. Further, planned project tasks were not adequately monitored to ensure their timely completion, and monthly progress reports were incomplete, unclear, and untimely.

Additionally, our review of project files for the Enforcement Case Tracking System (ENFORCE) version 1.2 and Intelligent Computer Agent Dispatching (ICAD) version II automation projects disclosed that project managers did not adhere to the INS system development life-cycle process. Consequently, INS does not have reasonable assurance that these projects will meet performance and functional requirements. Finally, INS still has not implemented processes to ensure that performance measurements are established for ongoing projects. Without these performance measurements, INS will be unable to determine whether deployed projects are meeting intended goals.

A. Efforts to Monitor and Control Project Costs

To control project costs, INS managers must develop and implement processes to compare actual project costs against projected cost estimates. The frequency of these comparisons is often based on the complexity and cost of the automation projects. If a project is over cost, INS managers must make timely decisions regarding whether the project should be continued, modified, or canceled, and take mitigating steps quickly to address any procedural or operating deficiencies.

Based on our review and analysis of the FY 1997 monthly progress reviews, we found no evidence that OIRM managers prepared annual cost estimates from which actual costs could be monitored and controlled. Instead, a monthly spending plan was shown for each of the projects tracked in the monthly reviews with no explanation as to what the spending plan was based on or comprised of. With the passing of each month, adjustments were made to either increase or decrease each project's spending plan. For instance, the spending plan for all 22 projects tracked in the November 1996 monthly review totaled $81 million. By the end of fiscal year 1997, the spending plans were increased to $279 million, with no justifications provided as to why the additional funds were required.

With no estimates against which to compare actual costs, OIRM managers had no objective means for determining the reasonableness of actual costs incurred. Nor could OIRM managers be held accountable for cost overruns since there was little likelihood that cost overruns would ever be identified or reported. As an illustration, if an automation project incurred excess costs of $5 million in any given month, OIRM managers could merely increase the individual project's spending plan by $9 million and show a favorable $4 million in funds remaining, thereby creating the inaccurate impression that the project's actual spending was well within some predetermined plan.

Through this reporting methodology OIRM managers did, in fact, create the impression that annual spending for automation projects was well within some predetermined plan. For instance, the end-of-year progress reviews for FYs 1996 and 1997 reported that total expenditures fell short of spending plans by $15 million and $23 million respectively. Given the lack of discipline, control or review that we observed with respect to the way these figures were determined, we decline to give them any credence as evidence of a program accomplishment.

B. Progress Toward Timely Project Completions

Each month, OIRM prepares a monthly progress review for internal management that is intended to track the progress of its automation programs. For each project, the monthly reviews provide its recipients with:

Our review and analysis of the monthly progress reviews (over a 14-month period) disclosed that unexplained delays occurred in meeting scheduled completion dates; project tasks were not adequately monitored to ensure timely completion; and monthly progress reports were incomplete, unclear, and untimely.

1. Unexplained Project Delays

Between October 1996 and December 1997, INS automation projects experienced significant, unexplained delays in meeting scheduled completion dates. We reviewed and analyzed scheduled project completion dates as reported in monthly progress reviews. Based on our review, we determined that 7 of the 22 automation projects being tracked were delayed by 9 months to 3 years, with the average delay being 24 months. Moreover, we were unable to determine the status of five other projects since the monthly reports for these projects no longer included Gantt charts showing the latest planned completion dates.

Although the monthly reviews contained a caption titled Areas of Concern and Critical Issues for each project, none of the reviews clearly identified or explained reasons for the project delays. For example, the September 1997 progress review showed a planned completion date of September 2000 for the Policy, Planning, Standards & Quality Management Project. However, one month later, the October report showed a September 2003 completion date and "none to report" under the Areas of Concern and Critical Issues caption.

Our review further disclosed that OIRM managers did not retain established milestone dates as a benchmark by which progress could be measured. Instead, as project delays were anticipated, the previously reported milestone dates were changed to reflect OIRM managers' latest estimates. By changing milestone dates to reflect the latest estimates, OIRM managers created the perception that their automation projects were progressing on schedule when, in fact, at least 7 projects had experienced significant delays in only 14 months' time. In contrast, by reporting the latest estimates against fixed milestones, project managers would have alerted senior OIRM managers of the delays, prompting them to initiate corrective actions to ensure projects remain on schedule.

2. Uncompleted Project Tasks

The OIRM managers did not adequately monitor FY 1997 planned project tasks. Our analysis of project plans for the 22 projects listed in the September 1997 Monthly Progress Review indicated that a total of 103 tasks were scheduled for completion during FY 1997. However, of these 103 tasks, 29 (28 percent) were incomplete by the fiscal year end. Moreover, there were no explanations as to why these tasks were not completed.

Because the progress reviews did not contain monthly and end-of-year task summaries, OIRM managers were unaware that 28 percent of their planned FY 1997 tasks were not completed. While OIRM managers stated that the monthly reviews were never intended to be used to assess the overall status of project tasks, OIRM managers had no other reporting system to track the progress of planned tasks.

Periodic reviews to determine progress toward completing planned tasks enable managers to assess the accuracy of their own reporting mechanisms as well as pinpoint operational weaknesses that may be causing project delays. For example, when we told OIRM managers that 28 percent of their planned tasks were not completed, they responded by examining their own reporting system, and found a number of tracking deficiencies. For instance, planned tasks and their completion status were not reported until the last month of the fiscal year, but should have been reported throughout the year so that progress could be tracked continuously. In addition, changes made to planned tasks during the fiscal year were never documented, and planned tasks were not always categorized correctly. According to OIRM managers, 12 of the tasks identified for completion during FY 1997 were miscategorized and should have been placed in the on-going (multi-year) task category.

3. Incomplete, Late, and Unclear Progress Reporting

The monthly progress reviews did not encompass all of INS' automation projects. The 22 projects tracked in the reviews included only 28 of the 57 line item allocations included in the FY 1997 OIRM budget. The remaining 29 line item allocations that were excluded totaled $47 million, or 15 percent of OIRM's $320 million annual budget. We were unable to determine why certain automation projects were excluded from the monthly reviews; however, without full disclosure of what they are reporting, OIRM managers provide readers with an incomplete, or even misleading, assessment of their automation programs.

Monthly progress reviews were not prepared in a timely manner. The August and September 1997 reviews were not completed until January 1998, and the October, November, and December 1997 reviews were still in draft form as of February 1998. Unless these reports are prepared promptly, management does not have the up-to-date information it needs to make meaningful reviews and to identify and remedy project delays.

The reporting format for the monthly progress reviews was unclear and difficult to understand.

C. Life-Cycle Management

The INS life-cycle process provides a structured and time-tested approach to designing information systems. One of the many benefits of life-cycle management is that it provides clear measures of project progress and status, thereby enabling managers to identify problem areas and initiate corrective actions if needed.

Project life-cycles are divided into phases. For each phase, key documents (typically referred to as deliverables) are generated to communicate status and direction. Project managers collect deliverables at the end of each phase and post them to the project workbook. The number of phases and deliverables a project has depends on the life-cycle work pattern being employed. The standard work pattern consists of eight phases with a maximum of 38 deliverables. Project managers can also choose from four alternate work patterns that allow them to tailor projects to meet specific needs and still meet life-cycle standards.

At the end of each project phase, OIRM managers must conduct a series of comprehensive life-cycle reviews. Essentially, these reviews ensure that all deliverables created during project phases meet system functional and performance requirements. To avoid project delays, project managers can progress to the phase beyond the one currently being reviewed; however, such progress is limited to 20 percent of the total time and resources planned for the succeeding phase. The 20 percent limit is intended to allow some progress while minimizing risk in the event of product disapproval. If required approvals are not forthcoming and project schedules are in jeopardy, project managers may obtain waivers to progress restrictions. Such waivers, however, must be documented, signed, and posted to the project workbook.

The most crucial deliverable of the life-cycle management process is the project plan. Created in the planning phase, the project plan describes how each life-cycle phase will be accomplished and is used to provide direction to many of the project life-cycle activities.

1. Importance of Life-Cycle Documentation

Effective communication and coordination of activities throughout the life-cycle depends on the complete and accurate documentation of activities, decisions, and events leading up to decisions. Undocumented or poorly documented events and decisions can cause significant confusion or wasted efforts and can intensify the adverse impact of project staff turnover. Project activities should not be considered complete, nor decisions made, until there is tangible documentation of the activity or decision.

2. Review of Automation Projects

To evaluate OIRM compliance with the INS life-cycle management process, we selected the ENFORCE version 1.2 and the ICAD version II automation projects for our review.3 According to INS records, both systems were developed using the seven phase, rapid application development work pattern requiring a total of 22 deliverables.

With a development cost of about $32 million, ENFORCE 1.2 is designed to provide INS officers with a user-friendly case-processing tool that automates many administrative tasks, thereby allowing officers to focus more on collecting and analyzing investigative data. According to the ENFORCE project manager, system deployment to over 170 locations began in July 1997.

The ICAD II is an INS alarm and dispatch system with facilities for accessing law enforcement databases to support the general functions of the Border Patrol and other INS law enforcement functions.4 According to the INS Information Resources Desk Reference, system deployment to the Southwest and Northern border sites was completed early in FY 1996.

Our review of project files for both the ENFORCE 1.2 and ICAD II automation projects disclosed the following significant deficiencies in OIRM compliance with the life-cycle process.

D. Outcome-Based Performance Measurements

Once projects have been fully developed and deployed, actual results are compared against pre-established performance measurements to: (1) assess the project's impact on mission performance, (2) identify any changes or modifications to the project that may be needed, and (3) revise the automation investment process based on lessons that were learned. According to OIRM personnel, project managers should establish performance measurements fairly early in the system development life-cycle process.

We found that INS still had not established processes to ensure that comprehensive performance measures were established for ongoing automation projects. Our discussions with OIRM managers disclosed that INS did not even have an agreed upon directory of automation projects for which performance measures should be established. During our initial audit, we identified 22 projects for which performance measures should have been established. However, during our current audit, the Acting Director, Systems Policy and Planning Branch, disagreed with our conclusion because she believed some of the 22 projects were, in effect, support systems for other automation projects.

Using a list of 96 projects provided to us in January 1998 by the INS Associate Commissioner for Management, the Acting Director identified a total of 34 projects for which, in her opinion, performance measures should have already been established. However, in interviewing OIRM personnel, we determined that performance measurements had been established for only 15 of the 34 projects (44 percent). As a result, INS will still not be able to evaluate whether a significant number of its completed automation projects will meet established goals.

E. Causes for Lack of INS Awareness Over Project Status

We identified three contributing causes for INS' lack of awareness of project status. First, the OIRM organization did not have a uniform directory of automation projects by which it manages its investment technology portfolio. Second, project status data was not readily available for review. Third, INS did not implement basic control processes to ensure its automation projects were adequately managed.

1. Lack of a Uniform Project Listing

To successfully manage its complex and extensive automation programs, INS must develop and maintain a uniform directory of the automation projects that make up its automation programs portfolio. The importance of developing and maintaining such a directory cannot be overstated, since it is the baseline by which OIRM should manage its automation programs. Without such a directory, there is no common baseline within the OIRM organization upon which team members and project managers can focus their collective efforts. With such a baseline, INS managers could ensure that ongoing projects are on time and under cost, will meet established functional and performance requirements, and have performance measurements established to ensure that, once deployed, they meet intended goals.

We found that OIRM did not develop and maintain a uniform directory of the automation projects making up its automation programs portfolio. Instead, managers at various levels throughout the OIRM organization created their own directories of projects based on their individual perceptions of what constitutes an automation project. As a result, OIRM managers did not have a common baseline of automation projects by which they could focus their collective efforts.

To illustrate the difficulties that can arise from the lack of a uniform project directory, we will narrate our own experiences during this audit. Initially, INS managers provided us with a newly developed Desk Reference that identified 45 automation projects upon which we could base our audit testing. At the commencement of audit field work, however, INS provided us with another list identifying only 41 projects. When we pointed out the discrepancy between this list and the desk reference, we were provided with yet another list that identified 43 projects.

Subsequently, the Assistant Commissioner, Data Systems Division, told us that the list of 43 projects we were basing our audit tests on was inaccurate because most of the items listed were actually sub-projects, support systems, functions, or milestones for other projects. However, she was unable to provide us with what she considered to be an accurate project directory. In an effort to obtain a reliable directory of projects, we went to the OIRM Associate Commissioner, who told us that the 22 projects tracked in the monthly progress reviews would be the most reliable directory available. He cautioned, however, that this directory would be reliable only to a point, but he could not clearly define at what point the directory would become unreliable. At the same time, OIRM provided us with a FY 1997 budget line item status list that identified a total of 57 projects.

In a further effort to obtain a reliable directory, we went to the INS Executive Associate Commissioner for Management, who expressed concern that a uniform, agreed-upon directory of projects was unavailable. In response to that concern, OIRM provided us with another list of 96 projects. Nevertheless, this list was also subject to interpretation by OIRM personnel. For instance, when we showed the list to the Acting Director, Systems Policy and Planning Branch, she advised that 42 of the projects listed are actually support systems or financial tracking accounts for automation projects.

Subsequent to our audit, INS provided us with yet another list titled "FY 1998 Projects List" containing 101 automation projects.

2. Data Not Readily Available

In our judgment, effective management decisions can only occur if accurate, reliable, and up-to-date information is included in the decision-making process. To this end, project data must be easily accessible to program team members and senior managers. The OIRM must have hard numbers and facts on what was spent on its automation programs and what INS has achieved. Project information must be collected, maintained, and readily available, and an organizational track record should be maintained. Project results and lessons learned must be tracked and aggregated in order to further refine and improve decision-making.

During our follow-up review, we consistently found that project information fundamental to effective project management and decision-making was not readily available. As previously discussed, there was no uniform list of automation projects; there was no single, complete source for project status reporting; there was no readily available, single source for obtaining project deliverables, or waivers to life-cycle requirements; and there was no documented track record of management decisions. Further, there was no readily available, documented comparison of actual to estimated costs, or explanations of what funds would be spent on, or reasons for exceeding project schedules, or explanations of why planned project tasks were not achieved. Instead of easily accessible data with hard and fast figures, the repeated INS theme was that the real answers to our questions were just one more document, discussion, listing, or meeting away.

3. Ineffective Management Control Processes

To maintain effective project control, OIRM senior managers must develop, document, and implement control processes to regularly monitor the progress of ongoing projects against their cost, scheduled completions, anticipated performance and delivered benefits. How often and to what extent individual projects should be monitored is based on project risk, cost, and complexity. To this end, project review schedules accentuate management accountability by creating pre-arranged checkpoints for projects and forcing corrective actions when necessary.

Our discussions with OIRM senior managers disclosed that fundamental project review processes were not developed, documented, or implemented. Project review schedules were not established at pre-arranged checkpoints, nor were risk and dollar thresholds predetermined to identify problematic projects that should bump up to the highest review levels within INS. Further, processes were not established that force senior managers to take decisive action to address problems, nor were management decisions documented to assemble a track record of results. As a result, there was minimal management accountability and little documented assurance that INS' automation projects will be completed under cost, on schedule, and meet performance and functional requirements.

F. Recent INS Developments to Initiate Corrective Actions

Subsequent to completion of our field work, INS stated that it is in the process of gearing up to conduct, with contractor support, a detailed review of the processes used to plan, manage, and track/report on its automation programs. Additionally, INS stated that with the award of its Service Technology Alliance Resources (STARS) contracts it has put in place a process, with an appropriate commitment of resources, to provide for complete project documentation. This documentation includes changes resulting in adjusted project milestones. The documentation will also include one or more layers of project level cost tracking, formal deliverable acceptance procedures, and a Task, Review, Analysis, and Coordination (TRAC) process that will require regular meetings between INS and contractor personnel. It will also provide a detailed record of changes and decision making throughout each task.

Additionally, INS stated that with the assistance of its Systems Management and Integration (SM/I) contractor, it is also putting in place numerous management controls. These controls will improve accountability and documentation for the expenditure of funds for all IRM projects. According to INS, its new contracts are completely changing the way automation projects are managed. The STARS Statement of Work and the STARS Concept of Operations reflect the range of controls that are being required of STARS contractors.

With respect to complying with its system development life-cycle process, INS stated that, with assistance from its SM/I contractor, it hopes to more effectively manage and document its automation projects throughout the development life-cycle. By ensuring that all life-cycle documents are produced and archived appropriately, and by making more effective use of life-cycle processes, INS expects to keep its development and maintenance activities under better control, and ensure meaningful user participation.

With respect to outcome-based performance measures, INS stated that its STARS contracts include specific requirements for the development of performance measures for all tasks awarded to both the SM/I and performance contractors. These requirements are included in the STARS SM/I Performance Measurement Program Plan.

G. Recommendations

We recommend that the Commissioner, INS:

  1. Establish a uniform directory of INS automation projects for distribution throughout the OIRM organization.
  2. Prepare annual project cost estimates against which actual project costs can be monitored and controlled.
  3. Ensure that planned to actual activities and costs are adequately tracked and reflected in periodic reports on the INS automation projects. Causes for cost overruns, project delays, and unaccomplished tasks should be clearly explained and documented, along with management corrective actions.
  4. Ensure that monthly progress reviews provide clear, timely, and complete reporting for all of INS' automation programs.
  5. Require project managers to comply with the INS system development life-cycle process by:
  1. Establish comprehensive performance measures for all projects in support of the automation programs and ensure the measures are used to assess the status of projects.
  2. Ensure that project information fundamental to effective project management and decision-making is easily accessible to program teams and senior managers.
  3. Ensure that OIRM senior managers develop, document, and implement control processes to regularly monitor the progress of ongoing projects against planned project costs, schedules, performance, and benefits.
  4. Establish an independent oversight group to ensure implementation of these recommendations.

2 As discussed on pages 10 and 11 of this report, there were considerable confusion within INS as to how many automation projects actually exists.

3 We selected these projects because of their importance to the INS mission and because of the projected high life-cycle costs.

4 System development costs for ICAD II were not readily available.