Audit of the Department of Justice Information Technology Studies, Plans, and Evaluations
Audit Report 07-39
Office of the Inspector General
This report is the final in a series of three reports prepared by the Department of Justice (Department) Office of the Inspector General (OIG) in response to a congressional request included in the Department’s appropriation for fiscal year (FY) 2006. Specifically, Congress instructed the OIG to present to the Committees on Appropriations: (1) an inventory of all major Department information technology (IT) systems and planned initiatives, and (2) a report that details all research, plans, studies, and evaluations that the Department has produced, or is in the process of producing, concerning IT systems, needs, plans, and initiatives. Congress requested that the OIG include an analysis identifying the depth and scope of problems the Department has experienced in the formulation of its IT plans.
The OIG’s first report, issued in March 2006, presented an unverified inventory of the Department’s major IT investments based on information reported to the Office of Management and Budget (OMB) for budget purposes. The inventory contained 46 major investments, each with projected costs at or exceeding $15 million for FYs 2005 through 2007.
The second report, issued in June 2007, presented the refined inventory of major systems according to criteria developed by the OIG, reducing the number of major systems to 38. The second report also examined issues related to verifying cost information about the 38 systems.
This third and final report addresses the request for the OIG to prepare a report that details the research, plans, studies, and evaluations related to the Department’s information technology initiatives. This report also includes an analysis of problems related to IT planning that have been identified in previous OIG reports.
Our work involved the Department’s Office of the Chief Information Officer and eight of the Department’s components or offices. We generally focused our audit on the 38 major systems and initiatives that were identified in the refined OIG inventory. These included the following number of systems in the chart below for each of the Department’s components represented in the revised inventory.
|Component|| Number of
|Bureau of Alcohol, Tobacco, Firearms and Explosives (ATF)||1|
|Bureau of Prisons (BOP)||1|
|Drug Enforcement Administration (DEA)||61|
|Executive Office for Immigration Review (EOIR)||1|
|Federal Bureau of Investigation (FBI)||21|
|Justice Management Division (JMD)||6|
|Office of the Deputy Attorney General (ODAG)||1|
|Office of Justice Programs (OJP)||1|
|Source: Department of Justice, Office of the Inspector General|
The types of systems, stages of development, and scopes of the projects vary widely. The systems include infrastructure acquisitions and application development projects that are in the early phases of planning and others that had been operational for several years.
OIG’s Audit Approach
Our audit objectives were to: (1) identify all research, plans, studies, and evaluations that the Department has produced, or is in the process of producing, concerning IT systems, needs, plans, and initiatives; and (2) analyze the depth and scope of the problems the Department has experienced in the formulation of its IT plans.
We identified relevant federal, Department, and component-specific requirements and standards for IT research, studies, plans, and evaluations, and merged the various standards into a generic set of documents. We requested and obtained documents from the components to develop the inventory, and assessed compliance with the document standards for the major systems in the inventory. For this audit report, we focused specifically on studies and research that justified the selection of investments in the OIG’s revised inventory of major IT systems and projects, plans that were developed after the investments were authorized, and evaluations that were performed after systems were implemented.
To evaluate problems the Department has experienced in its IT planning, we analyzed the evaluations obtained for information about problems the Department has experienced in formulating IT plans. We reviewed relevant audit and other independent reports, extending the scope of our audit work to some systems and projects that were not included in the inventory of major systems. We also asked the components to inform us of IT projects that had been terminated or had experienced problems.
The Deputy Assistant Attorney General for Information Resources Management (DAAG/ IRM), who reports to the Assistant Attorney General for Administration, serves as the Department’s Chief Information Officer (CIO). The CIO’s responsibilities include establishing and implementing Department-wide IT policies and standards, developing the Department’s IT Strategic Plan, and reviewing and evaluating the performance of the Department’s IT programs and projects. In his role as the DAAG/ IRM, the CIO leads the Information Resources Management (IRM) office of the Justice Management Division (JMD).
JMD developed and operates many systems that serve more than one component in the Department. The Department’s other components are responsible for providing information to the CIO, demonstrating that resources are being well spent and managed, and using the methodology in the Department’s standards for information systems. Each of the components included in the revised inventory has its own CIO, except for the Office of the Deputy Attorney General.
Numerous federal, Department, and component guidelines establish criteria for IT research, studies, plans, and evaluations. The guidelines come from both IT and budget authorities, and can apply to the Department as a whole or to individual components, such as the DEA or FBI. The various standards should complement one another. However, the IT compliance environment is complex and involves strategic planning, IT development methodologies, IT investment management, enterprise architecture, procurement, and budgeting.2 Additionally, many standards exist as guidelines rather than requirements, and allow flexibility for variation.
IT projects can be expected to go through a process of identifying a business need and alternative solutions for meeting the need, selecting the best alternative, planning to acquire or build the solution, defining specific requirements, and designing, building, testing, implementing, and evaluating the implemented solution. The Department’s Systems Development Life Cycle Guidance Document (SDLC) describes 10 life-cycle phases with associated tasks and deliverable products, including specific studies, plans, and evaluations. For different types of acquisitions and smaller-scope projects, the life-cycle work pattern can be tailored to reduce the workload from a full sequential work pattern. Tailoring the work pattern may include dropping requirements for specific tasks, studies, plans, and evaluations. Different sets of deliverables are identified in other standards, such as the Department’s Information Technology Investment Management Guide (ITIM Guide) and the FBI’s Life Cycle Management Directive (LCMD).3
Both the SDLC and ITIM tasks and deliverables generally follow the progression of IT projects chronologically. Under both, studies and research, such as alternatives analyses, feasibility studies, risk analyses, and market research for possible solutions, are performed early in the life of a system as the basis for selecting the best alternative and preparing the business case for the project. Major plans of all types, such as project management plans and quality assurance plans, are developed after the selected approach has been authorized. Post-implementation reviews, in-process review reports, and user satisfaction reviews are types of evaluations that occur after an IT system has been implemented or a project has been terminated. We used this chronological approach to identify and organize the studies, research, plans, and evaluations that are addressed in this audit.
This chronological approach is qualified by the evolutionary nature of the entire life-cycle process. As projects evolve to become more defined over time, plans should also become more defined. The life cycle of identifying business needs, selecting best alternatives, determining which IT investments should be added to and continued in the Department’s portfolio, acquiring and building solutions, and evaluating the results is intended to be iterative and ongoing. Both the SDLC and ITIM also require various types of ongoing evaluations to occur regularly as decision points are reached during the course of IT projects.
Department IT Studies, Plans, and Evaluations
Two comprehensive IT plans for the Department are required by Office of Management and Budget (OMB) standards: the Department’s IT Capital Plan and IT Strategic Plan. The IT Capital Plan, Agency IT Investment Portfolio, described in the second of the OIG’s three IT reports, represents the Department’s inventory of major IT investments. For this audit, we reviewed the Department’s IT Strategic Plan, which is described in Finding 1 of this report. Components are also allowed to develop their own IT strategic plans, as long as they are consistent with the Department’s plan.4 Five of the components we reviewed had developed their own IT strategic plans. The IT strategic plans are listed in Appendix III of this report. All other documents described in Finding 1, “Studies, Plans, and Evaluations,” were prepared in response to standards associated with each system or initiative.
Studies required by the various standards for IT activities and documents associated with each IT system or project are generally prepared early in the life cycle of an IT project to identify and evaluate possible alternative solutions to meet a business need. The studies include market research, alternative analyses, feasibility studies, cost-benefit analyses (or benefit-cost analyses), risk analyses, and privacy impact assessments.
The plans specified by the Department’s SDLC for each IT system or project include many types that are developed after an alternative solution has been selected. These include the following plans.
For evaluations, we requested reports of evaluations specified in the SDLC, such as post-implementation review reports, in-process review reports, and user satisfaction review reports. Post-implementation reviews are conducted after a system has been in production for a period of time and are used to evaluate the effectiveness of the system development. The review should determine whether the system does what it was designed to do, supports users as required, and was successful in terms of functionality, performance, and cost benefit. It should also assess the effectiveness of the development activities that produced the systems. The review results should be used to strengthen the systems as well as system development procedures. In-process reviews are performed during operations and maintenance to assess system performance and user satisfaction, and should occur repeatedly after a system has been implemented to ensure the system continues to meet needs and perform effectively.
Components submitted more than 800 items that we accepted as responsive to our requests. Of the 800 items, 494 were entire documents we categorized as studies, plans, and evaluations, which we included in our list of documents. The other items submitted by components were artifacts or other products of the system development and acquisition process. Artifacts included items such as briefing slides, spreadsheets showing schedules and work breakdown structures, and various progress reports. The studies, plans, and evaluations are listed in Appendix V to this report.
While many of the documents specified in various guidelines were produced, significant gaps existed between the studies, plans, and evaluations described in the guidelines and what was prepared by the components. Only seven post-implementation evaluations were obtained, of which four did not reflect lessons learned in terms of project planning and management.
We found the highest levels of compliance in the areas of business case documents, which become part of the Department’s annual budget process and are required to obtain funding for each system or project, and security plans, which are required for projects to obtain authorization to operate. The components provided at least one business case document for 36 of the 38 systems in the inventory. The two exceptions, the FBI’s Investigative Data Warehouse (IDW) and Secure Compartmented Information Operational Network (SCION), are included in an “umbrella” business case that represents the Department’s consolidated enterprise infrastructure (CEI). The business case document represents the single document type for which we found 100 percent compliance.
System security plans also had a high level of compliance. We obtained system security plans for 32 of the 38 projects. The six other projects were either too early in the life cycle for preparation of this document, or a draft security plan was undergoing review. Components also demonstrated a high level of compliance with privacy impact assessments (PIA), and we found acceptable explanations for the projects that did not submit a PIA. Components provided project management plans for 29 of the 38 projects, and explained all but one of those exceptions.
However, we found compliance in the areas of systems engineering management, configuration management, quality assurance, validation and verification, and training plans was significantly lower. The components cited several different reasons for not providing documents relating to these issues that we requested. The reasons included: (1) the requirement was not applicable to the investment; (2) a waiver to the requirement had been granted; (3) planning for the system pre-dated FY 2000 and the documentation was not available; (4) the system was purchased commercially off-the-shelf eliminating the need for certain processes; and (5) the investment had not reached the applicable point in the life cycle.
Department oversight is designed to focus on the capital planning and investment control (CPIC) process concerned with selecting and prioritizing IT investments. According to JMD officials and DOJ Order 2880.1b, Department oversight is not designed to enforce policies and procedures on documentation.5 JMD officials told us they do not perform independent reviews of the other components’ IT projects, nor do they receive major studies, plans, and evaluations from the components to review. The Department-level oversight of major IT projects is performed through presentations to the Department’s Investment Review Board, the CIO’s Dashboard report, and through the OMB exhibit 300s, all of which are described in the second report in this series. This allows some tracking of actual performance against scheduled milestones and costs, but does not involve JMD officials in the details of IT documentation for individual projects.
Based on the limited number of certain types of plans and evaluations produced on these major systems and projects, we recommend that the CIO evaluate why project teams do not prepare certain plans and evaluations, reassess the utility of those documents, and consider revising the standards for producing IT studies, plans, and evaluations for individual IT projects.
Many standards exist that define the types of studies, plans, and evaluations that should be performed for individual projects. The standards allow significant flexibility through waivers of document requirements and tailoring of the processes. For example, the SDLCs and FBI’s LCMD encourage tailoring the documentation standards to the size and complexity of the project. Although the SDLCs specify many studies, plans, and evaluations for all types of projects in the tailoring guidelines, we found that many Department projects have not generated these “required” documents. It is possible that the standards are not necessarily appropriate to different types of projects or acquisitions and should be revised. The Department should exercise increased oversight of the tailoring being done, and consider revising the guidelines for tailoring the work pattern for specific types of projects.
IT Planning Problems
To identify problems the Department has experienced in planning for IT systems and projects, we reviewed previous OIG audits and other reports. We also reviewed the evaluations we obtained from the components to help identify problems the Department has experienced in planning for IT systems.
We asked components for information on IT projects that had failed or been terminated. Other than one portion of the FBI’s Trilogy project and the FBI’s Laboratory Information Management System (LIMS) project, the components told us they were not aware of failed or terminated projects. The OIG found during work on the second report in this series that JMD’s Justice Consolidated Office Network (JCON) project had experienced a project termination sometime before FY 2002 prior to the current project. JMD, however, was not able to provide any information about the failure. The fact that no evaluation was performed to assess reasons for the failure suggests a serious gap in standards for evaluations. Terminated projects should be evaluated to determine the causes of the problems.
We also found that the Department had produced few evaluations of project management or success for IT projects in post-implementation reviews. According to the DOJ SDLC, one purpose of post-implementation reviews is to assess the effectiveness of the life-cycle development activities that produced the system. This includes analyzing if proper limits were established in the feasibility study and if the limits were maintained during implementation, addressing the reasons for variances between planned and realized benefits, addressing the reasons for differences between estimated and actual costs, and evaluating whether training was adequate, appropriate, and timely. The review results are intended to be used to strengthen the system development procedures, as well as the system itself.
The DOJ ITIM Guide calls for continuous monitoring of investments to assess progress against established cost, schedule, and performance metrics in order to mitigate any risks or costs on an on-going basis. The DOJ ITIM Guide also indicates that the activities of the evaluation phase include applying lessons learned from post-implementation reviews and periodic operational analyses for ITIM process improvement. The lessons learned for ITIM process should be incorporated into the select and control phases for future IT investments.
The OIG has issued audit and inspection reports about IT systems and project management that have focused on various IT concerns. These include the management and progress of individual IT projects, IT management in general, the performance of individual systems following implementation, system security, and system controls in financial management systems. Appendix VII lists prior OIG audits and inspections on IT issues that we reviewed for this analysis.
Among the problems that have been described in previous audit reports related to IT planning were weaknesses in investment and program management practices, business process re-engineering (BPR), cooperation between agencies, and contract management. BPR is defined as the redesign of the organization, culture, and business processes using technology as an enabler to achieve significant improvements in cost, time, service, and quality.
For example, various contracting and program management weaknesses contributed to the failure of the FBI’s Virtual Case File (VCF) project. The FBI did not effectively oversee the contract and failed to establish firm milestones to be achieved before the project could move to the next phase. In the FBI’s LIMS project, the OIG found that firmly managed schedule, cost, technical, and performance benchmarks for the contract would have raised warning signs earlier in the project and perhaps led to resolution of the problems encountered.6
The DOJ System Development Life Cycle Guidance Document indicates that business process re-engineering (BPR) should be the underpinning of any new system development or initiative, as part of strategic planning for information systems, and that agencies should consider BPR before requesting funding for a new project or system development effort. However, reviews have raised issues related to weaknesses in business process re-engineering in the planning of the Department’s IT projects. One study of the FBI’s terminated VCF project found that senior managers were not involved in efforts to re-engineer business processes or in rethinking the FBI’s use of IT, and that while users working on the re-engineering were experienced agents, none had experience with complex IT development projects or business process re-engineering.
Requirements planning is another area that has been cited as weak in specific audit reports. For example, the LIMS project was terminated in large part due to problems with the security requirements of the system, which were not fully defined early in the project. The LIMS Request for Proposal (RFP) had required security to be part of the system, but the FBI strengthened its security requirements after the contract award following high-profile espionage-related security breaches in the FBI. The audit found that the FBI had failed to document security requirements adequately and, to the extent the security requirements evolved, did not clarify those changes through contract modifications.
This audit sought to identify research, plans, studies, and evaluations that the Department has produced or is in the process of producing concerning IT systems, needs, plans, and initiatives. In addition, we analyzed the depth and scope of the problems the Department has experienced in the formulation of its IT plans.
Components submitted 494 documents that we categorized as studies, plans, and evaluations, related to federal, Department, and component-specific requirements and standards. Many of the documents specified in various criteria were produced, but significant gaps existed between the studies, plans, and evaluations described in criteria and what was prepared.
We found the highest levels of compliance in the areas of business case documents, which become part of the Department’s annual budget process and are required to obtain funding for each system or project, and security plans, which are required for projects to obtain authorization to operate. The components provided at least one business case document for 36 of the 38 systems in the inventory. The two exceptions, the FBI’s Investigative Data Warehouse (IDW) and Secure Compartmented Information Operational Network (SCION), are included in an “umbrella” business case that represents the Department’s consolidated enterprise infrastructure (CEI).
System security plans also had a high level of compliance. We obtained security plans for 32 of the 38 projects. The six other projects were either too early in the life cycle for preparation of this document, or a draft security plan was undergoing review. Components also demonstrated a high level of compliance with privacy impact assessments (PIA), and we found acceptable explanations for the projects that did not submit a PIA. Components also provided project management plans for 29 of the 38 projects, and explained all but one of those exceptions.
However, we found compliance in the areas of systems engineering management, configuration management, quality assurance, validation and verification, and training plans was significantly lower. In addition, components provided only seven post-implementation review reports.
Prior OIG reports have identified planning problems on individual systems and projects that include weaknesses in business process re-engineering, requirements planning, cooperation between agencies, and IT program and contract management. These weaknesses have contributed to:
project re-starts, cost increases, and delays in the FBI’s implementation of a case management system,
the termination of the FBI’s LIMS project,
delays in implementing an interoperable fingerprint identification system that can be used by both the Department and federal immigration authorities, and
data integrity problems in the TSC database.
We originally planned to use evaluations we obtained from components to identify problems the Department has experienced in planning for IT systems. This was not possible because the Department has produced so few evaluations of project management for either successful or failed IT projects, with the exception of two terminated projects in the FBI.In this report, we made five recommendations to the Department, such as recommending that the Department evaluate why project teams do not prepare certain plans and evaluations, reassess the utility of those documents, and consider revising the standards for producing IT studies, plans, and evaluations for individual IT projects. We also recommend that the Department consider revising the guidelines for tailoring the work pattern for specific types of projects. Additional recommendations focus on improving the evaluation of IT project management in the Department and improving business process re-engineering, and contract management and oversight. We believe the Department should ensure that evaluations are performed on both implemented systems and terminated projects that focus on lessons learned on planning and project management issues.
In the previously issued OIG report on Identification and Review of the Department’s Major Information Technology Systems Inventory, which provides information on the cost of the Department’s major IT systems, we included seven systems for the DEA and none for the ODAG. The seven systems included the Organized Crime Drug Enforcement Task Force (OCDETF) Fusion Center System (OFC) because the DEA’s unobligated funds developed the OFC. However, in this report we include the OFC as part of the ODAG because the system actually resides in that office.
Enterprise architecture (EA) is a blueprint that explains and guides how an organization’s IT and information management elements work together to accomplish the mission of the organization. An EA addresses business activities and processes, data sets and information flows, applications and software, and technology.
ITIM processes help identify needed IT projects, select new projects, and track and oversee project costs and schedules. The LCMD is the FBI’s systems development life cycle guidance defining IT project management procedures and documentation requirements.
DOJ Order 2880.1B, Information Resources Management Program, allows, but does not require, components to develop their own IT strategic plans.
The CIO does have specific responsibilities to enforce security standards.
The FBI’s Laboratory Information Management System (LIMS) project contract was terminated after the FBI determined the system would not be able to meet security requirements. See the discussion in Finding 2.
|« Previous||Table of Contents||Next »|