Audit of the Department of Justice Information Technology Studies, Plans, and Evaluations

Audit Report 07-39
August 2007
Office of the Inspector General


Findings and Recommendations

Finding 1:  Studies, Plans, and Evaluations

Inventory of Studies, Plans, and Evaluations

To identify specific IT research, studies, plans, and evaluations, we interviewed Department officials and reviewed the guidelines described in the Introduction to this report. We used the guidelines listed in Figure 5 as the basis for requesting specific studies, plans, and evaluations of IT needs, opportunities, projects, and systems. Each of the guidelines in Figure 5 is described in the Introduction to this report.


Guidelines for IT Studies, Plans, and Evaluations

Figure 5
Guideline and Date Applies to
DOJ Systems Development Life Cycle Guidance Document, revised 2003 Department of Justice
Guide to the DOJ Information Technology Investment Management (ITIM) Process, August 2001 Department of Justice
DOJ Order 2880.1b, Information Resources Management Program, September 2005 Department of Justice
OMB Circular A-11, Preparation and Submission of Budget Estimates, June 2005 All Federal agencies
DEA Systems Development Life Cycle Guidance Document, March 2000 Drug Enforcement Administration
FBI Life Cycle Management Directive, revised August 2005 Federal Bureau of Investigation
Sources: Department of Justice components

We used the DOJ SDLC as the primary criterion to identify the studies, plans, and evaluations that should be prepared when developing and implementing IT projects. The standards use various names for documents and organize the information differently. For this report, we combined the various specific standards into a generic set of studies, plans, and evaluations that could be applied to all of the IT systems and projects in our inventory. Because we found little research documented outside of the OMB exhibit 300, we included “research documents” under the category of “studies.”

We requested specific documents directly from each of the components because the CIO’s office did not maintain major documents produced for component-specific systems. OCIO officials told us that Department oversight is designed to focus on the Capital Planning and Investment Control process for selecting and prioritizing IT investments. It is not designed to enforce policies and procedures on IT project documentation.17 Department-level oversight of individual IT projects is performed through presentations to the Departmental Investment Review Board, the CIO’s Dashboard report, and through the OMB exhibit 300s, all of which are described in the second report in this series. This oversight includes tracking of actual performance against scheduled milestones and costs, but does not involve JMD officials in the details of IT documentation for individual projects.

In addition to requesting specific documents, we also asked the components to provide any additional documents they had prepared that would qualify as IT studies, research, plans, or evaluations. We requested a slightly different list of documents from the FBI than from the other components because some of the FBI’s LCMD standards varied somewhat from the standards being used by the other components. The variations for different components are described later in the report in the discussion of each document type. We found that the standards for preparing studies, plans, and evaluations as part of the IT development process come from a variety of different sources that overlap, duplicate effort, and may prove cumbersome.

We combined the various specific requirements from each guideline into the following generic set of criteria for studies, plans, and evaluations that we could apply to all of the IT systems and projects in our inventory. Figure 6 lists the generic set of documents we requested. All of the documents listed below are applicable to each IT system or project with the exception of an IT strategic plan, which is required for the Department but optional for components.


Studies, Plans, and Evaluations Requested

Figure 6
  • Business case studies
  • Market research
  • Alternatives analyses
  • Feasibility studies
  • Cost benefit analyses
  • Privacy impact assessments
  • IT strategic plans
  • Risk management plans
  • Acquisition plans
  • Project management plans
  • System security plans
  • Systems engineering management plans
  • Configuration management plans
  • Quality assurance plans
  • Validation and verification plans
  • Test plans
  • Conversion plans
  • Implementation plans
  • Training plans
  • Contingency & continuity of operations plans
  • Disposition plans
  • Test reports
  • Ongoing reviews of project status and earned value management
  • Post-implementation review reports
  • Any other IT-related research, plans, studies, and evaluations the component performed or sponsored
Source: OIG compilation of standards

Department components submitted more than 800 documents and other evidence that we accepted as responsive in some way to our requests. Of these responses, 494 were complete documents representing studies, plans, and evaluations. The responses also included other products or artifacts of the system acquisition and development process. Artifacts included items such as briefing slides, spreadsheets showing schedules and work breakdown structures, portions of the OMB exhibit 300, and various forms of progress reports. We included other artifacts in this report to the degree that they contributed to compliance with the various standards for documentation.

A detailed listing of the studies, plans, and evaluations we obtained for each project is located in Appendix VI of this report, along with a short summary about the project. Appendix V lists all documents and other artifacts we determined contributed to compliance with the various standards. The numbers include some duplicate counting of single documents because components sometimes submitted one document to fulfill more than one category.

The components cited several reasons for not providing all of the documents we requested. Specifically, components said: (1) the requirement was not applicable to the investment; (2) a waiver to the requirement had been granted; (3) planning for the system pre-dated FY 2000 and the documentation was not available; (4) the system was purchased commercially off-the-shelf eliminating the need for certain processes; and (5) the investment had not reached the applicable point in the life cycle.

Figure 7 shows the number of documents we received that we determined to be responsive to our document request for each system or project.


DOJ IT Studies, Plans & Evaluations Received

Figure 7
Components Systems & Projects Studies, Plans, &
Evaluations
ATF NIBIN 17
BOP ITS II 11
DEA Concorde 15
DEA E Com 18
DEA EIS 20
DEA Firebird 14
DEA M204 8
DEA Merlin 14
EOIR eWorld 9
FBI BRIDG 4
FBI CARTSAN 11
FBI CODIS 7
FBI DCS 15
FBI DCU 6
FBI EDMS 16
FBI FTTTF 6
FBI IAFIS 24
FBI IATI 18
FBI IDW 5
FBI LEO 9
FBI NCIC 19
FBI N-DEx 6
FBI NGI 11
FBI NICS 16
FBI R-DEx 5
FBI SCION 4
FBI Sentinel 11
FBI SMIS 21
FBI TRP 2
FBI TSC 8
JMD CITP 27
JMD IWN 23
JMD JCON 27
JMD LCMS 6
JMD PKI 15
JMD UFMS 18
ODAG OFC 16
OJP JGMS 12
  TOTAL 494
Source: Documents submitted by DOJ components in response to the OIG’s request

Two comprehensive IT plans for the Department are required by OMB standards: the Department’s IT Capital Plan and IT Strategic Plan. The IT Capital Plan, Agency IT Investment Portfolio (exhibit 53), was described in the second report in this series of three audits, as it represents the Department’s inventory of major IT investments. The Department’s IT Strategic Plan and the component plans are described below. All other documents under the section “Studies, Plans, and Evaluations” are standards associated with each system or initiative.

The Department of Justice IT Strategic Plan

OMB Circular A-130, Management of Federal Information Resources, requires that federal agencies maintain strategic plans for information resources management. According to OMB, the plans should: (1) support the agency’s Strategic Plan and (2) provide a description of how information resources management activities help accomplish agency missions and ensure that IT decisions are integrated with organization planning, budget, procurement, financial management, human resources management, and program decisions. DOJ Order 2880.1B, Information Resources Management Program, September 27, 2005, assigns responsibility to the CIO for developing, maintaining, and implementing the Department’s IT Strategic Plan, and requires that it be aligned directly with the Department’s Strategic Plan.

The Department of Justice Information Technology Strategic Plan for 2006 – 2011, June 2006, is designed to align IT strategic goals with the four strategic goals in the Department’s Strategic Plan:

To help the Department accomplish its goals, the IT Strategic Plan sets out five specific IT goals:

The IT Strategic Plan provides objectives for each goal and strategies for each objective. The IT goals, objectives, and strategies are intended to guide the technology capabilities toward specific outcomes. The Plan also introduces performance strategies as a means of measuring the Department’s performance of objectives. The performance objectives describe, at a high level, the expected performance, while specific metrics are developed for each investment. Further, there is at least one performance measurement defined for each objective.

We reviewed the Department’s IT Strategic Plan for compliance with the requirements stated in OMB Circular A-130. We found that the IT Strategic Plan supported the Department’s Strategic Plan and that it contained the required description of how information resources management activities help accomplish agency missions and ensure that IT decisions are integrated with organization planning, budget, procurement, financial management, human resources management, and program decisions.

Component IT Strategic Plans

DOJ Order 2880.1B, Information Resources Management Program, allows, but does not require, components to develop their own IT strategic plans. It also requires component-specific IT strategic plans to reflect and be aligned with the strategies in the Department’s IT Strategic Plan.

Five of the eight components included in this audit have developed component-specific IT Strategic Plans (ATF, BOP, DEA, EOIR, and FBI). Those IT Strategic Plans are listed in Appendix III. The Department’s IT Strategic Plan was prepared by JMD.

We reviewed the IT Strategic Plans for the five components to evaluate compliance with the requirement that they be aligned with the Department’s IT Strategic Plan. We found that the component IT Strategic Plans are generally consistent with the Department’s Plan.

While a strategic plan is required at the Department level and components have the flexibility to develop their own strategic plans, most standards that exist related to IT studies, plans, and evaluations are applicable to individual IT systems and projects rather than to the Department or its components. The following sections on studies, plans, and evaluations focus on the standards that apply to individual systems and projects.

IT System and Project Documents

This section presents a summary of what we obtained from components by type of document, along with a discussion of the specific standards for studies, plans, and evaluations.18 Our approach for discussing documents in this section is generally chronological, following documents as they are produced during the development of an IT project.

We applied each document type discussed below as a test of compliance for studies, plans, or evaluations. We also assigned unique numbers to individual documents and artifacts. We prepared a matrix identifying the individual documents and artifacts we determined were responsive to our requests for studies, plans, and evaluations. Appendix IV contains the matrix of document types and systems, with identifying numbers representing individual documents that met each standard. Appendix V lists individual documents in numerical order to match with items in the matrix.

Determining compliance with the standards for studies, plans, and evaluations was complicated by variations in criteria and the long duration of many projects coupled with the fact that criteria changed over time. Determining compliance was further complicated because the components allow waivers or tailoring of the standards for each project, depending on the nature of the project. We agree that flexibility and tailoring are reasonable. As we could not perform 38 individual audits for each of the systems and initiatives in the inventory, we are providing the following discussion of compliance in terms of whether the components provided documents in a consistent manner with the generic standards we used. It was not our intent to suggest that any individual project was out of compliance at any given time, since almost no document is absolutely required. Instead, our intent was to examine how consistently the components produced certain documents specified by various criteria.

Business case studies, system security plans, and PIAs are all required by criteria other than the Department’s SDLC or the FBI’s LCMD. Components must obtain funding for IT projects through the OMB exhibits 300, which summarize the business case, must provide system security plans in order to obtain authorization from Departmental IT security authorities to begin operating a system, and must abide by privacy laws by completing PIAs.

We found the highest levels of compliance in the areas of business case documents, which become part of the Department’s annual budget process and are required to obtain funding for each system or project, and system security plans, which are required for projects to obtain the Department’s authorization to operate. The components provided at least one business case document for 36 of the 38 systems in the inventory. The two exceptions, the FBI’s Investigative Data Warehouse (IDW) and Secure Compartmented Information Operational Network (SCION), are included in an “umbrella” business case that represents the Department’s consolidated enterprise infrastructure (CEI). The business case document represents the single document type for which we found 100 percent compliance.

System security plans also had a high level of compliance and we obtained security plans for 32 of the 38 projects. The six other projects were either too early in the life cycle for preparation of this document, or a draft security plan was undergoing review. Components also demonstrated a high level of compliance with PIAs, and we found acceptable explanations for the projects that did not submit a PIA. Components also provided project management plans for 29 of the 38 projects, and explained all but one of those exceptions.

However, we found compliance in the areas of systems engineering management, configuration management, quality assurance, validation and verification, and training plans was significantly lower.

The discussion in this section includes the numbers of whole documents we obtained that represented studies, plans, and evaluations. In the compliance matrix, Appendix IV, we included other artifacts that were submitted in lieu of, or in addition to entire documents.

Studies

Studies required by the various standards for IT activities and documents associated with each IT system or project are generally performed early in the life cycle of an IT project to identify and evaluate possible alternative solutions to meet a business need.19 The studies include market research, alternative analyses, feasibility studies, cost-benefit analyses (or benefit-cost analyses), risk analyses, and PIAs.20

While the Department and DEA SDLCs specify separate documents for these studies, the FBI LCMD groups all except the PIAs into a business case document that is a virtual image of the business case section of the OMB exhibit 300 required to be submitted as part of the Department’s budget. For reporting purposes, we organized the studies into groups called market/other research, business case studies, and PIAs.

As we conducted our audit, we became aware of a study that did not fit into the categories below that is related to a case management/common solution architecture for the Department. The 2004 study, sponsored by the CIO and performed by a contractor, is being used as the basis for JMD’s LCMS project.21

Market and Other Research

The only type of research mentioned in various criteria for IT documentation is market research. The DOJ ITIM Guide specifies market research through reference to the OMB exhibit 300, Capital Asset Plan and Business Case. Item 1.A. of section I.E., Alternatives Analysis, of the exhibit 300 instructs agencies to discuss the market research that was conducted to identify innovative solutions for the investment. OMB’s Capital Programming Guide indicates that federal agencies should conduct market surveillance and research to ensure that as many alternative solutions as possible are identified for consideration once an agency need has been identified. It lists announcements, requests for information, or requests for proposals to solicit information on alternative concepts from a broad base of qualified firms. It also states that emphasis should be placed on solutions that are currently available and do not require significant development in order to minimize risk.

While market research is the only type of research specifically identified in the exhibit 300, we asked components to identify and provide any other research that had been performed in connection with their planned IT projects. Components told us there was virtually no additional IT-related research being conducted separate from the market research that is performed as part of building a business case for a system.

We requested market research from all components except the FBI because the FBI’s LCMD does not specify market research independent of the business case.

Components provided 16 documents reflecting market research related to 11 of the 17 total non-FBI projects. Of the market research documents we received, the assessments included market research reports for DEA’s Firebird and JMD’s IWN and LCMS projects, requests for comment or information for the BOP’s ITS-II and JMD’s JCON, a summary report of vendor responses for JMD’s UFMS, two comparative analyses of other federal systems for the ODAG’s OFC, a report on public key infrastructure possibilities for the DEA’s eCommerce project, and a report on digital audio recording alternatives for the EOIR’s eWorld project. We obtained seven other artifacts related to market research for three projects which had also submitted documents we accepted as studies.

The 16 studies included in this discussion were separate from responses to the market research section of the Capital Asset Plans and Business Cases (OMB exhibits 300). Information included in the Capital Asset Plan and Business Case generally indicates that some market research was performed to help identify potential solutions. Six non-FBI systems were not represented by market research studies apart from the OMB exhibits 300. These six were NIBIN, Concorde, E-Commerce, M204, Merlin, and JGMS. Like the FBI projects, these six projects submitted OMB exhibits 300.

Business Case Studies

When managers decide that a system concept is worth developing further, work is performed to identify and evaluate alternative solutions. This item reflects studies performed to support the selection of a project or system and includes the following type of analyses that are frequently combined in one or two documents:

The Department SDLC and FBI LCMD standards vary considerably in terms of where certain types of information for these analyses should be found, but the basic information required is similar between standards. The Department and DEA SDLCs specify preparing the following:

The DOJ ITIM Guide specifies a business case analysis that reflects the requirements for the Capital Asset Plan and Business Case (OMB exhibits 300) to summarize the results of:

The FBI LCMD specifies an initial and a final business case that is virtually identical to the requirements for the OMB exhibit 300.

This audit generally reports on documents the components provided in response to our requests for specific documents. One exception to this is for business case studies. Some components submitted the OMB exhibit 300, Capital Asset Plan and Business Case, in response to this request, but many did not. Therefore, we obtained a number of additional exhibits 300 from other sources and have credited them to this test regardless of how the component responded to the document request.

Overall, we obtained 46 documents we categorized as a business case study, including at least one for 36 of the 38 systems or projects. The 46 business case studies include multiple documents for 9 projects. Several components submitted more than one OMB exhibit 300, representing multiple budget years, but we counted multiple OMB exhibits 300 for each project as one document. Additional studies we obtained included Alternatives Analyses and Cost Benefit Analyses.

We did not receive a business case study for the FBI’s Investigative Data Warehouse (IDW) or Secure Compartmented Information Operational Network (SCION) projects because they are included in an OMB exhibit 300 for the “comprehensive enterprise infrastructure” for the Department.

Additionally, we obtained more than 70 other artifacts related to business cases, including feasibility statements, mission needs statements, concept of operations documents, and cost benefit analysis spreadsheets.

Overall, we found the highest level of compliance with standards in the area of business case studies. The budget requirement for the OMB exhibit 300 undoubtedly contributes to the high level of compliance, as the case study is needed to obtain funding as part of the budget process.

Privacy Impact Assessments

PIAs are required by DOJ Order 2880.1b to ensure the Department reviews the potential impacts on individuals’ privacy concerns that may result from the development and use of computer-based information systems that collect or store personal data about individuals. All components are required to conduct a PIA for any new information system that contains sensitive information about individuals, uses new techniques to manipulate existing data about individuals in a way that such data is readily retrievable, or collects and maintains personal information about individuals that has not previously been collected and maintained by the component. JMD is responsible for enforcing compliance with this policy through the Department’s ITIM process.

The DOJ ITIM Guide instructs components to address privacy issues in developing the business case, in preparing the Capital Asset Plan and Business Case, and when preparing a disposal plan. The DOJ and DEA SDLCs require the PIA to be performed as part of the requirements analysis phase of a system’s life cycle.

The DOJ SDLC defines a PIA as a written evaluation of the impact that the implementation of the proposed system would have on privacy. Guidance for preparing a PIA is provided on the Department’s intranet, and consists of a list of questions to be answered about data in the system and the impact of the system on privacy. The assessment begins with a privacy threshold analysis to determine whether there is a need for a full PIA for each system.

Compliance with the PIA requirements appears consistent. We obtained 33 PIAs and privacy threshold analyses for 23 of the 38 systems and projects in the revised inventory, with some components submitting separate PIAs for different functions or modules of a system. PIAs were not required for every project. The threshold analyses for NIBIN and PKI determined there was no need for a full PIA for those systems, and we obtained an initial or full PIA for the other 21 of the 23 systems and projects.

We did not obtain PIAs or threshold analyses for 15 systems or projects. DOJ Order 3011.1A, Compliance with the Privacy Requirements of the Privacy Act, the E-Government Act, and the FISMA, March 6, 2007, states that PIAs identifying how information in identifiable form is collected, stored, protected, shared, and managed in an IT system or online information collection are required when developing or procuring new technology or making substantial modifications to existing technology. This would exempt older systems that have not undergone significant modification in the way described. Although the scope of this audit did not include evaluating information about modifications to all of the older systems in the inventory, this order would appear to exempt 7 of the remaining 15 systems: the DEA’s M204 corporate systems, the FBI’s DCU, IAFIS, LEO, NCIC, NICS, and R-DEx.

The DEA responded that a PIA was too broad for the infrastructure project Firebird and not applicable to Merlin, which is also an infrastructure project. The FBI told us that the PIAs for the FTTTF and TSC existed, but did not provide them to us. We did not obtain PIAs or explanations for the FBI’s IDW or TRP. The TRP is, however, at the beginning of its life cycle and is not yet at the phase of the FBI’s LCMD that requires a PIA. JMD told us that the PIA for IWN was not completed yet, and OJP responded to this item for the JGMS with its certification and accreditation plan of actions and milestones.

Plans

The plans specified by the DOJ and DEA SDLCs for each IT system or project include many types of plans that are developed after an alternative solution has been selected. These plans include:

The FBI LCMD also requires many of the same plans, but uses different names for some. Each of the differences is described below.

Risk Management Plans

The SDLCs specify risk management plans to be prepared during the system concept development phase, along with the feasibility and cost benefit studies. The risk management plan documents the results of assessing and planning to manage programmatic and technical risks of the system or project. The plan should identify and assess risks, and detail the strategies that will be employed to mitigate the risks.

The DOJ ITIM Guide describes assessing risk as part of analyzing alternatives, and reporting such risk assessment in the Capital Asset Plan and Business Case (OMB exhibit 300). When completing the exhibit 300, agencies are instructed to assess various risks, including those associated with schedule, initial costs, life-cycle costs, technical obsolescence, risk of monopoly, capability of the agency to manage the investment, overall risk of investment failure, security, privacy, and project resources.

Components provided 32 risk management plans for 25 of the 38 systems and projects. A number of components submitted the OMB exhibit 300 or other artifacts as their risk management plans. While the exhibit 300 contains information on risk management, it also requests the date of the risk management plan, suggesting that an independent plan should exist. We included artifacts, such as information from the OMB exhibit 300, in the compliance matrix, but did not count these as a risk management plan.

In addition to the OMB exhibits 300, other artifacts included risk registers, which are spreadsheets listing risks and mitigation strategies, risk analyses, and risk management sections of other documents, such as project plans. The number of projects represented by either risk management plans or other artifacts was 33 of the 38 projects. Five projects did not provide any specific response to this request, but Firebird, IAFIS, NICS, and TRP all submitted OMB exhibits 300 that included a risk management section. SCION, the final system, is included in the Department’s consolidated enterprise infrastructure OMB exhibit 300.

Acquisition Plans

The SDLCs specify preparation of an acquisition plan during the planning phase of a system life-cycle. This plan should document how all government resources and contractor support services will be acquired during the life of the project. Acquisition plans are specified in the DOJ ITIM Guide and also are included in the final business case under current FBI standards. We did not request acquisition plans from the FBI, as FBI officials told us the acquisition plans are in the business case. As is discussed in the section on business case studies, the FBI was compliant with business case studies. However, we did obtain two documents related to acquisition planning for the FBI’s Sentinel project.

Other components provided acquisition plans or some relevant alternate documentation for 13 of the 17 non-FBI systems and projects in the revised inventory. Alternate documentation included justification for other than full and open competition and the acquisition section of Capital Asset Plans and Business Cases, which summarizes the acquisition strategy. We obtained OMB exhibits 300 for all of the other non-FBI projects that did not provide separate acquisition plans. While the other components did not suggest that the OMB exhibit 300 fulfilled the requirement for an acquisition plan, we accepted them as such in order to ensure similar treatment to the FBI projects. However, we do not identify this as an area of high compliance because the OMB exhibit 300 clearly expects that components will develop a separate acquisition plan.

Project Management Plans

The SDLCs indicate that project management plans should be prepared for all projects. The plans are intended to document project scope, tasks, schedule, allocated resources, and interrelationships with other projects. The plans also provide details on the involved functional units, required job tasks, cost and schedule performance measurement, and milestone and review scheduling. Revisions to the project management plan should occur at the end of each phase and as information becomes available. The project management plan should reflect the entire scope of what is to be accomplished. Project management plans are also specified in the DOJ ITIM Guide.

Components provided 44 project management plans for 29 systems and projects, and 42 other artifacts representing 28 projects, together representing a total of 31 of the 38 projects. We included artifacts in the compliance matrix in Appendix IV. Common artifacts submitted in relation to this plan were schedules of tasks and work breakdown structures.

We did not obtain project plans or relevant artifacts for seven projects, four of which predated the FBI’s implementation of its LCMD (IAFIS, LEO, NICS, and R-DEx). We did not receive an explanation for why no project management plans for the FBI’s SCION and TRP were provided. In addition, the ATF waived compliance to the SDLC for its NIBIN system “due to the nature of the contract and special contractual constraints whereby the Contractor provides for 100% of the necessary customer support and maintenance support required to install, configure, implement and sustain all IBIS systems (hardware and software).”

System Security Plans

The various business and law enforcement functions within the Department depend on the confidentiality, integrity, and availability of systems and data. The DOJ SDLC specifies that system security plans should contain information about the system environment, information sharing, sensitivity of information processed, management controls, security controls, operational controls, contingency planning, security training, audit trails, and access controls.

The Department requires that all IT systems pass a security Certification and Accreditation process that is intended to ensure the adequacy of computer system security. Security plans and successful security test results are needed to obtain the Department’s authorization to operate. This likely ensures that system security plans and related security tests are among the most reliably prepared documents in the IT development process.

Components provided 40 system security plans and 33 other relevant artifacts for 32 systems. Included as artifacts in the compliance matrix in Appendix IV are items such as security sections of project management plans, authorizations to operate, and other artifacts of the certification and accreditation process. Of the six projects not represented here, the FBI and JMD told us the plans for NGI, Sentinel, and LCMS did not yet exist, which was reasonable given the status of the projects at the time of our field work. We were also informed that the draft security plan for the FTTTF was being reviewed at the time of our field work. We did not obtain a plan or an explanation from the FBI regarding the BRIDG or TRP projects. Overall, we found compliance with this system security standard extremely high.

Systems Engineering Management Plans

According to the Department’s SDLC, the systems engineering management plan (SEMP) should be developed during the planning phase of IT project development. The SEMP is intended to document the strategy for executing the technical management aspects of the project, and should include information about responsibilities for the technical effort, technical processes, and procedures to be applied. It should address control strategies for data management, technical performance measurement, interface management, and formal and informal technical reviews. The FBI’s LCMD also specifies SEMPs.

In response to our request, components provided only 11 SEMPs and 6 relevant artifacts for 13 projects. Components did not submit items we accepted as SEMPs for 25 of the 38 IT projects. In addition to the NIBIN contract waiver, components told us the SEMPs had not yet been developed or were not applicable for their projects. Others submitted project management plans or concept of operations documents to meet this requirement, but we did not accept the brief descriptions included in these documents for this test.

Configuration Management Plans

According to the DOJ SDLC, configuration management plans document uniform practice for managing system software, hardware, and documentation changes throughout a development project. The FBI LCMD also specifies configuration management plans.

Components provided 28 configuration management plans and 5 related artifacts for 26 projects. The 12 projects not submitting configuration management plans were NIBIN, ITS-II, BRIDG, CODIS, DCU, FTTTF, IDW, N-DEx, R-DEx, TRP, TSC, and OFC. In addition to the NIBIN contract waiver, component explanations for not submitting this item included that the documents had not yet been developed or the standard was not applicable.23 11">

Quality Assurance Plans

The DOJ SDLC indicates the purpose of quality assurance plans is to ensure that delivered products satisfy contractual agreements, meet or exceed quality standards, and comply with approved processes. The plans should include an overview of the processes to ensure that processes and products associated with hardware, software, and documentation are monitored, sampled, and audited to ensure compliance with methodology, policy, and standards.

Components provided 17 quality assurance plans for 16 projects.24 We included 12 other relevant artifacts for 5 projects in the compliance matrix, representing a total of 20 projects. No quality assurance plans or related artifacts were obtained for 18 of the 38 projects. The 18 projects were: NIBIN, ITS-II, BRIDG, CODIS, DCS, DCU, EDMS, FTTTF, IDW, LEO, NCIC, N-DEx, R-DEx, SCION, TRP, TSC, CITP, and JGMS. In addition to the NIBIN contract waiver, component explanations for not submitting this item included that the documents had not yet been developed, were not applicable, or were no longer available if they were developed several years ago.

Validation and Verification Plans

Validation and verification plans describe the testing strategies that will be used throughout a project’s life-cycle phases. Such plans should include descriptions of contractor, government, and appropriate independent assessments required by the project. They should also reflect the major reviews that will be performed through the project. However, the SDLC does not require that any validation and verification be performed independently.

The FBI LCMD also requires this plan and defines verification and validation as a disciplined approach to assessing software products throughout the software development life cycle to ensure that quality is built into the software and that the software satisfies business functional requirements. Verification and validation employs review, analysis, and testing techniques to determine whether a software product and its intermediate deliverables comply with business functional requirements and quality attributes. The LCMD specifically defines verification as the process of determining whether products in a given phase of the development process fulfill the requirements established during the previous phase and validation as the process of evaluating software at the end of the software development process to ensure compliance to software requirements.

Components provided 8 validation and verification plans and 14 other related artifacts for 10 projects. We accepted test plans in response to this document request for three projects. In our judgment, verification and validation plans should include more than software testing. Requirements and design products should also be subject to verification and validation. The ten projects we determined responded to this item were: DEA’s E-Commerce, EIS, Merlin, and M204, the FBI’s IAFIS and TSC, JMD’s CITP, IWN, and JCON, and OJP’s JGMS. The other 28 projects did not provide a validation and verification plan or we did not accept minimal test documentation that was submitted in response to this request. In addition to the NIBIN contract waiver, component explanations for not submitting this item included that the documents had not yet been developed, were not applicable, or were no longer available if they were developed several years ago.

Test Plans

Both the SDLC and FBI LCMD specify test master plans that should document the scope, content, methodology, sequence, management of, and responsibilities for test activities. The testing should include integration, system, user acceptance, and security testing.

Components provided 51 test plans and 19 other related artifacts for 30 projects. These represent plans for different types of testing and testing of various modules or functions of the same system such as security, acceptance, functional, maintainability, report generation, and integration tests. Of the eight projects not represented, three had not reached the appropriate stage of the life cycle for this document: CODIS-Next Generation project, NGI, and LCMS.25 Of the remaining five projects, only ITS-II responded that the item was not applicable. There were no specific responses on the other four projects (BRIDG, IDW, SCION, and TRP).

Conversion Plans

The SDLC calls for conversion plans to be prepared during the design phase of the life cycle to document the results of design work on conversion and transition strategies if information needs to be converted or migrated to the new system. The plans should describe the strategies involved in converting data from the existing to the new environment. Because the FBI’s LCMD requires transition plans to include data conversion issues, we requested transition plans for the FBI projects.

Components provided 13 conversion and transition plans for 10 projects. Most of these were FBI transition plans, although JMD submitted conversion plans for the Classified Information Technology Program (CITP) and Unified Financial Management System (UFMS) projects, the BOP submitted a plan for ITS-II, and JMD submitted two related artifacts for JCON. In addition to the NIBIN contract waiver, component explanations for not submitting this item included that the documents had not yet been developed, were not applicable, or were no longer available if they were developed several years ago.

Implementation Plans

According to the SDLC, implementation plans are to be prepared during the design phase to describe how the system will be deployed, installed, and transitioned into an operational status. The FBI LCMD refers to its comparable plan as an installation plan.

Components provided 29 implementation, deployment, or installation plans and 21 other related artifacts for 24 projects. Component explanations for the 14 other projects not represented in this item included that the documents had not yet been developed, were not applicable, or were no longer available if they were developed several years ago.

Training Plans

The SDLC also calls for training plans to be prepared during the design phase. The training plan should outline the objectives, needs, strategy, and curriculum to be addressed when training users on the new or enhanced information system. The training plan should present the activities needed to support the development of training materials, coordination of training schedules, reservation of personnel and facilities, and other training-related tasks.

Components provided 16 training plans and 5 other relevant artifacts for 19 projects. Component explanations for the 19 other projects not represented in this item included that the documents had not yet been developed, were not applicable, or were no longer available if they were developed several years ago.

Contingency and Continuity of Operations Plans

The DOJ SDLC specifies contingency planning as a function of the development phase of a system’s life cycle. The SDLC cites OMB A-130 as requiring the preparation of plans for general support systems and major applications to ensure continuity of operations. The purpose is to provide for the continuation of critical mission and business functions in the event of disruptions. The plans are known by various names, such as disaster recovery, continuity of operations, or contingency plans.

We obtained 23 contingency plans or continuity of operations plans and 1 related artifact for 19 projects. Component explanations for the 19 other projects not represented in this item included that the documents had not yet been developed or were not applicable.

Disposition Plans

Disposition plans are intended to end the operation of a system in a planned, orderly manner and to ensure that system components and data are properly archived or incorporated into other systems. The plan should be developed during the disposition phase, according to the SDLC, which begins when a decision is made to terminate or replace a system.

Components provided one disposition plan for JMD’s PKI and one other related artifact for the FBI’s DCU. The PKI document was prepared early in the PKI life cycle. It was our understanding that the DEA’s M204 corporate systems and the BOP’s ITS-II were both nearing the end of their life cycles, but other systems in the revised inventory were not yet at that stage.

Evaluations

During our field work for this audit, we requested IT project test reports, ongoing reviews of project status, and earned value management (EVM) reports to obtain information to describe IT planning problems within the Department. We obtained 42 test reports and 25 other relevant artifacts for 24 projects. We also obtained 86 documents and other related artifacts for 25 projects that we categorized as ongoing performance evaluations. These items included Dashboard reports to the OCIO, EVM spreadsheets, project reviews, results of gate reviews for FBI projects, project status reports, briefings for component and Departmental managers, and lessons learned statements. We found that most of these materials presented status information needed for project management and decision-making, but were not necessarily directed at describing planning problems. These items are included in the compliance matrix and list of unique documents in appendices IV and V.

To obtain information about the effectiveness of system development and acquisition efforts, we have limited our assessment of evaluations in this audit report to full reports produced about problems experienced during projects, or reports about systems and projects following implementation of the system. We requested post-implementation review reports, which include in-process review reports and user satisfaction review reports.

Post-Implementation Review Reports

According to the SDLC, post-implementation reviews are conducted after a system has been in production for a period of time and are used to evaluate the effectiveness of the system development. The review should determine whether the system does what it was designed to do, supports users as required, and was successful in terms of functionality, performance, and cost benefit. It should also assess the effectiveness of the development activities that produced the system. The review results should be used to strengthen the systems as well as the component’s system development procedures.

In-process reviews are performed during operations and maintenance to assess system performance and user satisfaction, and should occur repeatedly after a system has been implemented to ensure the system continues to meet needs and perform effectively.

The FBI LCMD does not require a post-implementation review as such. However, it does specify annual project-level operational reviews that are conducted by the operations and maintenance organization to ensure that the fielded system is continuing to support its intended mission and can be continuously supported, operated, and maintained in the future in a cost-effective manner. The FBI LCMD also calls for acceptance reviews at the time of implementation.

Components provided seven post-implementation reports and six other relevant artifacts for ten projects. These are discussed in Finding 2 of this report. Component explanations for the 28 other projects not represented in this item included that the documents had not yet been developed, were not applicable, or were no longer available if they were developed several years ago.

Conclusion

We found the highest levels of compliance with studies, plans, and evaluations in the areas of business case documents, which become part of the Department’s annual budget process and are required to obtain funding for each system or project, and security plans, which are required for projects to obtain authorization to operate. The components provided at least one business case document for 36 of the 38 systems in the inventory. The two exceptions, the FBI’s Investigative Data Warehouse (IDW) and Secure Compartmented Information Operational Network (SCION), are included in an “umbrella” business case that represents the Department’s consolidated enterprise infrastructure (CEI). The business case document is the single document type for which we found 100 percent compliance.

System security plans also had a high level of compliance and we obtained security plans for 32 of the 38 projects. The six other projects were either too early in the life cycle for preparation of this document, or a draft security plan was undergoing review. Components also demonstrated a high level of compliance with PIAs, and we found acceptable explanations for the projects that did not submit a PIA. Components also provided project management plans for 29 of the 38 projects, and explained all but one of those exceptions.

However, we found compliance in the areas of systems engineering management, configuration management, quality assurance, validation and verification, and training plans was significantly lower.

Departmental oversight is designed to focus on the Capital Planning and Investment Control process for selecting and prioritizing IT investments. It is not designed to enforce policies and procedures on IT project documentation.26 Department-level oversight of individual IT projects is performed through presentations to the Departmental Investment Review Board, the CIO’s Dashboard report, and through the OMB exhibit 300s, all of which are described in the second report in this series. This oversight includes tracking of actual performance against scheduled milestones and costs, but does not involve JMD officials in the details of IT documentation for individual projects.

Based on the limited number of plans and evaluations produced on these major systems and projects, the CIO should evaluate why project teams do not prepare certain plans and evaluations, reassess the utility of those documents, and consider revising the standards for producing IT studies, plans, and evaluations for individual IT projects.

Recommendations

We recommend that the Department’s CIO:

  1. Evaluate why project teams do not prepare certain plans and evaluations, reassess the utility of those documents, and consider revising the standards for producing IT studies, plans, and evaluations for individual IT projects.

  2. Consider revising the guidelines for tailoring the work pattern for specific types of projects.


Finding 2:  IT Planning Problems

To identify problems the Department has experienced in planning for IT systems and projects, we reviewed previous OIG audit and inspection reports. We used OIG performance audits, financial statement audits, information technology security audits, and inspections to help identify the scope of problems the Department has experienced in IT planning. The focus of the audits and reviews varied and included general IT management, the management and progress of individual IT projects, the performance of individual systems following implementation, system security, and system controls. The OIG reports we reviewed are listed in Appendix VII. We also reviewed special reports prepared for the FBI on the terminated VCF project.

The overall objective for the IT standards described in the Introduction to this report is to improve the acquisition, use, and disposal of information technology by the federal government so as to improve the productivity, efficiency, and effectiveness of federal programs. Prior OIG reports have identified IT planning problems that resulted in terminated efforts, implementation delays, problems with data in implemented systems, and cost overruns.27 The OIG reports described causes for the terminations, delays, and other problems that include weaknesses in contract management, project scheduling, BPR, requirements definition, and cooperation between federal agencies.

During this audit we looked for any IT projects that had either failed or been terminated, such as the FBI’s VCF and LIMS projects discussed below. The OIG found during work on the second report in this series that a prior effort on JMD’s Justice Consolidated Office Network (JCON) project had been terminated before beginning the current project in FY 2001. JMD, however, was not able to provide an evaluation of the failure. In our opinion, failure to evaluate why a contract failed suggests a serious gap in evaluating project management practices. We believe that troubled and terminated projects should be evaluated to determine the causes of the problems.

Business Process Re-engineering and Requirements Weaknesses

The DOJ SDLC indicates that BPR should be the underpinning of any new system development or initiative as part of strategic planning for information systems, and that agencies should consider BPR before requesting funding for a new project or system development effort. BPR is defined as the redesign of the organization, culture, and business processes using technology as an enabler to achieve significant improvements in cost, time, service, and quality. The results of successful BPR are increased productivity and quality improvements.

The FBI’s effort to develop a case management system to replace its obsolete Automated Case Support system has been subject to project restarts or continuations with new titles twice since its initiation.28 The first effort, undertaken in mid-2001 as the User Applications Component of the Trilogy project, was originally scheduled to be implemented in 2004. This effort was never implemented because the vision and functional requirements for the system changed significantly during the project. After the attacks of September 11, 2001, and other events affecting the FBI, the vision for the system changed from one that would simply consolidate existing applications to one that would implement a new overall workflow process for FBI agents, analysts, and support personnel.

The effort subsequently became the Virtual Case File (VCF) project. The VCF was intended to make criminal and terrorist investigation information readily accessible throughout the FBI. However, the FBI did not accept an initial delivery from the contractor in December 2003 because the system was not fully functional and did not meet FBI requirements. Subsequent deliveries did not occur because of difficulties experienced in completing the initial version of the VCF. The FBI told auditors that subsequent deliveries were not being pursued given the problems in the first delivery. The OIG report on the VCF project stated that one of the most significant problems with managing the schedule, cost, and technical aspects of Trilogy was the lack of a firm understanding of the design requirements by both the FBI and contractors. During the initial years of the project, the FBI had no firm design baseline or roadmap for Trilogy. According to one FBI official, Trilogy’s scope grew by about 80 percent from the initiation of the project. The FBI terminated the VCF portion of Trilogy in March 2005 after spending $170 million because of the lack of progress on its development and concerns that the development environment would make the system difficult to enhance and maintain. As discussed in two OIG audit reports, the effort has been re-started as the $425 million Sentinel project, which is scheduled for completion in December 2009.29

A contracted study of the FBI’s terminated Virtual Case File project found that the original plans for the case management portion of the Trilogy project were not based on a new vision of how the FBI could use IT to transform the way it performs its mission. Specifically, the unpublished report indicated that senior managers were not involved in efforts to re-engineer business processes or in rethinking the FBI’s use of IT, and that while users working on the re-engineering were experienced agents, none had experience with complex IT development projects or business process re-engineering.

Another terminated project at the FBI was an initiative to implement a new Laboratory Information Management System (LIMS) to replace its Evidence Control System, which was originally created in 1978.30 The LIMS contract was awarded in September 2003, was initially supposed to be implemented within 90 days of contract activation, and was terminated in January 2006 due to concerns over security requirements. According to an OIG audit, the project failed because of problems meeting the FBI’s security requirements and because of delays in implementing a web-browser interface.

The OIG determined that specific security requirements for the system were defined late in the project, hindering the contractor’s ability to comply. The LIMS Request for Proposals (RFP) had required security to be part of the system, but the FBI strengthened its security requirements after the contract award in response to high-profile espionage-related security breaches in the FBI. The audit found that the FBI had failed to document security requirements adequately and, to the extent the security requirements evolved, did not clarify those changes through contract modifications.

Cooperation Between Agencies

OIG audits and reviews have also identified difficulties when the Department attempts to work with other agencies to develop and implement successful IT systems. For example, lack of cooperation has cost time in the effort to coordinate fingerprint sharing between the Department and the Department of Homeland Security (DHS). Similar problems threaten the success of the Secure Flight Program and the Integrated Wireless Network (IWN).

The OIG audit of the Terrorist Screening Center’s (TSC) efforts to support the Department of Homeland Security’s (DHS) Secure Flight Program found that the TSC had been hindered and delayed in its efforts to prepare for implementation by the DHS-led Transportation Security Administration’s failure to make, communicate, and comply with key program and policy decisions in a timely manner.31 In addition to perceived problems in planning at DHS, cooperation between the TSC and DHS has been weak.

The OIG has performed a series of reviews of the FBI’s progress toward achieving interoperable fingerprint identification systems with federal immigration authorities.32 Since 1999 JMD has maintained oversight of the integration of the FBI’s fingerprint identification system, Integrated Automated Fingerprint Identification System (IAFIS), and the Department of Homeland Security’s Automated Biometric Identification system, IDENT. The 2001 USA Patriot Act and the 2002 Border Security Act both set requirements for a data system that would allow sharing of identification information in federal law enforcement databases with immigration authorities to determine whether to allow aliens to enter the United States.

Differences between the FBI and the DHS over the number (2 or 10) and type of fingerprints (flat or rolled) to be collected held up progress in this area. DHS deployed an additional system in 2004, US-VISIT, which uses IDENT to collect fingerprints, and is also used by Department of State employees at visa-issuing consulates. The principal barriers to achieving interoperability identified in an OIG December 2004 report were the different fingerprint collection requirements of the two agencies, and disagreement on the details of how to make information readily accessible to federal, state, and local law enforcement agencies. The most recent OIG report on the fingerprint integration issue indicated that the first barrier was resolved by DHS’ May 2005 decision to implement a 10-print standard. Currently, efforts are underway to make IAFIS, IDENT, and US-VISIT fully interoperable by December 2009.

The OIG recently released an audit report on the Integrated Wireless Network (IWN) project that is intended to enhance the ability of federal law enforcement agencies in the Departments of Justice, Homeland Security, and Treasury to communicate with each other.33 IWN would also allow interoperability with state and local law enforcement partners and meet mandates to use federal radio frequency spectrum more efficiently. The OIG’s audit found that the project, which may cost $5 billion, is at high risk of failing to deploy an integrated wireless network for use by the three federal departments. The reasons include a fractured IWN partnership, lack of an effective governing structure for the project, and disparate departmental funding mechanisms that allow the departments to pursue separate wireless communications solutions apart from IWN.

Contract Management Weaknesses

The OIG conducted an audit of the FBI’s Trilogy project to assess the FBI’s progress in meeting cost, schedule, technical, and performance targets for the three components of Trilogy.34 The OIG found that the VCF portion of the Trilogy project significantly exceeded the original schedule and budget. In addition, the FBI received an additional $78 million to accelerate the infrastructure and communications portions of the Trilogy project. Those segments were completed by April 2004, only one month before the original target date of May 2004. The audit found that while the Trilogy project had succeeded in improving the FBI’s IT infrastructure and communications capabilities, the new case management system was incomplete and would not meet the FBI’s needs. The OIG recommended the FBI monitor its Enterprise Architecture and apply ITIM processes to improve the FBI’s ability to identify, select, and manage future IT projects. Since then, the FBI has implemented a formal project management and oversight methodology, its Life Cycle Management Directive (LCMD), to address these weaknesses and the LCMD is being used in the current Sentinel project.35

The OIG examined the LIMS project and found that firmly managed schedule, cost, technical, and performance benchmarks would have raised warning signs earlier in the project. The LIMS contract was awarded 14 months before the FBI implemented its LCMD, a critical initiative that provided the FBI with a structured IT investment management process. The LCMD also involves project oversight at the enterprise level. In the LIMS audit, the OIG made recommendations to consider whether an existing commercial off-the-shelf system would meet the FBI’s needs, ensure that any future laboratory information system follows the FBI’s LCMD processes and is overseen by an experienced IT project manager, and establish controls to ensure that expenses are not incurred prematurely in the development of a successor project.

During its annual financial statement audit, the OIG identified inadequate oversight of contract staff as a weakness in financial statement audits, specifically at OJP.36 The OIG audit found that OJP contractors do not consistently adhere to Department policies and procedures for managing system changes and do not consistently provide OJP management with necessary technical and logistical information for production systems. As a result, OJP management is unaware of system operational information and system modifications implemented by the contractors. The OIG concluded that the OJP CIO needed to improve his oversight and monitoring of contractor activities in order to reduce the risk of negative effects on OJP operations and financial data.

IT Program Management

The OIG audit of JMD’s Joint Automated Booking System (JABS) found that booking stations installed at Bureau of Prisons (BOP) facilities were brought online in April 2004, 2 years after the equipment was installed during the summer of 2002.37 According to JMD officials, the software that was originally installed with the equipment had major problems that were not discovered until after all 240 JABS workstations had been installed. The 2-year delay in implementing JABS at the BOP was caused by inadequate oversight of the contractor’s work.

Since then, in audit reports issued in 2004 and 2005, the OIG found that the Department has begun to improve its oversight and guidance of the components’ EA and ITIM processes on Department-developed frameworks.38 In its audit of the Status of Enterprise Architecture and Information Technology Investment Management in the Department of Justice, the OIG made recommendations for improving the Department’s IT management, including completing the Department-wide Enterprise Architecture, providing guidance to components for the development and maintenance of EAs, ensuring that components requiring ITIM processes develop them, and establishing a clear schedule for completing the ITIM framework and a mature ITIM process.

In another audit, the OIG found the DEA had made significant progress in managing its EA and the ITIM processes.39 Although the DEA had not yet developed a target EA or developed a transition plan to accomplish its target, it had established a foundation by developing an overview of its existing IT structure. The DEA also assigned roles, committed resources, and established a plan to complete its target architecture. When the EA is complete, the DEA will be able to better manage current and future IT infrastructure and applications.

The OIG’s first in a series of audits examining Sentinel evaluated its development and implementation by reviewing the management processes and controls the FBI applied to the pre-acquisition phase of Sentinel.40 The OIG found that the FBI established ITIM processes through its Life Cycle Management Directive (LCMD) and was working to fully define its enterprise architecture. If followed, the FBI’s new IT management processes, reviews, and controls, coupled with external oversight by the OIG, contractors, congressional committees, and others, should help the FBI identify and minimize failures to achieve cost, schedule, performance, and technical benchmarks for the Sentinel project.

The OIG review of the TSC identified numerous problems with data in the database that is used for screening persons from consolidated terrorist-related watch lists, most of which resulted from the urgency with which the consolidated database was implemented.41 The data problems included incomplete, missing, and inaccurate information in records, and duplicate records containing inconsistent information. The potential effects of these data integrity problems include the possibility that screeners may not identify known terrorists during screening. The OIG found that these were caused by a lack of strategic planning, weak planning due to the pressure to implement a system, and user training weaknesses. The OIG is currently performing a follow-up review on the accuracy of the TSC watchlist.

Post-Implementation Evaluations

We originally planned to use evaluations we obtained from components to identify problems the Department has experienced in planning for its IT systems. However, this proved impossible because the Department has produced few meaningful evaluations of project management for either successful or failed IT projects, with the exception of two terminated projects in the FBI.

According to the DOJ SDLC, one purpose of post-implementation reviews is to assess the effectiveness of the life-cycle development activities that produced the system. This includes analyzing if proper limits were established in the feasibility study and if they were maintained during implementation, addressing the reasons for variances between planned and realized benefits, addressing the reasons for differences between estimated and actual costs, and evaluating whether training was adequate, appropriate, and timely. The review results are intended to be used to strengthen the system development procedures as well as the system itself.

The DOJ ITIM Guide calls for continuous monitoring of investments to assess progress against established cost, schedule, and performance metrics in order to mitigate any risks or costs on an on-going basis. The ITIM Guide also indicates that the activities of the evaluation phase include applying lessons learned from post-implementation reviews and periodic operational analyses for ITIM process improvement. The lessons learned for ITIM process should be incorporated into the select and control phases for future IT investments.

We reviewed the seven post-implementation review reports we obtained, four of which did not contain information on lessons learned in project management. The reports included two classified project closeout reports on two phases of one project. According to one of the reports, one phase was accomplished on schedule and within budget and included no lessons learned or discussion of any problems. The other report contained two lessons learned that were marked as unclassified. The lessons were:

JMD’s JCON project has produced two reports of lessons learned on the implementation of JCON in two components. The report on JCON implementation in the Civil Division described the need for better definition of project milestones and performance indicators to improve communications and develop a shared perspective on project performance. It also identified needs: (1) to devote greater attention and resources to quality review of deliverables and other work products, (2) for closer and more detailed review of requirements and design phase documentation, and (3) for improved adherence to change control procedures. The report on JCON implementation in the Civil Rights Division identified opportunities for improvement in the areas of communication and thoroughness of design. Comments in the report noted that requirements gathering needed to be as good as possible to avoid problems with design and implementation.

In addition, an assessment against project performance metrics was performed for one portion of the DEA’s E-Commerce project. The evaluation provided performance data, but no lessons learned information about project management.

In light of the limited number and scope of evaluations of project management, the Department should ensure that post-implementation evaluations and post-termination evaluations of IT projects are performed so lessons learned can be incorporated into the Department’s standards and used to improve project management on future projects.

Conclusion

Prior OIG reports have identified planning problems on individual systems and projects that include weaknesses in business process re-engineering, requirements planning, cooperation between agencies, and IT program and contract management. These weaknesses have contributed to:

We originally planned to use evaluations we obtained from components to identify problems the Department has experienced in planning for IT systems. This was not possible because the Department has produced almost no meaningful evaluations of project management for either successful or failed IT projects, with the exception of two FBI projects. Post-implementation evaluations and audits of individual projects identified weaknesses in contract management, and excessive reliance on contractors.

Recommendations

We recommend that the CIO:

  1. Ensure that post-implementation and post-termination evaluations are conducted that focus on lessons learned for project planning and management.

  2. Ensure that staff receive training to obtain skills needed to adequately direct and oversee contractor efforts.

  3. Implement targeted reviews to improve the use of business process re-engineering and requirements analysis early in concept development.



Footnotes
  1. The CIO does have specific responsibilities to enforce security standards.

  2. This is not intended to be a comprehensive discussion of all phases or activities associated with IT projects and systems, but focuses on the tasks and documents associated with research, studies, plans, and evaluations.

  3. We grouped various documents into the category of “studies” based on the idea that a study would be a product of attempts to acquire knowledge or understanding of a subject.

  4. Privacy impact assessments (PIA) are performed later in the life cycle, after an alternative solution has been selected. The Department’s SDLC places the PIA as a deliverable of the requirements analysis phase.

  5. The MITRE Corporation, Common Solution Architecture for Case Management (the Current State), Technical Report, April 2004.

  6. Tangible benefits are expressed in dollars or units, such as dollars saved from streamlining transactions and saved time. Intangible benefits are normally related to mission improvements that may be difficult to quantify.

  7. It was beyond the scope of this audit to determine what was appropriate for each project for every type of study, plan, or evaluation that may be prepared for individual projects.

  8. This number includes one quality management plan counted five times because it is being used for five DEA projects (Item #83).

  9. It was beyond the scope of this audit to ensure that we obtained test plans for every appropriate module, phase, or function of each project. We are reporting what we obtained in response to the request for studies, plans, and evaluations.

  10. The CIO does have specific responsibilities to enforce security standards.

  11. Some of the systems and initiatives included in this analysis were not included in the revised inventory but were the subject of OIG reports. All of the systems and initiatives in the revised inventory used for this audit either were implemented or are currently in development.

  12. Department of Justice, Office of the Inspector General, The Federal Bureau of Investigation’s Management of the Trilogy Information Technology Modernization Project, Audit Report 05-07, February 2005.

  13. Department of Justice, Office of the Inspector General, The Federal Bureau of Investigation’s Pre-Acquisition Planning For and Controls Over the Sentinel Case Management System, Audit Report 06-14, March 2006.

    and

    Department of Justice, Office of the Inspector General, Sentinel Audit II: Status of the Federal Bureau of Investigation’s Case Management System, Audit Report 07-03, December 2006.

  14. Department of Justice, Office of the Inspector General, The Federal Bureau of Investigation’s Implementation of the Laboratory Information Management System, Audit Report 06-33, June 2006.

  15. Department of Justice, Office of the Inspector General, Review of the Terrorist Screening Center’s Efforts to Support the Secure Flight Program, Audit Report 05-34, August 2005. Redacted

  16. Department of Justice, Office of the Inspector General, Follow-up Review of the FBI’s Progress Toward Biometric Interoperability Between IAFIS and IDENT, Inspections Report I-2006-007, July 2006, is the most recent report in the series of six reports.

  17. Department of Justice, Office of the Inspector General, Progress Report on Development of the Integrated Wireless Network in the Department of Justice, Audit Report 07-25, March 2007.

  18. Department of Justice, Office of the Inspector General, The Federal Bureau of Investigation’s Management of the Trilogy Information Technology Modernization Project, Audit Report 05-07, February 2005.

  19. The FBI’s LCMD methodology is fully documented in U.S. Department of Justice, Office of the Inspector General, The Federal Bureau of Investigation’s Pre-Acquisition Planning for and Controls Over the Sentinel Case Management System, Audit Report 06-14, March 2006.

  20. U.S. Department of Justice, Office of the Inspector General, Office of Justice Programs Annual Financial Statement Fiscal Year 2006, Audit Report 07-21, March 2007.

  21. U.S. Department of Justice, Office of the Inspector General, The Joint Automated Booking System, Audit Report 05-22, May 2005.

  22. Department of Justice, Office of the Inspector General, The Status of Enterprise Architecture and Information Technology Investment Management in the Department of Justice, Audit Report 06-02, November 2005.

  23. Department of Justice, Office of the Inspector General, The Drug Enforcement Administration’s Management of Enterprise Architecture and Information Technology Investments, Audit Report 04-36, September 2004.

  24. Department of Justice, Office of the Inspector General, The Federal Bureau of Investigation’s Pre-Acquisition Planning For and Controls Over the Sentinel Case Management System, Audit Report 06-14, March 2006.

  25. Department of Justice, Office of the Inspector General, Review of the Terrorist Screening Center, Audit Report 05-27, June 2005. (Limited Official Use and Redacted)



« Previous Table of Contents Next »