Return to the Table of Contents

Review of the Critical Incident Response Plans of the United States Attorneys’ Offices

Report Number I-2004-001
December 2003


RESULTS OF THE REVIEW

Most USAOs Have Not Prepared and Exercised Comprehensive Plans to Guide Their Response to Critical Incidents

Most USAOs failed to prepare comprehensive critical incident response Plans. Our analysis showed that the Plans submitted by the USAOs provide inadequate guidance to respond to a critical incident. The 48 actions that should be taken when responding to a critical incident were contained in the 1999 CMC Manual. The actions address essential elements of critical incident response, including coordinating interviews to avoid multiple agency interviews of the same person, providing for a unified evidence room and communicating chain of custody procedures, establishing a crime scene protocol, preserving the crime scene, and setting up overlapping relief shifts to avoid fatigue.

We analyzed the 76 Plans available at CTS and EOUSA and found that most Plans substantially failed to include instruction to ensure that USAO staff responding to a critical incident accomplishes the 48 fundamental actions. Only 12 of the 76 Plans on file addressed at least half of the 48 actions, and just 4 Plans addressed all 48 actions (Figure 2, below). Many of the omitted actions represent vital elements of an effective critical incident response. For example, of the 76 Plans:

Figure 2. Click on table  for a text version.

In some cases, the Plans did not represent even a minimal attempt to develop critical incident guidance. Eleven USAOs did nothing more than insert their District's name into a "Sample Plan" distributed by EOUSA and attach contact lists from their office and several other agencies.22 In three other cases, the USAOs submitted documents other than a critical incident response plan. These documents included an Occupant Emergency Plan, an Emergency Relocation Plan, and a "Plan" comprised of handouts from a January 1999 FBI-sponsored workshop on Weapons of Mass Destruction. 23

Developing Plans that address all 48 actions is essential to ensure a comprehensive response to a critical incident and to avoid repeating missteps that were identified in after-action reports on earlier critical incidents. For example, the need for action to establish a crime scene protocol and better preserve the crime scene were identified in the Oklahoma City after-action report. The need to plan for overlapping relief shifts to avoid fatigue and the potential for poor decision-making that may result from fatigue was recommended in the Ruby Ridge after-action report. Failing to develop Plans that ensure these and other actions are accomplished increases the risk that USAOs will respond incompletely or ineffectively to critical incidents. While the absence of a Plan does not preclude a USAO from responding to a critical incident, having a Plan that guides responders through all 48 fundamental actions ensures that the USAO is better prepared to respond quickly and appropriately to a critical incident.

  During our review, the Deputy Director, Security Program Staff, acknowledged some confusion concerning the overall security planning effort within USAOs. The confusion was the result of different directives that require USAOs to draft and maintain six plans for separate but related purposes, many of which overlap in key areas. To bolster preparedness and eliminate confusion, in July 2003 EOUSA recommended that USAOs draft a core plan with individualized annexes targeting specific purposes, such as crisis response, continuity of operations, occupancy emergencies, and emergency relocation. A Security Working Group (SWG) comprised of U.S. Attorneys and EOUSA senior staff is responsible for promulgating the appropriate guidelines. These actions are an indicator of EOUSA's effort to be more responsive to the confusion created by overlapping plans.

Planning to respond to critical incidents was not a priority for USAOs. Based on the quality of the Plans, as well as our discussions with CMCs, we concluded that the CMC Program was not a priority for the USAOs. In our interviews with 26 CMCs at the USAOs, we were consistently told that more attention was given to competing priorities and that the CMCs' workload was not adjusted to allow time for Plan development and CMC Program implementation. Several CMCs noted that there is no applicable category on their work tracking system to account for the time they spend on CMC duties. Therefore time spent on CMC Program related activities does not get reported as time spent directly on work that contributes to overall office performance.

Most USAOs have never conducted critical incident response exercises. Over 60 percent (49 of 81) of the USAOs responding to our survey reported that they conducted no Figure 3. Click on table  for a text version.critical incident response exercises since 1996. Another 20 percent (16 USAOs) conducted one exercise during that time. Only 17 percent (14 USAOs) conducted more than one exercise in the last 7 years (Figure 3).

The expectation that USAOs would exercise their Plans was clearly enunciated in 1997 by the former Attorney General. In speaking to all CMCs, she stated that USAOs should participate in regional critical incident response exercises with their local FBI field offices because: "The first tense hours after a bomb has exploded should not be spent on trying for the first time to build a working relationship with your key law enforcement agency. It is too little, too late."24

However, in promulgating the CMC Program, CTS and EOUSA did not establish any specific requirements for USAOs to conduct exercises to test their Plans and practice responding to critical incidents. Consequently, as detailed above, only a few USAOs have regularly conducted critical incident response exercises.25

 

ODP: Exercises Are An Essential Part
of Critical Incident Response

Experience and data show that exercises are a practical and efficient way to prepare for crises. They test critical resistance, identify procedural difficulties, and provide a plan for corrective actions to improve crisis and consequence management response capabilities without the penalties that might be incurred in a real crisis. Exercises also provide a unique learning opportunity to synchronize and integrate cross-functional and intergovernmental crisis and consequence management response.

We found that exercising plans is standard practice for emergency response programs. For example, Federal Preparedness Circulars (FPC) 65 and 66, which direct all Federal agencies to develop continuity of operations plans to maintain agency operations in the event of catastrophes, require that the plans be exercised at least annually.26

Similarly, within the Department, the FBI (which also prepares critical incident response plans) requires its field offices to conduct annual crisis response exercises. We contacted the Department's Office of Domestic Preparedness (ODP), which helps state and local agencies prepare to respond to critical incidents, and found that between May 2000 and March 2003, that office sponsored over 150 crisis response exercises.27 The Acting Director of ODP, and the Director, Exercise and Evaluation Division, ODP, told us that preparedness depends on exercising critical incident response plans, as well as updating and revising the Plans to reflect lessons learned. An ODP official told us that, "Having a Plan and not exercising or revising it is the same as not having a Plan."

Importantly, although few USAOs conducted exercises, most CMCs that did conduct exercises reported that they were helpful in establishing sound operational procedures to respond to a critical incident. We interviewed five of the six CMCs from USAOs most directly impacted by the events of September 11, 2001, who confirmed the need for conducting regular Plan exercises and updating Plans. The CMCs all told us that, based on their experience, well-exercised Plans save lives, property, and other assets.

 

CMC Training Recommendations

In a survey of the CMCs responsible for implementing the CMC Program (81 of 94 responding), and interviews of 26 CMCs, we received numerous comments recommending improvements to CMC Program training. The most frequent CMC training recommendations were:

  • Organize districts by size and situation for discussion. Most CMCs recommended against a "one size fits all" training model. Where appropriate, lectures and discussion materials should consider the inherent differences in personnel and other resources available to small, medium, large, and extra-large USAOs.

  • Address the relationship of the CMC Program to the ATTFs. Several CMCs stated that ATTF coordinators and CMCs duties overlap, particularly in coordinating with state and local agencies. One interviewee suggested that CMC and ATTF coordinator training be designed so the groups can discuss areas of joint or overlapping responsibilities.

  • Maximize small group discussion. CMCs frequently stated that training should utilize a more interactive format featuring pragmatic advice and information sharing among USAOs, rather than being "a gathering of talking heads" as one CMC described the prior CMC Training Conferences.

  • Conduct training on a regular basis. CMCs stated that training in critical incident response should be conducted on a regular schedule (annual or bi-annual) and the training should be mandatory.

While we found that most USAOs do not regularly conduct critical incident exercises, some USAOs did participate in exercises led by the FBI's Crisis Management Unit (CMU). The Supervisory Special Agent (SSA) for the FBI CMU informed us that, in the 33 months from January 1999 to September 2001, USAOs participated in 20 of 23 FBI-sponsored exercises where USAO participation would have been appropriate.28 The exercises took place throughout the country and involved USAOs of all sizes. Scenarios ranged from a full-scale mock airliner hijacking in Anchorage, Alaska, to a weapons of mass destruction tabletop exercise in Pomona, New York. While the USAOs participated in the FBI CMU exercises when they had the opportunity, we noted that, in the 33 months covered by the SSA's records, the FBI CMU conducted exercises in less than 25 percent of the USAO districts. Therefore, most USAOs had no opportunity to participate in an FBI-sponsored exercise.

CTS also reported that USAOs participated in many exercises since 1997, including cyberterrorism exercises, a "full-field" weapons of mass destruction exercise, exercises in preparation for the 2002 Winter Olympics, and TOPOFF 2000 and TOPOFF 2002, which were large-scale exercises simulating coordinated terrorist attacks in multiple jurisdictions. Our survey regarding participation in exercises (see Appendix C) was specifically designed to capture data on USAO participation in all of the above exercises.

USAOs report they lack training and resources to conduct exercises. During our interviews of 26 CMCs, we asked why USAOs did not conduct more critical incident response exercises. They responded that the primary reasons for not conducting exercises were that they lacked information on how to conduct exercises (14 of 26) and that small districts lacked the resources to conduct an exercise. However, we found that some CMCs took creative steps to identify and use local resources to conduct exercises. For example, one CMC in a medium size USAO in the Midwestern United States told us that she is developing a tabletop exercise, complete with video, with the assistance of a professor at a top research university. The CMC told us that she serves on a curriculum advisory committee for a graduate program in homeland security that the same professor is developing. Such efforts enhanced the USAO's response capabilities by enabling it to draw on previously untapped resources.

CTS and EOUSA Failed to Fulfill Their Administrative and Support Responsibilities for the CMC Program.

We found that CTS and EOUSA did not effectively support the CMC Program because they did not provide effective training, did not provide adequate guidance, did not accurately track and maintain the submitted Plans, did not review the submitted Plans, and did not evaluate the USAOs' implementation of the CMC Program.

CTS and EOUSA did not provide effective training. Since the inception of the CMC Program in May 1996, CTS sponsored only two CMC Training Conferences and one two-hour videoconference. The first training conference took place in Arlington, Virginia, from June 17 through 20, 1997. The second conference took place in Columbia, South Carolina, from October 19 through 22, 1999. No additional CMC-specific training was provided until March 2003, when CTS sponsored a two-hour videoconference for CMCs. The Deputy Chief, CTS, confirmed that CTS neither developed nor sponsored any other training for CMCs.

Limited CMC Training. When we questioned the lack of CMC-specific training over the previous four years, CTS told us that national CMC training had been planned for Fall 2001 or Spring 2002. According to CTS, this training was initially deferred after the events of September 11, 2001, to accommodate other required training, and then deferred further because many of those who would have been the trainers or trainees were involved in the nationwide investigation of the terrorist attacks. In August 2003, in response to a draft of this report, CTS told us that additional preparedness and response training was scheduled for March 2004.

We assessed the training agendas of the 1997 and 1999 CMC training conferences and viewed a videotape of the 2003 videoconference. We found that CMCs received little specific instruction on how to develop Plans and conduct critical incident exercises. According to the 1997 CMC Training Conference agenda, during the three-day conference the CMCs received three hours of instruction on developing crisis response plans and spent three hours in a group assignment on planning exercises. Similarly, during the 1999 CMC Training Conference CMCs participated in a two and one-half hour session covering "Development and Testing of a District Plan and Intra-district Coordination of Planning Efforts." The most attention given to either topic related to conducting exercises occurred at the 1999 conference. Participants spent four hours in a general session discussing two possible terrorist attack scenarios, after which they met in small groups to discuss one of the scenarios for 90 minutes. The session concluded with a 45-minute review for all participants.

In addition to reviewing the 1997 and 1999 Conference agendas, we discussed training during our interviews with CMCs across the country. All but one of the 26 CMCs we interviewed indicated that the prior training was inadequate and that they needed additional training. Further, the CMCs stated that the training should be revised to include changes that have occurred since the last CMC Training Conference in 1999. The changes include the post September 11, 2001, reorganization of the Department to focus on counterterrorism; the passage of the USA PATRIOT Act and other terrorism-related legislation; the reorganization of the Criminal Division; the issuance of the National Strategy for Homeland Security; the formation of the Department of Homeland Security; and the creation of the ATTFs within USAOs. While these topics were addressed in the ATTF training conducted since September 2001, we found that few CMCs have attended that training.

Regarding the lack of CMC Program training, CTS confirmed that it did not conduct more CMC-specific training after 1999. CTS also stated that it has no line authority over the USAOs and, thus, can provide guidance but not dictate what the USAOs do. CTS told us that with the Department's increased focus on prevention, it is working to see that fewer incidents occur and that there is less need for response activity. CTS stated that it is addressing preparedness through such activities such as increased planning and cooperative action between FBI Strategic Information Operations Center and CTS, the establishment of a national process tracking system, and a CTS website being piloted to 18 USAOs.

Other Training Fails to Fully Address Critical Incident Response Planning. During our review, the CTS Chief told us that ATTF training focused on both prevention and crisis response. He further stated that the training conducted for ATTF Coordinators covered much of the information needed by CMCs. CTS cited several examples of training that they believed met the needs of the CMCs, including:

Because the CTS Chief stated that ATTF training addressed CMC needs, we reviewed the training materials from the two national ATTF training conferences and the six regional training conferences. We found that the ATTF training focused on intelligence gathering and information sharing to prevent terrorist attacks. The training neither addressed preparing to respond to an attack or other critical incidents, nor developing and exercising a critical incident response plan.29 While we found the first ATTF conference included a session on crisis response, we also found that crisis response information was not covered at the following six regional training conferences, nor the second National Conference. Further, 10 of the 26 CMCs we interviewed also hold the ATTF Coordinator position for their USAO. In the last 2 years, these 10 CMCs attended the 2 ATTF national conferences and a regional training conference. Without exception, the CMCs told us they believe that the ATTF training was not a substitute for additional CMC-specific training.

Although CTS provided information that showed USAO staff members have attended numerous training events related to the ATTF initiative, our review found that this training did not replace or diminish the need for CMC training. Our review of the training agendas and curricula found that most of the training focused on the primary ATTF goals of identifying and preventing terrorist attacks, not on responding when attacks occur. In addition, while some of the training did address preparing to respond to attacks, our review of the attendee lists found that few CMCs attended that training. For example, no CMCs attended the January 2003 U.S. Attorney Anti-Terrorism Conference, and only 52 of the CMCs attended one of the six national security conferences conducted between May and September 2003. Further, ATTF training that did address response capabilities focused on responding to the threat of terrorism, not on responding to other critical incidents. The inadequacy of the ATTF training as a substitute for CMC training was confirmed in our interviews with 26 CMCs, as most (24 of the 26) identified the lack of training as the major hurdle they faced in improving the readiness of their offices to respond to a critical incident.

In addition to the ATTF training, CTS stated that many USAOs had been involved in "real-life" events such as responding to the September 11, 2001, terrorist attacks, and subsequent terrorism investigations, and suggested that those responses served as training. The actions of USAOs in responding to critical incidents could result in improvements to preparedness for later events if they were followed by after-action reviews, identification of weaknesses, and improvements to the process. However, we found that was not occurring. In May 2003, we contacted the 81 USAOs that responded to our initial survey to determine if they had made any substantive changes to their Plans.30 The responses we received from 53 USAOs indicated that only 8 had ever updated their Plans. While responding to "real-life" events does provide experience, the failure to fully exploit that experience by identifying shortcomings and improving response Plans leaves the USAOs at risk of repeating mistakes during future incidents.

Based on our review of the CMC training materials, our evaluation of the Plans submitted, our interviews with CMCs, and our determination that the ATTF training did not provide a substitute for the CMC training, we concluded that the training provided to CMCs has not sufficiently prepared them to develop and exercise critical incident response plans. The inadequate CMC training contributed to the poor quality of the Plans submitted by the USAOs.31 The uniform poor quality of the Plans and CMC feedback strongly suggest that the CMCs need additional training to provide them with the guidance that will enable them to prepare complete crisis response Plans, as well as to implement effective exercises to test the Plans.

CTS and EOUSA provided minimal guidance to the CMC Program. From 1996 until May 2003, CTS and EOUSA guidance to CMCs consisted of providing CMCs with the CMC Manual at the 1997 and 1999 conferences, and a "Sample Plan" sent to them in October 1999. This paucity of guidance was confirmed by the responses of all CMCs we interviewed. Significantly, most CMCs appointed since 1999 said that they were either unaware of the CMC Manual or unaware that it was available through USABook Online, the internal Department of Justice website for USAOs. Also, as with training, we found that the CMC Manual has not been updated since October 1999, and therefore does not reflect the critical changes in departmental and national policy since September 2001.

Further, we found that CTS and EOUSA did not work together to develop appropriate guidance for the CMC Program. For example, without notifying CTS, in October 1999 EOUSA distributed a five-page Sample Crisis Response Plan (Sample Plan) to CMCs. The Assistant Director for the EOUSA Security Programs Staff (SPS) told us that EOUSA distributed the Sample Plan after noting a serious inconsistency in format of the initial plans submitted by USAOs. According to EOUSA, the Sample Plan was never intended to be a comprehensive template, but was intended as a resource for CMCs to use in preparing district-specific Plans.

We reviewed the Sample Plan and confirmed that it is primarily a format guide. It does not provide complete guidance for USAOs. As a format guide, the Sample Plan was not designed to be scalable to meet the varying size, location, and vulnerabilities of all USAOs. Further, the Sample Plan gives examples, but does not mention many of the 48 actions recommended in the CMC Manual, such as coordinating with the FBI, ensuring the availability of specialized resources, and cooperating with state and local agencies.

The Deputy Chief, CTS, told us that, after CTS learned of the EOUSA Sample Plan, it did not support its distribution. According to the Deputy Chief, each USAO has unique requirements and CTS was concerned that some USAOs would merely adopt the Sample Plan without modification.32 However, she stated, at that time CTS did not have the resources to develop a sample plan that would address all the varying needs of the USAOs. When we asked her if CTS had contacted either EOUSA or any USAO to communicate this concern, she told us that it had not.

In May 2003, near the end of our review, CTS issued a "Guide to Developing a Crisis Response Plan." CTS requested that the CMCs review and revise their Plans using the Guide as a baseline.33 The USAOs were instructed to submit their revised Plans to their Regional ATTF Coordinators and the ATTF Coordinator at EOUSA. As of August 2003, USAOs reported that they were in the process of revising their Plans.

CMCs cite need for additional guidance. The CMCs we interviewed identified several areas of needed guidance. For example, half of the CMCs (including CMCs that were also ATTF Coordinators) cited the lack of a forum to improve communication of CMC Program information. CMCs told us that they would benefit from a web-based system that would allow them to share information such as:

The CMCs also requested guidance on the relationship between the CMC Program and the Department's counterterrorism mission, additional training reflecting the changes in law and policy regarding critical incident response since 1999, individualized feedback on submitted Plans, and information on conducting exercises tailored to the size of the district (Figure 4).

During our exit conference with CTS in which we discussed the findings of this review, CTS told us that it is developing a website intended to address these issues, among others. As of September 12, 2003, the website was being pilot- tested at 18 USAOs. According to CTS, full access is planned for all USAOs by the end of October 2003.

Figure 4. Click on table  for a text version.

CTS and EOUSA failed to accurately track and maintain the Plans submitted by USAOs, resulting in lost Plans. We found that both CTS's and EOUSA's tracking and maintenance of submitted Plans were disorganized and inadequate. Neither organization was able to accurately identify which USAOs had submitted a Plan, nor were they able to ensure that the Plans on file were current.

We found that the problems with CTS and EOUSA's management of submitted Plans began at receipt. Neither CTS nor EOUSA date stamped the Plans upon receipt. Our review of the 76 Plans submitted found almost 40 percent had no publication or submission date. As a result, it was not possible to determine from CTS's and EOUSA's records if those Plans were the current versions in use at the USAOs. When we asked the Security Programs Staff (SPS) Assistant Director about the lack of date stamping, he confirmed that they had no mechanism for tracking Plans, other than a checklist containing a listing of the USAOs and corresponding boxes that were checked to indicate an office had submitted a Plan.

We found that CTS and EOUSA have no system for ensuring that they both have the same Plans in their inventory. The Deputy Chief, CTS, confirmed continuing disparity in the inventories. According to the Deputy Chief, after the second CMC conference, CTS started to inventory and review the Plans and found that it did not have the number of Plans that EOUSA said that it had. She indicated that CTS has since tried to obtain the missing Plans, but has been unsuccessful. We asked EOUSA why it did not provide the Plans to CTS, and EOUSA indicated that it was not aware of any outstanding CTS requests.

As a result, different offices reported different counts of submitted plans. According to EOUSA, 88 Plans have been submitted since 1996. However, a list provided by CTS indicated that 81 USAOs submitted Plans. Moreover, in the FY 2001 Performance Report, the Department reported that 88 of the 94 USAO Districts had submitted Plans by the end of FY 2001.35 To establish an accurate count, we conducted a physical inventory of all of the Plans available at CTS and EOUSA and determined that only 76 Plans were on file. 36

The discrepancy between the reported number of submitted Plans and the number we found on file apparently occurred because six USAOs attempted to submit their Plans but the Plans were lost, and six other USAOs were counted as having submitted Plans in error. Specifically, in response to our request for further information on the Plans that had been submitted, the SPS Assistant Director provided 88 USAO responses to an October 2001 e mail in which the USAOs were requested to review their Plans, ensure that the Plans were current and complete, and confirm completion of the review by e-mail.37 Among the 88 responses were 12 e mails from USAOs that our review found had no Plans on file. Six of the e-mails indicated that an electronic copy of the USAOs' Plans had been included as an attachment, and the other six e-mails indicated that the USAO had reviewed the Plans as requested, but did not indicate that a copy was attached. When we asked the SPS Assistant Director if he or his staff had printed the six attached Plans, he told us that the electronic copies, including all attachments, had been deleted. Nonetheless, based on the receipt of 88 e-mail responses, the SPS Assistant Director reported that 88 USAOs had submitted their Plans.

CTS did not review timely submitted Plans or provide feedback to USAOs. Although most USAOs submitted their Plans to CTS and EOUSA as required, CTS did not review timely or adequately submitted Plans and failed to act when its review showed that the Plans were severely deficient in content and quality.38 CTS did not review the Plans as it received them, and some Plans remained on file for as long as five years before CTS began its review. CTS never provided feedback to each USAO on its individual plan and, as a result, USAOs continued to rely on Plans that substantially failed to address the fundamental actions necessary to respond effectively to a critical incident. Our interviews showed that the CMCs wanted feedback on the Plans. All but one of the 26 CMCs we interviewed indicated that they were unsure of the quality of their Plans and strongly desired feedback regarding Plan quality and content. After additional training, feedback on the Plans was the most frequently requested support identified by CMCs.

 

CMC Feedback -
Need for CTS review of Plans

"Feedback would be helpful, any kind of feedback… observations, insights… I would love to see some feedback - model plans, best practices, any type of information to make the plans more effective… We are not doing this for bureaucratic reasons."

- CMC from a large-size USAO in the southern United States

The CTS Deputy Chief told us that the reason CTS did not complete the reviews or provide feedback to the USAOs was that CTS did not have the resources to conduct individualized Plan reviews. Therefore, CTS opted instead to develop its own model plan.39 Beginning in early to mid-2001, nearly five years after CTS began receiving Plans, four CTS attorneys began reviewing the Plans on file in order to draft a model plan to guide USAOs in revising their Plans. Each attorney reviewed approximately 10 plans in conjunction with their work on drafting the model plan. Approximately 5 to 10 plans were identified as having "best practices" or provisions worthy of inclusion in a revised model plan that would address content, not just format. However, CTS's initial review also revealed serious shortcomings in the submitted Plans. Nonetheless, there was an additional two-year delay before CTS issued its Guide to Developing a Crisis Response Plan in May 2003.40 In August 2003, USAOs reported that they were in the process of revising their Plans.

EOUSA neglected to examine CMC Program implementation during evaluations of USAO operations. We found that EOUSA only recently included a minimal examination of the USAOs' implementation of the CMC Program in the triennial operations reviews conducted on each USAO. During the triennial operations reviews, EOUSA's Evaluation and Review Staff (EARS) evaluates "the performance of the Offices of the United States Attorneys, making appropriate reports and taking corrective action where necessary."41 When we initially interviewed the EARS Assistant Director, he told us that the CMC Program was not part of EOUSA's triennial operations reviews of USAOs. In a subsequent interview, he informed us that, in October 2002, two questions regarding the CMC Program were added to a Security Evaluator's Checklist completed by evaluators during the reviews. The questions added to the checklist were:

We asked the EARS Assistant Director why the CMC Program was not reviewed in more depth during the triennial operations reviews. He stated that EARS currently lacks the resources to evaluate the CMC Program in greater detail. When we posed the same question to the SPS Assistant Director, he asserted that the Plans are prosecutorial plans, not security plans. He stated that the Plans are not within the purview of the SPS to review, but are more appropriate to be reviewed by CTS. Further, he pointed out that the individuals reviewing the security operations are generally security personnel, who may not have extensive legal training, and therefore would not be appropriate to evaluate a prosecutorial plan. We asked the SPS Assistant Director if he had requested CTS's assistance in formulating appropriate evaluation questions for the CMC Program. He stated that he had not, and assumed that if CTS wanted the CMC Program evaluated, it would contact EARS directly.

We reviewed the reports from 18 EARS reviews conducted since those questions were added. We found the questions were checked off without any additional information provided. Moreover, the limited information contained in the reports was inconsistent with what we found when we reviewed the Plans available at CTS and EOUSA. Four of the 18 triennial operations reviews were conducted at USAOs that we found had no Plans on file with CTS or EOUSA, but the reports indicated that the USAOs had submitted Plans. In contrast, one report on a USAO that we confirmed had submitted a Plan indicated the opposite.

The Department Overstated the CMC Program Implementation in Its Annual Performance Reports.

We found significant discrepancies between the reported performance of the CMC Program in the Department's Annual Performance Reports and the actual performance of the USAOs, CTS, and EOUSA in implementing the CMC Program.42 While the performance measure was the number of USAOs with Plans, the supporting narrative indicated that all of the Plans (88) had been submitted and reviewed by CTS. The narrative also stated that the Plans met certain minimum content standards and provided a crosswalk with FBI and local and regional crisis response plans. However, we found that the number, the process, and the content of the Plans were all reported incorrectly. As a result, the intent of the performance measure - to ensure that the Department was fully prepared to respond to critical incidents - was not clearly met.

Figure 5. Click on table  for a text version.

In its FY 2000 Performance Plan, as a part of its strategic objective to "Improve Response Capabilities to Terrorists' Acts," the Department established a goal of having Plans in place at 90 of the 94 USAOs by the end of FY 2002.43 In FY 2001, JMD reported that 88 USAOs had completed their Plans based on the USAOs' responses to an e-mail survey conducted by EOUSA (Figure 5). The Department declared the performance measure "met," and eliminated the performance measure from future Annual Performance Reports. 44

In addition to reporting the number of Plans submitted, several Performance Reports also contained a narrative that described the content of the Plans, and the support that CTS and EOUSA had provided to the CMC Program.45 That narrative stated that: 1) CTS had reviewed the Plans submitted by the USAOs, 2) the Plans provided specific information to guide the response to a terrorist attack, and 3) the Department was providing continuing support to the CMC Program.

Our review did not corroborate the reported level of performance and the claims of continued CMC Program guidance and administration. For example, the FY 2000 Performance Plan stated:



These plans articulate the steps each office would take in the event of a terrorist act or other critical incident in their jurisdiction. Critical aspects of each plan include a listing of essential points of contact with state and local authorities, including first responders and other emergency personnel; identification of potential infrastructure targets, in both the public and private sector; and coordination with the local the FBI field office and other law enforcement entities. 46
Data Validation and Verification: The plans are evaluated to determine if they meet the criteria of a complete plan. This criteria [sic] the Attorney General includes, but is not limited to whether resource support elements such as other government agencies (FEMA, National Guard, etc.) are identified.47
Strategies and Initiatives to Achieve the FY 2002 Goal: Our strategy is to build maximum feasible capability in the counterterrorism program, allowing the Department to identify and address terrorist threats…It means that all elements of crisis and consequence management at the federal, state, and local levels throughout the country will have developed and implemented integrated terrorism response plans [emphasis added].



Footnotes

  1. The Sample Plan did not contain specific guidance on how to respond to critical incidents, but was a format guide intended to help the USAOs in developing their own Plans. The Sample Plan is discussed further on page 21.

  2. Occupant Emergency Plans provide for either the rapid evacuation of a building or sheltering in place within the building, depending on the nature of the incident that triggered the plan. Emergency Relocation Plans provide for the continuation of all essential organizational activities in secondary locations because the primary location has become unusable. These plans are required for USAOs, but they address activities in a context other than crisis response, as defined in the CMC Program.

  3. The Attorney General's speech at first CMC National Training Conference, June 17, 1997.

  4. CTS did encourage USAOs to participate in preparedness exercises conducted by the FBI and by other federal and state and local agencies in their region. At both national conferences, CTS distributed a list of exercises organized geographically to facilitate USAO involvement in crisis response and preparedness training. This list contained numerous exercises sponsored by ODP.

  5. FPCs 65 and 66 were issued by the Federal Emergency Management Agency, on July 26, 1999, and April 30, 2001, respectively.

  6. The Office of Domestic Preparedness, which assists state and local public safety personnel in acquiring training and equipment to manage the response to weapons of mass destruction attacks, moved from the Department of Justice, Office of Justice Programs, to the Department of Homeland Security on March 1, 2003.

  7. The CMU conducts a wide range of exercises, some of which involve supporting local law enforcement agencies. Because some of these exercises do not involve violation of federal law, USAO involvement is not always appropriate.

  8. The single reference to the CMC Program that we found was a list of CMC telephone numbers dated January 22, 2002. The only region to address the need for preparing for critical incidents was the Northeastern Region.

  9. We defined "substantive" as changes in policy, scope, or procedures, as opposed to "administrative only" changes, such as updating telephone contact lists.

  10. As discussed earlier in this report, our review of the 76 Plans available at CTS and EOUSA found 62 do not address most of the 48 actions deemed essential to a critical incident response by CTS, and our survey of CMCs found that since the Program's inception in 1996, 60 percent of USAOs have never conducted an exercise.

  11. Our review of the 76 Plans on file with CTS and EOUSA, as well as our interviews with CMCs substantiated CTS's concern. Our review of the Plans showed that at least 11 USAOs simply put their district's name on the plan, added a phone list, and submitted it back to EOUSA.

  12. Attachments to this Guide included a re-release of several outdated documents, some from as far back as 1994. Included was an unrevised copy of Chapter 2 of the CMC Manual, "Practical Tips." CTS did not revise the "Critical Incident Checklist for the Initial 48-Hours" to reflect legislative and policy changes that have taken place since September 11, 2001.

  13. While we acknowledge the CMC's comments that the lack of a forum makes it more difficult for them to share information, we noted that it has not prevented all CMCs from sharing information. In fact, more than 25 percent of the CMCs we interviewed told us that they used personal contacts to obtain information from other USAOs to assist in writing their Plans.

  14. Department of Justice, FY 2001 Performance Report & DOJ FY 2002 Revised Final, FY 2003 Performance Plan, page 223. The Justice Management Division collected the data used in the report.

  15. In August 2003, in response to a draft of this report, CTS reiterated that it had 81 Plans on file and provided a list of the Plans. We reviewed the list and found it omitted the Northern Mariana Islands federal judicial district, but did list four Plans that were not among the Plans initially made available to us. When we asked to review the four plans, CTS could not find two and had the USAOs provide copies by facsimile. The Principle Deputy Chief, CTS, speculated that the four Plans may have been out of the files during our review because CTS staff may have been working with them, but she could not be sure because the individual responsible for maintaining the files was on detail in another city.

  16. EOUSA Memorandum to All USAOs, "Review of Crisis Response and Disaster Recovery Plans," October 15, 2001.

  17. After the completion of our fieldwork, CTS provided the inspection team with the name of a former staff attorney who said he reviewed all of the Plans that were submitted as of the end of September 1999. When interviewed, he told us that he did not recall the exact number of Plans reviewed nor did he write up individual Plan reviews, but his overall assessment was that the Plans were not detailed and were generally of poor quality. He also told us that he informed the CTS Deputy Chief of his findings.

  18. As discussed on page 23 of this report, in May 2003, CTS sent all CMCs a "Guide to Developing a Crisis Response Plan."

  19. As noted earlier in this report, after the completion of our fieldwork, CTS provided the inspection team with the name of a former staff attorney who reviewed all of the Plans that were submitted as of the end of September 1999. When interviewed, he told us that he did not recall the exact number of Plans reviewed nor did he write up individual Plan reviews, but his overall assessment was that the Plans were not detailed and were generally of poor quality. He also told us that he informed the CTS Deputy Chief of his findings.

  20. EOUSA website, http://www.usdoj.gov/usao/eousa/mission-and-functions, April 9, 2003.

  21. Each fiscal year, the Department develops a Performance Plan that describes how it will achieve the objectives of its overall Strategic Plan. The following fiscal year, the Department issues a Performance Report that details its progress at achieving those objectives.

  22. Department of Justice, FY 2000 Performance Report & FY 2001 Performance Plan, April 2001, page 29.

  23. Department of Justice, FY 2001 Summary Performance Report, page 223, Appendix A - Discontinued Measures Performance Report.

  24. Department of Justice, FY 2001 Summary Performance Report, February 2000; Department of Justice, Department of Justice, FY 2000 Performance Report & FY 2001 Performance Plan, April 2001; Department of Justice, FY 2002 Performance Report/FY 2003 Revised Final/FY 2004 Performance Plan, February 2003.

  25. Strategic Objective 1.4 Terrorism, Deter and detect terrorist incidents by developing maximum intelligence and investigative capability. FY 2000 Performance Report and FY 2002 Performance Plan - April 2001 (pages 34 and 35).

  26. Department of Justice, FY 2001 Performance Report/FY 2002 Revised Final, FY 2003 Performance Plan, Section 1.4B, Improve Response Capabilities to Terrorists' Acts, April 2001.