Follow-up Review of the Critical Incident Response Plans of the United States Attorneys’ Offices

Evaluation and Inspections Report I-2007-001
January 2007
Office of the Inspector General


Results of the Review

EOUSA and the CTS developed a revised “model plan” that encompassed prior OIG recommendations.

In the spring of 2004, EOUSA and the CTS brought together a panel that included four CMCs experienced in crisis response planning to assess the changes made to the model plan during the OIG’s 2003 review and to work with EOUSA and the CTS to further revise the model plan.39 The five panelists met several times to complete a revised draft of the model plan.

The final version of the revised model plan directed USAOs to address all of the essential functions that should be contained in a CIRP. In addition, the OIG reviewed all of the comments provided by the expert reviewers and found that, in those instances where a USAO’s CIRP was not acceptable, the reviewers gave additional direction to ensure compliance with the model plan’s standards. These actions were responsive to the OIG’s 2003 finding that EOUSA and the CTS had not provided sufficient direction to USAOs to develop adequate plans. In the OIG’s 2003 review, we found that only 12 of the 76 CIRPs on file at EOUSA and the CTS addressed more than half of the 48 essential functions that a USAO may be required to perform during a response to a critical incident. Only 4 of the 76 plans addressed all 48 functions.

Our side-by-side comparison and analysis of the original and revised model plans showed significant improvement. The 19-page revised version is both comprehensive and detailed in its guidance regarding the required content for a USAO CIRP. It also addresses the OIG’s prior recommendation to ensure that the CIRPs cover all critical areas by encompassing all 48 essential functions that a USAO may be required to perform during a response to a critical incident.

EOUSA and the CTS provided improved training and guidance to CMCs.

In addition to revising the draft model plan, the expert panel worked with EOUSA and the CTS to prepare the materials for a training conference held in March 2004 at the National Advocacy Center in Columbia, South Carolina. The materials included a sample tabletop exercise for use at the conference.

At the conference, EOUSA and the CTS distributed the draft of the revised model plan for review and comment by the CMCs in attendance. The CMCs also participated in the sample exercise and engaged in small-group discussions. Following the conference, the expert panel used the CMCs’ feedback to finalize the model plan and tabletop exercise. Of the 61 CMCs that attended this conference, 49 remained their district’s CMC as of August 2006.

In 2005, the CTS requested and received approval to create a National Crisis Management Coordinator (National Coordinator) position to assist USAOs with their critical incident preparedness, including developing the training for the 2005 CMC conference, which took place in October. Part of the agenda at the conference was to share the lessons learned by those USAOs whose districts were struck by hurricanes in 2004 and 2005. Seventy-five CMCs attended the 2005 training conference, 51 of whom remained their district’s CMC as of August 2006.

Respondents to our survey gave positive feedback on both the 2004 and 2005 CMC training conferences. The CMCs stated that the training was helpful in understanding the role that USAOs would play in a real crisis. The CMCs’ positive assessment of the 2004 and 2005 training conferences was in contrast to the CMCs’ responses during our 2003 review that training was insufficient. Our 2003 report documented that since the inception of the CMC program in May 1996, EOUSA and the CTS had sponsored only two training conferences specifically for CMCs (in 1997 and 1999). From 1999 to the March 2004 conference, the only training session held for CMCs had been a 2-hour videoconference conducted in March 2003. During that review, the CTS stated that the Department’s anti-terrorism focus following September 11, 2001, had precluded additional training sessions for CMCs.

The materials covered during the 2004 and 2005 CMC conferences were helpful in understanding the role a USAO would play in a crisis, according to survey respondents. In contrast, our 2003 review found that the CMCs’ previous training was narrowly targeted at anti-terrorism issues. Of the 26 CMCs we interviewed during the 2003 review, 24 identified the lack of training as the major hurdle they faced in improving their offices’ readiness to respond to a critical incident. Our 2003 report recommended that EOUSA and the CTS provide regular training for CMCs on how to prepare effective and comprehensive CIRPs, as well as develop and conduct appropriate critical incident response exercises.

Also, in response to an OIG recommendation to complete the development of a web site containing information on critical incident response, including lessons learned, exercise scenarios, and best practices, EOUSA and the CTS each expanded the content of its web sites (both Internet and intranet). Additionally, during the course of our current review, the CTS further enhanced the amount of information available on its intranet site. As of October 2006, the web sites provided additional information and resources to assist the CMCs in their efforts to conduct critical incident preparedness activities. The site now includes electronic copies of the USAOs’ CIRPs, after-action reports, and the initial evaluations from the expert panel members. It also includes the 2004 and 2005 CMC training conference materials, as well as sample exercises and updated policy guidance.

EOUSA and the CTS directed the USAOs to revise their CIRPs based on the “model plan,” and had the CIRPs reviewed by an expert panel.

On May 10, 2004, EOUSA and the CTS distributed the final version of the revised model plan to all CMCs, with instructions to revise their existing CIRPs using the new format and content guidelines in the model plan and submit them to EOUSA by May 28, 2004. These revisions addressed the OIG’s prior recommendation that all USAOs revise their CIRPs to address action items the CTS had identified.

Upon receipt of the revised CIRP from each district, EOUSA forwarded it to one of the four members of the expert panel to review for compliance with the model plan’s requirements.40 Prior to beginning their review of the CIRPs, the experts on the panel developed a checklist to ensure that their reviews were consistent and that all revised CIRPs met the model plan’s requirements. The checklist allowed reviewers to provide additional comments on how a district should revise its CIRP in order to be “acceptable.” This process addressed the OIG’s recommendation that the CTS review all USAOs’ plans to ensure that the plans cover all critical areas.

In June 2004, the expert panel began the reviews. They judged 68 of the revised CIRPs to be acceptable without further modification, while another 14 were acceptable with changes. The expert panel found only seven of the CIRPs unacceptable.41 This represents a marked improvement from our 2003 review, which found that 72 of the 76 CIRPs the OIG analyzed lacked fundamental elements of an effective CIRP, which would have rendered these 72 CIRPs unacceptable based on the requirements contained in the current model plan.

As required by EOUSA and the CTS, all 93 USAOs revised their CIRPs. Eighty-four USAOs submitted their CIRP to EOUSA before the end of July 2004; four submitted their CIRPs between August and October 2004, and one CIRP was submitted in October 2005. Sixty-eight of the plans required no changes or additions. Further, 68 CMCs stated in their survey responses that after their CIRP was reviewed, they made additional changes to their CIRPs based on their subsequent experiences with exercises and critical incidents.

EOUSA and CTS instructed USAOs to test their revised CIRPs by conducting exercises and to complete after-action reports on the results of the exercises.

In July 2004, using the feedback from the CMC participants at the March conference, EOUSA and the CTS completed the revisions to the sample tabletop exercise. They also prepared an after-action report template to assist CMCs to memorialize lessons learned from the exercises and the need for any changes to their districts’ CIRPs. The sample exercise and the after-action report template were sent to the CMCs with instructions to conduct an exercise (either using the sample or an exercise of their own choosing) within 30 days of receiving feedback from the expert reviewer on their CIRP. They were also instructed to forward an after-action report to EOUSA upon completion of their exercise.

All 93 USAOs have conducted at least one exercise and completed an after-action report since revising their CIRP. While not every USAO completed their exercise within the prescribed 30-day time frame, we did find that every USAO completed an exercise and an after-action report by November 2006. Based on the CMCs’ responses to our survey, follow-up correspondence, and document review, the OIG determined that 78 districts conducted a CIRP exercise in 2004. The remaining 15 districts conducted their first exercise in either 2005 or 2006.

In order to verify their completion and to ascertain the various lessons learned, the OIG reviewed after-action reports documenting exercises for 91 of the 93 USAOs. Two CMCs stated they had completed after-action reports for their 2004 exercises but were unable to locate copies of them and EOUSA did not have copies on file. One of these districts subsequently completed a second exercise and an after-action report for that exercise but the district has not yet forwarded a copy. The other district has not completed a second exercise.

The majority of USAOs completed multiple CIRP exercises. Fifty-three USAOs completed exercises in at least 2 of the 3 calendar years (2004-2006), and 10 additional USAOs told us they planned to conduct an exercise before the end of the 2006 calendar year. Of the 78 USAOs that conducted an exercise in 2004, 49 (63 percent) had completed a second exercise by November 2006, and 8 planned to conduct their second exercise by the end of 2006. Within the group of 15 that conducted their first exercise after 2004, 4 USAOs had completed exercises in both 2005 and 2006, and 2 others planned to conduct their second exercise by the end of 2006. This is a marked improvement from the level of performance observed during our 2003 review. In 2003, only 30 of the 81 CMCs who replied to the OIG’s survey stated that their USAOs had conducted a CIRP exercise in the 7 years since the inception of the CMC program in 1996.

EOUSA incorporated questions into its triennial reviews of the USAOs to assess the USAOs’ ability to respond to a critical incident.

In May 2004, EOUSA staff added questions to the self-assessment checklist used in the triennial operations reviews of USAOs. The checklist questions, answered by USAO staff, addressed whether the district designated a CMC, had an approved CIRP, conducted the required tabletop or field exercise, and completed the subsequent after-action report. These questions addressed the OIG’s recommendation that EOUSA revise the operations review process to include a full evaluation of the USAOs’ CIRP-related activities.42 Responses to the questions served as an indicator of how likely the USAOs would be to respond successfully to a critical incident.

These triennial reviews are conducted by EOUSA’s Evaluation and Review Staff (EARS). During the review, the EARS team examines a district’s overall operations and management. If particular policies and procedures are not in place, the review team recommends that they be implemented.43

EOUSA monitored USAOs’ completion of CIRP activities through approximately June 2005.

In response to our 2003 recommendation to accurately track the status of USAO submissions, in late 2004 EOUSA began monitoring the USAOs’ submissions of revised CIRPs, completion of their CIRP exercises, and forwarding of their after-action reports. The process tracked the completion of the tasks, but did not account for any subsequent revisions to districts’ CIRPs or additional exercises that USAOs should have conducted in 2005. EOUSA discontinued the effort in June 2005, just prior to the staff person responsible for the monitoring transferring to the CTS. According to EOUSA senior management, it is unclear whether EOUSA has an additional oversight role regarding the program outside of the EARS reviews. Senior management felt the EARS reviews would provide the requisite monitoring of the program.

Steps taken by USAOs have had a positive impact on their preparedness.

As part of this review, the OIG interviewed CMCs from the seven districts that indicated in their survey responses that they had activated their CIRPs in response to a natural disaster. All seven districts had their revised CIRPs approved prior to the hurricanes that hit the districts. Six of the seven had conducted a CIRP exercise prior to the storms.44 Moreover, all seven districts conducted a CIRP exercise in the calendar year after the storms.

During these interviews, the CMCs described their respective districts’ planning efforts before the storms and their recovery efforts after the storms. They commented very positively on the benefits derived from the crisis response preparation activities they had previously completed. A more detailed description of the positive impact of crisis response planning on these seven districts appears in Appendix I.

The USAOs have regressed in their critical incident preparation activities.

The USAOs have not conducted CIRP exercises on an annual basis.Our current review found that USAOs are not fulfilling the annual exercise requirement stipulated in each district’s CIRP. As of November 2006, only 16 USAOs were in compliance with the requirement, having already completed an exercise in each of the 3 years (2004 to 2006). Another two districts were potentially in compliance because they had completed exercises in 2004 and 2005, and had an exercise scheduled prior to the close of 2006.

While each of the 93 USAOs has conducted at least 1exercise since revising its CIRP, we found that only 53 (57 percent) had conducted two or more exercises during the 2004 to 2006 time period.45 An additional 10 districts stated that they would attempt to complete their second exercise prior to the close of the calendar year. Thirty USAOs had only completed their initial CIRP exercise and had no plans to conduct a second exercise during the remainder of 2006.46

During this review, the OIG used a broad interpretation of what constituted a CIRP exercise. We considered it a CIRP exercise if the USAO participated in exercises with other federal, state, and local agencies that focused on potential critical incidents, even if the USAOs’ participants were not implementing or following their CIRP as part of the exercise. In interviews, CMCs, including the expert panel members, told the review team that one of the most important aspects of critical incident response is to know other agencies’ personnel and to build relationships with them. Therefore, the OIG concluded that it was appropriate to accept such activities as meeting the model plan’s annual exercise requirement, especially since these activities encouraged USAOs to attend exercises out in the field and not simply conduct a tabletop simulation.

Pursuant to Section 8.2 of the revised model plan, which was adopted by each USAO through its own CIRP, the USAOs are to conduct a tabletop exercise annually. To be in compliance with the model plan and their CIRP, USAOs should have completed at least three CIRP exercises by the end of 2006 (one each calendar year beginning in 2004).47

We found that USAOs have regressed in their critical incident preparations. Although 78 districts conducted a CIRP exercise in 2004, 21 of these 78 have not completed another exercise nor do they plan to conduct one in the remaining portion of 2006. USAO exercise activity dropped 50 percent in 2005, when only 39 USAOs conducted an exercise.48 Forty-five USAOs had conducted an exercise in the first 11 months of 2006. An additional 12 USAOs stated that they planned to conduct an exercise by the end of the calendar year.49 See Table 2.

Table 2: Number of Districts that Completed CIRP Exercises,
Calendar Years 2004-2006

Calendar Year Number of Districts

2004

78

2005

39

2006

45

2006 (pending)

12

Source: CMCs’ responses to OIG Survey.

We also found that 12 USAOs did not conduct an exercise in 2004 despite the review of their CIRPs by an expert panelist in 2004 (the latest being reviewed on August 5, 2004). These USAOs failed to adhere to EOUSA’s directive that they conduct a CIRP exercise within 30 days of receiving their CIRP review from the expert panel.50 Eleven of these 12 USAOs subsequently conducted their first exercise in 2005 (the latest taking place in November). The twelfth USAO conducted its first exercise in January 2006.

In survey responses and in follow-up correspondence, some CMCs indicated that exercises were not conducted because of difficulty in scheduling exercises around the prosecutorial responsibilities of the CMC and other AUSAs on the critical incident response team (CIRT). In our review, we found that at least three, and as many as seven, AUSAs in each USAO have important responsibilities when activating the USAO’s CIRP, such as “the Criminal Division Chief will provide advice on legal issues arising at the Command Post and will direct CIRT members to execute particular assignments.”51 According to the First Assistant United States Attorney from one of the districts that responded to an actual critical incident, all of these CIRT members should participate in exercises to increase their exposure to issues associated with critical incident response.

Other survey respondents indicated they conducted only one exercise because they were unable to coordinate exercises with other federal agencies in their districts. While multi-agency participation is not a requirement, these CMCs felt that a CIRP exercise needed the involvement of other federal law enforcement agencies operating within their district. Because they were unable to arrange the participation of other agencies, the USAOs decided not to conduct their own exercises.

In addition, six districts that were affected by the hurricanes in 2005 and activated their CIRPs did not conduct exercises in 2005. If these districts had scheduled exercises in September 2005 or shortly thereafter, the exercises would have become unnecessary given their real-life response efforts.

USAOs have not continued to complete after-action reports. Since USAOs conducted their first CIRP exercises and completed the corresponding after-action reports, the completion rate for after-action reports for subsequent exercises decreased significantly. All but two districts completed an after-action report for their first exercise, but only 24 of the 53 USAOs that conducted subsequent CIRP exercises completed after-action reports. Further, just seven USAOs completed an exercise and the corresponding after-action report in each of the three calendar years, 2004 through 2006. See Table 3.

Table 3: Number of Districts that Completed After-action Reports
After Conducting Exercises, Calendar Years 2004-2006

Calendar Year Districts Completing Exercises After-action Reports Completed Percentage Completing Reports

2004

78

76

97%

2005

39

23

59%a

2006

45

26

58%b

Source: CMC’s Responses to OIG Survey.

a If the 12 districts that completed their first exercise in 2005 are removed, the after-action report completion rate for those conducting a second exercise falls to 12 out of 27 (44%).

b If the 3 districts that completed their first exercise in 2006 are removed, the after-action report completion rate for those conducting second (or third) exercises falls to 23 out of 42 (55%).

Under the plans created by USAOs pursuant to the model plan, they are to complete after-action reports after each CIRP exercise or after a critical incident. The reports are designed to memorialize lessons learned and denote necessary changes to a district’s CIRP based on exercises or critical incidents.

One USAO that participated in several exercises in preparation for a National Special Security Event did not complete an after-action report (or written documentation of any kind) for those exercises.52 The USAO stated that the exercises were discussed by senior staff, but any comments or lessons learned were never committed to writing. However, when there is no documentation of previous exercises, newly appointed CMCs have no records to consult on what activities the district has engaged in and what lessons should have been learned from those events or exercises.

In contrast to the lack of after-action reports generated by the district referenced above, the Northern District of Alabama produced a report following its response to an investigation of church fires led by the Bureau of Alcohol, Tobacco, Firearms and Explosives (ATF). The report provides lessons learned that could benefit any district:

As ATF was the lead agency, we should have established a closer working relationship with the ATF case agent sooner than we did. We should have asked ATF earlier to provide us with a computer that had internet and email access. While we were able to maintain contact through the agent’s email service, this was inconvenient and time consuming due to the agent’s other priorities.

Cell phone communication was very difficult [in the area] for all agencies. Also, [as] there was no phone line available for dialup, it was necessary to communicate over the ATF email system with the USAO. Also, we were not able to access Westlaw, although we could have used ATF desktops had one been available.

USAOs are not utilizing the information-sharing capabilities provided by the CTS. In response to our survey, 52 CMCs stated that they had never received after-action reports, lessons learned information, or copies of revised CIRPs from other USAOs, EOUSA, or the CTS. In our 2003 review, we recommended that EOUSA, in conjunction with the CTS, complete the development of an intranet site containing information on critical incident response, including lessons learned, exercise scenarios, and best practices. In response, EOUSA and the CTS developed a web site to provide USAOs with access to such information, including other USAOs’ CIRPs and after-action reports.

Despite the availability of the information, our survey found that 42 CMCs (45 percent) had never visited the web site. The large percentage of CMCs that have not visited the web site indicates CMCs are not utilizing the lessons learned and other information available to assist them in their critical incident response preparation. This lack of use is troubling given that, of the 51 CMCs who had visited the site, 49 found it useful in locating critical incident planning information.53

Also in response to our survey, 30 CMCs stated that information they had received from EOUSA, the CTS, or from other USAOs was relevant and helpful for critical incident planning and was incorporated into their CIRPs.54 Moreover, after our current review began, the CTS further enhanced its intranet and now includes electronic copies of the USAOs’ CIRPs, after-action reports, and the initial evaluations from the expert panel members. It also includes the 2004 and 2005 CMC training conference materials, as well as sample exercises and other updated policy guidance. To maximize the benefit of the site, USAOs need to complete the required exercises and after-action reports so that their lessons learned can be of use to other districts.

EOUSA and the CTS are not providing the direction and support needed to ensure that the USAOs continually prepare for critical incidents.

EOUSA no longer assists the CTS with the USAOs’ critical incident response preparation. EOUSA has ceded all involvement in the USAOs’ critical incident preparation activities to the CTS despite there having been no change in the responsibilities of either organization regarding the CMC Program. When the CTS created a National Coordinator position to assist with the CMC Program in April 2005, and the EOUSA staff member responsible for implementing the recommendations in the OIG’s 2003 report transferred to the CTS in September 2005, EOUSA ceased monitoring the USAOs’ CIRP‑related activities. Currently, EOUSA has no staff assigned to formally assist the USAOs with CIRP-related activities.

When the CMC Program was implemented in October 1996, EOUSA was assigned to monitor timely CIRP submissions and updates.55 T o support the CMC Program, the Attorney General instructed the CTS, in conjunction with EOUSA, to develop and ensure training for the CMCs. The Attorney General stressed “training and advanced planning are imperative” given the intense time pressures and public attention during a critical incident.

When the model plan was revised in 2004, both EOUSA and the CTS envisioned that EOUSA would still be engaged in an active role, including providing guidance to (and monitoring) the USAOs’ CIRP activities. The model plan states that USAOs should forward their revised CIRPs and after-action reports to EOUSA. In 2004 and early 2005, after EOUSA and the CTS directed the USAOs to revise their CIRPs and conduct the first of what were to be annual exercises, staff responsible for the CIRP program at EOUSA tracked the USAOs’ completion of the exercises and after-action reports.56 However, these tracking efforts ceased around June 2005. At that time, all but five USAOs had completed their first exercise.57 Based on the documents used to track the USAOs’ performance of CIRP activities, there was no attempt by EOUSA to monitor the completion of a second exercise by the USAOs or the accompanying after-action reports. In interviews with EOUSA and CTS personnel, we learned that neither component has actively monitored the completion of the CIRP exercises or after-action reports since approximately June 2005.

Although the CTS created a National Coordinator position, the CTS is not directly or indirectly involved in overseeing the performance of the USAOs. CTS officials told us that they believe such activities fall under the purview of EOUSA. According to the CTS, its role is to provide advice on exercises, policy updates, and emerging issues regarding critical response preparedness, while EOUSA handles the administrative functions (e.g., monitoring when the USAOs complete exercises). During the course of this review, however, the OIG found that EOUSA was no longer performing these administrative functions for the CMC Program. For example, EOUSA did not have copies of all districts’ after-action reports.

EOUSA removed the USAOs’ performance measures assessing the CIRP Program. In October 2005, EOUSA removed three questions pertaining to CIRP activities that had been added to the EARS self-assessment checklist used in EOUSA’s triennial review process of each USAO.58 According to the EARS Director, the questions were dropped in 2005 “to streamline the checklist.” This action was taken without consulting the CTS.59 After the questions were removed, EOUSA and the CTS had no method of gauging the USAOs’ performance of CIRP-related activities since EOUSA had stopped its monitoring efforts in June 2005.

During this review, the OIG informed the CTS of the questions’ removal, and the CTS subsequently initiated efforts with EOUSA to reinstate the information about CIRP-related activities into the triennial review. On September 21, 2006, the EARS Director provided the OIG with new questions that USAOs would be asked about their CIRPs and CMCs as part of the evaluation process, and the questions addressed each of the items that had been removed.

Competing responsibilities have diminished the National Coordinator’s effectiveness with the CMC Program. In April 2005, the CTS created a CMC National Coordinator position with the intent that the individual would be primarily responsible for identifying and providing resources to the CMCs in the districts.60 This information was to include upcoming exercises in which USAOs could participate and best practices identified in after-action reports. However, due to the National Coordinator’s expertise in other critical response areas, he has been assigned to other Department work groups and task forces that have reduced his support to the CMC Program. We found through our survey that 62 CMCs had never received after-action reports (from exercises or critical incidents), other lessons learned information, or copies of revised CIRPs from EOUSA or the CTS.61 While the information was made available on the CTS’s intranet, the National Coordinator told the OIG he intended to communicate with CMCs directly.

The CTS originally envisioned that the National Coordinator, who had been one of the four expert panelists who evaluated the CIRPs, would serve as the primary link to the CMCs by disseminating policy, training, and exercise information. The National Coordinator would also be involved in creating training materials and information for the CMC Program web site. These tasks were contained in our 2003 recommendation that the CTS provide updated guidance reflecting changes in legislation, policy, and critical incident response practice.

Since arriving at the CTS, the National Coordinator has also been assigned by the Department to other critical incident planning efforts, primarily serving as the Department’s point person on the multi‑agency avian flu planning committee. Most of these efforts are not directly related to his responsibilities for managing and guiding the CMC Program. For example, the National Coordinator became one of the principal drafters of the Homeland Security Council’s National Strategy for Pandemic Influenza, work that demanded a significant amount of his time. According to the CTS, these additional responsibilities have prevented the National Coordinator from fully completing many of the CMC Program tasks, such as keeping the CMCs aware of exercises that were being conducted by other local, state, and federal law enforcement agencies in their districts so that the USAOs could participate.

EOUSA, meanwhile, has not provided support needed by the National Coordinator. The National Coordinator’s effectiveness has been diminished by EOUSA’s decision to not monitor the USAOs’ CIRP activities. EOUSA’s abandonment of its activities significantly limits the National Coordinator’s access to information on the USAOs’ current critical incident preparations.

EOUSA and the CTS have failed to ensure that newly appointed CMCs have received training. In the 9 months between the last CMC training conference in October 2005 and July 2006, 16 new CMCs were appointed and, as of October 2006, had yet to receive any formal training. Eight of the 16 newly appointed CMCs indicated they had no prior critical incident experience. Further, of all the current CMCs, 52 indicated in their survey responses that they had had no critical incident experience prior to becoming a CMC. Three of the new CMCs were designated as “acting” in the place of CMCs who had been called to National Guard duty in Iraq, but the other new appointees were permanent. Additionally, in response to our survey, several of these new CMCs stated they did not have the training materials from the 2004 and 2005 CMC conferences.62

In 2003, the OIG recommended that training and guidance be provided to the USAOs on CIRP activities. EOUSA and the CTS responded by conducting the 2004 and 2005 conferences and developing new guidance materials (the model plan and sample tabletop exercises). While a conference was not held in 2006, CTS officials stated they are scheduling one for 2007. Forty-six of the 93 CMCs stated that the conferences should be held on an annual basis, while an additional 34 believed the conferences should be held on a biannual basis. However, the CMCs also stated that due to scheduling and resource demands on AUSAs, it may not be practical to hold an annual conference.

Moreover, even if an annual conference were possible, some CMCs could be appointed shortly after the conference and go without training for up to 12 months in the absence of readily accessible orientation materials. We believe that there is a need to address the orientation of CMCs appointed to the position between training conferences, especially given the high turnover in the CMC position (discussed further on page 32 of this report). While the CTS has significantly increased the materials available on its intranet web site, the OIG did not find orientation materials designed for newly appointed CMCs. Further, the OIG’s review found that the training and other CIRP-related materials currently available on the intranet lack basic guidance for newly appointed CMCs who enter the position without prior critical incident response experience. Basic orientation materials would enable CMCs to more quickly understand the expectations and responsibilities of the position and to apply the advanced guidance on the web site. The number of newly appointed CMCs also underscores the need to ensure that each district complete after-action reports to provide historical information for successive CMCs at each USAO.

Another reason new CMCs do not receive timely training is that the USAOs have not alerted EOUSA and the CTS when they appoint new or acting CMCs.63 We found that several districts had not notified the National Coordinator at the CTS of their CMC’s appointment, thus delaying the CMCs’ obtaining the necessary background information to perform their role. During follow-up correspondence to our survey, four newly appointed CMCs asked the OIG how to acquire, and where to forward upon completion, materials on CIRP revisions, exercises, and training, even though the CIRP-related information is available on the EOUSA intranet.64 Further, we found that several districts with CMC vacancies had not designated a replacement CMC prior to assigning someone to complete the OIG’s survey. The model plan does not currently impose a requirement that USAOs promptly notify EOUSA or the CTS when there is a change, vacancy, or “acting” appointment to the CMC position.65

EOUSA and the CTS never ensured that all 93 CIRPs are acceptable. We found that the seven CIRPs that the expert reviewers did not deem fully acceptable were not reviewed again to ensure that deficiencies were addressed. Similarly, the 14 CIRPs found upon initial review to be “acceptable with changes” were not subsequently reviewed to ensure that deficiencies were addressed. Our 2003 recommendation stated that all USAO CIRPs should be reviewed, including revisions. CTS officials told the OIG that they did not believe a second review was necessary because they trusted the USAOs would make the required revisions. Because there was never a second review of the 21 CIRPs, EOUSA and the CTS could not demonstrate at the time of our review that these CIRPs were acceptable.

An OIG review of current versions of the seven CIRPs previously deemed “unacceptable” found that the areas of concern had not all been addressed in the 24 months since the CIRPs had been found deficient. As part of this review, the OIG requested the most recent versions of the CIRPs from six of the seven districts that had submitted unacceptable CIRPs.66 Of these six districts, three had not yet corrected the deficiencies that were the primary reasons their CIRPs had been deemed unacceptable. In all three of these cases, the districts had not revised their CIRPs to provide for contingency plans if one of the districts’ branch offices were to become unavailable for operations. The OIG conducted telephone interviews with the CMCs in these three districts. The CMCs acknowledged the need to make the suggested revisions and stated that the changes would be made promptly. After the interviews, one of the three districts forwarded to the OIG a revised CIRP that included the changes prescribed by the expert reviewer.

Budget shortages and rescissions have limited the ability of AUSAs to complete non-prosecutorial functions.

Budget shortages for the USAOs over the past 4 years (FY 2003 - FY 2006) have reduced funding available to the USAOs. According to EOUSA, this has reduced the number of AUSAs, while the USAOs’ workload has continued to increase. Consequently, according to CMCs, the ability of AUSAs to complete non-prosecutorial functions, such as CMC duties, has been restricted. Because of the collateral nature of the CMC position, reducing the amount of time CMCs dedicate to CIRP-related activities can have a significant negative impact on a USAO’s ability to prepare for critical incidents. Further, according to CMCs, AUSAs are evaluated on the number of prosecutions and not on CMC activities. Thus, AUSAs have less incentive to focus on CIRP-related activities.

Turnover among AUSAs serving as CMCs adversely affects USAOs’ critical incident response preparedness.

The turnover for CMCs since the October 2005 CMC conference has been much higher than that for AUSAs as a whole. Annualized, the CMCs’ rate of turnover was 23 percent – nearly four times that for AUSAs in 2005. Coupled with the lack of immediate access to training, high turnover disrupts the continuity of CMC activities.67 The turnover rate also emphasizes the need for USAOs to keep EOUSA and the CTS aware of changes in the USAO’s CMC position.



Footnotes
  1. The panel consisted of five attorneys, four of whom were CMCs. The CMCs were from Oklahoma, Northern District; Utah; Virginia, Eastern District; and Wisconsin, Eastern District. The fifth attorney, detailed to the CTS, served as the key drafter of the model plan.

  2. According to CTS officials, only the four CMCs on the panel reviewed CIRPs.

  3. Four USAOs reported they never received the comments from an expert panelist. The OIG did not find these four evaluations in the EOUSA files; thus, there were only 89 evaluations. Upon request, the OIG received CIRPs from these four USAOs, and our analysis showed the CIRPs were in compliance with the new model plan.

  4. The additions to the EARS review also addressed the OIG’s recommendation that the Deputy Attorney General ensure that performance measures be developed to assess the readiness of USAOs to respond to critical incidents.

  5. Interview with EARS Director, May 24, 2006.

  6. The one district that had not conducted an exercise prior to activating its CIRP because of a hurricane was the Northern District of Florida. However, Hurricane Ivan came ashore in September 2004, and the Northern District of Florida had completed its revised CIRP only three months earlier.

  7. The 53 districts that conducted at least 2 exercises included the 16 that are in compliance (and the additional 2 that are potentially in compliance) with the annual requirement referenced above.

  8. Exercise data are current through November 2006.

  9. Most USAOs did not conduct their initial CIRP exercise until August 2004. If one uses that as the starting point for the 12-month cycle instead of January 1, 2004, the results of our analysis would change only slightly. Forty-nine districts would have completed their required second exercise by August 2006 (two exercises within the 24-month span), while the remaining 44 would not have.

  10. The 39 USAOs include 12 USAOs that completed their first exercise and 27 USAOs that completed a second exercise in 2005. Three USAOs did not complete their first CIRP exercise until 2006.

  11. The 12 USAOs that planned to complete an exercise prior to the end of the calendar year, includes the 2 USAOs referenced previously that would come into compliance with the annual requirement plus an additional 10 USAOs that would be completing their second exercise.

  12. One of these 12 USAOs was faced with an actual critical incident in fall 2004; thus, its completion of their first CIRP exercise in March 2005 is understandable.

  13. Critical Incident Response Plan, Southern District of Alabama, Section 4.3.3.

  14. Examples of National Special Security Events include the Olympics, national political conventions, Super Bowls, and presidential inaugurations.

  15. Data based on responses to OIG survey question 18. CMCs may have visited the web site before the additional information was recently added; thus, they responded that they had visited the web site and found it useful but still had not received any after-action reports, lessons learned information, or copies of revised CIRPs.

  16. Based on responses to OIG survey questions 56.

  17. Critical Incident Response Plan, Decision Memorandum from Principal Associate Attorney General to the Attorney General, May 23, 1996 (signed May 24, 1996).

  18. The staff used a table to track when each USAO’s revised CIRP was received; which expert panelist completed the review; and whether the district had completed the CIRP exercise and forwarded its after-action report to EOUSA.

  19. The last USAO to complete its first CIRP exercise under its revised CIRP did so on June 7, 2006.

  20. These checklist questions were: (1) Has the USAO designated a Crisis Management Coordinator? (2) Has the District conducted either a tabletop exercise of their Critical Incident Response Plan or exercised their plan in a full field exercise in the District or Region? When? (3) Did the USAO provide an After-action Report of the exercise to EOUSA and the CTS?

  21. EARS Director, September 21, 2006, e-mail.

  22. The CTS filled the National Coordinator position with a First Assistant United States Attorney on temporary detail from a USAO.

  23. The 62 CMCs versus the 52 CMCs referenced on page 24, is based on the source of the information being EOUSA and the CTS, and does not include information received from other USAOs.

  24. The materials from both conferences were recently uploaded to the CTS intranet.

  25. The lack of specific personnel assigned to the CIRP program at EOUSA raises the issue of who at EOUSA the USAOs would contact.

  26. While the National Coordinator is listed on the web site, instructions on where (or to whom) to send information to EOUSA were not readily apparent.

  27. The model plan does impose a 6-month revision requirement and revisions are to be forwarded to EOUSA and the CTS.

  28. The seventh CIRP was not reviewed because the CMC had already indicated that the district was planning to make significant changes to it in the immediate future.

  29. The OIG could not research the historical CMC turnover rate because the information was not available through the USAOs, EOUSA, or the CTS.



« Previous Table of Contents Next »