The United States Marshals Service Judicial Security Process

Evaluation and Inspections Report I-2007-010
September 2007
Office of the Inspector General

Results of the Review

Assessing Reported Threats

In our March 2004 report, the OIG determined that USMS headquarters did not meet its timeliness standard of assessing threats within 24 hours after receipt from the districts for 73 percent of the threat assessments it conducted in FYs 2000 through 2003. To ensure the most serious threats were assessed on a timely basis, after 2003 the USMS began designating reported threats as either “expedite” or “standard” and, in August 2004, the USMS established longer timeliness standards of 3 business days for expedite cases and 7 business days for standard cases. We also reported that the threat assessments were of questionable validity.

In this section, we examine the USMS’s effort to improve its performance under its revised timeliness standards in FY 2005 and FY 2006; an FY 2007 effort by the USMS to monitor timeliness and resolve a large backlog of cases that accrued in FY 2005 and FY 2006; and the USMS’s plans to revise the threat assessment process in FY 2008 and improve the quality of its assessments.

Timeliness of USMS assessments of reported threats decreased during FY 2005 and FY 2006.

In FY 2005 and FY 2006, the USMS failed to improve the timeliness of its threat assessments. We reviewed a sample of 568 of the 2,018 threats reported to USMS headquarters in FY 2005 and FY 2006. We found that although the USMS extended its timeliness standard from 24 hours for all cases to 3 days for expedited cases or 7 days for standard cases, the OPI failed to meet those standards in about two-thirds of all cases in our sample.25 Moreover, the USMS did not complete threat assessments on more than half of all threats reported in FY 2005 and FY 2006, which led to a backlog of 1,190 “pending” assessments as of October 1, 2006.

For each year, we examined how many of the threat assessments met the timeliness standard for the applicable category. We also examined the average time it took the USMS to assess reported threats. We found that the OPI took longer to process both expedited and standard cases in FY 2006 than it did in FY 2005.

Chart 2 illustrates the OPI’s improvement in assessing the reported threats in our sample in FY 2005, FY 2006, and the first half of FY 2007.

Not Completed/Completed-Not Timely/Completed-Timely: FY 2005-66/79/141, FY 2006-200/26/56, FY 2007-0/15/217.

Source: OIG

FY 2005. We selected a random sample of 286 threats reported to the OPI in FY 2005 and found that the USMS had completed assessments for 220 of the cases (77 percent). As of November 3, 2006, when the USMS provided data on its operations to the OIG, the USMS had still not conducted threat assessments on 66 of the cases (23 percent), which remained in a “pending” status at the time of our analysis.26

We next analyzed whether assessments were completed within applicable timeliness standards based on the case category. Of the 286 cases, 14 were categorized as expedited and 195 were categorized as standard. Eleven threats that had been processed and all 66 of the pending cases were not categorized as either expedited or standard. We found that 141 of the threats were assessed within the applicable 3-day or 7-day timeframe.27 Another 79 threat assessments were completed but not within the applicable 3-day or 7-day timeframe.28 The 66 cases that were pending as of November 2006 were at least 13 months old and so failed to meet either timeliness standard.

FY 2006. Our random sample of 282 cases reported to the OPI in FY 2006 found that the USMS had completed assessments for only 82 of the cases (29 percent). The remaining 200 threats (71 percent) were still pending as of November 3, 2006. Of the 82 threats that had been assessed, 9 were categorized as expedited and 73 were categorized as standard. We found that 56 of the threats were assessed within the applicable 3-day or 7-day timeliness standard.29 Another 26 threats had been assessed, but the assessment was completed later than the applicable 3-day or 7-day standard. The 200 cases that were pending as of November 2006 were at least 1 month old and so failed to meet either timeliness standard.

OPI managers attributed the increase in time to the increasing number of reported threats and the OPI’s inability to hire additional qualified analytical staff, specifically Intelligence Research Specialists.

The USMS made efforts to improve processing timeliness in FY 2007.

In early FY 2007, the USMS initiated several actions to improve its ability to monitor threat assessments and to resolve the backlog of 1,190 pending cases. An OPI manager told us that, beginning in FY 2007, the USMS dedicated additional staff, including investigators, to perform threat assessments. According to OPI management, the OPI had two analysts conducting threat assessments in FY 2005 and 2006. A third analyst was hired in late FY 2006. Also, the OPI began to identify and process pending threat assessments that were still needed by the districts and implemented procedures to begin monitoring the processing of threat assessments. Because the actions taken by the USMS were pertinent to addressing the problems we identified in our review of FY 2005 and FY 2006 threat assessments, we expanded the scope of our review to include the first half of FY 2007.

Chart 3 illustrates how many days it took, on average, for the OPI to assess each threat received in FYs 2005, 2006, and the first half of 2007, as reflected by our random samples.

Chart 3: Average Number of Days for OPI to Assess Cases in
FYs 2005, 2006, and First Half of 2007

Average number of days for OPI to process EXPEDITE cases/Average number of days for OPI to process STANDARD cases: FY 2005-0.5/14, FY 2006-2/19, FY 2007-1/6.

Source: OIG analysis of a random sample of cases provided by the USMS

Processing of threat assessments improved. We found that the actions taken by the USMS enabled it to assess reported threats more quickly in FY 2007. According to USMS data, the districts reported 590 threats during the first two quarters of FY 2007. We randomly selected 232 cases for review and found that the USMS had conducted assessments for all 232 of the cases.30 Further, our analysis showed that 93 percent of the threat assessments were completed within applicable timeliness standards. Of the 232 threats, 3 were categorized as expedited and all were assessed within 3 days. The remaining 229 were categorized as standard, and 214 were assessed within 7 days. For our FY 2007 sample of 232 cases, it took an average of 6 days to conduct a threat assessment.

The USMS eliminated its backlog of pending threat assessments. On October 1, 2006, the OPI identified 1,190 threats that had not been assessed. The OPI contacted the districts that reported the threats to determine the status of the investigations related to each of the 1,190 pending assessments. District personnel reviewed the cases and informed the OPI whether each case had been closed or whether the district still considered the case active and therefore still required a threat analysis from the OPI. The OPI determined that analyses were still required for 538 of the 1,190 cases and, by March 2007, had completed the analyses and disseminated the results to the districts. For the remaining 652 threats, the OPI determined that the districts had already closed their investigations. These cases were then “administratively closed” by the Assistant Director of the JSD.31 This effort was completed in May 2007, at which time the USMS no longer had any pending threat analyses.

The USMS began monitoring threat assessment timeliness and quality in FY 2007. In response to the OIG’s March 2004 finding that 73 percent of threat assessments conducted during FY 2000 through FY 2003 did not meet timeliness standards, the USMS stated:

The USMS will be revising its policy on time frames for the ASU [Analytic Support Unit] to complete assessments. The new policy will establish criteria that categorize requests according to urgency. Once the policy is implemented, adherence to the time frames will be made a factor in the annual performance evaluations of the ASU staff. The USMS estimates that the new policy will be implemented by the end of August 2004. The USMS will also review the workload of the ASU and will request additional resources during the FY 2006 budget process if necessary.32 [emphasis added]

Despite that plan, the USMS did not monitor the timeliness of threat assessments during FY 2005 and FY 2006. The USMS also did not modify WIN/JDIS to enable it to better manage threat assessment processing. As currently configured, WIN/JDIS cannot be used to automatically calculate elapsed time or determine whether the elapsed time meets established standards because it does not contain dedicated data fields for this information.33 Because of WIN/JDIS’s limitations, calculating the time taken to complete an assessment and determining whether the assessment met USMS standards must be done manually.34 Further, data needed to determine timeliness is often missing from WIN/JDIS. For example, in our sample of 568 threats reported in FY 2005 and FY 2006, 274 (48 percent) were not identified as either expedited or standard cases.

In late 2006, OPI management implemented procedures to manually monitor the timeliness and quality of threat assessments. Data from each case is now entered by the Inspector or analyst responsible for the analysis into a spreadsheet for management review. At the conclusion of the research segment of the threat assessment process, the responsible staff initials a checklist maintained in the case file to document that all steps required to complete the research have been accomplished. OPI managers then review the case file to determine the timeliness of the research, review all information from the district, and direct the investigator or analyst to obtain any additional information from the district deemed necessary for a comprehensive assessment. After OPI management ascertains the research is adequate, the OPI transmits the assessment scores to the originating district for use in its protective investigation. OPI managers told us that they believe that the improvement in threat assessment timeliness in the first half of FY 2007 is directly attributable to the implementation of this oversight mechanism.

The USMS plans to revise the threat assessment process in FY 2008.

During our review, USMS managers told us that threat assessments produced under the current process were of limited utility to support protective investigations in the districts because they do not provide sufficient information about the threatener’s behavior. Further, responses to our Judicial Security Inspector survey confirmed that threat assessments infrequently provide information that affects how protective investigations are conducted. Because of the deficiencies in the current process, USMS management told us that they plan to change the threat assessment process in FY 2008 so that it provides better information and continuing support from the OPI to District Threat Investigators for the duration of protective investigations. In the following paragraphs, we discuss the perceptions expressed to the OIG regarding the usefulness of the current threat assessment process and the USMS’s plans for changing the process.

We noted an apparent contradiction between our survey results and actual OPI performance that we believe indicates that Judicial Security Inspectors did not highly value the OPI’s threat assessments. When we asked the Judicial Security Inspectors whether they received threat assessments from the OPI in time to assist them in conducting protective investigations, a large majority (80 percent) stated that they did.35 However, as we discussed previously, the OPI failed to complete threat assessments on about half (1,190 of 2,018) of the threats reported to it during FY 2005 and FY 2006. We believe the dissonance between the Judicial Security Inspectors’ stated belief that they received timely threat assessment results and the fact that they did not receive assessments for half of the threats they reported to the OPI indicates that Judicial Security Inspectors placed only limited value on threat assessments in their protective investigations.

The OPI is planning to implement a new threat assessment process. In a memorandum dated March 30, 2007, the Assistant Director of the JSD informed the OIG that the USMS will “move away from MOSAIC” and comparative analysis to “concentrate on the behavior of subjects who make threats and inappropriate communications.” USMS managers further explained that the OPI is starting to employ a more collaborative method of working with the districts on protective investigations, threat assessments, and case management. As envisioned by USMS managers, the analytical steps carried out on reported threats will be revised, and the extent and duration of the OPI’s involvement in protective investigations will increase.

Under the revised threat assessment process, districts will report all suspicious activities, inappropriate communications, and threats to the Threat Management Center. Once the initial records checks and recommendations for a protective investigation are provided to the district, the Threat Management Center staff will turn the case coordination over to the Investigations Branch circuit team responsible for protective investigations in its assigned circuits.

When a case is turned over to the Investigations Branch, it will be assigned to a team consisting of an analyst and an Inspector for evaluation. Through the use of the protective investigation case information supplied by the District Threat Investigator and further research and analytical work, the Investigations Branch team will develop a work product to send back to the district for consideration and use in its investigation. As described by OPI managers, the process of gathering information and providing feedback will continue until the OPI and the district determine the case can be closed.36

While each protective investigation is unique, OPI managers told us that they see the new process as an opportunity to further standardize, over time, the protective investigation process in each of the 94 districts. For example, the OPI will request that each District Threat Investigator provide certain core information to answer specific analytical questions so that higher-quality assessments can be produced. Because the OPI will ask the District Threat Investigators to obtain and provide consistent and complete investigative information, officials expect some uniform investigative work will be performed in each case. Further, the continuing dialogue between the OPI and the districts during the course of protective investigations could result in more consistently useful threat assessment products for the District Threat Investigators. The new process is also expected to achieve more consistent reporting of judicial security information. Under the existing threat assessment process, districts simultaneously notify the OPI and create a record in WIN/JDIS only after they determined that an event meets the criteria for an inappropriate communication. Under the new process, the districts will be expected to report any event or issue involving judicial security (including all suspicious activities, inappropriate communications, or threats) to the Threat Management Center as soon as possible.

The new threat assessment process OPI managers described to us could improve the ability of the USMS to assess and respond effectively to threats against the judiciary. However, OPI has not yet developed formal plans with defined milestones, tasks, and outcomes. OPI managers told us they have decided to eliminate the 3-day and 7-day timeliness standards. Instead, the Threat Management Center staff will provide the results of their analyses to the districts verbally or by fax and then follow up with a written response within 1 business day.

In August 2007, JSD managers told the OIG that they have drafted a new process for the Threat Management Center and an updated protective investigation policy, and that they planned to distribute the drafts to the districts. Because these drafts were not available for our review, we cannot fully evaluate the USMS’s planning and implementation of the new process or assess the potential for the new process to improve the USMS’s ability to respond to threats against the judiciary. The OPI needs to fully define the new process, provide direction to the districts, and provide training to district and headquarters staff involved in the judicial protection mission.37

Identifying Potential Threats

To identify and address the risk posed by individuals or groups who may not make overt threats to the judiciary in advance of an attack, in March 2004 the OIG recommended that the USMS create a centralized capability to collect, analyze, and share intelligence on potential threats. In 2005, separate reviews conducted by an Attorney General Working Group and a USMS committee examined the USMS judicial security mission and also recommended improvements to the USMS’s protective intelligence capabilities. (See the text box on the next page for details.)

Our current review found the USMS is making slow progress at implementing a protective intelligence function to identify potential threats. Three years after the OPI was established, it still lacks the staff needed to gather and analyze information to effectively develop protective intelligence. During the past 3 years, the USMS has made some improvements to its capacity for collecting information, including secure equipment and a new facility for working with classified information. However, the OPI still does not systematically collect and analyze information from its districts; from other federal, state, and local law enforcement agencies; or from the courts to produce protective intelligence about potential threats to the judiciary. Although the Assistant Director of the JSD identified a wide range of capabilities scheduled to be implemented in the Intelligence Branch over the next 2 years, the OPI lacks plans for achieving these capabilities.

Two 2005 Reports Identify the Need to Improve
the USMS’s Protective Intelligence Capability

In 2005, the Department and the USMS conducted separate reviews of the USMS judicial security mission and made recommendations for improving the protective intelligence function to identify potential threats.

Attorney General Judicial Security Working Group. In a June 2005 report, this Working Group called for the USMS to develop “a first-rate system of intelligence gathering and threat assessment which the Marshals Service currently lacks” and recommended improvements in information sharing with the judiciary and other law enforcement agencies. On September 15, 2005, the Attorney General informed the Chairman of the Executive Committee of the Judicial Conference of the United States that he had directed that the USMS implement the Working Group’s key recommendations. Specifically, he stated that he directed the USMS to increase the OPI’s staff and resources “to enhance the USMS’s ability to collect, analyze and store and retrieve intelligence and information, and to share that intelligence and information promptly and effectively within the USMS and with our Federal, state and local partners.”

USMS Judicial Threat and Analytical Assessment Commission. In the fall of 2005, the Acting Director of the USMS established the Commission and directed it to:

In December 2005, the Commission made 22 recommendations, including that the USMS provide additional staffing for the OPI; that the OPI work closely with federal, state, and local law enforcement, including full-time USMS representation with Top Secret clearances on JTTFs; and that the USMS provide the OPI with the equipment necessary to receive and transmit classified information expeditiously.

Developing a protective intelligence function is essential to meeting the security risks identified by judges and a Department study. To determine how federal judges viewed the risks associated with potential threats, we surveyed them about the types of threats that pose the greatest risk. In response, 527 out of 696 (76 percent) respondents reported that the unknown general danger associated with being a federal judge posed the greatest risk. In contrast, only 134 out of 696 (19 percent) reported that the known threat posed the greatest risk. The results of our survey are consistent with a 5-year study by the Department of Justice and the U.S. Secret Service of 83 individuals who attacked or approached to attack a prominent public figure. This study documented that less than 10 percent had communicated a direct threat to their targets or a law enforcement agency. In the following sections, we discuss the efforts of the USMS to develop its capability to identify potential threats to the judiciary.

Three years after the USMS established the Office of Protective Intelligence, it is still not fully staffed .

In response to an OIG recommendation, on May 14, 2004, the USMS reported that it would establish the OPI on June 1, 2004, in the JSD. The OPI was directed to collect, analyze, and disseminate all intelligence relating to the safety of USMS protectees, employees, facilities, and missions. The OPI staff consisted of a Chief, three Criminal Investigators, and one Intelligence Analyst. In addition, the USMS stated that “a number of analysts from the Analytical Support Unit” would be reassigned to the office shortly thereafter. The USMS reported that the OPI’s priorities were to (1) immediately develop a plan to transfer all threat analysis responsibilities from the Analytical Support Unit to the OPI, (2) prepare and propose an organizational and staffing plan, and (3) assist in preparing an FY 2006 budget submission that supported the creation and continuity of the OPI.

On April 26, 2005, the USMS stated to Congress that it had established the OPI “to analyze and disseminate protective intelligence,” but added the caveat that “the availability of resources will determine the rate of progress with regard to staffing the office.”38

On May 13, 2005, the OIG met with the Chief of the OPI to discuss the staffing and implementation of the office. We found that the assigned staffing level remained at the five positions that were transferred to create the office in June 2004. In July 2005, the USMS transferred responsibility for assessing reported threats from the Analytical Support Unit in the Investigative Services Division to the OPI. From May 2005 through July 2007, the USMS increased the OPI’s staffing to 21, with 2 applicants under consideration. The additional resources were primarily assigned to the OPI’s Investigations Branch, where they were directed at assessing reported threats, including the large backlog of pending assessments that accumulated in FY 2005 and FY 2006. The OPI only recently began to dedicate staff to the collection, analysis, and dissemination of intelligence related to potential threats. According to JSD managers, when the Threat Management Center becomes operational and can receive classified information, more Intelligence Research Specialists will be assigned to the Intelligence Branch. OPI will assign dual responsibilities to other Intelligence Research Specialists to monitor classified intelligence and work protective investigations.

Within the OPI, the Intelligence Branch is responsible for collecting and analyzing information to develop protective intelligence on potential threats. As of July 2007, it was staffed by a branch chief, a JTTF program coordinator, and four Inspectors who serve as liaisons to other federal law enforcement agencies. However, no Intelligence Research Specialists were assigned to the Intelligence Branch.39 Since the establishment of the OPI in June 2004, the USMS has increased from three to five the number of Inspectors assigned as full-time liaisons to other federal law enforcement agencies to collect information on potential threats to the judiciary.40 In March 2007, the Assistant Director of the JSD told the OIG that the USMS plans to increase the number of liaisons assigned to other agencies further by FY 2009 if the JSD receives additional resources. JSD managers are developing reporting requirements for the liaisons, but they told us that the requirements will not be formalized or distributed until after the Threat Management Center is operational.

In addition, the USMS has not assigned full-time representatives to all JTTFs to improve access to information and intelligence related to judicial security. Assigning full‑time representatives to all 56 FBI field office JTTFs was recommended in our March 2004 report. It was also recommended by the Attorney General’s Working Group, and the USMS’s Judicial Threat and Analytical Assessment Commission in 2005.41 The FBI has since increased the number of JTTFs to 101. The USMS actually reduced the number of full‑time JTTF representatives from 25 to 17 after the issuance of our March 2004 report and reduced the number of part-time JTTF representatives from 25 to 23. During this period, USMS districts also began assigning liaisons to JTTFs. Unlike full- or part-time representatives, these liaisons do not work on a JTTF and do not have direct access to FBI databases. As of July 2007, USMS districts had assigned 39 liaisons to JTTFs. The USMS JTTF program coordinator in the OPI’s Intelligence Branch monitors the program and receives and disseminates information, but has no operational authority over the representatives and liaisons the districts have assigned to the JTTFs. These representatives and liaisons report to the district management that assigned them.

Judicial Security Information From JTTFs

The USMS improved its capacity for working with classified information.

To operate an effective protective intelligence function, USMS staff must have appropriate security clearances and the equipment and facilities required to store and work with classified information.42 We determined that since our 2004 report, the USMS has increased the number of staff with Top Secret clearances:

The USMS also improved the facilities and equipment it has to work with classified information. In August 2003, 51 of the USMS’s 94 districts had secure telephones for communicating classified information. By April 2005, the USMS reported to the OIG that all 94 districts had secure telephones. Also, as of July 2007, the USMS was nearing completion on construction and accreditation of a Threat Management Center housed in a sensitive compartmented information facility. The Threat Management Center will provide the OPI with the capacity to electronically receive, access, analyze, and disseminate Top Secret information related to judicial threats with other agencies.

The OPI has not developed the capability to systematically collect and analyze information to identify potential threats to the judiciary.

While the USMS has made some improvements to its capacity for collecting information, we found that the OPI’s Intelligence Branch has not yet implemented a protective intelligence function to systematically collect and analyze information from the districts, from other federal, state, and local law enforcement agencies, or from courts to identify potential threats.43 Specifically, the USMS has not defined the protective intelligence products it needs and has not developed a strategy for obtaining and analyzing information to produce and disseminate protective intelligence products. For example, we found that:

In April 2007, the Assistant Director of the JSD acknowledged to the OIG that the OPI was not focusing on potential threats because it still did not have sufficient staff resources. Although the OPI has not developed a protective intelligence capability to identify potential threats, we found that it does use some external information sources to identify potential risks to the judiciary. For example:

Protective Intelligence: How State and Local Databases
Can Assist in Identifying Potential Threats

State and local law enforcement and court databases are a potential source of information to develop protective intelligence on threats to the federal judiciary. Although the USMS is not systematically collecting and analyzing data from these sources, 52 percent of the USMS district Judicial Security Inspectors we surveyed said they routinely obtain state and local data as part of their protective investigations. We interviewed six Judicial Security Inspectors to obtain additional details on the databases they used and how they used the information in their investigations. The inspectors said that the information in these databases relates to bookings, misdemeanors, incident reports, court cases, and state prison records. The inspectors said they learned about the databases either through personal outreach or as a result of participating in a task force.

The Judicial Security Inspectors reported using the information in the databases in several ways, such as background for interviews with individuals, to identify possible motives for inappropriate communications, to determine if a person had a history of misdemeanors, or to provide leads to other cases that might assist in the current investigation. For example:

Although the above examples involve cases in which threats were made, they demonstrate the types of information in state and local databases that a protective intelligence function could use to identify when individuals known to have threatened state or local officials become federal defendants or litigants.

In addition, in a March 30, 2007, memorandum, the Assistant Director of the JSD identified other initiatives related to improving the protective intelligence function that the USMS plans to accomplish by FY 2010. The memorandum appears in Appendix I. Regarding protective intelligence, the Assistant Director stated that the USMS plans to:

Also, in June 2006 JSD managers told us that the USMS intended to initiate the development of a National Center for Judicial Security (Center) in FY 2007 to serve as a repository for information pertaining to the security of courthouses and the protection of judicial officials. The National Support Division of the Center will be responsible for information sharing initiatives such as the Virginia pilot project discussed above. JSD managers have not identified a target date for completion of the Center.

While the OIG believes that the extensive initiatives would be valuable, we also note that the OPI has not developed formal plans to achieve these goals. Formalizing and fully defining the new process will be required for the OPI to obtain resources to carry out its plans, as well as to provide direction to the districts and provide training to district and headquarters staff involved in the protective intelligence mission.

Implementing Enhanced Security Measures

Home Alarms for Federal Judges

In May 2005, Congress appropriated $11.9 million to the USMS to provide home intrusion detection systems requested by federal judges and to pay for security measures used by the USMS to investigate and counter threats to judges when they are away from a courthouse.47 With these funds, the USMS created the Home Intrusion Alarm Program to improve the residential security of federal judges. In our survey, federal judges’ responses on their perceptions regarding their security at home highlighted the need for a home alarm program. When we asked federal judges about their feelings of security in different settings, of the 696 who responded, only 114 (16 percent) felt very secure at home (see Appendix II). In this section, we describe our examination of the USMS’s implementation of the Home Intrusion Alarm Program, including the USMS’s identification and installation of the initial group of alarms pursuant to the legislation; the USMS’s management of the program; and the USMS’s oversight of alarm monitoring and responses to alarm activations.

Identification and installation of initial alarm systems. On June 14, 2005, the AOUSC sent a survey to federal judges to identify those judges who wanted an alarm system installed in their homes. As of July 8, 2005, 1,363 responded, of whom 1,176 replied that they wanted home alarm systems installed. Judges could request that they be included in the home alarm program after the survey as well. In December 2005, the USMS directed each district to determine the number of judges in their district who wanted home alarm systems. By November 2006, a total of 1,616 judges had requested alarm systems.

Contracting for the alarm systems and program initiation. According to USMS documents, a solicitation for the home alarm contract was issued in November 2005, and in December 2005 the USMS awarded the contract to install the home alarms. However, the original contract did not include monitoring services or maintenance, leading to objections from the judiciary. Members of the Judicial Conference Committee on Judicial Security contended that the supplemental funding should pay for the monitoring and maintenance because both were part of the USMS’s statutory responsibility for judicial security. In late 2005, the USMS agreed and informed the AOUSC and Judicial Conference Committee members that it would pay for central station monitoring.48 The contract was amended to reflect this change on February 9, 2006. For each system installed in a judge’s home, the USMS pays its contractor a monthly fee for monitoring the system. On January 2007, the USMS added a maintenance component to its contract and pays a monthly fee for maintenance of each system.

In February 2006, the USMS and its contractor conducted pilot installations of alarms in the homes of three judges in the Washington, D.C., metropolitan area, and the contractor finalized its “pre-installation plan” format as a result of the pilot test. Meanwhile, the USMS and the AOUSC drafted policies and procedures to govern the administration of the home alarm program. On March 27, 2006, the USMS and the AOUSC issued a joint memorandum that launched the home alarm program nationally.

After the program was initiated, the cost of proposed alarm systems submitted in April 2006 was almost 100 percent higher than had been projected based on the three pilot installations. The higher costs resulted from additional features included in the proposed systems by contractor sales representatives who were not familiar with the scope and parameters of the USMS program. For example, contractor representatives were proposing to install higher-cost systems in some residences, were proposing to use contracted system components improperly (such as placing motion sensors in bathrooms), and were proposing other features that were deemed redundant by the USMS. In April 2006, the USMS and its contractor reviewed and revised 92 pre-installation plans. To control costs, the USMS issued new guidance regarding the types of equipment authorized for the USMS-funded systems, directed Judicial Security Inspectors to take a more active role in the installation process by inquiring about the components suggested in the pre-installation plans, and directed additional reviews of pre-installation plans.

Also during the first months of installations, an issue arose concerning contract termination fees for judges who were replacing personal systems with USMS-provided systems. Initially, to participate in the USMS program, judges were required to terminate their existing contracts, which sometimes left judges responsible for paying early termination penalties. In addition, the contractor believed that under the terms of the December 2005 contract, it was obligated to install new home intrusion systems, even if a judge was already a customer with the contractor. These issues were resolved when the contractor agreed to terminate existing contracts with those judges who had its system without penalty. However, the contractor could not consider contracts the judges had with other security vendors. The contractor was able to use many of the existing home alarm components during installation of the new system.

Alarm installation progress. The USMS and its contractor follow a three-step process for installing alarms. First, the Judicial Security Inspector in the district and a representative from the contractor arrange with the judge to conduct a home inspection to determine the system requirements and select the appropriate alarm features for the judge’s residence. Next, based on the inspection, the contractor and the Judicial Security Inspector develop a proposed system configuration for acceptance by the judge. The judge provides the contractor with emergency contact information, and the Judicial Security Inspector presents the proposal to USMS headquarters for review and approval. Finally, after the system configuration is approved for installation by an official at the home alarm program office at USMS headquarters, the Judicial Security Inspector and the contractor contact the judge to arrange a time to install the system. After the system is installed, the judge is trained on how to use the alarm system.

Between March 2006 and July 2007, 1,531 alarms were installed in judges’ residences. Chart 4 shows the progress of the alarm installations by month. Installations were prioritized within each district, with judges that had no alarm system scheduled first, followed by those that had pre-existing alarm systems.

Chart 4: Number of Home Alarm Installations Completed

Feb 2006-0, March 2006-3, April 2006-20, May 2006-177, June 2006-361, July 2006-585, Aug 2006-745, Sept 2006-795, Oct 2006-1047, Nov 2006-1085, Dec 2006-1255, Jan 2007-1268, Feb 2007-1413, March 2007-1467, July 2007-1531.

Source: USMS

The USMS Residential Program manager told us that he noticed that some judges’ alarm systems had not been installed and sent a message to all districts asking them to query the judges about whether they still wanted the systems. The manager identified several reasons that alarm installations had not been completed. According to the manager, some of the judges indicated to the USMS that they no longer want the home intrusion detection system and some are no longer federal judges. In other cases, judges who requested systems either did not respond to requests from the USMS to arrange a home inspection to determine the system requirements or, after the system requirements were determined, did not work with the USMS and the contractor to establish a time for installation. According to the manager, several of the judges told the USMS that they had not yet made up their minds about whether they wanted the system installed. For these judges, the USMS is holding open the request.

As of July 2007, the USMS reported that it had 67 outstanding requests for alarm systems. Of the 67 requests, approximately 30 of the judges are undecided and have yet to complete the home inspection or installation, and the other 37 were requests for which installation was proceeding.

Monitoring and response to alarms. The USMS is not directly notified of alarm events and receives limited reports of alarm occurrences at judges’ homes. When an alarm is received, the contractor first calls individuals identified by the homeowner on their Emergency Contact List. If contact cannot be made, the contractor calls local law enforcement for emergency response. Although the contractor provided the USMS with monthly activity reports from March 2006 through early 2007, these reports did not include data on the number of reported alarm events.49

As of July 28, 2007, the USMS was unable to respond to the OIG’s request that it identify the number of alarm events that had occurred at judges’ residences, including the number of alarms that were accidental or did not require an emergency response, or the number of instances in which the contractor notified local police to make an emergency response. In response to our request during this review, the USMS told us that it did not have an arrangement with the contractor to be notified of alarm events at the residences of judges covered by the USMS program.

Initially, the USMS was included in the list of designated numbers for several of the judges. When the contractor received an alarm notice and was not able to contact anyone on the Emergency Contact List, the contractor called the USMS Communications Center in Washington, D.C., to report that an alarm had been received. According to the USMS Residential Program manager, this presented a problem because the USMS Communications Center is unable to provide immediate physical responses to alarms in the residences of judges across the country. The manager said that the USMS made a determination that it should not be included on the Emergency Contact List used by the contractor. In the event of an alarm, the contractor was directed to contact law enforcement to ensure prompt emergency response for those residences.

Although the contractor is no longer providing the USMS with reports summarizing alarm events, the USMS has implemented a policy that its districts are to notify local law enforcement that they should be contacted in the event of a “bona fide” alarm event. Specifically, the Residential Program manager told us that he sent a notice to all districts advising them to send letters to each local law enforcement agency that provides coverage of an area in which a federal judge resides. In the letter, the districts were to ask that any local law enforcement agency responding to a call from an alarm at a judge’s residence inform the USMS district of the nature of the alarm after it responded. However, the USMS was not able to provide information on how many letters districts have sent to local law enforcement agencies because it did not require the districts to provide copies to headquarters.

We have several concerns regarding the USMS’s approach for learning of alarm events at judges’ residences. First, even if a local law enforcement agency identifies that an emergency response is being dispatched to a judge’s residence and subsequently notifies the USMS, receiving such “after-the-fact” notifications delays the USMS’s awareness of potential security events at judges’ residences. Second, the current process places responsibility for notifying the USMS on a local police department rather than on the company responsible for monitoring and providing alarm services. Third, notifications from local police departments may not be reliable, given the difficulty in keeping the information on judges’ residences current as they move or retire and given the variation in emergency dispatch systems from jurisdiction to jurisdiction. We agree that the local law enforcement agencies should be immediately notified by the contractor of all unresolved alarms so that they can respond quickly. However, we believe that the contractor should also notify the USMS immediately after notifying the local law enforcement agency.

An opportunity to modify the notification procedures currently exists because the USMS is renegotiating its alarm contract. We believe that the USMS should include as a term of its new contract that the alarm contractor will, after making the required emergency notification to local law enforcement agencies, also notify the USMS, either at the local district office level or at headquarters, that it has referred an alarm at judge’s residence to a local law enforcement agency.

Judges’ satisfaction with alarm systems. In a survey the OIG conducted in November 2006, 62 percent (281 of 454) of judges who responded stated that they were very satisfied with the home alarm system they received through the USMS, 26 percent (120) were somewhat satisfied, 5 percent (22) were somewhat dissatisfied, and 1 percent (5) were very dissatisfied (see Chart 5 below).

Chart 5: Judges’ Satisfaction With Home Alarms

62% very satisfied, 26% somewhat satisfied, 6% neither satisfied or dissatisfied, 5% somewhat dissatisfied, 1% very dissatisfied

Source: OIG Judicial Survey

When we asked the judges in our survey to provide narrative comments or suggestions on the home alarms systems, some of the positive comments were:

In contrast, some judges had negative comments about the home alarm program. Many of these respondents stated that they had incurred additional costs for features that were not covered under the USMS contact, the contractor had not been responsive to service calls, or that they had unanswered questions about the system. Below are some of the other critical comments provided by judges.

The USMS Technical Operations Group

The USMS is enhancing its Technical Operations Group’s (TOG) support of the judicial security mission. The TOG is organizationally located in the Investigative Services Division and is composed of an electronic branch, an air surveillance branch, a tactical support branch, and an analysis and intelligence group. In response to requests from district offices, the TOG uses sophisticated technologies to provide investigative and intelligence support, primarily for the USMS fugitive apprehension mission.50 The districts request judicial security assistance from the TOG through the JSD’s Office of Protective Operations. The judicial security assistance requested by district offices can include providing technical equipment.

In response to concerns of the Judicial Conference of the United States and Congress, in September 2005 the USMS Director convened a Judicial Security Technology Committee (Committee) composed of USMS and AOUSC staff to review the agency’s technology assets and the ability of the JSD to fully respond to the security needs of the judiciary. As part of its review, the Committee also considered whether a single entity within the USMS could support the missions of both the Investigative Services Division and the Judicial Security Division. In January 2006, the Committee reported that JSD headquarters personnel did not provide sufficient support to the districts in accomplishing the judicial security mission because:

The Committee recommended that the JSD transfer its judicial security technology resources to the TOG and that the USMS expand the TOG’s mission to more adequately address the judicial security mission.

USMS efforts to enhance the capability of the TOG are ongoing. To address the Committee’s recommendations, the USMS has provided some additional resources to the TOG and requested additional resources in its FY 2008 budget request. In September 2006, the JSD transferred three personnel to the TOG, including a telecommunications specialist who will manage the Court Security Officer radio program and two criminal investigators.51

In its FY 2008 budget submission, the USMS requested funding for six positions (five Deputy Marshals and one analyst) to assist in the enhanced TOG support of the judicial security mission. The USMS also requested $890,000 for TOG equipment and technology.

The USMS has not implemented policies and procedures to guide requests for TOG support. The Office of Protective Operations has not yet developed a policy for referring district requests to the TOG. We asked the JSD Assistant Director in April 2007 about the USMS procedures for districts to request TOG assistance and criteria for providing assistance to districts. He responded that everyone in the field does not have to know what the TOG is capable of as long as headquarters knows because headquarters makes the decisions about approving the requests. Although the USMS has identified that the TOG has limited resources to support judicial security, the JSD has not yet developed criteria for prioritizing and referring district requests to the TOG.

Further, the TOG has drafted, but has not implemented, a policy describing when and how its resources will be deployed. We asked the TOG Deputy Chief in January 2007 about the review process for district requests for TOG assistance since he mentioned that he expected an increase in district requests as awareness of the TOG’s capabilities becomes better known. The TOG had already identified the need for written requirements and was drafting and forwarding the document for ISD and JSD review and comment. In May 2007, the TOG provided the OIG a draft of the document.

From September 2006 through June 2007, the TOG received eight requests for judicial security-related assistance from the districts.52 The TOG fully supported six of the requests and denied two requests because they did not fall within the TOG’s area of responsibility. The following are two examples of TOG support to the USMS’s judicial security mission:

The TOG has provided training on its support capabilities to district personnel. In July and August 2006, and again in July 2007, the USMS highlighted the TOG’s capabilities during a Protective Investigation Training Program at the Federal Law Enforcement Training Center. During the training, USMS district personnel were informed about the TOG’s capability to provide protective intelligence gathering and analysis, information sharing with state and local law enforcement, and tactical support for judicial security missions. However, the training did not make all Judicial Security Inspectors aware that the TOG has increased its support to the judicial security mission. In response to our November 2006 telephone survey of 82 Judicial Security Inspectors, 26 (32 percent) told us that they were not aware of the initiative to expand the use of the TOG for protecting the judiciary.53 JSD managers told the OIG that the JSD plans to hold training seminars for Judicial Security Inspectors in October and November 2007.

The USMS Rapid Deployment Team Program

The JSD recently began creating a Rapid Deployment Team program to respond to significant judicial security incidents around the country, such as an assault on a judge or a disruption of a U.S. courthouse’s operation. The Rapid Deployment Team responds to the location to assist the local USMS district in managing the incident. In a July 2006 interview for an AOUSC staff publication, the USMS Director stated that it was his “goal to change how the Marshals Service protects the Judiciary, from being less reactive to more proactive in our approach. We need to be ready, as much as we possibly can, to respond.” The Director said that one key factor in accomplishing this would be to establish rapid deployment teams:

We want to be fast in getting personnel where they need to be. For example, when someone harms or makes a viable threat to harm a judge or his or her family members, we want to put trained teams in that area as fast as possible to do a couple things: to immediately protect the judge or the family members or whoever needs protection, and also to relieve our field offices of managing both the crisis and their regular day-to-day duties. Rapid deployment teams, as we see them to be, will be a group of several deputies or court security inspectors who will, when the “fire alarm” rings, be on the ground quickly. They will be on call for a set period of time – perhaps 30 days at a time. We’ll have a back-up team ready as well, so if there’s a secondary incident or there’s a need for additional people, we’ll have that team available. These teams will be fully trained, equipped, ready to be mobilized. So again, the timeliness of our response is very, very critical.54

In March 2007, the Deputy Assistant Director for Judicial Operations told the OIG that the JSD had directed a working group to draft the operating methodology and plans for the Rapid Deployment Team by the end of May 2007. In April 2007, the Assistant Director of the JSD told the OIG that a senior JSD manager could immediately deploy to assess the need for a Rapid Deployment Team that would be composed of JSD managers and circuit court inspectors from around the country. If the JSD manager determined that a team was necessary, the manager would work with the district to define the expertise needed on the team and select the appropriate USMS staff to serve on it. As of July 2007, the Rapid Deployment Team program was still in development and no deployments had occurred. The Deputy Assistant Director told the OIG that the operating methodology and plans for the Rapid Deployment Teams were not expected to be completed until September 2007.

  1. The USMS does not count weekends and holidays when determining timeliness of the threat assessment.

  2. The USMS uses the term “pending” to describe those cases for which the OPI has completed no comparative analysis or MOSAIC assessments.

  3. The 141 assessments completed within established timeliness standards in FY 2005 included 13 of the 14 expedited cases that were assessed within 3 days (average: 0.5 day) and 122 of the 195 standard cases that were assessed within 7 days (average: 14 days). We also considered six of the uncategorized cases that were assessed in under 7 days to have met the timeliness standard.

  4. This includes 1 threat categorized as expedited, 73 threats categorized as standard, and 5 threats that were not categorized but that were not assessed within 7 days.

  5. The 56 assessments completed within established timeliness standards in FY 2006 included 7 of 9 expedite cases that were assessed within 3 days (average: 2 days), and 49 of 73 standard cases that were assessed within 7 days (average: 19 days).

  6. We originally selected 233 cases and found 1 case for which no assessment was conducted. However, we determined that case to have been a reporting error and excluded it from our sample.

  7. The administrative closure was annotated in the case file in WIN/JDIS and a memorandum from the Assistant Director was placed in the actual case file.

  8. Prior to the establishment of the OPI in June 2004, analysts assigned to the Analytical Support Unit within the Investigation Services Division were responsible for conducting threat investigations at USMS headquarters.

  9. The information to calculate timeliness may be present, but the calculation cannot be automated even when the information is included. For example, the date a threat assessment is forwarded to the OPI is embedded within the case file number and must be extracted. The case category and assessment completion date, if entered, are contained (along with other information) in a text-based remarks field.

  10. The OIG manually calculated the number of elapsed days for each threat to determine the number of days it took to complete each threat assessment.

  11. Only 6 percent stated that they did not receive results in time to be useful to the protective investigation. The remaining 14 percent had no opinion.

  12. The final determination to close a case will remain the district’s responsibility.

  13. Although the new process has not been fully defined, the USMS told us that it has already conducted training on the behavioral aspects of this new approach. In July and August 2006, the USMS conducted 4 separate 1-week Protective Investigations Training courses focusing on behavioral methodologies of investigation for 190 Deputy Marshals at the Federal Law Enforcement Training Center. These seminars were provided by experts from the USMS, the Bureau of Alcohol, Tobacco, Firearms, and Explosives, the FBI, United States Attorneys’ Offices, the U.S. Secret Service, and the Diplomatic Security Service. The USMS conducted 2 additional Protective Investigations Training courses for 96 Deputy Marshals in July 2007. JSD managers stated that they are working with USMS staff assigned to the Training Center to conduct four more courses in FY 2008.

  14. Statement of the United States Marshals Service before the Subcommittee on Crime, Terrorism, and Homeland Security, Committee on the Judiciary, House of Representatives, concerning H.R. 1751, The Secure Access to Justice and Court Protection Act of 2005, April 26, 2005.

  15. OPI managers stated that the Intelligence Research Specialists in the Investigations Branch were available to assist the Intelligence Branch as needed to conduct research and disseminate information.

  16. The three liaisons that existed when the OPI was started were assigned to the Federal Bureau of Prisons’ Sacramento Intelligence Unit, the Central Intelligence Agency, and the FBI’s National Joint Terrorism Task Force. As of September 2006, the USMS had cancelled the liaison position at the Central Intelligence Agency, but maintained full-time liaisons to the Department of Homeland Security and the FBI’s National Joint Terrorism Task Force and the Washington Field Office. The USMS also assigned a Senior Inspector to serve as a liaison to the Supreme Court Police, the U.S. Capitol Police, and the Metropolitan Police Department.

  17. In March 2004, the OIG recommended that the USMS assign full-time representatives to all 56 FBI field office JTTFs. The Attorney General’s Judicial Security Working Group report made a similar recommendation that the Director of the USMS should strive to staff all of the JTTFs. In December 2005, the Judicial Threat and Analytical Assessment Commission report also recommended that the USMS assign full-time representatives to each of the 56 FBI field office JTTFs.

  18. In our March 2004 report, we recommended that the USMS require that all Chief Deputy U.S. Marshals and USMS JTTF representatives have Top Secret clearances and that each district have secure communications equipment.

  19. OPI analysts in the Investigations Branch have generated some information products that are shared with the districts, including information bulletins, alert notices, and foreign travel briefs. The OIG reviewed 14 products provided by the USMS to determine whether they contained analytical information such as why the information provided was relevant to the judicial security mission, how the information could or should be used, and how the recipient should respond. We found that out of the 14 products, 9 had an analytical section, and 4 of the analytical sections contained analytical information.

  20. In our survey of 82 Judicial Security Inspectors, 42 (51 percent) stated that, in addition to reporting threats, they sent the OPI other types of judicial security information, such as reports on indictments or arrests, courthouse incidents, information on domestic terrorist groups, and suspicious activities.

  21. The PACER system provides real time public access to case and docket information from Federal Appellate, District and Bankruptcy courts, including a listing of all litigants and judiciary involved in the case, case related information such as the nature of the suit, and the status of the case.

  22. The NCIC provides police officers and federal agents with criminal history and open warrant information. Although the criteria are not final, the USMS might add 100 to 200 individuals who have threatened the judiciary and who have a violent history, are known to have taken any overt or covert action to carry out an assault or assassination, or have recently purchased a weapon. The USMS would remove the names when the individuals no longer pose a threat.

  23. The Emergency Supplemental Appropriation Act for Defense, the Global War on Terror, and Tsunami Relief of 2005 (P.L. 109-13) provided funds for ongoing military and intelligence operations in Iraq and Afghanistan and other selected international activities, including tsunami relief and reconstruction.

  24. Central station monitoring occurs at the contractor’s facility in Aurora, Colorado.

  25. According to the USMS Residential Program manager, the monthly reports submitted by the contractor contained data on the number of installations completed and issues encountered, but not on alarm events.

  26. The Investigative Services Division oversees the enforcement of court orders, fugitive investigations, execution of federal warrants, and operation and maintenance of WIN/JDIS. It also provides electronic surveillance. In FY 2006, the TOG conducted over 7,100 surveillance operations for over 2,900 fugitive cases, an increase of 6 percent from FY 2005.

  27. The Court Security Officer radio program is funded by the Judicial Branch’s AOUSC.

  28. The TOG began tracking district requests for its assistance on judicial security cases in September 2006.

  29. We noted that 24 of the 26 Judicial Security Inspectors who stated they were unaware of the enhanced TOG capabilities did not attend the training at the Federal Law Enforcement Training Center.

  30. “Interview: A Dialogue with USMS Director John F. Clark,” Third Branch, Vol. 38, Number 7, July 2006.

« Previous Table of Contents Next »