The Federal Witness Security Program
Criminal Division (redacted version)
Report No. 02-05
Office of the Inspector General
The OEO has made some improvement in overcoming a long history of problems associated with reporting the results of protected witness testimony. We found that Assistant United States Attorneys (AUSAs) are increasingly providing the OEO with the results of protected witness testimony. Also, the OEO spent $533,433 as of March 2001 to upgrade its antiquated information tracking system. In our judgment, the upgrade should facilitate accurate collecting and reporting of data. Further, we found that the most recent data compiled by the OEO, totaling more than $411 million in seizures, forfeitures, fines, and restitutions resulting from protected witness testimony, was accurately compiled.
However, problems remained with the data compiled by the OEO. We found that the reported number of defendants against whom protected witnesses testified or cooperated, along with the resulting number of indictments and convictions, were significantly overstated. This occurred because of flaws in the process used to compile the data and the formula used to calculate the number of indictments. We also noted that the CRM did not comply with the requirements of the GPRA to develop outcome- based performance measures related to the results of protected witness testimony. As a result, the OEO may not be able to demonstrate to Congress and other decision makers accurate statistical benefits resulting from protected witness testimony.
The United States Attorney Manual, Part 9-21, Witness Security, dated September 1997, states that the OEO is responsible for collecting and maintaining the results of protected witness's testimony. The Manual identifies the specific information that must be collected and maintained and directs prosecutors to provide this information to the OEO as soon as it becomes available.
The OEO uses the information provided by prosecutors to demonstrate, through statistics and anecdotal case information, the Program's contribution to the successful prosecution of cases. In addition, Congress uses the statistical information to set the Program's funding levels and to determine its continued viability.
Evaluation of the Admission Process
The OEO analyzes information from various sources to determine if a witness is suitable for admittance into the Program. The OEO receives the information from the sponsoring AUSA, the investigating agency (e.g., Federal Bureau of Investigations (FBI), Drug Enforcement Administration (DEA), or U.S. Customs Service), the BOP, USMS, litigation sections within the CRM, and the National Crime Information Center 2 (NCIC). The information received from the various sources includes: (1) an application for the witness, (2) a threat assessment, (3) a risk assessment, (4) program alternatives, (5) psychological evaluations, (6) the results of the preliminary interview, (7) polygraph reports, (8) the litigation section's recommendations, and (9) criminal history reports. An explanation of the documents follows:
[Sensitive Information Deleted]
The OEO approves or denies applicant admission into the Program. To evaluate the OEO's admission procedures, we judgmentally selected 80 of 719 applications it received from FY 1997 to FY 1999. We reviewed the required documents listed above along with the OEO's summary for each of the 80 applicants included in our sample. We determined which documents were required for each applicant depending upon the type of witness. As stated earlier, two types of witnesses are admitted into the Program - relocated and prisoner witnesses. Additionally, the OEO establishes subcategories to better track and monitor witnesses and family members. Our sample of 80 applicants illustrated 9 witness subcategories. Listed in the table on the next page is our assessment of whether all requirements were met and whether the OEO's conclusion to authorize or deny Program services for the 80 applicants included in our sample was properly supported.
Requirements to Admit Witnesses in the Program
|No. of Applicants
|Family Member||5||A family member of witnesses who is relocated separately from the primary witness.||A, TA, RA, PA, CHR, PE, PI, SR|
|Prisoner Release||16||A prisoner witness who will be released from prison after serving time and is being considered for relocation.||A, TA, RA, PA, CHR, PE, PI|
|Prisoner Release with Family||2||A prisoner witness who is being considered for relocation along with family members.||A, TA, RA, PA, CHR, PE, PI|
|Prisoner Witness||31||A witness who will be incarcerated while in the Program.||A, TA, PA, SR, PO|
|Prisoner Witness with Family||2||A witness who will be incarcerated while in the Program and whose family members will be relocated.||A, TA, RA, PA, CHR, PE, PI, SR, PO|
|Relocated Witness||5||A witness who will be relocated to a new area while in the Program.||A, TA, RA, PA, CHR, PE, PI, SR|
|Reinstatement||10||A former witness who is reinstated into the Program. 4||A, TA (updated)|
|Relocated Prisoner Witness||4||A relocated witness who has been reauthorized as a prisoner witness. 5||A, TA (updated), PO (if witness is placed in PCU)|
|Relocated Witness with Family||5||A witness who will be relocated to a new area along with family members.||A, TA, RA, PA, CHR, PE, PI, SR|
We found that all 80 applicants from our sample were properly authorized (69 applicants) or denied (11 applicants) Program services. Applicants were denied admission into the Program either because the investigative agency determined that no threat existed against the witnesses or the witnesses continually failed to abide by the Program guidelines. In our judgment, the OEO authorized witnesses into the Program in accordance with the Act.
Evaluation of the Termination Process
The Department requires each protected witness to sign an agreement. Prisoner witnesses must sign an OEO Witness Security Program Prisoner Witness Agreement. The agreement addresses the program guidelines while the witness is incarcerated, such as the effects of prison disciplinary violations, as well as program relocation services after release from custody. Relocated witnesses must sign a USMS Memorandum of Understanding (MOU). The MOU addresses program guidelines while the witness is relocated, such as security assistance, new identity assistance, social security agreement, relocation assistance, household goods movement, employment, medical assistance, mail assistance, protection of prisoner witnesses' families, and probation and parole. The Act requires that protected witnesses and their family members be terminated from the Program if they "substantially" breach program guidelines or provide false information related to the case for which the witness was provided protection. Both agreements state that witnesses can be terminated from the Program if they:
To determine if witnesses were properly terminated from the Program, we reviewed all witness termination records from FY 1997 to FY 1999, the most recent information available at the time of the audit fieldwork. During these three years, 315 of 16,000 6 witnesses were terminated from the Program. We reviewed the reasons why the OEO and USMS 7 terminated the 315 witnesses from the Program. The following table summarizes the reasons the witnesses were terminated.
Reasons for Terminating Witnesses from the Program
|Basis for Termination||Number of
|Died (i.e., natural causes or suicide)||5|
|Did not comply with guidelines (e.g., failed to attend alcohol counseling)||71|
|Returned to the Danger Area||25|
|Breached Security (e.g., refused relocation)||83|
|Whereabouts were Unknown||8|
|Other (e.g., released to FBI custody)||2|
|Source: The Office of Enforcement Operations|
We found that all 315 witnesses were terminated from the Program in accordance with the agreements and the Act. Additionally, we found that witnesses were made aware of the reasons for the terminations. In our judgment, based on the information we reviewed, witnesses were properly terminated from the Program.
Historical Weaknesses in Collecting and Reporting Data
A September 1993 Office of the Inspector General (OIG) audit report noted that the OEO did not have complete indictment and conviction data for cases in which witnesses had testified. The OEO officials attributed this condition primarily to a lack of staff, inadequate follow-up practices by OEO analysts, and poor feedback from prosecutors. Consequently, OEO managers were unable to evaluate the success of the Program.
Despite efforts by the CRM to improve data collection during the early 1990's, problems persisted. In a June 1996 Senate Judiciary Committee hearing on the effectiveness of the Program, a former CRM official testified that very few statistics were available on the Program because it was difficult to obtain the required information from prosecutors. The official assured the Committee, however, that the CRM would "come up with some meaningful statistics demonstrating what the contribution of these protected witnesses have been to law enforcement generally, and in particular with respect to violent crime."
In early 1997, the OEO created the Support Services Unit (Unit), which was responsible for collecting, analyzing, and reporting on the results of protected witness testimony. The Unit's duties included:
The Unit collected indictment and conviction data for witnesses authorized in FY 1996 and FY 1997. In May 1997, OEO managers informed the head of the CRM that the United States Attorneys were not providing the Unit with the information necessary to compile effective indictment and conviction statistics. In March 1998, OEO managers instructed the Unit to stop collecting indictment and conviction data because of ongoing problems within the Unit and the poor quality of data being collected and reported. Eventually, the Unit was disbanded.
Accuracy of Data Compiled by the OEO
In April 1999, the OEO's Intelligence and Investigative Section was assigned responsibility for collecting, analyzing, and maintaining the results of protected witness testimony. At the time of our audit, the Section had compiled data received from the sponsoring AUSAs for 106 of the 128 witnesses admitted into the Program during FY 1998 and was still awaiting information from the AUSAs for the remaining 22 witnesses. [Sensitive Information Deleted] We reviewed the FY 1998 data because it was the most current information available. The table below summarizes the results of protected witness testimony for the data categories compiled by the OEO.
FY 1998 Results of Witness Testimony
|Data Category||Total Number|
|Source: The Office of Enforcement Operations|
We reviewed the results of protected witness testimony for the data categories identified in the above table. We found that the OEO accurately compiled data pertaining to seizures, forfeitures, fines, and restitutions resulting from testimony provided by witnesses admitted into the Program; however, the reported number of defendants, indictments, and convictions was significantly overstated, as detailed below. This occurred because of flaws in: (1) the process used by the OEO to compile data pertaining to the number of defendants, indictments, and convictions; and (2) the formula used by the OEO to calculate the number of indictments.
Procedures Used to Compile Data
Our review of the FY 1998 data indicated that in compiling the data the OEO overstated the number of defendants by 296 (18 percent), the number of indictments by 251 (15 percent), and the number of convictions by 228 (25 percent). This occurred because the OEO double-counted the number of defendants, indictments, and convictions in those instances when more than one witness provided testimony against the defendants in the same cases.
For example, in one instance five witnesses provided testimony against 39 defendants, resulting in 37 indictments and 35 convictions. However, the data compiled by the OEO showed 5 witnesses providing testimony against 195 defendants (5 witnesses multiplied by 39 defendants), resulting in 185 indictments (5 witnesses multiplied by 37 indictments) and 175 convictions (5 five witnesses multiplied by 35 convictions).
An OEO manager said the OEO compiled the data in this manner because it was unable to determine which protected witness's testimony was more significant. Therefore, for each protected witness, it reported the total number of defendants the witness testified against, as well as the resulting total number of indictments and convictions for each defendant. Consequently, when the OEO summarized this data for all 106 witnesses, the total numbers of defendants, indictments, and convictions were significantly overstated.
We acknowledge that the OEO may not always be able to determine which protected witness's testimony is more significant. Nevertheless, the procedures used to compile the data should always result in the accurate reporting of the results of protected witness testimony. In March 2001, the OEO upgraded its information-tracking system, which should facilitate the OEO's ability to accurately collect and report data. This is discussed in more detail on page 17 of this report. However, even with the new system, OEO will continue to report distorted data unless it changes its procedures.
Formula Used to Compute Indictments
The OEO developed and used an internal report to track the results of protected witness testimony for the 128 witnesses admitted into the Program during FY 1998. The report summarized the results of protected witness testimony and was based on witness profile sheets containing data received from sponsoring AUSAs. The OEO used the internal report as an interim measure while its information system was being upgraded. The system was upgraded in March 2001, and the OEO intends to upload data from the internal tracking report into the new system. The profile sheets included information such as witness's name, type of crime, defendant's name, and disposition. 9
The OEO summarized the information recorded on the individual profile sheets in the internal report. We reconciled the data contained in the profile sheets with the internal report and found that 495 indictments (31 percent) were overstated for 68 witnesses on the internal report. Rather than recording the indictment data directly from the profile sheets, the OEO used a flawed formula to calculate the number of indictments for the 128 witnesses within the internal report. For each witness, the formula subtracted dismissals from defendants to arrive at the number of indictments. The formula was flawed in that it assumed that individuals who were not dismissed were automatically indicted, when that was not the case. The formula would calculate an indictment for an individual who was either awaiting trial or for whom no information was provided by the AUSAs. For the remaining 60 witnesses, the number of indictments calculated by the formula was supported by the individual profile sheets.
As stated earlier, the OEO used the internal report as an interim measure to track the results of protected witness testimony. Therefore, the OEO needs to ensure that the report is accurate because the data will eventually be transferred to the OEO's upgraded information system and used in determining Program effectiveness.
Cooperation of the AUSAs in Providing Data
Upon authorizing a witness into the Program, the OEO sends a memorandum to the sponsoring AUSA detailing the guidelines for the authorization. The guidelines can require that:
The memorandum also informs the sponsoring AUSAs of their obligation to provide the OEO with the results of protected witness testimony as soon as it becomes available. Historically, the AUSAs did not always provide results of witness testimony to the OEO in a timely manner. The OIG initially reported this condition in a 1993 audit report. According to OEO management, the problem persisted and in May 1997, the OEO informed the head of the CRM that the USAOs were not providing the OEO with the information necessary to compile effective indictment and conviction statistics. The OEO managers said this occurred because the AUSAs assigned to protected witness cases either left the Department or were otherwise unavailable. The OEO requested that the head of the CRM send a memorandum to all USAOs to help focus their attention on the problem and to remind them of the importance of the information to the Department and to Congress. In response, the head of the CRM sent a memorandum stressing the importance of the outcome data and the AUSAs' obligation to provide the data to the OEO as soon as it becomes available.
In April 1999, the OEO requested the results of protected witness testimony from sponsoring AUSAs for the 128 witnesses authorized in FY 1998. As of May 2001, the AUSAs had provided results data to the OEO for 106 witnesses (83 percent). The data was provided in various formats including completed individual profile sheets, reports generated from the USAOs' case tracking systems, and memoranda listing the data. In our judgment, since our 1993 audit more AUSAs are providing the OEO with the results of protected witness testimony. However, to ensure responses are received from all sponsoring AUSAs as soon as the data becomes available, the OEO should implement a policy of sending periodic follow-up letters to the sponsoring AUSAs.
Upgrade to the OEO Information Tracking System
During this audit, the OEO was using an antiquated information system to manage the Program. According to the OEO, the system consisted of minimally interactive database files and was never designed to handle the OEO's voluminous tracking requirements because of the following conditions:
These limitations hampered the OEO analysts' efforts to efficiently and accurately capture, analyze, and report the results of protected witness testimony as well as track the daily flow of work.
[Sensitive Information Deleted]
Compliance with the Government Performance and Results Act (GPRA)
The GPRA requires federal agencies to develop Annual Performance Plans that includes measurable performance goals that define what an agency plans to accomplish during each fiscal year. The Annual Performance Plan should include results-oriented output and outcome performance goals. For most performance goals, performance indicators should be developed. In some instances, the performance goal may be self-measuring and no separate indicators are needed. According to the GPRA report of the Senate Committee on Government Affairs:
In areas where meaningful objective measurement is difficult, an alternative form of measurement may be authorized by OMB. Preferably, the alternative will be in the form of two, somewhat subjective performance definitions-one of a minimally or marginally effective program, and one of a fully successful program. Recognizing that in some cases an agency may be unable to define a goal using these two descriptive statements, OMB may in those instances authorize the agency to define and use another alternative form. All forms must be in terms that would permit an independent determination of whether the program's eventual performance corresponded to the performance statement.
In compliance with the GPRA, the Department requested that each component prepare a performance plan for FY 2000 that would accompany the Department's FY 2000 Summary Performance Plan. The CRM prepared a component performance plan for FY 2000 that was incorporated within its budget submission. The plan included performance output indicators showing the number of applications received for the Program and the number of witnesses authorized in FY 2000.
In February 1999, the Department issued a policy letter stating that numerical targets would not be established for performance indicators relating to law enforcement activities such as arrests, indictments, convictions, and seizures. The Department created this policy out of concern that such targets could be seen as "bounty-hunting." However, the policy does allow components to collect and report on these law enforcement activities on a prior-year basis. For example, in the Department FY 2000 Performance Report, which compares actual performance to annual goals, the Department reported retrospective data for these law enforcement activities and did not include targets for level of performance. However, the CRM performance plan did not include retrospective data nor did it include an alternative form for the outcome of cases in which protected witnesses had testified.
Measuring performance for law enforcement is challenging and complex. In addition, reporting on numerical counts of law enforcement activities such as arrests, indictments, convictions, and seizures may not be the best measures of performance for the Program, but it is the best measure available at this time. The CRM's FY 2000 performance plan only contained performance indicators for the number of applications received and the number of witnesses authorized in the Program. The Department reported that it is continually working on improving performance measures for law enforcement to ensure that the measures are realistic and meaningful. We urge the CRM to continue to explore ways to measure the performance of the Witness Security Program and to consider the utility of reporting, on a historical basis, statistics that depict the program's achievements relative to prosecutions and law enforcement objectives.
We recommend that the Assistant Attorney General, Criminal Division: