Audit of Department of Justice's Key Indicators

Audit Report 08-18
March 2008
Office of the Inspector General


Findings and Recommendations

I.       STRATEGIC GOAL I: PREVENT TERRORISM AND PROMOTE THE NATION’S SECURITY

The September 11 terrorist attacks led to the development of this strategic goal. According to the FY 2003‑2008 Strategic Plan for the U.S. Department of Justice, “The Department of Justice’s approach to protecting the U.S. from terrorism is three‑pronged, focusing on the prevention of terrorist acts; the investigation and prosecution of those who have committed, or intend to commit, terrorist acts in the United States; and combating espionage against the United States by strengthening counterintelligence capabilities.”

We discuss our review of the key indicator related to this strategic goal below.

Terrorist Acts Committed by Foreign Nationals Against U.S. Interests within U.S. Borders – FBI

This key indicator is measured by the FBI and assesses the number of terrorist acts committed by foreign nationals against U.S. interests within U.S. borders. The FBI is responsible for coordinating counterterrorism investigations. For the purposes of this indicator, “The FBI defines a terrorist act as an attack against a single target. A terrorist incident may consist of multiple terrorist acts.”

Data Collection and Storage

We did not identify any issues with the FBI’s data collection and storage processes for this key indicator. The number of terrorist acts committed by foreign nationals against U.S. interests is compiled by the FBI Counterterrorism Division. When a terrorist act is committed, an electronic communication is provided to FBI management from the Counterterrorism Division. Since these terrorist acts are infrequent occurrences, a database system is not necessary. The electronic communications are maintained as the source documents to report the number of occurrences in the PAR and are stored on a secure FBI computer with access controls.

Data Validation and Verification

We did not identify any issues with the FBI’s data validation and verification processes for this key indicator. The electronic communications regarding terrorist acts and incidents are reviewed by multiple levels of FBI management.

Key Indicator Data Comparison

Since the last reported terrorist act was committed in FY 2002, we reviewed the electronic communication supporting that occurrence. We discussed the zero terrorist acts committed for FY 2006 with the FBI. Based on our discussion, we did not find any discrepancies with the performance data reported for this key indicator for FY 2006.

Disclosure of Data Limitations

The FY 2006 PAR disclosed the following data limitations, “The decision to count or discount an incident as a terrorist act is subject to change based upon the latest available intelligence information and the opinion of program managers. In addition, acts of terrorism, by their nature, are impossible to reduce to uniform, reliable measures. A single defined act of terrorism could range from a small‑scale explosion that causes property damage to the use of a weapon of mass destruction that causes thousands of deaths and has a profound effect on national morale.” During our audit, we did not identify any data limitations beyond those already disclosed in the FY 2006 PAR for this key indicator.

II.     STRATEGIC GOAL II: ENFORCE FEDERAL LAWS AND REPRESENT THE RIGHTS AND INTERESTS OF THE AMERICAN PEOPLE

Strategic Goal II encompasses several broad issues, including reducing the threat, incidence and prevalence of violent crime and criminals, including crimes against children; reducing the threat, trafficking, use, and related violence of illegal drugs; combating white collar crime, economic crime, and cyber crime; and targeting threats to the U.S. Constitution and individuals’ civil rights.

We describe below each of the seven key indicators we reviewed related to this strategic goal.

Number of Organized Criminal Enterprises Dismantled - FBI

This key indicator is measured by the FBI and determines the number of organized criminal enterprises dismantled. Criminal enterprise investigations “target the entire entity responsible for the crime problem.” Dismantlement is defined in the PAR as “destroying the targeted organization's leadership, financial base, and supply network such that the organization is incapable of operating and reconstituting itself.” The FBI uses its Integrated Statistical Reporting and Analysis Application (ISRAA) to collect data for this indicator.

Data Collection and Storage

We did not identify any issues with the FBI’s data collection and storage processes for this key indicator. The field agents complete an FBI Accomplishment Report, Form FD-515, to report a statistical accomplishment, which includes dismantlements. The FD‑515 is approved by an FBI supervisor and administrative personnel enter the information into the ISRAA. According to Section 3 of the FBI’s Manual of Administrative and Operational Procedures, revised July 2007, “The accomplishments described in the FD-515 should be reported and uploaded in the ISRAA within 30 days from the date of occurrence.” the ISRAA has controls that restrict and limit access to specific data in the system. During our audit, we visited the FBI’s Denver Field Office and observed how data is entered into the ISRAA and system controls that help ensure valid and accurate data.

Data Validation and Verification

We did not identify any issues with the FBI’s data validation and verification processes for this key indicator. After the dismantlement is entered into the ISRAA, weekly queries are run by FBI headquarters personnel to identify new dismantlement statistics. The dismantlements are reviewed and either approved or denied. FBI headquarters’ approval is documented within the ISRAA and a report can be generated to verify that all of the information recorded in the ISRAA is accurate. The information in the ISRAA is also validated by the annual Resource Management Information System Audits that are conducted by FBI field offices. In addition, the FBI Inspection Division conducts field office inspections on a rotating 3-year basis, which review compliance with policies and procedures and the accuracy of case files.

Key Indicator Data Comparison

In order to verify the number of organized criminal enterprises dismantled reported for FY 2006 in the 2008 Budget and Performance Summary, we reviewed the FBI’s ISRAA Report.9 Based on our review, we did not identify any discrepancies with the performance data reported for this key indicator for FY 2006.

Disclosure of Data Limitations

The FY 2006 PAR disclosed the following data limitations, “FBI field personnel are required to enter accomplishment data within 30 days of the accomplishment or a change in the status of an accomplishment, such as those resulting from appeals. Data from this report are compiled less than 30 days after the end of the fiscal year, and thus may not fully represent the accomplishments during the reporting period. FY 2005 data subject to this limitation were revised during FY 2006.” During our audit, we did not identify any additional data limitations for this key indicator.

Number of Child Pornography Websites or Web Hosts Shut Down ‑ FBI

This key indicator is measured by the FBI’s Cyber Crimes Program and assesses the number of child pornography websites and web hosts shut down. In the FY 2006 PAR, the FBI explained that the mission of the Innocent Images National Initiative (IINI), a component of the FBI’s Cyber Crimes Program, is to “identify, investigate, and prosecute sexual predators who use the Internet and other online services to sexually exploit children; identify and rescue child victims; and establish a law enforcement presence on the Internet as a deterrent to subjects who seek to exploit children.” In performing its mission, the IINI issues subpoenas to web hosting companies and Internet service providers to obtain subscriber records regarding the requested website, IP address, screen name, e-mail address, and customer information in association with an ongoing investigation.10

FBI officials explained that the FBI has no direct technical role in shutting down the websites. the FBI administers the subpoenas to the web hosting companies and Internet service providers to obtain subscriber information in order to investigate who made the website and who is responsible for it. The subpoena says that the request is “in support of an ongoing sexual exploitation of children investigation.” However, the service of a subpoena does not require the termination of a website. According to the FBI, as a matter of routine, the web hosting companies and Internet service providers will shut down the website upon receiving the subpoena because they do not want this material on their servers.

We believe this key indicator is not fully accurate. It measures the number of website shut downs, while the FBI captures this data by counting the number of subpoenas served. According to FBI officials, the FBI does not have data to comprehensively count the total number of websites shut down through its interventions with Internet service providers. As previously mentioned, serving a subpoena does not require or necessarily result in the termination of a website. Therefore, this key indicator does not reflect the work and activities of the FBI. We recommend that the FBI revise this key indicator to accurately measure its role and activities.

Data Collection and Storage

As mentioned previously, the FBI’s work does not address this key indicator because its role is to issue subpoenas, which does not necessarily result in a website shutdown. However, in relation to the FBI’s role of issuing subpoenas in investigations of ongoing sexual exploitation of children, we did not identify any issues with the FBI’s data collection or storage processes for this key indicator. Quarterly, FBI personnel e-mail a request to each of the 56 FBI field offices, the IINI Unit, and the off-site IINI Unit. The request is for the following information: (1) number of e‑group take downs, (2) number of peer-to-peer subjects convicted, (3) number of file servers shut down, and (4) number of miscellaneous websites and online organizations that meet the definition of a website or web host. The FBI field offices, the IINI Unit, and the off‑site IINI Unit retrieve this information from their case files and determine the number of subpoenas served for each of these categories.

IINI personnel compile the information from the 58 respondents in a spreadsheet that documents each field office’s and unit’s responses to the data call. The spreadsheets are used to update the spreadsheet for the fiscal year, which provides the support for the information reported in the annual PAR. According to IINI personnel, all of the e-mail responses are electronically stored and the hard copies of the spreadsheets are stored in a secure area. Additionally, IINI personnel stated that there is a single data handler after the data is received from the point of contacts, which assists in the safekeeping and consistency of the data.

Data Validation and Verification

Again, the FBI’s work does not address this key indicator because its role is to issue subpoenas, which does not necessarily result in a website shut down. However, in relation to the FBI’s role of issuing subpoenas in investigations of ongoing sexual exploitation of children, we did not identify any issues with the FBI’s data validation and verification processes for this key indicator. The designated point of contact for the field office provides the quarterly information to IINI personnel. The data is reviewed by IINI personnel, who perform the data handling and are familiar with the offices’ and units’ responses. According to FBI personnel, the data from the responses is copied from the e-mails onto the spreadsheet to avoid keying errors. The IINI requests that case numbers be provided with the responses, which allows IINI personnel to verify that cases are categorized correctly, meet the category’s criteria, and to identify duplicates.

Key Indicator Data Comparison

In order to verify the number of child pornography websites or web hosts shut down that were reported for FY 2006 in the FY 2006 PAR, we reviewed the FBI IINI consolidated spreadsheets, which track the number of subpoenas served. Based on our review and in relation to the FBI’s role of issuing subpoenas in investigations of ongoing sexual exploitation of children, we did not find any discrepancies with the performance data reported for this key indicator for FY 2006.

Disclosure of Data Limitations

The FY 2006 PAR disclosed the following data limitations, “Data for this report are compiled less than 30 days after the end of the fiscal year, and thus may not fully represent the accomplishments during the reporting period. Information based upon reporting of locates and convictions are necessary for compilation of some of these statistics.” As mentioned previously, the FBI’s work does not address this key indicator because its role is to issue subpoenas, which does not necessarily result in the termination of a suspect website. In relation to the FBI’s role of issuing subpoenas in investigations of ongoing sexual exploitation of children, we did not identify any additional data limitations.

Recommendation

We recommend that the FBI:

  1. Revise the key indicator “ Number of Child Pornography Websites or Web Hosts Shut Down,” to accurately measure the FBI’s role and activities.

Consolidated Priority Organization Target-Linked Drug Trafficking Organizations Disrupted and Dismantled - FBI, DEA, OCDETF

This key indicator determines the number of disruptions and dismantlements of significant drug trafficking organizations that are linked to a Consolidated Priority Organization Target (CPOT). The Drug Enforcement Administration (DEA), FBI, and the Organized Crime Drug Enforcement Task Forces (OCDETF) work collectively to measure this indicator. OCDETF is a multi-agency drug enforcement program operated by the Department with participation by ATF, DEA, FBI, Criminal Division, Tax Division, Immigration and Customs Enforcement, Internal Revenue Service, USMS, U.S. Coast Guard, and the 93 U.S. Attorneys offices. The mission of OCDETF is to coordinate federal, state, and local law enforcement agencies in order to identify, disrupt, and dismantle the most serious drug trafficking and money laundering organizations.

OCDETF oversees the development of the Attorney General’s annual CPOT List, which is a multi-agency list of the international “command and control” elements of the most significant drug trafficking and money laundering organizations responsible for the nation’s drug supply. OCDETF investigations can be sponsored by any of the OCDETF member agencies. At the conclusion of our field work, only the DEA and FBI have reported on this measure.

The goal of OCDETF, in relation to the CPOT List, is to attack organizations with connections to a CPOT target, thus disrupting the drug market with the goal of reducing the nation’s drug supply. In the FY 2006 PAR, disruptions were defined as “impeding the normal and effective operation of the targeted organization, as indicated by changes in the organizational leadership and/or changes in methods of operation....” Dismantlements were defined as “destroying the organization's leadership, financial base, and supply network such that the organization is incapable of operating and/or reconstituting itself.” This key indicator focuses on disrupted and dismantled organizations linked to organizations on the CPOT List.

The OCDETF Program Guidelines set the standards for OCDETF cases and also provide detailed criteria for CPOT‑linked targets. OCDETF cases may be CPOT-linked cases. The primary selection criteria linking an organization to a CPOT include:

Data Collection and Storage

As stated previously, OCDETF, the DEA, and FBI are the only three federal entities currently reporting on this key indicator. During our audit, we did not identify any issues with the data collection and storage processes for this key indicator.

The DEA’s cases are entered into the Priority Target Activity Resource and Reporting System (PTARRS) by DEA Special Agents and CPOT‑links can be designated. If the case meets OCDETF guidelines, it is marked as a potential OCDETF case in PTARRS. The case is also marked as a potential CPOT‑linked case if it meets the criteria. All OCDETF cases must be approved by a group supervisor, Assistant Special Agent in Charge, and Special Agent in Charge before it is reviewed by DEA headquarters.

Once the DEA district approves the case, DEA headquarters prints the PTARRS reports showing the new potential CPOT‑linked organizations and the justification for the link. Next, program analysts review the case, highlight relevant information, and provide a print-out to the appropriate staff coordinator. Once the staff coordinator concurs and signs off on the case, the documentation is provided to the DEA Office of Enforcement. After approval by the Office of Enforcement Section Chief, an Office of Enforcement staff coordinator must also review and approve the link. Finally, the head of the Office of Enforcement must approve the CPOT-link.

PTARRS has various security controls and access restrictions. In addition, each DEA division can only access its own division’s cases. During our audit, we visited the DEA’s Denver Field Office and observed how cases are entered and tracked in PTARRS, along with the system controls that help ensure valid and accurate data.

The FBI NET is the central database system where the FBI Automated Case System and the ISRAA are located. FBI field agents complete the FBI Accomplishment Report, Form FD‑515, to report statistical accomplishments, which include disrupted and dismantled CPOT‑linked organizations. The FD‑515 is reviewed and initialed for approval by an FBI supervisor. According to the FBI’s Manual of Administrative and Operational Procedures, revised July 2007, “The accomplishments described in the FD‑515 should be reported and loaded in the ISRAA within 30 days from the date of occurrence.”

The FD‑515 information is entered into the Automated Case System, where it is assigned a serial number. The FD‑515 and serial number are then entered into the ISRAA where the information is stored. On a weekly basis, a program manager at FBI headquarters performs a query on new disruption and dismantlement statistics, and either approves or denies them. If the information is incomplete or inadequate, the program manager prepares a Denial Memoranda noting the reason for the denial. FBI program analysts also run verification reports from the ISRAA to verify that all of the information is correct. During our audit, we visited the FBI’s Denver Field Office and observed how accomplishments are entered into the ISRAA and observed the system controls that help ensure valid and accurate data. Additionally, we observed the FBI headquarters’ approval process.

After the DEA and FBI approval processes are complete, the potential OCDETF cases are provided to the OCDETF District Coordination Group. This group is comprised of representatives from each OCDETF member agency. The OCDETF District Coordination Group is chaired by an OCDETF lead task force attorney under the supervision of the U.S. Attorney. The OCDETF District Coordination Group determines whether an investigation meets the criteria for OCDETF designation. If so, the investigation is forwarded to the OCDETF Regional Coordination Group for final approval. If the OCDETF Regional Coordination Group does not believe the case meets the criteria, the sponsoring law enforcement agency is given the opportunity to provide additional information and supporting documentation.

The OCDETF Regional Coordination Group forwards approved cases to the OCDETF Executive Office. If a case has a potential CPOT‑link, it can be entered into the OCDETF Management Information System (MIS) at the district, regional, or national level. The OCDETF Associate Director reviews all new potential CPOT links to confirm that there is a valid CPOT link. If the link is valid, the Associate Director approves the link and it is entered into the MIS. If the link is deficient, a request is made to the sponsoring agency to provide additional supporting documentation and information.

According to OCDETF personnel, the MIS is a certified and accredited computer system used by OCDETF personnel who have access to the DOJ intranet including the U.S. Attorney offices, to track OCDETF cases, including CPOT-links. The MIS has various security controls and access levels.

Data Validation and Verification

We did not identify any issues with the data validation and verifications processes for this key indicator. The DEA’s primary validation process is the work flow status fields required in PTARRS, which shows the approval process beginning with a field office DEA Group Supervisor to DEA headquarters’ review. PTARRS does not allow the approval process to be bypassed. Additionally, all DEA divisions are required to update their cases at least every 90 days to ensure that relevant cases are not omitted. Once a DEA Special Agent updates the case’s progress, the case moves through the approval chain at the division and DEA headquarters. Therefore, the DEA Special Agent, group supervisor, Assistant Special Agent in Charge, and Special Agent in Charge should be reviewing the progress of the case at least every 90 days.

A program manager at FBI headquarters performs weekly data validation and verification of the potential CPOT-links and the disruptions or dismantlements. The information in the ISRAA is validated monthly when quality control reports are run by the field offices to look for duplicate FD‑515 entries in the ISRAA and again annually when the field offices conduct Resource Management Information System Audits. In addition, the FBI Inspection Division conducts field office inspections on a 3‑year rotating basis, which review compliance with policies and procedures and the accuracy of case files. Finally, CPOT-links are validated by the OCDETF Associate Director.

OCDETF personnel are able to view the status of a CPOT-link through the MIS, CPOT Link Validation Status Tracking screen. This screen is used for tracking the validation of CPOT-linked organizations. In addition, the OCDETF Executive Office performs data reviews throughout the year. Copies of the Investigation Initiation Reports, Interim Reports, and the Final Reports received by the OCDETF Executive Office, are reviewed against the data in the MIS for accuracy. OCDETF personnel are also able to run various reports by region, CPOT‑linked organization, status, and agency through the MIS to verify the information on the reports and in the system.

Key Indicator Data Comparison

In order to verify the number of CPOT-linked drug trafficking organizations disrupted and dismantled in FY 2006 reported in the 2008 Budget and Performance Summary, we reviewed the OCDETF Executive Office spreadsheets, DEA summary spreadsheets, and FBI ISRAA reports.11 Based on our review, we did not find any discrepancies with the performance data reported for this key indicator for FY 2006.

Disclosure of Data Limitations

The FY 2006 PAR disclosed the following data limitations, “Investigations of CPOT-level organizations and related networks are complex and time-consuming, and the impact of disrupting/dismantling such a network may not be immediately apparent. Accordingly, data on this measure may lag behind actual enforcement activity by the investigating agency. It is also possible that a particular CPOT-linked organization may be disrupted in one FY and subsequently dismantled in a later year. For example, a significant number of organizations disrupted during the current FY remain under investigation, as law enforcement seeks to permanently destroy their ability to operate.” During our audit, we did not identify any data limitations beyond those already disclosed in the FY 2006 PAR for this key indicator.

Number of Top‑Ten Internet Fraud Targets Neutralized ‑ FBI

This key indicator determined the number of top-ten Internet fraud targets neutralized.12 However, in the 2008 Budget and Performance Summary the FBI revised the indicator to measure high‑impact Internet fraud target neutralizations and titled it “Number of High-Impact Internet Fraud Targets Neutralized.” The FBI measures this key indicator. According to the FY 2006 PAR, “The FBI and the National White Collar Crime Center [NW3C] partnered in May 2000 to support the Internet Crime Complaint Center (IC3).” The IC3 defines high‑impact Internet fraud targets as targets meeting one of the following criteria:

The FBI and IC3 investigate and coordinate with state, local, federal, and international law enforcement agencies to “neutralize” or ensure that these high-impact targets are unable to perpetuate their frauds any further.

Data Collection and Storage

According to FBI personnel, data collection for this key indicator begins when a complaint is entered into the IC3 website's standard complaint form by a victim of an Internet crime.13 After a complaint is submitted, the information is sent to the IC3 Production Database 1.

NW3C analysts review the complaints, check the validity, and make referrals to state and local member law enforcement agencies if a victim’s or subject’s location is provided. Additionally, the NW3C may make referrals to appropriate agencies when complaints include child pornography or terrorist information. After the NW3C’s validation, complaints are run through the Automatch system, which searches for relationships among complaints stored in the Production Database 1 and cases in Automatch.14 A case is built in Automatch when information in complaints can be linked and FBI analysts work to establish additional relationships.15 When multiple cases are determined to be linked, the cases can be merged into a single case.

The FBI component of IC3 refers cases to state or local law enforcement agencies, FBI field offices, and federal agencies when cases in Automatch exceed $10,000 in losses, or when numerous complaints, usually 10 or more, are reported. When a referral is made to a state or local law enforcement agency, a letter is sent to the agency along with the case file. When a referral is made to an FBI field office, an electronic communication is provided to the field office along with the case file. The IC3 maintains copies of the electronic communications, the letters, and the case files. Additionally, the FBI component of IC3 will refer presidential threats to the Secret Service.

Referrals to FBI field offices and state and local law enforcement agencies are tracked in the Internet Crime Complaint Center Report Log. Additionally, cases that have moved into the investigation process are tracked in the Statistic Log. As a proactive approach to identify neutralization information, the IC3 searches the Internet and liaisons with the FBI field offices. The IC3 also depends on and requests FBI field offices and state and local law enforcement agencies to provide follow‑up information on whether a case was initiated and the outcome of their investigation. However, there is no policy requiring FBI field offices or state and local law enforcement agencies to provide follow‑up information on whether a case was initiated and the outcome of their investigation.

FBI personnel assigned to the IC3 maintain a list of the ongoing, high‑impact investigations that is reviewed by the Unit Chief. While conducting fieldwork at the IC3, we observed how a victim can enter a complaint on the IC3 website and how the complaints are tracked through the IC3 databases and logs.

Since there is no requirement for FBI field offices and state and local law enforcement to provide information on neutralizations, the IC3 may not receive complete feedback and cannot assure the accuracy of the neutralization information for this key indicator. FBI personnel explained that the IC3 is a referring entity, not an investigative entity. As noted above, the IC3 refers cases to FBI field offices and state and local law enforcement agencies. However, these offices and agencies have the discretion to decide whether to open a case and pursue an investigation based on these referrals. IC3 requests that the FBI field offices and state and local law enforcement agencies provide follow-up information regarding the referrals. We found that no formal requirements exist regarding use of the referred case, and there is no formal follow-up process for FBI field offices or state and local law enforcement agencies to provide information on neutralizations or whether an investigation was opened. Therefore, the IC3 is unable to provide assurance on the accuracy of the number of neutralizations since it does not receive follow‑up information on all referrals. We recommend that the FBI develop and implement procedures to ensure that complete and accurate information is obtained to report on this key indicator or, in the alternative, revise this key indicator.

Data Validation and Verification

We did not identify any issues with the FBI’s data validation and verification processes for this key indicator. NW3C analysts initially review the complaints to identify high-priority items and eliminate “spam.”16 The complaints are then examined by FBI analysts who build cases in Automatch. Additionally, the processes in place can prevent false complaints from becoming a case or referral. According to FBI personnel, in order for a complaint to become part of a case, information must be linked. A false complaint cannot be matched with another complaint to build a case. Therefore, because a case cannot be built, a referral cannot be made.

As mentioned previously, since the IC3 is a referring entity, it relies on FBI field offices and state and local law enforcement agencies to provide the neutralization information. The information provided by FBI field offices is verified by the IC3 through the FBI’s Automated Case System or Sentinel. The information provided by law enforcement agencies is verified through researching articles on the Internet that is similar to information the IC3 received.

Key Indicator Data Comparison

In order to verify the number of high-impact Internet fraud targets neutralized in FY 2006 reported in the 2008 Budget and Performance Summary, we reviewed the FBI list summarizing each high-impact case and neutralization.17 Based on our review, we did not find any discrepancies with the performance data reported for this key indicator for FY 2006. However, our conclusion is qualified to the extent that we could not be assured that all neutralizations were reported to the IC3, based on the previously mentioned information.

Disclosure of Data Limitations

The FBI did not identify any data limitations for this key indicator in the FY 2006 PAR. Additionally, during our audit we did not identify any data limitations for this key indicator.

Recommendation

We recommend that the FBI:

  1. Develop and implement procedures to ensure that complete and accurate information is obtained to report on the key indicator “Number of High-Impact Internet Fraud Targets Neutralized,” or in the alternative, revise this key indicator.

Number of Criminal Enterprises Engaging in White-Collar Crimes Dismantled – FBI

This key indicator is measured by the FBI and determines the number of dismantled criminal enterprises engaging in white-collar crime. The FBI’s White-Collar Crime Program investigates crimes that include “health care fraud, financial institution fraud, government fraud (e.g., housing, defense procurement, and other areas), insurance fraud, securities and commodities fraud, telemarketing fraud, bankruptcy fraud, environmental crimes, and money laundering.” Dismantlement is defined in the FY 2006 PAR as “destroying the organization's leadership, financial base, and supply network such that the organization is incapable of operating and/or reconstituting itself.” The FBI collects and stores the data for this indicator in its ISRAA.

Data Collection and Storage

We did not identify any issues with the FBI’s data collection and storage processes for this key indicator. The field agents complete the FBI Accomplishment Report, Form FD‑515, to report a statistical accomplishment, which includes dismantlements. The completed FD‑515 is approved by an FBI supervisor and the information is then entered and stored in the ISRAA. According to Section 3 of the FBI’s Manual of Administrative and Operational Procedures, revised July 2007, “The accomplishments described in the FD‑515 should be reported and uploaded in the ISRAA within 30 days from the date of occurrence.” During our audit, we visited the FBI’s Denver Field Office and observed how accomplishments are entered into the ISRAA, as well as system controls that help ensure valid and accurate data.

Data Validation and Verification

We did not identify any issues with the FBI’s data validation and verification processes for this key indicator. The FBI field office supervisors review the FD-515’s, and weekly queries are run by FBI headquarters to identify new white‑collar crime dismantlement statistics, which are reviewed and either approved or denied. Additionally, the ISRAA has controls that limit access to specific data and the system. Annually, the FBI field offices conduct Resource Management Information System Audits to validate the information in the ISRAA. In addition, the FBI Inspection Division conducts field office inspections on a rotating 3‑year basis to review compliance with policies and procedures and the accuracy of case files. Finally, the white‑collar crime numbers are spot‑checked by FBI headquarters personnel before providing the information for the PAR.

Key Indicator Data Comparison

In order to verify the number of dismantled criminal enterprises engaging in white‑collar crime in FY 2006 reported in the 2008 Budget and Performance Summary, we reviewed the FBI’s ISRAA Report.18 Based on our review, we did not identify any discrepancies with the performance data reported for this key indicator for FY 2006.

Disclosure of Data Limitations

The FY 2006 PAR disclosed data limitations for this indicator, which state “FBI field personnel are required to enter accomplishment data within 30 days of the accomplishment or a change in the status of an accomplishment, such as those resulting from appeals. Data for this report are compiled less than 30 days after the end of the fiscal year, and thus may not fully represent the accomplishments during the reporting period. FY 2005 data subject to this limitation were revised during FY 2006.” During our audit, we did not identify any data limitations beyond those already disclosed in the FY 2006 PAR for this key indicator.

Percent of Cases Favorably Resolved – EOUSA and the Litigating Divisions

This key indicator assesses the percent of cases favorably resolved, which described in the FY 2006 PAR includes “those cases that resulted in court judgments favorable to the government, as well as settlements.” This key indicator is measured by the Executive Office for U.S. Attorneys (EOUSA) and six litigating divisions. According to the FY 2006 PAR, litigating efforts fall into either criminal litigation or civil litigation, and are measured separately under this indicator.

Data Collection and Storage

We found that the methods and procedures for data collection and storage are similar among EOUSA and the litigating divisions. All of the components utilize different case management systems to track work from the origination of a case or matter, through each component’s defined phases of a case, to the final disposition and closure of a case.19 Data entry personnel are responsible for the collection of court documents, such as a summons, complaint, or any other forms provided by an attorney, and entering the information into the component’s case management system. To capture case or defendant outcomes, data entry personnel use various forms containing the case or defendant disposition type such as a guilty plea, conviction, settlement, dismissal, and any other favorable or unfavorable result.

The determination of whether an outcome is listed as favorable or unfavorable is at the discretion of each component’s management. An example is the Civil Division’s treatment of Harbor Maintenance Tax cases in which a ruling in the lead case resulted in 9,047 favorable decisions in FY 2006. The Civil Division chose to count this as one favorable decision to more accurately represent the Civil Division’s total percentage of cases favorably resolved.

EOUSA and the litigating divisions, with the exception of the Criminal Division, generate fiscal year favorable and unfavorable statistics through system reports sorted by outcome type or by a unique disposition code. The results from these reports are provided to JMD, which consolidates the cases and calculates DOJ’s total percentage of cases favorably resolved. The Criminal Division does not use its case management system to accumulate this key indicator statistic. Instead, 1 month after the end of each quarter, managers from each of the Criminal Division’s 12 litigating sections provide a listing of that quarter’s outcome data to the division’s Resource, Planning and Evaluation Staff, who aggregate the data into a spreadsheet and provide the results to JMD.

We found that EOUSA and the litigating divisions are using two different dates to report on the percentage of cases favorably resolved – the disposition date and the system date. The disposition date is the date that the disposition or decision actually occurred. Alternatively, the system date is the date the disposition is entered into the case management system. Table 2 illustrates the date used by EOUSA and each litigating division.

TABLE 2:   DATE EACH COMPONENT USES TO REPORT ON THE PAR INFORMATION

COMPONENT NAME DATE USED FOR REPORTING
Civil Division Disposition
Civil Rights Division Disposition
Criminal Division Disposition
Antitrust Division Disposition
Environment and Natural Resources Division Disposition
Tax Division System
Executive Office for U.S. Attorneys System
Source:   Management at Civil Division, Civil Rights Division, Criminal Division, Antitrust Division, Environment and Natural Resources Division, Tax Division, and Executive Office of U.S. Attorneys

During our audit, we attempted to obtain data runs from EOUSA and the litigating divisions using the disposition date and the system date to determine the variance, if any, in the information reported in the FY 2006 PAR. However, we were unable to obtain data runs from all of the components because several of the case management systems did not capture the system date. Therefore, we were unable to determine the reporting variance.

In our opinion, using two different dates for the data runs provides inconsistent results. Prior to release of the FY 2007 PAR, we discussed this issue with EOUSA and the litigating divisions and provided a preliminary recommendation that EOUSA and the litigating divisions implement a common method of generating the number of cases favorably resolved and total number of cases litigated that is provided to JMD, or disclose the difference in approaches in the PAR. Component management agreed and took action to disclose in the FY 2007 PAR the different dates that are used to collect data for this indicator.

As part of our audit, we assessed implementation of the preliminary recommendation in the FY 2007 PAR. We determined that EOUSA and the litigating divisions opted to disclose that “the court’s disposition date is used for reporting purposes for the ATR [Antitrust Division], CIV [Civil Division], CRM [Criminal Division], CRT [Civil Rights Division], and ENRD [Environment and Natural Resource Division], however, EOUSA and TAX [Tax Division] use the date that it is entered into their current case management system.” In our judgment, EOUSA and the litigating divisions sufficiently disclosed the differences in approaches. Therefore, we are not including a formal recommendation on this issue.

Additionally, the data definition section of the PAR states, “The data set includes non-appellate litigation cases closed during the fiscal year.” However, we found that the Civil Rights Division included appellate cases in the information provided to JMD. Table 3 illustrates what should have been reported by the Civil Rights Division to JMD for the PAR, while Table 4 illustrates what was reported to JMD for the PAR.

TABLE 3:   CIVIL RIGHTS DIVISION FY 2006 CASE RESULTS EXCLUDING APPELLATE CASES

CASE TYPE FAVORABLE
CASES
TOTAL
CASES
PERCENT
FAVORABLY
RESOLVED
Civil Cases 99 101 98%
Criminal Cases 59 64 92%
Source:   Civil Rights Division Favorable Case Breakdown Reports

TABLE 4:   CIVIL RIGHTS DIVISION FY 2006 CASE RESULTS REPORTED TO JMD FOR THE PAR

CASE TYPE FAVORABLE
CASES
TOTAL
CASES
PERCENT
FAVORABLY
RESOLVED
Civil & Appellate Cases 194 206 94%
Criminal & Appellate Cases 74 79 94%
Source:   Justice Management Division and Civil Rights Division Favorable Case Breakdown Reports

As shown in Tables 3 and 4, the percentage of civil cases favorably resolved was understated by 4 percent and the percentage of criminal cases favorably resolved was overstated by 2 percent. According to Civil Rights Division officials, appellate cases were included because excluding them would not reflect all of the work performed by its attorneys. We recommend that the Civil Rights Division exclude appellate cases from the quarterly and fiscal year information provided to JMD to comply with the statements in the PAR and to avoid over- or understating the percentage of cases favorably resolved.

We did not identify any issues with EOUSA’s or the litigating divisions’ data storage processes for this key indicator. The source documents and data are maintained in accordance with record retention schedules and the Offices, Boards and Divisions Order 2710.6, Recordkeeping for Litigation Case Files. Both establish procedures for the management of litigation records. We were informed that closed case files are maintained at the respective component or local storage site until being transferred to a Federal Records Center for long-term storage.

Data Validation and Verification

We found that some duplicate cases are included in the data for this key indicator. It is common for cases to be jointly litigated or transferred among the litigating divisions and the U.S. Attorneys.20 EOUSA reports all cases for this key indicator regardless of whether a U.S. Attorney Office or litigating division was the lead office on the case. As a result, in some instances the U.S. Attorney’s Offices and the litigating divisions are including the same cases in the outcome numbers provided to JMD and reported in the PAR. EOUSA and the litigating division’s management explained that the duplicate counting is a result of the different systems that are unable to share data. Additionally, the litigating divisions were concerned that not including the joint cases in the numbers provided to JMD would not reflect all of their work. In response to this concern, we emphasized that this key indicator is intended to reflect data at the department level, and not at the component level. Prior to the release of the FY 2007 PAR, we discussed this issue with EOUSA and the litigating divisions and provided a preliminary recommendation that EOUSA and the litigating divisions develop and implement a method to collect and provide data to JMD that ensures jointly litigated cases are not duplicated. In the alternative, they should disclose that duplicate cases are included in the data in the PAR. Component management agreed and took action to disclose that duplicate cases are included in the data in the FY 2007 PAR.

As part of our audit, we assessed EOUSA’s and the litigating divisions’ compliance with our preliminary recommendation. We found that EOUSA and the litigating divisions disclosed in the FY 2007 PAR that cases worked by more than one component are duplicated in the totals for this key indicator. Additionally, EOUSA and the litigating divisions explained that this will remain an issue until the litigating case management system is implemented. Because EOUSA and the litigating divisions disclosed that duplicate cases are included in the data in the PAR and this issue will be addressed by the litigating case management system, we are not including a formal recommendation on this issue.

Although we found that duplicate cases were reported, we determined that EOUSA and each of the litigating divisions had procedures and internal controls in place to seek assurance that the data collected within each agency’s system is accurate. The most commonly cited validation and verification methods were docket reviews, attorney interviews, exception reports, management certification of key indicator data, and case management internal controls. Docket reviews and attorney interviews consist of section managers or case management specialists discussing the status of the cases with the attorneys, resolving any problems with missing or inconsistent information, and ensuring that accurate and timely information is maintained. Additionally, exception reports are produced when a specific condition or exception occurs. For example, one exception report that we obtained from the Civil Rights Division listed “Cases and Matters with Null Outcomes.” Additionally, some of the components use management certification that consists of section managers certifying the accuracy of cases and matters entered into the case management system.

We also observed how cases and matters are entered into EOUSA’s and each of the litigating division’s case management systems and observed system controls that help ensure valid and accurate data. These controls include the use of drop‑down list boxes to mitigate manual‑typing errors, mandatory data entry fields, and report generating capabilities.

Key Indicator Data Comparison

In order to verify the percentage of cases favorably resolved and reported in the FY 2006 PAR, we reviewed the support from EOUSA and the litigating divisions for FY 2006 criminal and civil cases. Based on our review, we did not find any discrepancies with the performance data reported for this key indicator in FY 2006.

Disclosure of Data Limitations

The following data limitations for EOUSA and the litigating divisions were disclosed in the FY 2006 PAR, stating that “ Data quality suffers from the lack of a single DOJ case management system and a standardized methodology for capturing case related data. Due to the inherent variation in data collection and management among the litigating divisions, cases may refer to cases or individuals. In addition, due to reporting lags, case closures for any given year may be under‑ or over-reported. To remedy these issues, the Department is currently developing a Litigating Case Management System to standardize methodologies between the components and capture and store data in a single database.” We did not identify any additional data limitations for this key indicator beyond those already included in the data limitations section of the FY 2006 PAR and the issues disclosed above, that were identified during our audit.

Recommendation

We recommend that the Civil Rights Division:

  1. Exclude appellate cases from the quarterly and fiscal year information provided to JMD to comply with the statements in the PAR and to avoid over- or understating the percentage of cases favorably resolved for the key indicator “ Percent of Cases Favorably Resolved.”

Percent of Assets/Funds Returned to Creditors for Chapter 7 and Chapter 13 – USTP

The U.S. Trustee Program’s (USTP) key indicator measures the percent of assets and funds returned to creditors for proceedings under both Chapters 7 and 13 of Title 11 of the U.S. Code, known as the Bankruptcy Code.

The USTP consists of the Executive Office for U.S. Trustees (EOUST) and 21 regional U.S. Trustees.21 The U.S. Trustees are responsible for establishing, maintaining, and supervising panels of private trustees. These private trustees serve as fiduciaries to various parties with an interest in a case.

During our audit, we found that U.S. Trustees are not in all 50 states as the FY 2006 PAR implied. Specifically, in Alabama and North Carolina, bankruptcy cases are still administered by the courts. We discussed this issue with EOUST officials prior to the release of the FY 2007 PAR, and EOUST concurred. In the FY 2007 PAR EOUST changed the language to disclose that the two states are not included in the data for this key indicator.

Data Collection and Storage

During our audit we did not identify any issues with the USTP’s Chapter 7 data collection and storage processes for this key indicator. According to the EOUST, in FY 2006, approximately 1,200 trustees administering Chapter 7 cases closed more than 59,000 asset cases, generating nearly $2.6 billion in funds. Chapter 7 private trustees use case administration software to track the status of their cases and are required to periodically submit forms and reports to the bankruptcy courts and the U.S. Trustees Offices. Specifically, the Distribution Report for Closed Asset Cases, Form 4, provides statistical data concerning the distributions made in the case. Twice a year, personnel in the 95 field offices accumulate and consolidate Form 4s into a single spreadsheet and send it to the EOUST’s Office of Research and Planning. At the Office of Research and Planning, a management analyst formats and loads the 95 files into a database and provides the Chapter 7 statistics to JMD for inclusion in the PAR.

Additionally, we did not identify any issues with the USTP’s Chapter 13 data collection and storage processes for this key indicator. According to the EOUST, in FY 2006 188 trustees administering Chapter 13 cases collected more than $5.5 billion. The collection of key indicator data by Chapter 13 private trustees is similar to that of the Chapter 7 private trustees. Chapter 13 private trustees use case administration software to track receipts and disbursements, and are required to prepare an Annual Report at the end of the fiscal year detailing disbursements. Annual Reports must be electronically submitted by all 188 private trustees to their respective regional office by November 15 and forwarded to the EOUST by December 1. Upon receipt, the EOUST consolidates the data and loads it into a database where data can be modified and queried. Once the audited Annual Reports are received from independent Certified Public Accountants, database inaccuracies are corrected and the final data is exported to a spreadsheet and provided to JMD for inclusion in the PAR.

Data Validation and Verification

We did not identify any issues with the USTP’s Chapter 7 data validation and verification processes for this key indicator. EOUST uses Biennial Performance Reviews to document private trustee performance, based on the objective of making meaningful distributions to creditors. One of the evaluation criteria is the timeliness, accuracy and completeness of the trustee’s Final Account, which contains the Form 4.22 The EOUST also utilizes a combination of independent audits and field examinations of private trustees conducted on a 4-year rotating basis. The audits are performed by independent Certified Public Accountants and the field examinations are performed by U.S. Trustee personnel. Both reviews focus on the appropriateness and effectiveness of each trustee’s internal controls and case administration, and conclude with a report on the findings.

During our audit, we observed EOUST’s processes for collecting Chapter 7 data spreadsheets, converting the spreadsheets into text files, and loading the text files into a database. We also observed system controls used to identify data inconsistencies or errors, and the use of a checklist to ensure that spreadsheets have been received from each field office and that text files have been loaded into the database.

We did not identify any issues with the USTP’s Chapter 13 data validation and verification processes for this key indicator. The EOUST requires annual audits of each trustee’s Annual Report by independent Certified Public Accountants. The purpose of these audits is to obtain reasonable assurance that the Annual Report is free of material misstatement. As previously stated, upon receipt of the audited Annual Reports EOUST staff compares the audited numbers to the trustee’s unaudited Annual Report data that was loaded into the database. If the audited Annual Report identifies discrepancies, the database information is replaced with the audited information. Additionally, auditors perform a yearly Prescribed Procedures engagement to test compliance with the USTP policies. These procedures include evaluations of internal controls, case tracking, cash receipts, and disbursements. During our audit, we observed EOUST’s processes for collecting Chapter 13 data spreadsheets and loading them into a database. We also observed system controls used to identify data inconsistencies or errors, the use of audited Annual Reports to make changes to data, and the use of a report that lists the fiscal year audit findings and is used to resolve common issues.

Key Indicator Data Comparison

We reviewed the USTP’s Percent Analysis of Chapter 7 Statistics in order to verify the accuracy of the FY 2005 percentage of assets and funds returned to creditors for Chapter 7 filings reported in the FY 2006 PAR.23 Based on our review, we did not find any discrepancies with the performance data reported for this key indicator for FY 2005.

Additionally, we reviewed the USTP’s supporting record of Chapter 13 Standing Trustee FY 2005 Audited Annual Reports in order to verify the accuracy of the FY 2005 percent of assets and funds returned to creditors for Chapter 13 filings reported in the FY 2006 PAR. 23 Based on our review, we did not find any discrepancies with the performance data reported for this key indicator for FY 2005.

Disclosure of Data Limitations

With regard to both Chapters 7 and 13, USTP disclosed in the data limitations section of the FY 2006 PAR that they are unable to project out‑year performance because there is no reliable method for calculating future bankruptcy case disbursements. USTP also disclosed that the most recent fiscal year data is reported in the following year’s PAR due to a data lag caused by using audited data. During our audit, we did not identify any additional data limitations for this key indicator.

III.    STRATEGIC GOAL III: ASSIST STATE, LOCAL, AND TRIBAL EFFORTS TO PREVENT OR REDUCE CRIME AND VIOLENCE

According to the FY 2003‑2008 Strategic Plan, DOJ assists state, local, and tribal governments by providing “an extensive, varied portfolio of criminal and juvenile justice grant programs, training, and technical assistance.” Additionally, DOJ conducts research, collects statistics, and evaluates new programs and technologies in order to further understand crime, violence, and justice.

We discuss below each of the four key indicators we reviewed related to this strategic goal.

Reduction of Homicides Per Site Funded Under the Weed and Seed Program – OJP

This key indicator assesses the number of homicides per Weed and Seed site and calculates the reduction of homicides per site. The Community Capacity Development Office (CCDO) within the Office of Justice Programs (OJP) measures this indicator. The Weed and Seed Program is a grant program. The mission of the OJP CCDO “is to work with local communities to design strategies for deterring crime, promoting economic growth, and enhancing quality of life.” According to the OJP, the Weed and Seed strategy “involves a two‑pronged approach [to crime control and prevention]: law enforcement agencies and prosecutors cooperate in ‘weeding out’ violent crime and drug abuse [from a designated area]; and ‘seeding’ brings human services to the area, encompassing prevention, intervention, treatment, and neighborhood revitalization.”

Data Collection and Storage

The data for this key indicator is submitted by the grantees in their annual GPRA Reports. Grantees can submit GPRA Reports through OJP’s Grant Management System (GMS) or provide them directly to the CCDO. In March of each year, the CCDO provides instruction and due dates to grantees for completing the GPRA Report. The completed reports are due in May and report on the previous calendar year. For example, the FY 2004 GPRA Report includes data covering calendar year 2003. The 3‑month lag helps ensure that most information has been captured in the grantees’ computer systems.

The Weed and Seed Program Guide and Application Kit lists the performance measures and performance data that each grantee is required to report, including this key indicator, which requires grantees to provide the number of homicides per site.

If a GPRA Report is provided directly to the CCDO, a CCDO program manager reviews the report for completeness, ensuring that a majority of the questions are answered and the report is entered into GMS. Incomplete reports are returned to the grantee for additional information. GPRA Reports that have been approved by a CCDO program manager are provided to the Justice Research and Statistics Association (JRSA). The JRSA and CCDO track the grantees’ GPRA Report submissions to identify any missing reports.

JRSA personnel review the GPRA Reports to check for accuracy and missing information. JRSA personnel may follow up with grantees to request clarification on the information provided. JRSA personnel then key the information into a JRSA database.

We found that the data reported for this key indicator was unclear because it misidentified the time periods covered. Specifically while reviewing the summary reports provided by the JRSA, we confirmed the number of homicides per Weed and Seed site that were reported for FYs 2004 and 2005. However, we found that the scope of the data did not cover FYs 2004 and 2005. Instead, the summary reports used data from the FYs 2004 and 2005 GPRA Reports, which covers data from calendar years 2003 and 2004. Therefore, the data presented in the FY 2006 PAR as FYs 2004 and 2005 is data covering calendar years 2003 and 2004. We recommend that the CCDO present the accurate scope of the performance data in the PAR by listing the correct calendar year that the data covers. In discussing this issue with CCDO personnel, they agreed with our finding.

We did not identify any issues with the CCDO’s and JRSA’s data storage processes for this key indicator. The GPRA Reports are stored in multiple forms. The CCDO maintains paper versions of the GPRA Reports stored in GMS. JRSA personnel convert GPRA Reports into PDF files that are stored on a secure server, which is backed up weekly. Additionally, the JRSA provides the PDF files to the CCDO on compact discs. Finally, the JRSA maintains the paper versions of GPRA Reports and posts all of the data from the reports on the Weed and Seed Data Center’s website.24

Data Validation and Verification

A CCDO program manager reviews the reports for completeness before providing them to the JRSA. The data validation and verification processes performed by the JRSA include: checking the data for completeness and consistency; checking for outliers, which are values that are outside other values in the data set; conducting follow-up with grantees; comparing the GPRA Report to the previous year’s reports for the same grantee; comparing the data to published information, including the FBI’s crime statistics; and determining whether the local law enforcement records were subject to any reviews. Further validation and verification is performed by OJP’s Budget Planning and Performance Division in the Office of the Chief Financial Officer. Additionally, the Management Discussion and Analysis (MD&A) write-up is reviewed by personnel within OJP’s Budget Planning and Performance Division in the Office of the Chief Financial Officer, the Audit and Review Division in the Office of Audit, Assessment, and Management, and the Chief Financial Officer.

Key Indicator Data Comparison

The CCDO reported the number of homicides per Weed and Seed site for FYs 2004 and 2005 data in the FY 2006 PAR. In order to verify these numbers, we reviewed the summary reports provided by the JRSA and did not identify any discrepancies with the performance data reported. However, as previously mentioned, we found that CCDO misidentified the time periods that the data covered in the FY 2006 PAR.

Additionally, using the same summary reports we did not identify any discrepancies with the 17.8 percent reduction of homicides per site funded under the Weed and Seed Program reported for FY 2005 in the FY 2006 PAR. However, we identified an issue with the methodology used to calculate the reduction of homicides per site funded under the Weed and Seed Program. We found that the data sets used to report on the number of homicides per Weed and Seed site included data from all reporting sites irrespective of whether reporting occurred in previous years. This methodology prevents the data sets from being comparable because different grantees were included in each data set. In discussing this issue with CCDO personnel, they agreed that for the reduction portion of the indicator, they should have used data sets that were limited to sites with data for both years. We recommend that for the reduction portion of this key indicator, the CCDO either use data sets that are limited to sites with data for both years or, in the alternative, remove the reduction portion from this key indicator and only report on the number of homicides per Weed and Seed site.

Disclosure of Data Limitations

We found that in the FY 2006 PAR, the following data limitation was disclosed “Data for this measure are reported by CCDO grantees on a calendar year cycle.” However, the performance data reported in the PAR is presented as fiscal year, and as previously mentioned in the Data Collection and Storage section of this report the CCDO misidentified the time periods that the number of homicides per Weed and Seed site covered.

Furthermore, we discussed and identified additional data limitations for this key indicator with CCDO and JRSA personnel. Our primary concern was that the Weed and Seed grants have 5‑year designations. Therefore, grantees may be at a different phase in their program depending upon the number of years they have received grant funds. We also determined that the 5‑year designation creates an additional data limitation, since grantees do not begin the 5‑year designation at the same time. Therefore, the scope of the data changes each year as new grantees are added and other grantees reach the end of their 5‑year designation. Finally, CCDO and JRSA personnel informed us that not all Weed and Seed sites are comparable. Some of the differences include population demographics and population size. Therefore, the data for this key indicator is difficult to compare across years and among sites. We recommended that the CCDO disclose the year‑to‑year and site comparative data limitations within the data limitations section of the PAR or, in the alternative, revise the key indicator to eliminate these data limitations. Prior to the release of the FY 2007 PAR, we discussed this preliminary recommendation with the CCDO. CCDO personnel agreed and took action to disclose the year-to-year and site comparative data limitations in the FY 2007 PAR.

As part of our audit work, we reviewed the FY 2007 PAR and found that CCDO personnel added the following statement: “There are slight variances in the group of local sites reporting each year due to some sites’ Official Recognition status expiring and adding newly funded sites. For this reason, the OJP requests multiple years of crime data in every CCDO required annual GPRA report, so that we can do multi-year analyses for the same group of sites and jurisdictions. This means that the average number of homicides reported for a given calendar year will be different for every year’s GPRA dataset.” In our opinion the CCDO sufficiently disclosed the year-to-year and site comparative data limitations within the data limitation section of the FY 2007 PAR. Therefore, we are not including a formal recommendation on this issue.

Recommendations

We recommend that OJP:

  1. Coordinate with the CCDO to present the accurate scope of the performance data in the PAR by listing the correct calendar year that the data covers for the key indicator “Reduction of Homicides Per Site Funded Under the Weed and Seed Program.”

  2. Coordinate with the CCDO to either use data sets that are limited to sites with data for both years or, in the alternative, remove the reduction portion from this key indicator and only report on the number of homicides per Weed and Seed site for the key indicator “Reduction of Homicides Per Site Funded Under the Weed and Seed Program.”

Percent Reduction in DNA Backlog – OJP

This key indicator measures the reduction of DNA samples awaiting analysis resulting from activities funded under the Convicted Offender and the Forensic Casework DNA backlog reduction grant programs. The Convicted Offender DNA Backlog Reduction program offers assistance to existing crime laboratories that conduct DNA analysis to reduce their backlog of convicted offender DNA samples. The Forensic Casework DNA Backlog Reduction Program offers assistance to existing crime laboratories that conduct DNA analysis to analyze backlogged forensic DNA casework samples from forcible rape, murder, and non-negligent manslaughter. The OJP National Institute of Justice (NIJ) measures this indicator.

Data Collection and Storage

We did not identify any issues with the NIJ’s data collection and storage processes for the convicted offender DNA backlog data for this key indicator. The convicted offender DNA backlog data consists of DNA samples that state laboratories are unable to analyze with available resources. Annually, the NIJ submits a data call to state laboratories requesting their final projected backlog numbers as of September 30 of the previous year and the funds needed to complete and reduce the case backlog. The NIJ compiles this information in a spreadsheet consisting of all state laboratories needing funding and the amount requested. The information received from the state laboratories and the spreadsheets are archived on OJP’s network, which is backed-up daily.

Additionally, we did not identify any issues with the NIJ’s data collection and storage processes for the casework DNA backlog data for this key indicator. The NIJ does not receive projections for casework DNA backlog numbers. Instead, the NIJ sends a solicitation to grantees reporting the amount of funding available and requests grant applications to be submitted through GMS. The applications are to include: (1) the number of cases the laboratory possesses, (2) the number of cases in storage that have not been submitted, and (3) the amount of funding requested. The NIJ compiles spreadsheets of the data received in GMS, the backlog percentage is calculated, and the grants are awarded. According to NIJ personnel, they receive approximately 100 casework DNA backlog grant applications each year and try to fund all of them. The grant applications are stored in GMS, and the data is backed‑up incrementally throughout the week. The grant applications and the NIJ spreadsheets are currently stored indefinitely on OJP’s network.

Data Validation and Verification

We did not identify any issues with the NIJ’s data validation and verification processes for the convicted offender or casework DNA backlog data for this key indicator. These processes are ongoing and include multiple levels of review including semi‑annual progress reports and Grant Progress Assessment site visit reports. Semi‑annual progress reports are submitted by grantees through GMS and provide the NIJ with information regarding progress achievements in relation to project milestones. According to NIJ personnel, state laboratories that receive NIJ funding become part of the Grant Progress Assessment program and are reviewed by the NIJ on a 2‑year rotating basis. These reviews help ensure that grant funds are used properly.

Further validation and verification is performed by OJP’s Budget Planning and Performance Division in the Office of the Chief Financial Officer. Additionally, the MD&A write-up is reviewed by personnel within OJP’s Budget Planning and Performance Division in the Office of the Chief Financial Officer, the Audit and Review Division in the Office of Audit, Assessment, and Management, and the Chief Financial Officer.

Key Indicator Data Comparison

In order to verify the percent reduction in the convicted offender and casework DNA backlogs reported for FY 2006 in the FY 2006 PAR, we reviewed the NIJ’s spreadsheets showing the number of convicted offender and casework DNA samples funded for analysis and the NIJ spreadsheets summarizing the state’s convicted offender backlog estimations and the casework backlog estimations. Based on our review, we did not find any discrepancies with the performance data reported for this key indicator for FY 2006.

Disclosure of Data Limitations

The FY 2006 PAR reported no known data limitation for this key indicator and during our audit we did not identify any data limitations for this key indicator.

Number of Participants in the Residential Substance Abuse Treatment Program – OJP

This key indicator assesses the number of participants served by the Residential Substance Abuse Treatment (RSAT) Program and is measured by the OJP Bureau of Justice Assistance (BJA). The RSAT grants are formula based, meaning the grants are awarded to the 50 states and 6 territories on a noncompetitive basis.25 The RSAT program provides inmates with individual and group treatment activities that focus on substance abuse problems in a residential treatment facility set apart from the general correctional population. The substance abuse treatment programs are generally conducted within the walls of the prisons. However, up to 10 percent of grant funds may be used for aftercare facilities.

Data Collection and Storage

Annually, the BJA issues a grant announcement requesting that grant applications be submitted in GMS by a designated deadline. The grant announcement requires grantees to submit an annual RSAT Report that collects numerical, cumulative, and narrative information on the treatment program and participants in the program during the grantee’s previous fiscal year. The state administering agency or the state department of corrections is responsible for gathering the data and completing the annual RSAT Report. However, the state may designate other personnel to complete this task. The annual RSAT Reports are submitted in GMS as a requirement of the grant. BJA personnel use the information on the annual RSAT Reports to compile a consolidated spreadsheet for reporting the information for this key indicator.

We found that the grantees’ fiscal years differ. The scope of the grantees’ fiscal year may be the federal government’s fiscal year, the calendar year, or the state’s fiscal year. As a result, the data reported by the grantees and compiled for this key indicator represents various time periods and is neither exclusively fiscal year nor calendar year data. We recommend that the BJA develop and implement procedures for collecting and reporting data for a single consecutive 12‑month period or disclose this as a data limitation within the data limitations section of the PAR.

We did not identify any issues with the BJA’s data storage processes for this key indicator. The annual RSAT Reports are stored in GMS, and OJP’s official hardcopy file is stored by the Office of the Chief Financial Officer. Additionally, BJA program managers may maintain copies of the annual RSAT Reports for their files.

Data Validation and Verification

Data validation begins when BJA program managers receive the annual RSAT Reports. The program managers review the reports for completeness to determine whether the reports are acceptable. If an RSAT Report is determined unacceptable, a BJA program manager notifies the grantee and requests additional information. The information contained in the reports is copied directly from GMS by OJP’s helpdesk and BJA personnel into a spreadsheet, instead of manually typing the information. This process prevents transposing numbers and other manual typing errors.

According to BJA personnel, all states and territories submitted reports in 2005. BJA personnel explained that if a grantee has not submitted its report, a BJA program manager contacts the grantee to follow up on the missing report and determine the reason it has not been submitted. According to the OJP Grant Manager’s Manual dated September 2005, when annual reports are not received within 30 days of the due date, the grantee’s account is considered delinquent and the grant funds are automatically frozen by GMS until the annual RSAT Report is received.

Validation and verification is performed by OJP’s Budget Planning and Performance Division in the Office of the Chief Financial Officer through interviews with each division and the completion of the Data Verification Form. This form is used for each key indicator and performance measure, regardless of whether it is included in the PAR. The form collects information on the data collection and validation processes, data limitations, and the key indicator’s target and actual performance. It also requires confirmation of the reported actual performance and a signature by a program office point of contact. The information on the Data Verification Form is used to revise the discussion section in the PAR and confirm the accuracy of the information presented in the PAR for that key indicator. Additionally, the MD&A write up is reviewed by various personnel within OJP’s Budget Planning and Performance Division in the Office of the Chief Financial Officer, the Audit and Review Division in the Office of Audit, Assessment, and Management, and the Chief Financial Officer.

Key Indicator Data Comparison

In order to verify the number of participants in the RSAT program in 2005 reported in the FY 2006 PAR, we reviewed the BJA’s spreadsheet compiling all of the information from the grantees’ annual RSAT Reports.26 The results are shown in Table 5.

TABLE 5:   NUMBER OF PARTICIPANTS IN THE RESIDENTIAL SUBSTANCE ABUSE TREATMENT PROGRAM

SOURCE RESULTS
FY 2006 PAR Results 35,350
OIG Audited Results 31,740
Source:   FY 2006 DOJ PAR and the BJA’s spreadsheet

Based on our discussion with BJA, BJA identified a discrepancy with the 35,350 RSAT program participants reported for 2005 in the FY 2006 PAR. Using the spreadsheet provided by the BJA, BJA pointed out that for 2005 it should have reported a total of 31,740 RSAT program participants. Therefore, the number of RSAT program participants was overstated by 3,610 participants or 10.21 percent. We discussed this issue with BJA management who attributed this error to challenges with the reporting features within the GMS.

According to DOJ’s Financial Statement Requirements and Preparation Guide, “If actual performance data has changed from what was previously reported in either document [PAR and the Department's Annual Budget Summary], components must provide a full explanation in their MD&A. The explanation must include sufficient detail for reviewers/auditors to follow.” We recommended that the BJA implement procedures to ensure that RSAT data is accurately reported. Further, we recommended that the BJA comply with the DOJ Financial Statement Requirement and Preparation Guide and restate the number of RSAT participants for 2005 and provide a full explanation in its MD&A, which is compiled into the PAR. Prior to the release of the FY 2007 PAR we discussed our preliminary recommendations with BJA, who concurred and took action to disclose the overstatement for 2005 in the FY 2007 PAR.

As part of our audit work, we reviewed the FY 2007 PAR and found that BJA disclosed that “In Spring of 2007, the 2005 performance was re‑verified by the Bureau of Justice Assistance (BJA). BJA determined that the actual count was 31,740 rather than 35,350 reported in the 2006. The variance in the number previously reported is a result of the OJP’s continuing efforts to enhance data collection and data verification processes.... As a result, previously submitted numbers were updated and resubmitted to reflect more accurate numbers and additional reports received from some states.” In addition, the BJA reported the correct number of RSAT participants for 2005 in the FY 2007 PAR. In our opinion, the BJA adequately restated the number of RSAT participants for 2005, and provided an explanation in the PAR in accordance with our preliminary recommendation. However, we continue to recommend that the BJA develop and implement procedures to ensure that RSAT data is accurately reported in the future.

Disclosure of Data Limitations

In the FY 2006 PAR, the BJA disclosed the following data limitation, “Statutorily mandated calendar year reporting requirement.” However, the data is presented as “FY 2005 Actual” data in the PAR. Therefore, the data limitation and the scope of the data presented in the PAR are contradictory. Further, we found that the data is reported on the grantee’s fiscal year, which represents various time periods that are neither exclusively fiscal year nor calendar year data. We recommended that the BJA present the accurate scope of the performance data in the PAR. We discussed this preliminary recommendation with the BJA prior to the release of the FY 2007 PAR. BJA personnel concurred and presented the accurate scope of the performance data in the FY 2007 PAR.

As part of our audit, we assessed BJA’s compliance with our preliminary recommendation and found that the BJA presented the accurate scope of the performance data in the FY 2007 PAR by disclosing that the performance data is collected according to the grantee’s fiscal year, which may not be the same for all grantees. Therefore, we are not including a formal recommendation on this issue.

Recommendations

We recommend that OJP:

  1. Coordinate with the BJA to develop and implement procedures for collecting and reporting data for a single consecutive 12‑month period or disclose this as a data limitation within the data limitations section of the PAR for the key indicator “Number of Participants in the Residential Substance Abuse Treatment Program.”

  2. Coordinate with the BJA to develop and implement procedures to ensure that RSAT data is accurately reported in the future for the key indicator “Number of Participants in the Residential Substance Abuse Treatment Program.”

Increase in the Graduation Rate of Drug Court Program Participants ‑ OJP

This key indicator tracks the graduation rate of Drug Court Program participants and calculates the increase in the graduation rate. It is measured by OJP’s BJA. The Drug Court Discretionary Grant Program is a competitive solicitation that accepts applications from potential grantees for grant funds. A peer review looks at each application and selects the grantees. The grant funds are to be used by the grantee to establish or enhance its Drug Court Program. The goal of the BJA’s Drug Court Program is to provide financial assistance to states, state and local courts, units of government, and tribal governments in order to improve or establish drug treatment courts.

The Drug Court Program began as a way to reduce crime and substance abuse among non‑violent offenders. A defendant may enter a Drug Court Program by making or accepting a guilty plea and successfully completing the program. After sentencing, the drug court assigns the defendant a treatment provider with educational resources and programs lasting 12 to 16 months. Once the Drug Court Program is successfully completed, the defendant’s case is often dismissed.

Data Collection and Storage

The data collection process begins by collecting the semi-annual progress reports submitted by the grantees. The reporting periods for the two progress reports are January 1 through June 30 and July 1 through December 31.27 The progress reports require the following information for the reporting period: (1) the number of participants in the grantee’s Drug Court Program, (2) the number of participants that graduated from the Drug Court Program, and (3) the services that the Drug Court Program provided. The progress reports are due within 30 days of the end of the reporting period and are submitted through GMS. In addition, grantees are required to submit a final progress report once the grant ends.

BJA program managers review progress reports to ensure all of the information has been provided by the grantee. The data is then extracted from GMS by OJP Information Technology personnel and provided to BJA staff in a spreadsheet. The BJA policy advisor reviews the spreadsheet and looks for any missing information. If the policy advisor requires any additional information, that person notifies the program manager to contact the grantee. The program specialist reviews historical data and deviations from the mean to determine whether the aggregated data is comparable of historical data.

If a grantee does not provide a progress report, a BJA program manager contacts the grantee to follow up on the missing report and determine why it has not been submitted. When progress reports are not received within 30 days of the due date, the grantee’s account is considered delinquent and its funds are automatically frozen by GMS until the progress report is submitted.

We found that the 31.9 percent graduation rate of drug court participants reported for FY 2006 in the FY 2006 PAR does not encompass the entire fiscal year. Instead, it only represents January through June of FY 2006. Therefore, the drug court graduation percentage reported in the FY 2006 PAR may be inaccurate because it represents 6 months instead of a 12‑month period. Further, this may affect the 13.8 percent increase in the graduation rate reported in the FY 2006 PAR. We recommended that the BJA revise its reporting procedures to ensure that it collects complete data from all grantees for a consecutive 12‑month period.

Further, we found that the data reported in the FY 2006 PAR is presented as fiscal year data when instead it represents the state’s fiscal year data using the data collection methodologies described by the BJA. We recommended that the BJA present the accurate scope of the performance data in the PAR.

Prior to the release of the FY 2007 PAR, we discussed both of these preliminary recommendations with BJA personnel. BJA agreed with both recommendations and took action to collect data for a consecutive 12‑month period, disclose the scope of the 12‑month period, and present the accurate scope of the performance data in the FY 2007 PAR. As part of our audit, we reviewed the FY 2007 PAR and found that the BJA disclosed that “End of year performance data for the Drug Court Program is provided by semi‑annual progress reports via the GMS in August. Beginning with data reported for 2007, data collected and reported will cover a single consecutive 12-month period from July 1, 2006 through December 31, 2006 and January 1, 2007 through June 30, 2007.” In our judgment, the BJA now adequately collects data for a consecutive 12-month period, disclosed the scope of that 12-month period, and presented the accurate scope of the performance data in the FY 2007 PAR. Therefore we are not including any formal recommendations on these issues.

We did not identify any issues with the BJA’s data storage processes for this key indicator. The data for this key indicator is collected on progress reports that are stored in GMS, and OJP’s official hardcopy file is stored by the Office of the Chief Financial Officer. In addition, BJA program managers may maintain a copy of the progress reports for their files.

Data Validation and Verification

We did not identify any issues with the BJA’s data validation and verification processes for this key indicator. Data validation and verification begins with the program office continually: (1) reviewing progress reports, (2) conducting desk reviews, and (3) conducting site visits. BJA program managers are responsible for managing the grants awarded and reviewing the progress reports to ensure all of the needed information has been provided by the grantee. According to BJA personnel, program managers monitor between 250 and 300 grants covering 25 to 30 programs primarily through desk reviews and site visits.

According to the September 2005 OJP Grant Manager's Manual “A desk review or desk monitoring consists of reviewing grant files to ensure they are complete, accurate, and up-to-date so as to assess grantee performance and compliance.” The August 2005 BJA Monitoring Guide explains that “The desk review will assist BJA staff in determining which grantees need the most assistance requiring a monitoring visit.” We obtained a list of the desk reviews conducted in FY 2006 and found that 166 desk reviews were conducted on 126 grants. Therefore, the BJA complied with OJP’s requirement to conduct desk reviews periodically.

We also determined that in FY 2006 the BJA conducted 12 site visits on 12 grants to review financial reports and ensure grantees are keeping current with grant awards and file maintenance. Specifically, BJA staff “visit the program facility and meet with staff to ensure that the program adheres to established guidelines.”28 Additionally, the site visits determine the grantee’s graduation criteria and the number and percentage of Drug Court Program participants who graduated from the program. The September 2005 OJP Grant Manager's Manual says that “The number of times a grant manager conducts an onsite visit is determined by each bureau or program office and based upon programmatic need or requests by the grantee.” Therefore, the BJA complied with OJP’s requirement to conduct site visits at the discretion of each bureau.

Validation and verification is performed by OJP’s Budget Planning and Performance Division in the Office of the Chief Financial Officer through interviews with each division and the completion of the Data Verification Form. This form is used for each key indicator and performance measure, regardless of whether it is included in the PAR. The form collects information on the data collection and validation processes, data limitations, and the key indicator’s target and actual performance. Additionally, the form requires confirmation of the reported actual performance and a signature by a program office point of contact. The information on the Data Verification Form is used to revise the discussion section in the PAR and confirm the accuracy of the information presented in the PAR for that key indicator. Finally, the MD&A write up is reviewed by personnel within OJP’s Budget Planning and Performance Division in the Office of the Chief Financial Officer, Audit and Review Division in the Office of Audit, Assessment, and Management, and the Chief Financial Officer.

Key Indicator Data Comparison

In order to verify the increase in the graduation rate of Drug Court Program participants reported for FY 2006 in the FY 2006 PAR, we reviewed the BJA’s supporting document that compiled all of the information from the grantees’ progress reports. Based on our review, we did not find any discrepancies with the performance data reported for this key indicator for FY 2006. However, as stated previously, this percentage does not encompass a 12‑month period.

Additionally, we found that the bar graph for this key indicator in the FY 2006 PAR is titled, “Increase in the Graduation Rate of Drug Court Program Participants,” while the bar graph displayed the graduation percent, not the graduation percent increase. Specifically, the bar graph illustrates the graduation rate of 31.9 percent instead of the 13.8 percent increase in the graduation rate as stated in the title. JMD uses the DOJ components’ MD&A to draft the PAR and is responsible for generating the bar graphs in the PAR. Therefore, we recommended that JMD revise the title of the bar graph in the PAR in order to clarify the information illustrated. We discussed this preliminary recommendation with JMD prior to the release of the FY 2007 PAR. JMD concurred with our preliminary recommendation and took action to adjust the title of the bar graph in the FY 2007 PAR.

As part of our audit, we reviewed the FY 2007 PAR and found that JMD changed the title to “Graduation Rate of Program Participants in the Drug Courts Program.” In our opinion, JMD adequately revised the title of the bar graph in the FY 2007 PAR. Therefore, we are not including a formal recommendation on this issue.

Disclosure of Data Limitations

In the FY 2006 PAR, the BJA did not identify any data limitations. Based on our review, we did not identify any additional data limitations for this key indicator.

IV.    STRATEGIC GOAL IV: ENSURE THE FAIR AND EFFICIENT OPERATION OF THE FEDERAL JUSTICE SYSTEM

According to the DOJ FY 2003‑2008 Strategic Plan, “The Department plays a key role in the administration of the federal justice system.” DOJ’s responsibilities include protecting judges, witnesses, and federal proceeding participants; ensuring the appearance of criminal defendants for judicial proceedings and confinement; apprehending fugitives; providing safe, secure, and humane confinement for detained persons; maintaining and operating the federal prison system; providing services and programs to assist inmates to successfully re‑enter society; and adjudicating all immigration cases promptly and impartially in accordance with due process.

We discuss below each of the nine key indicators we reviewed related to this strategic goal.

Number of Judicial Proceedings Interrupted Due to Inadequate Security – USMS

This key indicator determines the number of judicial proceedings interrupted due to inadequate security and is measured by the United States Marshals Service (USMS). Interruption of a judicial proceeding is defined by the USMS as either removal of a judge from a courtroom or a suspended proceeding while the USMS requests additional deputies to guarantee the safety of the judge, witnesses, and other participants.

According to the FY 2006 PAR, “The USMS maintains the integrity of the judicial security process by: (1) ensuring that each federal judicial facility is secure – physically safe and free from any intrusion intended to subvert court proceedings; (2) guaranteeing that all federal, magistrate, and bankruptcy judges, prosecutors, witnesses, jurors, and other participants have the ability to conduct uninterrupted proceedings; (3) maintaining the custody, protection, and safety of prisoners brought to court for any type of judicial proceeding; and (4) limiting opportunities for criminals to tamper with evidence or use intimidation, extortion, or bribery to corrupt judicial proceedings.”

Data Collection and Storage

We did not identify any issues with the USMS's data collection and storage processes for this key indicator. Data for this key indicator is collected on the Use of Force Reports, Form USM‑133, which details major occurrences. A Form USM-133 is completed by USMS Deputies in 94 USMS districts and reviewed by district directors or program managers prior to providing it to the Office of Internal Affairs. USMS policy requires completion of the Form USM‑133 after a major occurrence.

Data Validation and Verification

We did not identify any issues with the USMS’s data validation and verification processes for this key indicator. The district directors or program managers review the completed USM‑133 Forms. The USMS Planning and Evaluation Group receive both the completed USM‑133 Forms and weekly e-mails to verify the incident and determine if it meets the key indicator definition. The USMS Planning and Evaluation Group contact the USMS districts and divisions to verify the incident, prior to preparing the quarterly status reports for JMD. The quarterly status reports are used to cross-reference the annual number of interruptions provided for the PAR.

Key Indicator Data Comparison

According to the USMS, no judicial proceedings were interrupted due to inadequate security during FY 2006. Based on our review, we did not find any discrepancies with the performance data reported for this key indicator for FY 2006.

Disclosure of Data Limitations

The FY 2006 PAR disclosed that “This measure was not tracked or reported until FY 2003.” During our audit, we did not identify any data limitations for this key indicator beyond the limitation already disclosed.

Federal Fugitives Cleared or Apprehended – USMS

This key indicator identifies the number and percent of federal fugitives cleared or apprehended and is measured by the USMS. In the FY 2006 PAR the USMS states that, “Fugitives cleared consists of those cases that the USMS has successfully completed all aspects of closure and has removed from the active and outstanding records. This definition holds true in cases where we do or do not have primary apprehension responsibility.” Cleared fugitives include those who are deceased or who have been apprehended, cases that have been dismissed, and any other reason why the USMS search has ended.

According to a statement in the FY 2006 PAR, “The USMS has primary jurisdiction to conduct and investigate fugitive matters involving escaped federal prisoners, probation, parole, bond default violators, warrants generated by DEA investigations, and certain other related felony cases.”

Data Collection and Storage

We did not identify any issues with the USMS’s data collection and storage processes for this key indicator. Data for this key indicator is collected and maintained in the USMS Warrant Information Network (WIN), which tracks federal warrants. A Deputy Marshal or Investigative Research Specialist at each of the 94 USMS districts enters warrant information into WIN and the information is reviewed by a Warrant Supervisor. The USMS Planning and Evaluations Group calculate the number and percent of fugitives cleared using information from WIN, which is checked by the USMS Investigative Services Division.

Data Validation and Verification

We did not identify any issues with the USMS's data validation and verification processes for this key indicator. These processes are conducted by USMS Investigative Research Specialists using the Public Access to Court Electronic Records system to run reports and obtain case and docket information from federal courts to verify the warrant information. They also revise case information in WIN from USMS Warrant Update Forms. In addition, Investigative Research Specialists receive USMS Wanted Person Record Validation Memorandums detailing the records each district is responsible for validating. Investigative Research Specialists confirm that warrants in WIN are active, verify the WIN data against signed paper records, and update the information in WIN. Additionally, USMS personnel using WIN check the data entered into the system and cross‑reference it to their cases. Finally, USMS auditors conduct internal audits of WIN and case files. During our audit, we observed how warrants are entered into WIN and the system controls used to help ensure valid and accurate data.

Key Indicator Data Comparison

In order to verify the percent and number of federal fugitives cleared or apprehended in FY 2006 and reported in the FY 2006 PAR, we reviewed the USMS’s WIN reports. Based on our review, we did not find any discrepancies with the performance data reported for this key indicator for FY 2006.

Disclosure of Data Limitations

The FY 2006 PAR disclosed that “These elements of data are accessible to all 94 judicial districts and are updated as new information is collected. There may be a lag in the reporting of data.” During our audit, we did not identify any additional data limitations beyond the limitation already disclosed for this key indicator.

Per-Day Jail Costs – OFDT

This key indicator measures the per-day jail costs, which is the weighted average of the “actual price paid (over a 12-month period) by the USMS to house federal prisoners in non‑federal detention facilities.”29 It is measured by the Office of the Federal Detention Trustee (OFDT) using information from the USMS.

According to the OFDT in the FY 2006 PAR “DOJ acquires detention bed space to house pretrial detainees through reimbursable Intergovernmental Agreements (IGAs) with State and local governments and contracts with private vendors.” The OFDT uses information from the USMS’s Prisoner Tracking System (PTS) to calculate the per-day jail cost. PTS is decentralized with each of the 94 USMS districts maintaining separate PTS databases.

Data Collection and Storage

We did not identify any issues with the USMS’s or OFDT’s data collection and storage processes for this key indicator. USMS districts communicate with jails to locate available space for prisoners. The districts track and enter all prisoner movement information into the PTS databases. Currently, the aggregate data from the previous day is added to the centralized database at USMS headquarters. The jail rates in PTS are set and entered by USMS headquarters based upon established contracts and intergovernmental agreements. Monthly, each of the USMS districts calculate the jail rates from the counties and establish an obligation in the Financial Management System.30 The obligations are closed out each month and sent to USMS headquarters and the Financial Management System is reconciled with the USMS’s Standardized Tracking, Accounting, and Reporting System.

Nightly, the OFDT receives data from PTS regarding the number of prisoners held at each facility as of the close of business. This information is used to complete a monthly report to calculate an average jail-day rate for the month. Monthly reports are consolidated to generate a quarterly report, which is used to generate a summary report. The summary report provides the quarterly and fiscal year jail-day rates reported in the PAR. All of these reports are stored on the OFDT’s computer system and can also be retrieved from PTS if needed. They are also linked with automatic data feeds to reduce data errors. The USMS and OFDT computer systems have user and security access controls that limit access and edits to the systems. During our audit, we observed prisoner information being entered into PTS and the system controls that help ensure valid and accurate data.

Data Validation and Verification

We did not identify any issues with the USMS’s or OFDT’s data validation and verification processes for this key indicator. These processes are ongoing and reviewed by multiple levels of personnel within the USMS. USMS headquarters personnel also validate the jail days and jail rates each month by running PTS reports by district. USMS runs a Jail Utilization Report monthly and provides it to the OFDT. The OFDT uses the Jail Utilization Report to verify the daily population feeds and the monthly feeds from the PTS.

Key Indicator Data Comparison

In order to verify the per-day jail costs reported for FY 2006 in the FY 2006 PAR, we reviewed the OFDT’s monthly, quarterly, and summary reports. Based on our review, we did not find any discrepancies with the performance data reported for this key indicator for FY 2006.

Disclosure of Data Limitations

For FY 2006, the OFDT disclosed that the “PTS is very time and labor intensive. Lack of a real‑time centralized system results in data that is close to six weeks old before it is available at a national level.” During our audit, we did not identify any data limitations for this key indicator beyond the limitation already disclosed in the FY 2006 PAR.

System-wide Crowding in Federal Prisons – BOP

This key indicator assesses the ratio of inmates held in BOP facilities compared to the inmate capacity at BOP facilities and reports the percent over capacity. In the FY 2006 PAR, the BOP explained that “System-wide [crowding] represents all inmates in BOP facilities and all rated capacity, including secure and non-secure (minimum security) facilities, low, medium, and high security levels, as well as administrative maximum, detention, medical, holdover, and other special housing unit categories.” The BOP measures this key indicator.

Data Collection and Storage

We did not identify any issues with the BOP’s data collection and storage processes for this key indicator. The inmate data for this key indicator is collected and stored in the BOP’s SENTRY system. Data entry in SENTRY is centralized at the BOP Designation Sentence Computation Center in Grand Prairie, Texas. The Inmate System Management Unit at each BOP facility is responsible for correcting and updating the information in SENTRY. BOP headquarters determines the rated capacity for each facility and records it in SENTRY. Population levels are analyzed daily, and a SENTRY report provides the inmate count within every BOP institution. The percentage of capacity is then calculated by dividing the inmate population count by the rate of capacity. Monthly, data is copied from SENTRY and placed in SAS and the Key Indicators/Storage Support System (KI/SSS).

Data Validation and Verification

We did not identify any issues with the BOP’s data validation and verification processes for this key indicator. The primary validation process is conducted by the BOP’s Program Review Division. Additionally, supervisors at the facilities, system managers, and the Correctional Services Division validate the data in SENTRY.

Key Indicator Data Comparison

In order to verify the system-wide crowding rate in FY 2006 reported in the FY 2006 PAR, we reviewed the BOP’s Population Report. Based on our review, we did not find any discrepancies with the performance data reported for this key indicator for FY 2006.

Disclosure of Data Limitations

The BOP did not identify any data limitations in the FY 2006 PAR. During our audit, we did not identify any data limitations for this key indicator.

Escapes from Secure BOP Facilities – BOP

This key indicator measures the number of escapes from secure BOP facilities, which includes administrative institutions and low, medium, and high security institutions. The security levels of BOP facilities are classified as either minimum, low, medium, or high, depending in part on the physical design of each facility. The administrative category exists for specialized populations such as pre‑trial, mental health, and sex offender inmates. The BOP measures this indicator.

Data Collection and Storage

We did not identify any issues with the BOP’s data collection or storage processes for this key indicator. The inmate data is collected and stored in the BOP SENTRY system. For monitoring purposes, inmates are counted at each BOP facility five times a day by personnel comparing a picture of each inmate with the inmate in the cell. Escapes from secure BOP facilities are rare and therefore are well known when they occur. During our audit, we observed various modules in SENTRY and the system controls that help ensure the data is valid and accurate.

The BOP Report of Incident, Form 583, is used as a first alert when an event such as an escape, riot, or assault occurs. The Form 583 lists all inmates thought to be involved in the escape; a separate Form 583 is not completed for each inmate. This form is submitted by the BOP facility where the escape occurred to the Correctional Services Division at the central office, and is entered into various database systems, including SENTRY and the BOP KI/SSS. After BOP personnel complete the preliminary interviews with the inmates listed on the Form 583, a Misconduct Form is completed for each inmate who is determined to have been involved in the escape and this information is entered into the Chronological Disciplinary Report module of SENTRY. The escape data in KI/SSS is later compared to the Chronological Disciplinary Reports to ensure all of the information is accurate. The Information Policies and Public Affairs Office extract the escapes data directly from the BOP KI/SSS to compile the data for this key indicator. The Budget Development Branch reviews the key indicator data before submitting it to JMD for PAR reporting.

The completed Forms 583 are stored in the Investigation Division for record-keeping after being forwarded to the BOP Correctional Services Division in Washington, D.C. A hard copy of the Misconduct Form is filed in the inmate’s paper file at the BOP facility.

Data Validation and Verification

We did not identify any issues with the BOP’s data validation and verification processes for this key indicator. However, we found that few validation procedures are in place since the escapes are well-known when they occur. BOP facilities are responsible for correcting and updating the information in SENTRY, and SENTRY is equipped with data entry controls that limit and restrict access to data and edits to the system. Program reviews are performed to ensure that BOP policies are being adhered to and recorded escapes are investigated to determine why and how they occurred. Additional reviews at BOP facilities include reviews of the SENTRY data by supervisors, system managers, the Inmate System Management Unit, and the Correctional Services Division. At the headquarters level, the Information Policies and Public Affairs Office compares the Chronological Disciplinary Reports to KI/SSS to ensure all of the information was copied over from SENTRY.

Key Indicator Data Comparison

In order to verify the number of escapes from secure BOP facilities in FY 2006 reported in the FY 2006 PAR, we reviewed the BOP’s KI/SSS report on escapes. Based on our review, we did not find any discrepancies with the performance data reported for this key indicator for FY 2006.

Disclosure of Data Limitations

The BOP did not identify any data limitations in the FY 2006 PAR and during our audit, we did not identify any data limitations for this key indicator.

Comparative Recidivism Rates for FPI Inmates versus Non-FPI Inmates – FPI, BOP

This key indicator compares recidivism rates for inmates who participated in the Federal Prison Industry (FPI) versus inmates who did not participate, 3 years and 6 years after release from a secure facility. For this key indicator, recidivism is defined in the PAR as “a tendency to relapse into a previous mode of behavior....” The BOP defines recidivated cases as individuals who are arrested and returned to the legal system.

Data Collection and Storage

We did not identify any issues with the BOP’s or FBI’s data collection and storage processes for this key indicator. Data for this key indicator is collected in two systems, the BOP’s SENTRY and the FBI’s Interstate Identification Index (III). SENTRY contains such inmate information as personal characteristics, background, criminal history, and the programs the inmate participated in while in a BOP facility. Data entry is centralized at the BOP Designation Sentence Computation Center and the Inmate System Management Unit at each BOP facility is responsible for correcting and updating the information in SENTRY. The BOP’s SENTRY is equipped with data entry controls that can limit and restrict access to data and edits to the system. The FBI’s III contains records of state and federal arrests and is used by the BOP to depict a more accurate and complete picture of inmate recidivism.

Annually, the BOP sends a file listing all of the inmates in SENTRY to the FBI data center. The file includes the inmate’s FBI number, registration number, and name. The FBI matches and merges this information with the III data, storing the information on a tape that is provided to the BOP. The BOP loads this information onto its server and uses SAS to analyze the data.

SAS uses Cox’s Proportional Hazard Model to analyze this information because it can estimate the outcome by comparing one group to another group, using the assumption that the two groups recidivate at the same rate.31 First, the propensity score is used to ensure the two groups are comparable and then to select the appropriate comparison subjects. This is achieved by having a study group that participated in the FPI program for 6 months or longer and a reservoir consisting of inmates who would have participated in the FPI program if the opportunity would have presented itself. Each study group individual is matched to an individual in the reservoir who has similar characteristics, including age, race, sex, criminal history, and previous conviction. Each individual in the study group can only be matched to one individual in the reservoir, and once an individual in the reservoir is matched that person is removed from the reservoir. Propensity score matching occurs quarterly for the inmates released during the quarter because employment is seasonal and recidivism can relate to unemployment. According to BOP personnel, the propensity score allows for the determination of an unbiased effect.

Cox’s Proportional Hazard Model summarizes the individual predictions. The scope of the annual data runs are recidivism after 3 years and recidivism after 6 years. Cox’s Proportional Hazard Model predicts the number of days inmates will be released before they recidivate.

The data for this key indicator are stored in the BOP's SENTRY and the FBI's III. Additionally, BOP officials stated that each BOP facility maintains a paper file for each inmate who served time in that facility and this information is also stored in SENTRY.

Data Validation and Verification

We did not identify any issues with the data validation and verification processes for this key indicator. As mentioned previously, BOP facilities are responsible for correcting and updating the information in SENTRY. Because the information in SENTRY is used daily by various personnel at BOP facilities, any errors identified are corrected by BOP personnel at the facility. Additionally, according to BOP policy, program reviews are conducted every 180 calendar days and include a review of inmate files for progress in recommended programs, as well as new programs that may be recommended based on the inmate’s skills.32

According to BOP personnel, data validation may be performed on the FBI’s III information. The FBI’s III is an index system to which each state’s record management systems are linked. The III continually extracts information from each state’s record management system. Verification of each state’s information is the responsibility of that state. BOP personnel added that the states may notify one another when an error is identified in another state’s information, but it is the responsibility of the state with the error to make the correction.

The final validation process is performed by the BOP Office of Research using a snapshot of the data instead of running Cox’s Proportional Hazard Model against the live data. The snapshots are reviewed by Office of Research staff to ensure that they appear accurate. The review includes comparing the snapshots to previous snapshots to identify any anomalies. Then, a data set is generated for further analysis.

Key Indicator Data Comparison

We reviewed the SAS Output Report in order to verify the comparative recidivism rates for FPI inmates versus non-FPI inmates in FY 2006 reported in the FY 2006 PAR. Based on our review, we did not find any discrepancies with the performance data reported for this key indicator for FY 2006.

Disclosure of Data Limitations

We identified two additional data limitations beyond those disclosed in the FY 2006 PAR. The FY 2006 PAR disclosed, “Although non-citizens make up a large minority of the BOP population, they are excluded from analysis because many of them are deported following release from prison, and it is not known if they recidivate. Projected targets are based on earlier studies done on recidivism of the FPI participating inmates and their nonparticipating counterparts. The results of this ongoing research may differ due to changes in the program, improved research methods, changes in the composition of the inmate population, and changes in the quality and comprehensiveness of data, especially automated data on recidivism.”

We found that Vermont, Maine, and Washington, D.C., are not participating in the FBI’s III. Therefore, if a BOP-released inmate was arrested in either of these states or in Washington, D.C., it may not be reported in the FBI’s III. Specifically, if the inmate was returned to a BOP facility, SENTRY would capture this data. However, if the inmate was returned to a state or local facility in one of these entities, the FBI’s III would not capture this data. As a result, this information would not be reported and included within the BOP's statistical model that generates the results reported in the PAR. Additionally, according to BOP personnel, for the same reason the following states may be under-represented and reported in the recidivism data: New Jersey, North Carolina, Oregon, Florida, Kentucky, Hawaii, Maine, and Idaho. We recommend that the BOP disclose this information within the data limitations section of the PAR or, in the alternative, revise the key indicator to alleviate this data limitation.

Additionally, according to BOP personnel, a data lag can occur between the time an inmate is arrested and when the information is entered into the entity’s record management system or into SENTRY. This limitation may cause the results reported in the PAR to be under‑ or over-reported. We determined that t his data limitation was not disclosed in the FY 2006 PAR. Therefore, we recommend that the BOP disclose the data lag within the data limitations section of the PAR.

During our Exit Conference, we discussed both of these recommendations with BOP personnel who explained that it has revised this key indicator to compare FPI inmates versus non-FPI inmates returned to the federal prison system for a new offense and no longer relies on the FBI’s III information. However, since this information was not reflected in FY 2007 PAR we are providing the recommendations shown below.

Recommendations

We recommend that the BOP:

  1. Disclose within the data limitations section of the PAR, the states that do not participate in the FBI’s III, and that the results reported in the PAR do not include all federal and state crimes committed and arrests in these states and Washington, D.C.; in the alternative, revise the key indicator Comparative Recidivism Rates for FPI Inmates versus Non‑FPI Inmates” to address this data limitation.

  2. Disclose within the data limitations section of the PAR the data lag between the time an inmate is arrested and when the information is entered into the state’s record management system or into SENTRY for the key indicator “Comparative Recidivism Rates for FPI Inmates versus Non-FPI Inmates.”

Rate of Assaults in Federal Prisons (Assaults per 5,000 Inmates) ‑ BOP

This key indicator is measured by the BOP and represents the number of assaults in federal prisons per 5,000 inmates, including inmate-on-inmate assaults, inmate-on-staff assaults, serious assaults, and less serious assaults.33. The assault data included in this indicator covers allegations of assault that have been adjudicated by BOP Disciplinary Hearing Officers. This indicator was changed in the FY 2007‑2012 Strategic Plan to the “Rate of Serious Assaults in Federal Prisons (per 5,000 inmates),” which beginning in the FY 2007 PAR will only measure serious inmate-on-inmate assaults, instead of inmate-on-inmate assaults, inmate-on-staff assaults, serious assaults, and less serious assaults.

Data Collection and Storage

We did not identify any issues with the BOP’s data collection or storage processes for this key indicator. Data for this key indicator is collected in the BOP’s SENTRY computer system. Data entry is centralized at the BOP Designation Sentence Computation Center, and the Inmate System Management Unit at each BOP facility is responsible for correcting and updating the information in SENTRY. SENTRY is equipped with data entry controls that can limit and restrict access to data and edits to the system.

Data collection begins with a Report of Incident Form, Form 583, which is used for immediate reporting and as a first alert when an undesirable event occurs at a BOP facility. When any assault occurs at a BOP facility, a Form 583 is completed and provided to the appropriate BOP personnel. The form includes the names of all of the inmates involved and the inmates in the area where the assault occurred. The BOP central office enters the Form 583 information into SENTRY and the completed forms are stored in the BOP Investigation Division.

After preliminary interviews with the inmates listed on the Form 583, a Misconduct Form is completed for each inmate who is determined to have been involved in the assault and this information is entered into the Chronological Disciplinary Report module of SENTRY. A hard copy of the Misconduct Form is filed in an inmate’s paper file at the BOP facility.

Disciplinary Hearing Officers, located in each BOP region, are involved in investigating the incident reported on the Misconduct Forms. When an incident occurs, the Disciplinary Hearing Officers receive a packet of information to assist with the investigation and an analysis of the incident. The Unit Disciplinary Committee conducts video conferences briefing the Disciplinary Hearing Officers, interviews the inmates involved, and makes a decision on the incident. Additionally, the Unit Disciplinary Committee tracks the pending incident and each step is updated in the Chronological Disciplinary Report module for record-keeping. The Disciplinary Hearing Officers are responsible for recording the Unit Disciplinary Committee’s decision on the incident in SENTRY.

Data is copied monthly from SENTRY and placed in SAS. Adjudicated cases are extracted from SAS and placed into the KI/SSS, where reports can be generated on this key indicator. During our audit, we observed various modules of SENTRY, SAS, and KI/SSS. In addition, we observed the system controls that help ensure valid and accurate data.

Data Validation and Verification

We did not identify any issues with the BOP’s data validation and verification processes for this key indicator. These processes are ongoing and include multiple levels of review. Because the information in SENTRY is used daily by various BOP personnel, any errors identified can be corrected by BOP personnel at the facility. Additional reviews at BOP facilities include reviews of the SENTRY data by supervisors, system managers, the Inmate System Management Unit, and the Correctional Services Division. At headquarters, the Information Policies and Public Affairs Office compare the disciplinary reports to KI/SSS to ensure all of the information was copied over from SENTRY. Finally, while adjudicating each case the Disciplinary Hearing Officers verify the information in SENTRY to the file they received.

Key Indicator Data Comparison

We reviewed the BOP's Injury Assessment for Acts of Inmate Misconduct Report in order to verify the rate of assaults per 5,000 inmates in FY 2006 reported in the FY 2006 PAR. The results are shown in Table 6.

TABLE 6:   RATE OF ASSAULTS IN FEDERAL PRISONS PER 5,000 INMATES

SOURCE RESULTS
FY 2006 PAR Results 119
OIG Audited Results 116
Source:   FY 2006 DOJ PAR and the BOP’s Injury Assessment for Acts of Inmate Misconduct Report

Based on our review, we found a discrepancy with the FY 2006 rate of assaults per 5,000 inmates reported in the FY 2006 PAR by the BOP. We found that the BOP overstated the rate of assaults by 2.76 percent. BOP personnel reviewed the data that was used in generating the 119 reported in the FY 2006 PAR and explained that the data set may have incorrectly included data on inmates housed in privately managed facilities, resulting in a larger data set and possibly higher rate of assaults.

According to DOJ’s Financial Statement Requirements and Preparation Guide “If actual performance data has changed from what was previously reported in either document [PAR and the Department's Annual Budget Summary], components must provide a full explanation in their MD&A. The explanation must include sufficient detail for reviewers and auditors to follow.” However, because the indicator was changed in the Strategic Plan for FY 2007‑2012 to only measure serious inmate on inmate assaults, we do not recommend that the BOP restate the FY 2006 actual rate of assaults in federal prisons, as it will no longer relate to the new indicator. We recommend that the BOP evaluate the cause of the overstatement and implement procedures to ensure the rate of assaults in federal prisons per 5,000 inmates is accurately reported in the future.

Disclosure of Data Limitations

The FY 2006 PAR disclosed the following data limitations, “The data represent the number of assaults over a 12‑month period per 5,000 inmates. Due to the time required to adjudicate allegations of assault, there is a lag between the occurrence and reporting of guilty findings. Due to accelerated reporting requirements (within 15 days of quarter and fiscal year end) and to provide a more accurate assault rate, the BOP began using 12 months of complete/adjudicated Chronological Disciplinary Report data for each quarter and end of fiscal year reporting beginning for FY 2004.” During our audit, we did not identify any data limitations for this key indicator beyond those already disclosed in the FY 2006 PAR.

Recommendation

We recommend that the BOP:

  1. Evaluate the cause of the FY 2006 overstatement and implement procedures to ensure the rate of assaults in federal prisons per 5,000 inmates is accurately reported in the future for the key indicator “ Rate of Assaults in Federal Prisons (Assaults per 5,000 Inmates).”

Inspection Results – Percent of Federal Facilities with American Correctional Association Accreditations – BOP

This key indicator, which identifies the percent of federal prison facilities with American Correctional Association (ACA) accreditations, is measured by the BOP.

We did not identify any issues with the BOP’s data collection, storage, or data validation and verification processes for this key indicator. The American Correctional Association (ACA) holds panel hearings twice each year to review the ACA audit reports and to vote on whether BOP institutions should receive accreditation. According to BOP personnel, the ACA provides electronic reports for each audited institution, meeting minutes from the panel hearings, and a letter listing BOP institutions that are accredited as of the end of the fiscal year.

Additionally, the BOP maintains the Accreditation Status Report, which is BOP’s own list of ACA-accredited institutions. BOP personnel develop the Accreditation Status Report based on the results of ACA panel hearings. The BOP uses the ACA meeting minutes from the panel hearings to confirm and verify the information on its Accreditation Status Report. The Accreditation Status Report is updated twice yearly.

The information on the BOP’s Accreditation Status Report is then compared to the ACA letter that lists BOP accredited institutions to ensure information is complete and accurate. If the figures on the two documents match, the information is considered valid and verified. The BOP considers the data reliable on the ACA letter since it comes directly from the ACA.

Key Indicator Data Comparison

We reviewed the BOP’s Accreditation Status Report and the ACA letter, in order to verify the actual percentage of accredited facilities in FY 2006 reported in the FY 2006 PAR. Based on our review, we did not identify any discrepancies with the performance data reported for this key indicator for FY 2006.

Disclosure of Data Limitations

The BOP did not identify any data limitations in the FY 2006 PAR. During our audit, we did not identify any data limitations for this key indicator.

Percent of Executive Office for Immigration Review Priority Cases Completed Within Established Time Frames – EOIR

In the FY 2006 PAR, the Executive Office for Immigration Review (EOIR) stated that its “mission is to be the best administrative tribunal possible, rendering timely, fair, and well considered decisions in the cases brought before it.... Included in this context are the timely grants of relief from removal in meritorious cases, the expeditious removal of criminal and other inadmissible aliens, and the effective utilization of limited detention resources.” The EOIR has set priorities and time frames for court cases involving aliens seeking asylum, criminal aliens, other detained aliens, and adjudicative time frames for all appellate cases filed with the Board of Immigration Appeals. This key indicator measures EOIR’s progress in meeting its priorities and time frames.

Data Collection and Storage

We did not identify any issues with the EOIR’s data collection and storage processes for this key indicator. The data is collected in the EOIR’s case tracking system, the Automated Nationwide System for Immigration Review (ANSIR), which is used in tandem with Case Access System for EOIR (CASE) by the Board of Immigration Appeals and select courts since 2005. Therefore, both ANSIR and CASE were used to collect and store the FY 2006 data that was reported in the FY 2006 PAR. According to EOIR, beginning in FY 2008 CASE replaced ANSIR because CASE is a more technologically advanced and timely web‑based database system.

During FY 2006, information was entered into ANSIR or CASE by court or Board staff. Data is entered using the Judges Worksheet that documents hearing information. A supervisor reviews the data entered into the systems by comparing the information on the Judges Worksheet with the data in ANSIR and CASE. The Office of the Chief Immigration Judge is responsible for developing a Microsoft Access database to extract data from ANSIR and CASE. The Office of the Chief Immigration Judge also reviews the database for accuracy, verifies printed reports, and reviews final reports.

According to EOIR Standard Operating Procedures Case Completion Goal reports are generated through “a series of queries and tables... to complete the reports and those detailed procedures are separated into five sections; receipts, completions, detained completions, pending and returns.”34 These reports are used to generate the percent of priority cases completed within the EOIR-established time frames for the PAR. Both ANSIR and CASE are equipped with data entry controls to restrict access to data and edits in the systems. During our audit, we observed ANSIR at EOIR headquarters and the systems and controls that help ensure valid and accurate data.

Data Validation and Verification

We did not identify any issues with the EOIR’s data validation and verification processes for this key indicator. These processes are ongoing as case information is added to the systems. Paralegals use checklists to compare the case files to the information in the systems and correct errors, including addresses, attorney information, and mistyped data fields. The information is also validated by attorneys identifying missing or incorrect information while working on a case. Additionally, aliens may notify the EOIR if they identify any errors on the paperwork they receive.

Currently, a program runs between ANSIR and CASE and generates a file of discrepancies between the two systems. The EOIR Operation and Maintenance staff reviews the discrepancies, in consultation with the components, and determine if any changes need to be made. In addition, when a data field is entered incorrectly, the data entry personnel receive a message from the system to correct the field. The Court Evaluation Programs conduct a comprehensive evaluation of each immigration court’s operation and the accuracy of the entries into the ANSIR and CASE. These audits are conducted on a 4-year rotating basis of the 54 courts, with 12 to 13 courts audited annually.

Key Indicator Data Comparison

In order to verify the percent of the EOIR priority cases completed within established time frames reported in the FY 2006 PAR, we reviewed the EOIR’s Office of the Chief Immigration Judge Case Completion Reports for FY 2006.35 Based on our review, we did not identify any discrepancies with the performance data reported for this key indicator for FY 2006.

Disclosure of Data Limitations

The EOIR did not disclose any data limitations in the FY 2006 PAR, and we did not identify any data limitations for this key indicator during our audit.

V.     CONCLUSION

During our audit, we determined that components had implemented various processes to review and validate their Management Discussion and Analysis prior to submitting the information to JMD for compiling the Performance and Accountability Report. However, our audit identified deficiencies and issues in 9 of the 21 key indicators. This suggests that the components need to improve their validation processes by examining the accuracy of MD&A narratives covering the key indicators and verifying supporting information necessary to ensure the accuracy of the key indicator performance data. Component management should also assess the current MD&A validation processes and determine whether any opportunities for improvement exist and address those areas. Additionally, component management should communicate to staff the need for accuracy of the key indicator information presented in the MD&A for the PAR.

Further, while we recognized that JMD officials are not currently in a position to verify or adjust key indicator information provided by the components, we believe that JMD should expand its oversight role to ensure the accuracy of the key indicator performance data reported in the components’ MD&As and subsequently compiled for the PAR. To accomplish this oversight, JMD would require the supporting information to verify the accuracy of the key indicator performance data. In addition, JMD should issue a formal policy requiring components to provide the support for the performance data with each component’s annual MD&A submission.

Recommendations

We recommend that the FBI; EOUSA; the Antitrust, Civil, Civil Rights, Criminal, Environment and Natural Resources, and Tax Divisions; EOUST, OJP; and BOP:

  1. Examine the accuracy of their MD&A narratives covering the key indicators and verify supporting information necessary to ensure the accuracy of the key indicator performance data. Additionally, component management should notify staff of the significant need for accuracy of the key indicator information presented in the MD&A for the PAR.

We recommend that JMD:

  1. Prepare and issue a formal policy requiring components to provide the supporting performance data information with the annual MD&A submission. Additionally, JMD should develop and implement procedures for examining the performance information submitted by the components in their annual MD&As.



Footnotes
  1. We reviewed the number of organized criminal enterprises dismantled reported in the 2008 Budget and Performance Summary, instead of the number reported in the FY 2006 PAR because the FBI revised the number of dismantlements during FY 2006 based on complete information and reported the revision in the 2008 Budget and Performance Summary.

  2. These users may engage in child pornography through operating websites or using e-groups, file servers, and peer-to-peer networks.

  3. We reviewed the number of CPOT-linked drug trafficking organizations disrupted and dismantled reported in the 2008 Budget and Performance Summary instead of the number reported in the FY 2006 PAR because the OCDETF, DEA, and FBI, revised the number of disruptions and dismantlements during FY 2006 based on complete information and reported the revision in the 2008 Budget and Performance Summary.

  4. In the FY 2006 PAR, this key indicator was titled, “Number of Top‑Ten Internet Fraud Targets Neutralized.”

  5. The IC3 website is located at http://www.ic3.gov.

  6. A complaint is a single complaint, while a case could include multiple complaints.

  7. Links include such information as phone number, e-mail address, and street address.

  8. Spam is unsolicited e-mail, usually of a commercial nature, sent to a large number of addresses.

  9. We reviewed the number of high-impact Internet fraud targets neutralized that was reported in the 2008 Budget and Performance Summary instead of the number reported in the FY 2006 PAR because the FBI revised the number of neutralizations during FY 2006 based on complete information and reported the revision in the 2008 Budget and Performance Summary.

  10. We reviewed the number of criminal enterprises engaging in white‑collar crimes dismantled reported in the 2008 Budget and Performance Summary instead of the number reported in the FY 2006 PAR because the FBI revised the number of dismantlements during FY 2006 based on complete information and reported the revision in the 2008 Budget and Performance Summary.

  11. Case Management Systems include: Executive Office for U.S. Attorneys’ Legal Information Office Network System; Antitrust Division’s Matter Tracking System; Civil Division’s Case Management System, Civil Rights Division’s Interactive Case Management System; Criminal Division’s Automated Case Tracking System; Environment and Natural Resources Division’s Case Management System; and Tax Division’s TaxDoc.

  12. The U.S. Attorneys Offices consists of 93 U.S. Attorneys in 94 districts. The districts of Guam and the Northern Marianas share a U.S. Attorney.

  13. The U.S. Trustee Program is structured with an executive office in Washington, D.C., U.S. Trustees in 21 regions, and 95 field offices headed by an Assistant U.S. Trustee.

  14. The EOUST receives trustee Form 4s as part of the Final Account on each Chapter 7 case closed during the year.

  15. We audited FY 2005 data because the FY 2006 data was not presented in the FY 2006 PAR due to the USTP’s use of audited data. The data limitation in the FY 2006 PAR disclosed that “data are not available until January (Chapter 7) and April (Chapter 13) following the close of the fiscal year because of the need to audit data submitted by private trustees prior to reporting.”

  16. The Weed and Seed Data Center website is http://www.weedandseed.info.

  17. The 6 territories include the District of Columbia, American Samoa, Guam, Northern Mariana Islands, Puerto Rico, and United States Virgin Islands.

  18. BJA’s data limitation disclosed that a “Statutorily mandated calendar year reporting requirement.” Therefore, we audited 2005 data that was presented in the FY 2006 PAR.

  19. The grant announcement informs the grantees of the semi-annual progress report requirements.

  20. BJA Monitoring Guide, August 2005.

  21. According to OFDT personnel, medical facilities are excluded from the per-day jail cost calculation because the cost incurred cannot feasibly be projected and therefore, a negotiated per-day jail rate for medical facilities does not exist. If these facilities were included in the key indicator, the per-day jail rate would be skewed and targeting for this key indicator would be difficult due to the range in medical costs depending on the services provided.

  22. When the USMS receives jail bills from the counties, the bills are cross‑checked and verified before an obligation is established by the USMS district in the Financial Management System.

  23. Cox’s Proportional Hazards Model is a regression model in statistics.

  24. BOP Program Statement 5322.12, regarding Inmate Classification and Program Review, November 2006.

  25. In the FY 2006 PAR, the BOP described a serious assault as “An assault that results in major bodily injury or death...” and a minor or less serious assault as “An assault that does not result in major bodily injury....”

  26. Executive Office for Immigration Review, Office of Planning and Analysis, Standard Operating Procedures, April 2007.

  27. The EOIR priority cases include Institutional Hearing Program, Asylum, and Detained cases. Our audit scope did not include Single Appeal and Panel Appeal cases because they were discontinued from the key indicator as of September 30, 2006.



« Previous Table of Contents Next »