The Office of Justice Programs Convicted Offender
DNA Sample Backlog Reduction Grant Program

Report No. 02-20
May 2002
Office of the Inspector General


The objectives of our audit of the Program were:

  1. to assess the overall impact of the Program on the national offender backlog;

  2. to assess the compliance of the selected contractor laboratories with pertinent contractual requirements and the Offender QAS; and

  3. to evaluate the adequacy of OJP's administration of the Program and monitoring of grantee activities, and determine the extent to which selected grantees had administered their grant and monitored their contractor's activities in accordance with federal and agency requirements, and with the Offender QAS.

We conducted our audit in accordance with Government Auditing Standards. We included such tests as were considered necessary to accomplish the audit objectives.

The audit generally covered the period from the award of the Program's first year of grants in August 2000 through the completion of audit fieldwork in November 2001. However, for comparison purposes, we gathered pre-award productivity statistics for selected grantee states for one year prior to the Program grant award. In addition, we limited our post-award productivity statistics to one year following the grantee's receipt of their Program grant, since the Program grants were intended to last one year.

Audit work was conducted at OJP, at the top three high-dollar contractor laboratories, and at selected Program grantees contracting with those three contractor laboratories. Further, our work at the contractor laboratories was limited to the portion of their personnel, facilities, and documentation that involved the analysis of offender samples for Program grantees.

To assess the overall impact of the Program on the national offender backlog, we reviewed grantee productivity and CODIS upload statistics for both pre-award months (up to one year prior to the Program grant award) and for post-award months (up to one year after the Program grant award), reviewed documentation of the attainment of Program goals and performance measurements; and interviewed key grantee and OJP personnel.

We audited the following three contractor laboratories and issued a separate report to OJP for each:

For each of these contractor laboratories we assessed compliance with the Offender QAS by:

We also assessed the contractor's compliance with OJP Program requirements, including the timely return of data to their client states, by interviewing selected contractor management and reviewing documentation of data shipments.

To evaluate OJP's oversight of the Program grantees, we interviewed key personnel and reviewed grantee tracking files and electronic records. This review focused on the Program's performance measurements, the timely filing of all required reports by the grantees, and the monitoring of grantee progress in completing their contracts.

We selected eight Program grantees to audit. Generally, our selection was based on the amount of grant funds each grantee provided to the three contractor laboratories we audited, with two exceptions. First, Ohio was substituted for New York after the events on September 11, 2001. Second, for efficiency purposes, Utah was selected because the laboratory was located in close proximity to Myriad Genetic Laboratories in Salt Lake City. The eight Program grantees we audited were:

Of these grantees, our audit fieldwork in North Carolina and Texas was not conducted on-site, but included the use of previous OIG CODIS laboratory audit results, with the remainder of the fieldwork conducted via the provision of documentation and other information. For each of these audits, we assessed compliance with the contractor oversight provisions of the Offender QAS by reviewing documentation of on-site visits, data review, random re-analysis, and quality control sample results. To assess the grantees' compliance with OJP's solicitation requirements, we interviewed key grantee personnel regarding changes to their state's DNA collection statute and regarding factors influencing the completion of their Program grant. We also reviewed supporting documentation for state procurement practices and contractor selection (except in North Carolina and Texas), and for 5 percent of the no-suspect cases counted by each grantee toward their 1 percent match requirement. For the match requirement, we set sampling limits of a minimum of 10 cases and a maximum of 20 cases for each grantee audited.