The Office of Justice Programs Convicted Offender
DNA Sample Backlog Reduction Grant Program
Report No. 02-20
Office of the Inspector General
We determined that the Program has been successful in funding the analysis of over 288,000 previously backlogged offender samples, and some data suggests that the national offender backlog is declining. However, because of continuing changes to state statutes requiring greater numbers of people to provide DNA samples, and the challenge to the states to respond to this increased demand, it is difficult to determine whether the national offender backlog will be eliminated. In addition, while the Program grants helped to increase the volume of complete offender profiles uploaded to NDIS, two of the eight grantee states we audited showed no increase in productivity. Finally, although our audit results supported OJP's claim of meeting two of the four FY 2000 Program performance measurements, we could not determine whether the remaining two performance measurements had been met because OJP was not tracking the correct data to substantiate that the performance measurements had been achieved.
Impact of the Program on the National Offender Backlog
As described in the Introduction of this report, the first year of the Program funded the analysis of over 288,000 backlogged offender samples in 21 states. In addition, estimates provided by state and local laboratories in early 2001 indicated that the backlog was decreasing.
However, determining the exact reduction in the national offender backlog in the first year of the Program was precluded by the fact that the national offender backlog is constantly fluctuating, due primarily to the expansion of state DNA collection statutes. The more conviction offenses that require the collection of a sample, the larger the analysis workload is for the states, and the higher the possibility that the states will encounter increasing backlogs. As previously mentioned, since 1988, every state has passed a DNA collection statute. In recent years, states have expanded those collection statutes, with 2001 seeing the most dramatic increase in statute expansions.
According to data provided to OJP,9 a total of 35 state legislatures introduced DNA expansion bills in 2001, up from 19 states in 2000 and 10 in 1999. Even more significant, as illustrated below, was the number of state legislatures that proposed requiring the collection of a DNA sample from all felons, one of the broadest collection standards being used in the United States.
Although not all proposed expansion legislation was enacted, bills in 22 states passed in 2001, up from 8 states in 2000 and 6 states in 1999. Further, the number of states with "all-felons" legislation has doubled, increasing from 7 states at the end of 2000, to 14 states by December 2001.
In order to gauge how these expansions might impact the national backlog, we interviewed laboratory management during our eight state grantee audits. From these interviews we obtained estimates from three grantee states for how statute expansions might affect their backlogs in 2001 or 2002. We also gathered information from a fourth grantee state that had statistical data on the impact of the statute expansion in their state in 2000. Their responses illustrate how legislative changes can impact a state's backlog of offender samples:
The variety of these responses illustrates that while most expansions equate to a considerable increase in a state's analysis burden, not every state is affected the same by a statute expansion. Overall, many variables help determine how much of an impact expanded legislation has on a state's backlog, including whether legislative changes are retroactive; whether additional appropriations accompany the statute change; whether statutes apply to juveniles in addition to adults; whether statutes apply to probationers and parolees; and which agencies are tasked with the collection of the samples and the compliance level of those collections.
Therefore, based upon the increasing frequency of state legislative changes, all of which will likely increase the number of samples requiring analysis, and the general consensus of the states we interviewed that such increases stand to drastically increase their backlogs, we question whether the backlog reductions accomplished under the Program will be sufficient to reduce and ultimately eliminate the national offender backlog.
Impact of Program Grants on State Productivity
Preliminary information gathered in our audit fieldwork raised the question as to how administering the grants and the resulting contracts would affect the resources of the grantee state laboratories. Specifically, we wanted to determine whether the time taken away from the laboratories' normal in-house analysis (for selecting the contractor, shipping and receiving samples and data, reviewing the data, and completing the requirements of the Quality Assurance Standards for Convicted Offender DNA Databasing Laboratories (Offender QAS), effective April 1, 1999, for contractor oversight) would counteract the benefits of the outsourcing.
Consequently, we reviewed CODIS upload documentation for each of the eight grantee states audited to determine how the Program grants affected the number of complete10 profiles that those states were able to upload to CODIS. Specifically, we compared each state's average number of complete profiles uploaded monthly during the 1-year period prior to the Program grant to the average number of complete profiles uploaded monthly during the 1-year period after the award of the Program grant (limited to one year since the original grant award period was one year). By comparing the productivity of the pre-grant award year and the post-grant award year, we intended to demonstrate the possible impact of the Program.
We determined that five of the eight grantee states we audited demonstrated a marked increase in total complete profiles analyzed and uploaded to NDIS in a timely manner after receiving their Program grant. These increases were inclusive of both samples analyzed in-house as well as samples analyzed by the contractor laboratory. However, because of difficulties in efficiently addressing the Offender QAS requirements, two of the eight states we reviewed (California and Michigan), experienced no increase in productivity in the 1-year period following the Program grant award. Both of these states showed no uploads of complete offender profiles to NDIS either before the Program grant or during the Program grant period reviewed.11 In addition, a third state, Ohio, experienced significant delays in their ability to upload profiles returned to them by their contractor. The change in the average number of profiles being uploaded monthly to NDIS before the grant and after the grant is demonstrated for each state in the graph on the following page:
Source: CODIS System Printouts for uploads at each state
The increases experienced by the states could be attributable to one of three causes: (1) an increase in the grantee laboratory's in-house productivity; (2) the effect of the contractor's assistance, under the Program grant, on the laboratory's productivity; or (3) a combination of both of these causes. Therefore, to determine the cause of the increased productivity, we reviewed the laboratory's records of samples analyzed in-house during the contract period. We concluded from this analysis that for five of the six states showing increased productivity (North Carolina, Ohio, Texas, Virginia, and Washington), any increase was due to the work being performed by the contractor, since the states had no increase in in-house productivity during the contract period. The remaining state, Utah, was able to increase in-house productivity during the contract period, in addition to the increase contributed by the contractor. However, the majority of Utah's total increase was attributable to the contractor's work.
In looking further at the two states showing no productivity increase during our audit period, we noted that the contractor laboratories for both states had analyzed samples during our audit period, but the profiles had not been uploaded to CODIS during the audit period. Most states we audited appeared to be uploading contractor data monthly, maintaining a relative pace with incoming contractor data. However, for the two states showing no increase in productivity (California and Michigan), we noted that as many as 10 months passed between when sample data was received from the contractor and when profiles were uploaded to CODIS, with the first upload occurring after our 2-year review period. The delays between when profiles were received from the contractors and when uploads to CODIS occurred are illustrated in the following chart:
Source: CODIS System Printouts for uploads at each state
As stated previously, neither California nor Michigan uploaded any profiles to CODIS during the 2-year period we reviewed. Further, although Ohio began receiving sample data from the contractor as early as November 2000, the first sizeable upload was not processed until April 2001, after which point uploads generally kept pace with sample data received. There are no standards or criteria governing how much time states are permitted before they upload contractor data to CODIS. However, profiles that have not been uploaded to NDIS cannot have a nationwide impact in solving crimes.
A key aspect to note regarding the states' ability to upload contractor data to CODIS is that there is always lag time between when the data is received from the contractor and when the data is uploaded by the state to CODIS. This is true because, after receiving the contractor data, states must address the requirements placed upon them by the Offender QAS prior to uploading the data to CODIS. These requirements include, but are not limited to: (1) random reanalysis of samples; (2) visual inspection and evaluation of results/data; (3) inclusion of quality control samples; and (4) conducting on-site visits to the contractor facility. However, as detailed below and in Appendix IV, states vary in their ability to address the Offender QAS in an efficient manner for a variety of reasons, including limitations of staffing, funding, facilities, and computer systems.
For example, California laboratory management stated that several factors hindered their ability to efficiently address the Offender QAS requirements and upload the contractor data to CODIS in a timely manner. These factors included personnel turnover and understaffing, computer memory limitations, as well as complications with the compatibility between their in-house sample tracking system and their contractor's organization of the data. We were provided with documentation substantiating these problems. California was able to process its first CODIS upload of approximately 20,714 complete offender profiles in September 2001, which included samples analyzed in-house and by the contractor.
For Michigan, we determined that a few interrelated factors hindered its ability to efficiently address the Offender QAS requirements and upload the contractor data to CODIS. These factors centered on the Michigan laboratory's decision to complete random reanalysis on 10 percent of the samples analyzed by the contractor. According to laboratory management, this decision was made because this was a new outsourcing contract and they wanted to perform as many quality checks as possible to assure themselves that the contractor's work was acceptable. Although the laboratory was allowed to make this decision under the Offender QAS and under OJP grant award guidelines, the Michigan laboratory's resources were not sufficient to complete the reanalysis of the large volume of samples at a pace sufficient to keep up with the contractor data being returned. Consequently, the laboratory fell behind in the reanalysis, as seen in the previous chart, and were unable to upload the data to CODIS until the reanalysis was complete. Ultimately the Michigan laboratory addressed the issue by reducing its in-house percentage of reanalysis and having the contractor perform the remaining reanalysis needed to meet the 10 percent level. Michigan was able to process its first upload of 2,916 complete offender profiles to CODIS in September 2001.
The Ohio CODIS Administrator provided information about their difficulty in efficiently addressing the Offender QAS requirements, particularly the requirement for visual inspection of results and data. The CODIS Administrator stated that she is responsible for performing the 100 percent visual inspection of the contractor data, and that it currently takes her approximately 3 hours to review data from 100 samples, depending on the complexity of the samples. She added that initially it took her longer to get oriented to the organization of the contractor's data and to develop a system for efficient review. In addition, the review of the data is only part of her daily responsibilities in the laboratory. Given the volume of profiles being received from the contractor, shown in the previous chart, it would have taken the CODIS Administrator several weeks each month to review the profiles.
As discussed in the following section, the delays in uploading samples to the national database by California, Michigan, and Ohio led us to question the methods by which OJP addressed two of its performance measurements.
Program Performance Measurements
In response to the Government Performance and Results Act, which requires agencies to develop strategic plans that identify their long range goals and objectives, and establish annual plans that set forth corresponding annual goals and indicators of performance, OJP developed performance measurements for the Program. These measurements were consistent with the overall strategic plan for the Department of Justice. The stated mission for the Program is to reduce and ultimately eliminate the convicted offender DNA sample backlog awaiting analysis and entry into NDIS. This mission directly supports the following Department strategic plan goal and objective:
To monitor the progress toward achieving the desired Program outcomes and results, OJP developed and tracked four performance measurements. We reviewed OJP's progress toward achieving each of the following four performance measurements for the first year of the Program.
OJP established a goal of assisting all states that applied for grants, thereby improving those states' access to external capabilities. A total of 21 states applied for approximately $14.5 million in funding to outsource the analysis of approximately 288,000 convicted offender samples. Supporting documentation revealed that OJP was able to fully fund all 21 requests, thereby meeting this performance measurement.
To address this measurement, OJP collected monthly statistics from each of the 21 states detailing the number of samples returned to the states by their contractors. These statistics revealed that over 288,000 samples had been analyzed by the contractor laboratories and returned to the grantee states as of the end of our fieldwork in November 2001. However, OJP was not tracking the number of Program-funded profiles that had actually been entered into NDIS, as required by the performance measurement, because grantee states were reporting "samples received" from the contractor rather than "profiles uploaded" to NDIS.
OJP officials stated that they had not asked the states to report the Program-funded profiles uploaded to NDIS because that data would take more time to report than the number of samples received back from the contractor. Further, they indicated that they believed that the number of samples returned to the states served as a sufficient measure of the number of profiles available for upload to NDIS, since the only delay between samples received and profiles uploaded was the time that it took the states to address the Offender QAS requirements for oversight of the contractor's data.
Because profiles that have not been uploaded to NDIS cannot have a nationwide impact in solving crimes, we agree with OJP's decision to make profiles uploaded to NDIS a performance measurement for the Program. We also believe that it is reasonable to assume that sample data received back from the contractors will eventually be uploaded to NDIS. However, based upon the data presented in the preceding section regarding delays encountered by California, Michigan, and Ohio in uploading the data received back from the contractor, we do not agree that tracking "samples received" serves as a sufficient substitute for tracking "profiles uploaded" in addressing the performance measurement for FY 2000. Therefore, we conclude that OJP could not substantiate that this performance measurement had been achieved because the appropriate data was not being collected and monitored.
OJP reported that all 21 states receiving FY 2000 Program funding had experienced an increase in the number of complete offender profiles they had contributed to the national database, since all 21 states used the funds to outsource the analysis of convicted offender samples. Based upon our audit results, we agree with OJP that all 21 states used the funds to outsource the analysis of convicted offender samples. However, based upon the data presented in the preceding section regarding delays encountered by California, Michigan, and Ohio in uploading data to NDIS, we do not agree that samples funded can be substituted for the number of profiles contributed to the national database when addressing 1-year performance measurements. Therefore, we conclude that OJP could not substantiate that this performance measurement had been achieved because the appropriate data was not being collected and monitored.
As a condition of each grant award, each state was required to, at their own expense, analyze no-suspect cases equal to 1 percent of the offender samples for which they were receiving Program funding. We were able to determine that grantee states had reported to OJP, as part of the monthly statistics collected by OJP, that more than 2,890 no-suspect cases had been analyzed. Therefore, OJP stated that it had met its fourth performance measurement. For the states we audited, we were able to confirm that the states' assertions regarding their completion of their no-suspect match requirement were supported by appropriate documentation as detailed within Finding No. 3 of this report.
Our audit results supported OJP's claim of meeting FY 2000 Performance Measurement Nos. 1 and 4. However, OJP was not tracking the correct data to substantiate that it had met Performance Measurement Nos. 2 and 3; therefore, we could not determine from OJP records that those measurements had been achieved.
In addition to assessing whether OJP had met the performance indicators it had established, we also assessed whether there were other performance measurements that could be established that would provide decision makers within the Department and in Congress with information on whether the Program was meeting its mission to reduce and ultimately eliminate the convicted offender DNA sample backlog awaiting analysis and entry into NDIS. We identified two areas that we think OJP should consider:
We recommend that the Assistant Attorney General, Office of Justice Programs:
We assessed three Program contractor laboratories' compliance with standards governing their DNA analysis contracts with Program grantees. These three contractors received contracts from 14 of the 21 grantees, accounting for 85 percent of the first year's Program funding. We determined that the contractors generally complied with these standards, with a few exceptions related to the Offender QAS for equipment calibrations and continuing education documentation.
The first year of the Program was designed so that the states receiving grants were responsible for screening and selecting contractors that met certain criteria. In general terms, states were to select contractors that could perform DNA analysis of offender samples (1) in compliance with the Offender QAS, and (2) in a manner consistent with the requirements placed upon contractors, through the states, by the Solicitation and attached certifications. Further information on this criteria can be found in Appendix III.
The 21 states that received FY 2000 Program grants contracted with a total of 7 private contractor laboratories, as set forth in the Introduction section of this report. Of these seven contractor laboratories, we selected for audit the three contractors that accounted for the majority of the grant funding and the majority of samples to be analyzed: Myriad Genetic Laboratories, Inc., located in Salt Lake City, Utah; The Bode Technology Group, located in Springfield, Virginia; and ReliaGene Technologies, Inc., located in New Orleans, Louisiana.
The exceptions identified for each of the contractors audited are summarized below.
Myriad Genetic Laboratories, Inc.
For our audit of Myriad Genetic Laboratories, Inc. (Myriad), we considered 130 elements14 of the Offender QAS. We found that Myriad complied with the Offender QAS except for the following two areas.
Offender Standard 10.2 states that a laboratory shall identify critical equipment and shall have a documented program for the calibration of instruments and equipment. Although Myriad complied with this standard by identifying critical equipment and by having a documented calibration program, we determined that Myriad had not followed that calibration program for one of the ten critical equipment items we reviewed. The item, a balance, was not calibrated between November 1999 and June 2001, a span of 19 months, during which 3 semi-annual calibrations should have been performed.
Myriad management stated that the missing calibrations were due to a misunderstanding in which the technician responsible for the calibrations thought that the frequency of required calibration had changed from semi-annual to "at use," when actually the reverse had occurred. The misunderstanding was identified due to our request to see the calibration logs. The item was calibrated while we were on site and laboratory personnel noted no problems with the instrument.
According to Myriad management officials, Myriad's comprehensive central tracking system monitors the performance of all aspects of the process and would have been able to detect analysis problems caused by a faulty balance had there been any. Supporting documentation for how the tracking system identifies analysis problems was reviewed by the auditors. Because the balance was found to be within accuracy limits and because the tracking system appears to have been capable of detecting analysis problems caused by an inaccurate balance, we concluded that the only deficiency was not performing the calibration at the required intervals.
Missing Documentation of Equipment Tests
Offender Standard 10.3.1 states that new critical instruments and equipment, or critical instruments and equipment that have undergone repairs or maintenance, shall be calibrated before use. Offender Standard 10.3.2 states that written records or logs shall be maintained for the maintenance service performed on instruments and equipment, and that such documentation shall be retained in accordance with federal or state law. Although Myriad personnel do maintain logs as described in Offender Standard 10.3.2, the logs did not provide sufficient documentation to demonstrate that two of the ten critical equipment items reviewed had been calibrated after their most recent repairs. Myriad management stated that the equipment had been calibrated before being put back in use, as required by Offender Standard 10.3.1, and the fact that no further problems arose with the items since then served as evidence that the items were fit for continued use. However, they acknowledged that the logs did not reflect the calibration work that had been performed to approve the items for continued use.
The Bode Technology Group
For our audit of The Bode Technology Group (Bode), we considered 130 elements of the Offender QAS. We found that Bode complied with the Offender QAS except for one area of noncompliance described below.
Continuing Education Documentation
Offender Standard 184.108.40.206 states that the technical manager, CODIS manager, and analysts must stay abreast of developments within the field of DNA typing by reading current scientific literature and by attending seminars, courses, professional meetings, or documented training sessions or classes in relevant subject areas at least once per year. This requirement is listed as a substandard of Offender Standard 5.1.3, which requires a continuing education program. In reviewing Bode's policies and procedures, we noted that Bode had a continuing education program that mirrored the requirements of Offender Standard 220.127.116.11. In addition, Bode had a system of documentation that accounted for the attendance of appropriate personnel at seminars, courses, and meetings. However, the laboratory did not have a mechanism in place to document that appropriate personnel, such as the technical manager and analysts, had completed the required reading of scientific literature.
Laboratory management was able to produce a routing slip that was attached to an article from a recent scientific journal, on which the technical manager had signed off. However, any of his reading completed in previous years, as well as any reading completed by analysts, was not documented. Laboratory management stated that all appropriate personnel are actively engaged in reading scientific literature that is routed through the laboratory and that the laboratory subscribes to a variety of journals that would make such reading material immediately available to the staff. However, they acknowledged that there was no documentation to substantiate that the requirement was met.
ReliaGene Technologies, Inc.
For our audit of ReliaGene Technologies, Inc. (ReliaGene), we considered 130 elements of the Offender QAS. We found that ReliaGene complied with the Offender QAS except for one area of noncompliance described below.
Offender Standard 10.2 states that a laboratory shall identify critical equipment and shall have a documented program for the calibration of instruments and equipment. Although ReliaGene complied with this standard by identifying critical equipment and by having a documented calibration program, ReliaGene staff did not comply with their own calibration program for one of the ten equipment items reviewed. ReliaGene's calibration program requires that the temperature verification system be calibrated annually. Documents reviewed indicated that this system was placed into service on July 22, 1999. However, no calibrations subsequent to the initial calibration were performed until August 22, 2001, the date our audit revealed the deficiency. This was a span of 25 months.
ReliaGene's management stated that the missing calibrations were due to oversight and immediately calibrated the instrument while we were on site. ReliaGene personnel noted that the instrument was within acceptable ranges when compared with a National Institute of Standards and Technology thermometer, thereby indicating there were no problems with the instrument.
We issued separate audit reports15 to OJP for each of the three contractor laboratories audited. Because OJP is providing oversight while these laboratories are responding to our audit findings through their respective grantee states, we will not provide additional recommendations to address contractor laboratory audit findings in this report.
OJP's management controls over grantee compliance with Program requirements need improvement. In May 2001, we conducted reviews of OJP documentation for all 21 Program grantees. We also conducted in-depth fieldwork at 8 of those Program grantees. We determined that 14 of the 21 grantees either did not submit required reports or submitted reports after the deadlines. Further, we noted that 15 of the 21 grantees did not submit required quality assurance test results to OJP. Finally, 14 of the 21 grantees reviewed did not comply with Program requirements relating to timeliness. In our judgment, if Program reports are not submitted and reviewed, OJP cannot adequately track or monitor grantee progress toward achieving the Program's goals and objectives. In addition, if issues with quality assurance samples are noted and not reported to OJP, any corrective action necessary cannot be made on a timely basis. This could potentially have an adverse impact on the integrity of the national DNA database. However, we did note that OJP had implemented informal follow-up procedures as a result of our findings by the time our subsequent audit work was conducted in November 2001.
We reviewed OJP's oversight of the Program to determine if grants were made in accordance with applicable legislation, and whether OJP adequately monitored grantee progress and compliance with Program requirements. In addition, we reviewed eight selected grantees' oversight of their contractor laboratories to determine if they were monitoring them in accordance with the Offender QAS, and whether the grantees were complying with key Program requirements.
The Crime Information Technology Act (CITA) provided for grants to state governments to promote compatibility and integration of national, state, and local systems for criminal justice purposes and for the identification of sexual offenders. Further, the CITA specified allowable uses for the grant funds, including programs: (1) to establish, develop, update, or upgrade the capabilities of forensic science programs and medical examiner programs related to the administration of criminal justice; (2) leading to accreditation or certification of individuals and departments, agencies, or laboratories; and (3) relating to the identification and analysis of DNA. We found that all Program grants were made to state governments for the purpose of outsourcing the testing of convicted offender DNA samples, and thus were made in accordance with the CITA.
OJP developed and issued Program requirements in its Solicitation for CODIS STR Analysis of States' Collected Convicted Offender DNA Samples (Solicitation), dated March 2000. The Solicitation specified general grant guidelines and restrictions, as well as more specific requirements. Grantee states were required to certify that they were in compliance with certain provisions of the DNA Identification Act of 1994, including the Offender QAS relating to the oversight of contractors. In addition, grantees were required to adhere to timeliness requirements and deadlines relating to the selection of contractors, the submission of DNA samples to contractors for testing, and reporting requirements.
OJP Oversight and Grantee Reporting
Program grantees were required to submit quarterly financial status reports, semi-annual progress reports, and quality assurance test results. These reports contain information necessary for OJP to track and monitor grantee progress, such as contractor selection information, dates of sample shipment, and other details relating to contractor oversight that would allow OJP to ensure grantee compliance with Program requirements and with the Offender QAS. During our initial review in May 2001, we noted that 14 of the 21 grantees either did not submit required progress or financial status reports, or submitted the reports an average of 133 days after the required deadlines. At that time, OJP was unsure of the status of these reports. Further, while OJP did have an informal mechanism for tracking grant progress, no system of follow up was in place to ensure that required reports were submitted.
Between our initial review in May 2001 and our follow-up review in November 2001, an additional progress report and two additional financial status reports were due from each grantee. In response to our initial review results, OJP had initiated informal follow-up procedures for unsubmitted or late reports through the development of a report tracking spreadsheet. Further, there was evidence of e-mails in grantee files showing that OJP was monitoring these reports and contacting grantees to remind them of the missed deadlines. Further, our follow-up review revealed that all previously missing reports had been submitted. While we noted that 9 of the 21 grantees submitted reports that were due subsequent to our initial review between 12 and 80 days late, there was evidence of OJP follow up in each grant file.
Program grantees were also required to submit quality assurance test results to OJP. To ensure the accuracy of profiles received from the contractor laboratories, Program requirements specified that each state submit quality assurance samples (i.e., samples with unknown values to the contractor but with known values to the state), with the first group of convicted offender samples sent to the contractor laboratory. Further, the results of these quality assurance samples were required to be reported to OJP within 30 days of receipt of the results.
During our initial review in May 2001, we noted that 15 of the 21 grantees did not submit the quality assurance test results to OJP as required. At that time, OJP officials were unsure of the status of the quality assurance results. In response to our initial review, OJP instituted an informal follow-up process with the grantee states. During our second review at OJP in November 2001, we noted that all grantee states had submitted the required quality assurance test results subsequent to our initial review. While no significant quality assurance issues were reported by these 15 states, it is important for OJP to be aware of these results so that adequate grantee oversight and timely resolution of any quality issues can occur.
In our judgment, quality assurance tests are a key control to ensure the accuracy of DNA test results. Because this was a new program and many of these private contractor laboratories were being utilized by the grantee states for the first time, the results of these quality assurance tests were crucial. If quality assurance deficiencies had been noted and not reported to OJP, timely oversight by OJP could not have been accomplished.
Program grantees were required to ensure they complied with two timeliness guidelines. The first guideline required grantees to expedite their state procurement process to ensure that a contractor laboratory was selected and the first group of convicted offender samples was provided to that laboratory within 120 days of the OJP grant award notification letter. Preliminary data gathered at OJP in May 2001 indicated that 6 of 21 Program grantees did not meet this requirement, providing their first group of offender samples to their contactor laboratory between 12 and 65 days late. Of the six grantees not meeting this requirement, we gathered further data to determine the cause at the two grantees that were among the eight grantees we audited. While both grantees followed their state procurement processes, which was also required by the Program, the lengthy procurement process contributed to delays at one grantee state totaling 159 days. Further, procurement delays and other laboratory challenges caused a second grantee to not provide their first group of samples to the contractor laboratory until 185 days after the date of the OJP grant award notification letter. Changes made to the Program in its second year essentially eliminated the types of procurement delays states experienced in the first year of the Program.
A second timeliness guideline required grantees to ensure the contractor laboratory analyzed and reported back the results of the analysis of each group of convicted offender samples within 30 days of receipt. From preliminary data gathered at OJP in May 2001, we determined that the contractor laboratories for 10 of 21 Program grantees did not meet this requirement for returning the first group of samples, returning them between 2 and 56 days late. Of the ten grantees not meeting this requirement, we gathered further data to determine the cause at the three grantees that were among the eight grantees we audited.
One delay was due to an unforeseen genetic variation that occurred in a particular section of the DNA being analyzed, affecting the contractor's ability to process the samples. This issue was resolved, but caused the first group of results to be reported 84 days after shipment. The second delay was caused by differences in how initial results were being reported by one contractor and how the state wanted the results to be reported. This was also resolved, but resulted in the first group of results being reported 45 days after shipment. The third delay was the result of a manufacturer's untimely release of software for a new piece of equipment being brought into operation at the contractor laboratory, which caused the first group of results to be reported 49 days after shipment. All of these one-time delays were satisfactorily explained, and in our judgment, did not negatively impact the overall Program effectiveness.
Compliance with Offender Quality Assurance Standards
Section 17 of the Offender QAS requires any laboratory using subcontractors to establish review procedures to verify the integrity of data received from the subcontractor. This section specifies that these review procedures include: (1) random reanalysis of samples, (2) visual inspection and evaluation of results, (3) inclusion of quality control samples, and (4) on-site visits.
We reviewed data and documentation maintained by each of the eight grantees we audited to ensure that contractor review procedures had been established and followed in accordance with the Offender QAS. We noted no deficiencies relating to grantee compliance with the Offender QAS pertaining to contractor review and oversight.
No-Suspect Match Requirement
Instead of requiring a monetary local match, Program guidelines required grantees to analyze, at their own expense, no-suspect cases16 equal to at least 1 percent of the total number of convicted offender samples for which grant funds were awarded. This analysis was required to be conducted within the grantee's laboratory system. All eight grantees reported that they had met or exceeded their match requirement as of the conclusion of our audit fieldwork.
Each grantee provided us with a list of cases that they had analyzed to meet the grant match requirement. Using each list, we either randomly or judgmentally selected a total sample of 113 cases out of a total of 2,414 cases, and reviewed documentation to ensure that each case met the following Program requirements: (1) the analysis had to occur after October 1, 1999, and (2) each case had to qualify as a no-suspect case. All 113 cases that we reviewed met the Program requirements.
We recommend that the Assistant Attorney General, Office of Justice Programs: