The No Suspect Casework DNA Backlog Reduction Program

Audit Report No. 05-02
November 2004
Office of the Inspector General


Findings and Recommendations


  1. Program Impact and Achievement of Program Goals
  2. We determined that the Program has been successful in funding the analysis of over 24,700 previously backlogged no-suspect cases, as projected by Program grantees. However, we were unable to determine whether the Program was achieving its mission of increasing laboratory capacity. Further, many grantees experienced lengthy delays in implementing their proposals and were not drawing down Program funds on a timely basis. We also determined that while the Program awards helped to increase the volume of no-suspect profiles uploaded to CODIS, all four of the individual grantees we audited experienced delays in uploading completed profiles. Finally, OJP had not developed substantive Program goals, and the Program's performance measurements were not adequate to assess whether it was achieving its stated mission.


Impact of the Program on Laboratory Capacity

As stated previously, the mission of the Program is "to increase the capacity of state laboratories to process and analyze crime-scene DNA in cases in which there are no known suspects, either through in-house capacity building or by outsourcing to accredited private laboratories." To accomplish this, OJP awarded approximately $28.5 million in funding to 25 states for the analysis of over 24,700 backlogged no-suspect cases during the first year of the Program.

We found that measuring the Program's progress was complicated by the lack of definitive data linking Program funding to trends observed in increased uploads of DNA profiles to NDIS from case evidence. For example, we collected NDIS upload statistics for each of the four grantees we audited to determine how the Program awards affected the number of complete21 profiles that those states were able to upload to NDIS prior to and during the award period. Those statistics are illustrated on the following graph:22

Figure 2: Forensic Profiles Uploaded to NDIS
Figure 2 summarizes the amount of forensic profiles uploaded to NDIS. Click on graphic for a text only version.
Source: Program grantees

As the figure illustrates, all four of the grantees demonstrated a marked increase in total complete profiles analyzed and uploaded to NDIS after receiving their Program awards. However, since these increases were inclusive of both the no-suspect cases funded by the Program as well as other DNA cases that the laboratories were analyzing with local funding, we cannot conclusively state the extent to which this data establishes that the Program met its mission. For example, it is unclear from the data whether the increase in uploads is due to the Program funding, or whether it is because the laboratory hired, with its own funding, additional staff that helped increase productivity.

When considered in conjunction with delays in the drawdown of funding and delays in the upload of profiles, two issues we discuss later in this section, it becomes even more apparent that without better data a concrete determination about the Program's achievement of its mission is not possible. For example, for those of our four auditees that had not materially drawn down Program funding, we would conclude that the Program did not account for the increase in productivity demonstrated in the previous chart.


Untimely Utilization of Program Funds

During our audit fieldwork, we noted that many of the grantees had drawn down very little of their award funds, or in some cases had not drawn down any funds at all.

As of May 31, 2004, only $11.6 million, or about 41 percent of the $28.5 million awarded from FY 2001 Program funds, had been drawn down by the 25 Program grantees. For the four grantees included in our audit, only $5.9 million of the $13.5 million awarded, or 44 percent, had been drawn down as of the same date. While these awards were made between July 2002 and September 2002, the largest grantee in terms of dollars awarded (Maryland), and two additional grantees (Delaware and Connecticut), had not drawn down any funds as of May 31, 2004. These three grantees received awards totaling nearly $5.3 million. The following chart illustrates the drawdown trends for this Program through May 2004:

Figure 3
[Figure 3 is not available electronically]

While the drawdown amounts are not a definitive indicator of specific grantee Program activities, we believe that drawdowns are an important indicator of overall grantee progress toward the achievement of proposed objectives.

For example, the award to the New York State Division of Criminal Justice Services (DCJS) in the amount of $5.04 million, with a term of one year, was awarded in September 2002. Yet, as of May 2004, only $500,000 had been drawn down, or less than 10 percent of the award amount. According to grantee officials, multiple reasons accounted for their delayed drawdowns, including the time it took to establish separate contracts with the co-grantees across the state. In many cases, these contracts were not finalized until August 2003, nearly a year after the 1-year award was made. Further, grantee officials in New York stated that amounts drawn down may not be the best indicators of progress actually being made. Because funds may have been spent or obligated, but not yet drawn down, they believed the amount of funds actually spent and obligated would provide a better gauge. However, as of April 2004, New York reported total funds spent and obligated of $2.2 million, which is still only 45 percent of the total awarded. Further, one co-grantee in the state of New York estimated that its program will not be completed until December 2004, or 27 months after the initial 1-year award was made.

In another example, the Texas Department of Public Safety (TXDPS), which had drawn down approximately $2 million of its $3.4 million award as of May 2004, cited delays in initiating contracts with the co-grantees in its state as a reason for delays in expending funds. Further, the Florida Department of Law Enforcement (FDLE), which had drawn down about $2 million of its $2.8 million award, stated that backlogs at its contractor laboratories (i.e., contractor laboratories' inability to process all the cases it was receiving from various clients, delaying results back to those clients) were preventing it from expending its remaining award funds. The FDLE anticipated completing drawdowns in December 2004. Finally, as of May 2004, the Ohio Bureau of Criminal Identification and Investigation (Ohio BCI&I) had drawn down approximately $1.4 million of its $2.3 million award. Officials at the Ohio BCI&I cited delays in the submission of no-suspect cases by law enforcement agencies, and the screening of evidence by the laboratory for items that were most likely to produce viable DNA results.

In sum, grantee drawdowns are one gauge of the overall progress being made toward achieving grantees' proposed goals. Program awards were made for an initial period of one year, and the above examples illustrate that many grantees have not made timely progress in completing their proposed programs, and have had to obtain extensions from the NIJ. Not only does this practice hinder the timely achievement of the Program's overall mission, but obligated funds not being utilized by this Program could have been used by other programs or grantees with more immediate needs for the funding.


Profiles Not Uploaded to CODIS

An additional factor that affects the overall success of the Program is whether Program-funded profiles are being uploaded to CODIS. During our audit work at various state and local laboratories, we observed that approximately 2,538 of the DNA profiles that had resulted from Program-funded analysis had not been uploaded to CODIS. Specifically, we noted various laboratories in all four grantee states had received back data from their contractor laboratories for cases analyzed by those contractors, but that the resultant DNA profiles had not been uploaded to CODIS as of the time we reviewed the data.

There is always a delay between when the data is received from a contractor and when it is uploaded by the state to CODIS. This time lag is due to the fact that, after receiving the contractor data, states must address the requirements of the Quality Assurance Standards for Forensic DNA Testing Laboratories (QAS), effective October 1, 1998, prior to uploading the data to CODIS. The QAS require that a forensic laboratory ensures that the data it receives back from its contractor meets certain quality standards. As part of this, the laboratory must conduct a technical and administrative review for each case analyzed by the contractor. However, as detailed below, grantees varied in their ability to address the QAS requirements in a timely manner.

To assess the reasons that might account for our observation of profiles not being uploaded to CODIS, we analyzed data provided by grantees and co-grantees. As of April 2004, 2,538 profiles from Program-funded cases returned to the grantees had not been uploaded to CODIS. We reviewed the reasons provided by grantees and co-grantees for this delay and summarized in the following figure.23

Figure 4
Figure 4 summarizes reasons for profiles not being uploaded to CODIS. Click on graphic for a text only version.
Source: Program grantees and CODIS reports

 * "Mixture" refers to profiles that reflect DNA from multiple persons and are too complex to be appropriately included in CODIS. "Only Victim Profile" refers to those profiles where only the victim's DNA was found on the evidence. Victim DNA profiles were not permitted in NDIS.

The most common reason provided for profiles not being uploaded was "Awaiting Data Review." In its Solicitation for the No Suspect Casework DNA Backlog Reduction Program (FY 2001) (Solicitation), the NIJ required that profiles be "expeditiously uploaded into CODIS." While no standards or criteria govern how much time grantees are permitted before they should upload analyzed data to CODIS, profiles that have not been uploaded to CODIS cannot be compared and matched to other forensic and offender profiles, limiting the crime-solving benefits that those profiles can have.

We further examined this issue for seven grantees and co-grantees. We judgmentally selected 25 cases and, as part of a larger review of those cases, determined the length of time it took to upload the profiles once the DNA results were returned by contractor laboratories for each case where resultant profiles were uploaded. The results of that analysis are summarized as follows:

Figure 5: Average Days for Data Review
Figure 5 summarizes the average number of days it took for a profile to be uploaded to CODIS.  Click on graphic for a text only version.">
Source: Grantee case files and CODIS reports

These results illustrate the vast differences between the various grantees and co-grantees. For example, the Palm Beach Sheriff's Office and the Ohio BCI&I were able to conduct the reviews required by the QAS necessary for upload to CODIS within an average of 9 days and 12 days, from the time the analyzed data was returned by the contractor laboratory. However, it took the FDLE's Jacksonville laboratory and the Fort Worth Police Department24 an average of 187 days and 122 days to conduct these reviews and upload the data to CODIS.

Further, we noted many additional cases where data had not been reviewed and profiles had not been uploaded that exceeded the times illustrated above. For example, we noted cases for the FDLE's laboratories in Jacksonville and Tampa Bay where analysis results were returned by the contractor laboratories in June 2003 and August 2003, but the profiles had not been uploaded to CODIS when we conducted our review in March 2004. We believe that these data review delays are excessive and not in accordance with the intent of the Program. DNA profiles not reviewed cannot be uploaded to CODIS and therefore cannot be linked to other crime-scene evidence or offender profiles, undermining the mission of the Program.

The second most common reason, "No DNA," is the result of insufficient DNA being detected during the screening process of the evidence to yield a viable sample for DNA analysis. This reason is not a problem to be addressed, particularly with old evidence from unsolved crimes, since the DNA present on the evidence may have deteriorated over time and may not be of sufficient quantity to yield a DNA profile.

As discussed in the following section, the lack of program goals and objectives, combined with the previously discussed delays in utilizing Program funding and in uploading profiles to CODIS, led us to question whether OJP had established adequate performance measurements to monitor the Program's progress.


Program Goals and Performance Measurements

In response to the Government Performance and Results Act, which requires agencies to develop strategic plans that identify their long-range goals and objectives and establish annual plans that set forth corresponding annual goals and indicators of performance, OJP developed one performance measurement for the Program. The stated mission for the Program is "to increase the capacity of state laboratories to process and analyze crime-scene DNA in cases in which there are no known suspects, either through in-house capacity building or by outsourcing to accredited private laboratories." This mission directly supports the following Department strategic plan goal and objective:

  • Goal: To prevent and reduce crime and violence by assisting state, tribal, and local community-based programs.
  • Objective: To improve the crime fighting and criminal justice administration capabilities of state, tribal, and local governments.

We reviewed OJP's progress toward achieving the single performance measurement established for the Program: Number of DNA samples/cases processed in cases where there is no known suspect. For this measurement, OJP had set a goal of 24,800 samples/cases for FY 2002. However, due to various factors, including the events of September 11, 2001, disbursement of funding for this Program was delayed and not completed until September 2002, and OJP did not meet this measurement. The Program funded the analysis of 24,738 samples or cases in its first year. According to information provided by the NIJ, only 10,609 cases had been analyzed as of December 31, 2003. In FY 2003 and FY 2004, OJP established goals of 33,850 and 43,000 samples or cases, respectively.

Even though the targets established for the Program in FY 2002 were not achieved, we sought to further analyze the established performance measurement as it relates to the Program's mission. While its mission is to increase the capacity of state laboratories to process and analyze crime-scene DNA in no-suspect cases, the Program's performance measurement merely tracks no-suspect samples or cases that have been "processed." We concluded that this measurement does not gauge whether the Program is making progress toward the achievement of its stated mission.

In discussing the performance measurement with Program management, they stated that they had attempted to add the following data points to their performance measurement in FY 2003: 1) number of profiles entered into CODIS; 2) number of profiles entered into NDIS; 3) number of investigations aided; and 4) number cases solved.

According to documentation provided by Program management, the OJP's budget office informed them that they could not make changes to their performance measures since they had already been entered into the "Performance Measurement Table" and been approved. However, while these measurements may have assisted Program management in monitoring certain Program achievements, these revised performance measurements still would not generate the type of data (i.e., laboratory capacity prior to and during the Program) that would allow Program management to track the Program's progress toward achieving its mission of increasing laboratory capacity.

In addition to assessing whether OJP had met the performance measurement it had established, we assessed whether there were other performance measurements that could be established that would provide decision-makers within the Department and Congress information on whether the Program was meeting its goals and mission. We concluded that the Program performance measurement does not address whether the Program is aiding in reducing the national backlog of no-suspect casework samples awaiting analysis. While reducing the backlog is not part of the official mission of the Program, monitoring this information would be useful in determining whether Program funding is having a positive effect on the national no-suspect casework backlog, or whether a decrease in the national no-suspect casework backlog has the beneficial effect of increasing laboratory capacity across the country.

In a report issued in November 2003, the General Accounting Office (GAO)25 cited concerns that performance measurements for many NIJ programs, including this Program, were inadequate to assess results.26 The report stated that the Program's one performance measurement was not outcome-based; rather, it was merely an intermediate measure. GAO recommended that the NIJ reassess the measures used to evaluate the Office of Science and Technology's progress toward achieving its goals and focus on outcome measures to better assess results where possible. Further, in a prior report issued by the Office of the Inspector General (OIG), deficiencies were noted relating to the adequacy of data being collected by OJP to monitor performance measurements for another DNA-related program.27

In addition, when we began our audit work in November 2003, we asked Program officials for the goals and objectives established for the Program. OJP officials responded that management personnel for the Program had recently changed, but those officials were unaware of any formal goals and objectives for the Program. In response to our inquiry, OJP officials developed the following goals and objectives for the Program:

  • Ensure that state and local forensic casework laboratories receive funding to reduce their no-suspect case backlogs;
  • Make future awards in a timely manner;
  • Ensure consistency among applicants;
  • Ensure funding drawdowns meet program and application goals;
  • Provide better award monitoring; and
  • Collect and report accurate statistics and performance measures.

In our judgment, none of these goals and objectives allow OJP to assess whether the Program is making progress toward achieving its mission of increasing the capacity of state laboratories to process and analyze no-suspect DNA from crime scenes. Some examples of such goals and objectives could include: 1) To increase grantee laboratory capacity by a certain percentage, and 2) To reduce grantees' no-suspect backlogs by a certain percentage.


Recommendations

    We recommend that OJP:

  1. Develop and implement procedures that will allow Program officials to more closely monitor grantee drawdowns as a means to ensure that adequate progress is being made toward the achievement of each grantee's goals and objectives.
  2. Ensure that timely uploads of Program-funded profiles are performed by all grantees.
  3. Develop Program goals and objectives that support the achievement of the Program's mission of increasing laboratory capacity, and implement a system to track these goals.
  4. Develop performance measurements that allow the monitoring of progress toward achieving the Program's mission, such as monitoring laboratory capacity prior to, during, and at the conclusion of the Program.

  1. Administration and Oversight of the Program
  2. We reviewed OJP's administration and oversight of the Program, and determined that weaknesses existed in three areas: 1) OJP issued second-year Program grants to states that had not drawn down any of their first-year Program grant funds by the time the new awards were issued; 2) the requirements instituted by the Program for contractor laboratories performing no-suspect casework analysis were inconsistent with those required for state and local laboratories performing no-suspect casework analysis; and 3) OJP failed to ensure that the federal funds granted under the Program will benefit the national DNA database. These weaknesses hinder the ability of Program management to maximize Program accomplishments and ensure consistent operational quality of laboratories funded for no-suspect casework analysis.

In August 2001, OJP developed and issued Program requirements in the Program Solicitation. The Program Solicitation specified general grant guidelines and restrictions, as well as more specific requirements. Grantees were required to ensure that all analyses of no-suspect cases under the Program complied with the QAS, and that any profiles resulting from these analyses be uploaded expeditiously to CODIS. Further, the grantees were to ensure that their contracting laboratories:28

  • are accredited by the American Society of Crime Laboratory Directors/Laboratory Accreditation Board (ASCLD/LAB), or certified by the National Forensic Science Technology Center (NFSTC);
  • adhere to the most current QAS issued by the FBI Director;
  • have a Technical Leader located onsite at the laboratory;
  • provide quality data that can be easily reviewed and uploaded to CODIS;
  • have the appropriate resources to screen evidence (if applicable); and
  • only be paid for work that is actually performed.

We reviewed OJP's administration and oversight of the Program to determine if grants were made in accordance with applicable legislation, and whether OJP adequately monitored grantee progress and compliance with Program requirements. In addition, we assessed whether the Program-specific requirements instituted by OJP fully supported the Program's mission. We identified the following weaknesses in OJP's administration and oversight of the Program.


Additional Funds Awarded to Grantees not Drawing Down Initial Funds Timely

In FY 2003, OJP awarded grants for the second year of the Program, totaling $10.2 million, to six states that had drawn down none of their initial awards, and to one state (New Mexico) that had drawn down less than 1 percent of its initial award, as of the date the second-year grants were made. The initial awards to these seven states totaled $11.8 million. Further, for six of the seven states, the applications requested funding for purposes that were partially or completely identical to those identified in their initial award application.29

Table 3: FY 2003 Program Awards to States Unable to
Timely Use FY 2001 Grant Funds

State FY 2001
Award Date
FY 2001
Grant
Amount
FY 2003
Award Date
FY 2003
Grant
Amount
Maryland 09/05/2002 $5,048,669 09/24/2003 $2,072,362
New York 09/20/2002 $5,039,535 09/16/2003 $5,482,020
New Mexico 08/13/2002 $550,245 07/11/2003 $674,414
Oklahoma 08/22/2002 $500,000 07/11/2003 $244,500
New Jersey 08/07/2002 $286,805 06/10/2003 $1,272,254
Nebraska 09/10/2002 $226,494 07/11/2003 $125,086
Connecticut 08/05/2002 $117,163 09/10/2003 $346,758
Total   $11,768,911   $10,217,394

The two largest grantees in the initial award, Maryland and New York, had not drawn down any of their FY 2001 funds when OJP awarded them second-year funding. As shown in the table, both states received their second grant roughly a year after their initial award. Their applications for the second-year funds requested resources to pay for similar transactions as were funded in their initial award. For example, Maryland was funded in FY 2001 for the outsourcing of 3,704 no-suspect cases. Similarly, OJP awarded it funds for the outsourcing of an additional 500 no-suspect cases in the FY 2003 award. In New York, three laboratories (Monroe, Nassau Counties, and the New York State Police) received funding for the outsourcing of cases in both FY 2001 and FY 2003.

Oklahoma, Connecticut, and New Jersey also had not drawn down any of their initial awards when they received second-year funding for activities similar to the first year. We noted that New Jersey, in particular, received a significant increase in its second-year grant even though it had failed to establish a pattern of drawing down its first-year Program funds efficiently. According to application documents, New Jersey requested this increase to outsource a significantly larger number of no-suspect cases than was requested in the first year (1,500 no-suspect cases in FY 2003 versus 220 in FY 2001).

While these states may have legitimate bases for requesting funding for additional cases, based upon the number of cases in their backlog we question OJP's awarding additional funds to states that had failed to establish a pattern of drawing down their current Program funds in a timely manner.

We noted that although Nebraska had not drawn down any of its initial award at the time it received additional Program funding, unlike the previous states mentioned, Nebraska significantly changed its funding request in its FY 2003 grant application. The initial award was provided to pay for personnel and consultant/contractual agreements so the Omaha Police Department could outsource the analysis of no-suspect cases. The FY 2003 award funded equipment and supplies for the Nebraska State Patrol Crime Laboratory to help it become ASCLD/LAB accredited. The significant variance in its two application requests may have provided OJP with appropriate justification for the FY 2003 grant award.

We identified one instance in which OJP intervened with a grantee that stated it was unable to draw down Program funds. OJP awarded an FY 2003 grant to the city of Albuquerque, New Mexico, which, at the time of the grant award, had drawn down less than 1 percent of its initial award. However, after receiving the second Program grant, Albuquerque grantee management communicated to OJP the significant problems it was experiencing in expending its initial award funds, problems that did not appear reconcilable. Consequently, OJP management began to take steps to de-obligate both grants awarded to the city at the time of our fieldwork in May 2004.

In light of these findings, we concluded that OJP should more closely monitor previous grantees' progress in using grant funds prior to awarding additional funding. Further, we recommend that OJP continue to pursue de-obligation of funds for Program grantees that have failed to draw down their Program funds in a timely manner and are unable to provide satisfactory evidence that they will do so in the near future.


Inconsistent Requirements for Laboratories Performing No-suspect Casework Analysis

As previously mentioned, the Solicitation issued by OJP to initiate the Program included several requirements for outsourcing analyses to contract laboratories.

During initial audit fieldwork conducted in December 2003, we determined that three of the six laboratories within the state of Ohio which were participating in the grant as co-grantees did not meet the requirements that were imposed upon the outsourcing or contract laboratories. The primary differences between the requirements imposed on state or local laboratories and contractor laboratories are that contractor laboratories are required to be accredited/certified by ASCLD/LAB or NFSTC and are required to have a technical leader onsite. We considered this to be a material inconsistency in the Solicitation requirements, since these co-grantee laboratories within the state of Ohio were being treated similarly to a contract laboratory and were being reimbursed on a flat-fee basis for each no-suspect case analyzed under the grant.

Specifically, we found that the Canton-Stark County, Cuyahoga County, and Mansfield Police Department laboratories all lacked either accreditation or certification. In addition, we found that these same three laboratories did not have a permanent technical leader onsite, even though they did have a technical leader available to them for onsite consultation. Since the requirements state that outsourcing laboratories must "have a technical leader that is located onsite at the laboratory where the testing is being performed," we concluded that these laboratories do not meet this requirement.

We designed our fieldwork in the other grantee states, including Florida, Texas, and New York, to include sufficient review to determine if similar deficiencies were noted with the co-grantee participants in those states. While we noted no exceptions in Florida or New York, we identified the following conditions in Texas:

  • For the TXDPS (Austin) laboratory, we found that the technical leader position had been vacant since November 2003. While the vacancy was posted, no one had been hired as of our fieldwork in March 2004, and the technical leader from the TXDPS's Houston laboratory was also serving as the technical leader in Austin. The technical leader meets the qualification requirements for this position, but she was not onsite to accomplish her responsibilities at the TXDPS (Austin) laboratory. During this period, the laboratory was performing grant-funded in-house analysis of no-suspect cases.
  • For the TXDPS (Mc Allen) laboratory, we found that the technical leader position has been vacant since July 2003. The technical leaders of the Corpus Christi (from July 2003 through February 2004) and Lubbock (from February 2004 to the present) laboratories have been available to provide technical oversight to the Mc Allen laboratory. While both meet the qualifications for being a technical leader, neither is located onsite. During this period, the TXDPS (Mc Allen) laboratory was performing grant-funded in-house analysis of no-suspect cases.

None of the above-mentioned laboratories failed to meet the criteria imposed by OJP for state and local laboratories (i.e., compliance with the QAS), but they did fall short of the requirements that OJP imposes on contractor laboratories (see Administration and Oversight of the Program). However, as discussed previously, we believe that all laboratories, whether state, local, or contractor, should be held to the same standards.

To assess whether similar conditions might exist at other laboratories, we reviewed the grant files of the remaining 21 grantees to determine whether they, or their co-grantees, were outsourcing the analysis of no-suspect cases (making this issue not applicable for those laboratories) or completing the analysis in-house. If a laboratory was completing the analysis in-house, we reviewed the grant file records to determine, where possible, whether that laboratory was ASCLD/LAB accredited or NFSTC certified, and whether the laboratory had an onsite technical leader. In most instances, we were able to determine from OJP grant file documentation that grantees and co-grantees in each state met these requirements.

However, for the following states OJP grant file documentation was not sufficient to indicate whether grantees and/or co-grantees doing in-house analysis met the applicable requirements: Arizona, Delaware, Kansas, Maine, Michigan, Missouri. We cite these states only as an indicator of the number of grantees and co-grantees that, similar to the conditions in Ohio and Texas, may not fully meet the same requirements being imposed upon the contractor laboratories.

We consider this to be a vulnerability within the Program's administration in that the level of scrutiny placed upon the contractor laboratories is not similar to that placed upon the state and local laboratories. Of particular concern is the issue of accreditation/certification. A laboratory's accreditation or certification signifies that an independent external organization has confirmed the laboratory's compliance with the QAS and the overall quality of their operations. By not requiring Program grantees to be accredited or certified, Program management have deprived themselves of a valuable assurance of grantee compliance with Program requirements, including compliance with the QAS, thereby hindering their own administration of the Program. Therefore, we recommend that OJP ensure that future Program Solicitations require all laboratories - whether in-house or contractor - analyzing no-suspect cases to meet the same accreditation/certification requirements.


Failure to Ensure Program Funding to Support the National DNA Database

In the process of collecting information to complete an analysis of whether no-suspect cases were being uploaded into CODIS, we were informed of a complication that had developed at the Fort Worth Police Department (FWPD) that prevented the profiles resulting from their grant-funded analysis from being uploaded to all levels of CODIS.

Specifically, the FWPD – due to the closure of its DNA laboratory in mid-2002 – had hired both an analysis contractor laboratory to analyze the no-suspect cases and a data review contractor to review and upload the data to CODIS. The FWPD did not have CODIS access, nor were its staff qualified to perform the data review of the analysis contractor’s results.

In late December 2003, the TXDPS was informed by the FBI's NDIS Program Manager that the FWPD's data review contractor, who did have access to CODIS to upload missing person's profiles, could no longer serve as the agent in charge of uploading FWPD's forensic profiles. Further, none of the profiles that this contractor reviewed past the point of notification could be uploaded without a separate review by staff of a CODIS-participating public laboratory.

However, at the time of our audit in March 2004, the FWPD was continuing to use the services of the data review contractor laboratory since the NDIS Program Manager's decision did not prevent the profiles confirmed and uploaded by that laboratory from being uploaded to SDIS (i.e., the state level of CODIS). Consequently, the profiles were still searchable and could therefore provide aid to investigations within the state.

The Solicitation for the Program did not clearly specify that laboratories are required to upload grant-funded profiles to NDIS (i.e., the national level of CODIS), if complete results are obtained. Rather, the Solicitation required that the grantees include in their applications a plan for, among other things, submission of profiles that result from grant-funded analysis to CODIS. Since “CODIS” is a term used generically to convey the entire database system of indexes at the local, state, and national levels, a grantee could argue that upload of profiles only to the local level, or to the local and state levels, meets the requirements of the Solicitation. In fact, FWPD management made such an argument to us regarding the profiles that resulted from the grant-funded analyses completed.

We disagree with such a conclusion. While the Solicitation and the DNA Analysis Backlog Elimination Act of 2000 make references to CODIS, we believe that federal funds awarded by OJP should be used for analysis when all viable (i.e., complete and allowable) resulting profiles will be uploaded to NDIS, thereby contributing to the crime-solving potential of the national database. Therefore, we encourage OJP to develop future Solicitations to clarify that the expectation of grantees is ultimately to upload all viable grant-funded profiles to NDIS.

Further, we recommend that OJP verify that the TXDPS has implemented the necessary measures to ensure that the FWPD’s grant-funded profiles eventually will be uploaded to NDIS. The FWPD’s Laboratory Manager stated during our fieldwork in March 2004 that he was aware that action would need to be taken to ensure that the profiles were uploaded to NDIS. However, he stated that the issue was much larger than just his laboratory, since he would most likely need to rely on assistance from another local laboratory within the state to perform the reviews for him. In addition, he said he would need assistance from the Texas CODIS Administrator to resolve the issue.

According to communications we have had with both the FWPD and the TXDPS since that time, the Manager of Field Laboratories at the TXDPS has begun action to resolve this matter. The resolution underway uses grant funds to hire a contract worker to review and upload to CODIS the profiles that were analyzed by Orchid Cellmark (Dallas) for the FWPD under its participation in the Program grant. This corrective action is to our satisfaction, since it will ensure that profiles are uploaded to NDIS, not just to SDIS. However, OJP should ensure that such resolution is completed.


Recommendations

    We recommend that OJP:

  1. Monitor grantee's progress in drawing down grant funds prior to awarding them additional funding, and closely examine the reasons additional funding is requested. If funding is awarded, a justification supporting the decision should be carefully documented, specifically addressing the rationale for the untimely drawdowns.
  2. De-obligate funds for Program grantees that have failed to draw down their Program funds in a timely manner and are unable to provide satisfactory evidence that they will be able to do so in the near future.
  3. Ensure that Program requirements in future years stipulate that all laboratories analyzing no-suspect cases to meet the same accreditation/certification requirements, regardless of whether the laboratory is private or public.
  4. Ensure that future Solicitations clarify that the expectation of grantees is ultimately to upload all viable grant-funded profiles to NDIS.
  5. Verify that the TXDPS has implemented the necessary measures to ensure that the Fort Worth Police Department's grant-funded profiles will be uploaded to NDIS.

  1. Grantee Oversight of Contractor Laboratories
  2. In assessing the adequacy of grantee oversight of contractor laboratories, we identified four laboratories that had inadequate documentation to substantiate that oversight of their contractor laboratory met Program requirements. Six laboratories also had incomplete or outdated policies or procedures relating to the outsourcing of no-suspect cases. Without complete and current documented policies or procedures, laboratory management cannot ensure that all appropriate staff comply with established methods, and management is hindered in its ability to detect and respond to issues of non-compliance.

The structure of the Program places oversight responsibility on each grantee, whether that grantee is a primary or co-grantee, for any contractor laboratory it uses as part of its participation in the grant. Such oversight includes ensuring the adequacy of policies and procedures related to the outsourcing of no-suspect casework evidence by its own laboratory and by its contractor laboratory. Therefore, throughout this section we refer to primary grantees and co-grantees as simply "grantees." Our audits assessed the adequacy of grantee oversight of their contractors, as well as verification of the compliance and handling of no-suspect cases at various contractor laboratories.


Inadequate Contractor Oversight Documentation

The QAS require that laboratories conduct certain oversight of their contractor laboratories, and the extent of these activities varies between laboratories that are outsourcing casework analysis and those outsourcing analysis of convicted offender samples. For casework analysis, the QAS require that laboratories:

  • Ensure that the contractor laboratory certifies its compliance with the QAS. This requirement is contained within Standard 17.1. According to the FBI, the contractor laboratory must submit to annual audits to ensure compliance with the QAS, and must make the results of those audits available to the laboratories for which they perform analysis work.
  • Establish and use review procedures to verify the integrity of the data received from the contractor laboratory. This requirement is contained within Standard 17.1.1. The procedures implemented to comply with this requirement must include a review of the data received from the contractor, similar to the type of review that is conducted on the laboratory's own analysis results. In addition, according to guidance provided by the FBI regarding compliance with the QAS, an onsite visit should be conducted to verify the contractor laboratory's ability to provide quality data. These onsite visits should include an evaluation of any findings detected during the last audit of compliance with the QAS to ensure that all deficiencies noted were satisfactorily resolved.

Of the grantees we audited, we identified four that were outsourcing their no-suspect case analysis to contractor laboratories and that were unable to supply us with sufficient documentation to substantiate that they had met the QAS requirements for contractor oversight.

  • The Ohio BCI&I was unable to provide us with documentation that an onsite visit meeting the FBI's guidance was conducted as of the time of our fieldwork in December 2003. Specifically, the Grant Manager provided us with confirmation that an onsite visit was conducted, but that visit did not include a review of audit results to ensure that the contractor laboratory was fully complying with QAS requirements. Ohio laboratory management agreed with our assessment, and following our audit the Grant Manager provided us with documentation that an onsite visit meeting the requirements of the QAS was conducted. This documentation satisfactorily addressed this finding, and therefore we make no further recommendation regarding this deficiency.
  • The FDLE's Jacksonville Regional Operations Center DNA Laboratory, a co-grantee of the grant awarded to FDLE Headquarters in Tallahassee had no documentation that an onsite visit of their contractor laboratory had been performed. The DNA Supervisor stated that she assumed one was performed by FDLE Headquarters in Tallahassee since the contract that exists between the FDLE and the contractor was implemented by the Tallahassee office. The Jacksonville DNA Supervisor could not say when the site visit might have been conducted, since she had been provided with no documentation of the visit, nor had she requested any.

    While the contract that exists between FDLE and the contractor laboratory may have been handled by FDLE Headquarters, records from each FDLE laboratory utilizing that contract must contain sufficient documentation to substantiate that the oversight of the contractor laboratory required by the QAS has been performed. Grantee management in both FDLE's Tallahassee and Jacksonville locations agreed with this assessment, and stated that they would take appropriate corrective action.

    Subsequent to our audit, the Supervisor of the Jacksonville DNA Laboratory provided us with the onsite visit reports supplied to her by the FDLE Investigative and Forensic Science Services Director in Tallahassee documenting that onsite visits had been conducted in 2002 and 2004. These reports, and the fact that the Jacksonville DNA Laboratory Supervisor now has the documentation of onsite visits conducted, satisfactorily addresses our Jacksonville audit finding. Therefore, no further recommendation will be made regarding the Jacksonville DNA Laboratory. However, we recommend that the FDLE Tallahassee Investigative and Forensic Science Services management implement a policy that will ensure documentation is provided to relevant FDLE system laboratories regarding contractor oversight in the future.
  • The Fort Worth Police Department, a co-grantee of the grant awarded to the TXDPS, had incomplete documentation to substantiate that its contractor laboratory complied with the QAS. At the time we conducted our audit in March 2004, the Laboratory Director, who was not in his current position when the outsourcing contract was implemented, was able to locate an onsite visit report from April 2002, copies of protocols and procedures that were supplied to them at the start of the contract, and accreditation documentation. However, this documentation contained no indication of a review of the contractor's QAS audits, or any indication that the contractor's on-going compliance with the QAS had been confirmed.
  • The Houston Police Department, a co-grantee of the grant awarded to the TXDPS, could not produce sufficient documentation that an onsite visit of its contractor laboratory meeting the requirements of the QAS had been conducted. Travel vouchers for a site visit to the contractor laboratory were provided, but no site visit report could be located. Therefore, we could not determine whether the site visit included the level of review required by the QAS.


Incomplete or Outdated Policies and Procedures

In addition to reviewing supporting documentation of contractor oversight, we also reviewed the policies and procedures in place that govern the transfer of evidence between the grantee laboratories and the contractor laboratories, and the tracking, safeguarding, and analysis of that evidence. We specifically examined whether sufficient policies and procedures were in place to ensure that the chain-of-custody was properly maintained throughout the transfer process, and to ensure that the policies, procedures, and facilities governing the storage, analysis, and tracking of the evidence were consistent with the QAS Sections 6 (Facilities) and 7 (Evidence Handling). Among other requirements contained in these sections, the QAS require a laboratory to have a facility that is designed to provide adequate security and minimize contamination (Standard 6.1), and to have and follow a documented evidence control system to ensure the integrity of physical evidence (Standard 7.1).30

Our audits revealed that three grantee laboratories and three contractor laboratories had incomplete or outdated policies or procedures regarding either chain-of-custody or evidence handling.

Grantee Laboratory Deficiencies

    Florida Department of Law Enforcement, Jacksonville Regional Operations Center

    We found that the FDLE Jacksonville Regional Operations Center DNA Laboratory had an evidence control system. However, while the system appeared adequate to address the processing of evidence within the laboratory, it did not contain any specific guidance for the outsourcing process.

    For example, the system did not contain evidence-handling policies that clearly described how the samples were being packaged by the DNA section for submission to the contractor laboratory, or how the chain-of-custody documentation was being maintained. Also, the electronic-evidence tracking system was not designed to permit employees to check evidence out of the system and submit them to the contractor for analysis. Instead, according to laboratory management, the tracking system stated that each outsourced item had been "returned to the submitter," which was the only option programmed into the computer for instances when evidence was leaving the laboratory. The laboratory had manual documentation to account for the chain-of-custody, but without a policy explaining how this was being handled it was not clear how the chain-of-custody had been accounted for.

    Laboratory management cannot ensure that staff know and comply with the established procedures unless those procedures have been formalized in writing for staff reference. Therefore, while the procedures described to us for the outsourcing process appeared adequate to prevent loss or abuse to the evidence during this process, we recommend that these procedures be described fully in a formally approved and implemented written policy.

    Nassau County Police Department

    As co-grantees of the Program grant to the New York State DCJS, the Nassau County Police Department (NCPD), in cooperation with the Nassau County Office of the Medical Examiner (OCME), outsourced no-suspect cases to Orchid Cellmark in Germantown, Maryland. The outsourcing was handled so that the evidence was sent directly from the NCPD to the contractor, and the results of the analysis were reviewed by the OCME. Based upon these arrangements, the evidence handling portion of our audit was conducted strictly at the NCPD.

    Various procedures for tracking, handling, and storing the evidence were described to us by NCPD staff while we physically reviewed their facilities. These procedures appeared to be sufficient to account for and safeguard the evidence that was being outsourced, both prior to being sent out to the contractor and after it was returned by the contractor.

    However, we were not able to locate all of these procedures in documented policies. We were provided with evidence handling policies that constitute the laboratory's evidence control system required by QAS 7.1, but these policies did not detail all of the procedures staff stated were in use to minimize contamination and document the details of evidence being sent to the contractor. Further, those policies do not reflect the current electronic evidence tracking system in place. The laboratory did have separate procedures - albeit not formally completed - for the use of the electronic evidence tracking system. However, laboratory management cannot ensure that staff know and comply with the established procedures unless those procedures have been formalized in writing for reference.

    We recommend that all the current procedures in use in the outsourcing of no-suspect casework evidence be described in detail in formalized written policies.

    Houston Police Department

    The Houston Police Department's evidence control system did not address the policies or procedures used when transferring evidence between its laboratory and the contractor laboratory. We asked the Assistant Laboratory Director about these policies and procedures, and she responded that while the laboratory has procedures for this process, they are not contained in the current formalized policies. She further provided us with a written description of the procedures in use, as well as a form that is used to document the chain-of-custody of evidence items being sent to and received from the contractor laboratory.

    In reviewing the procedures described by the Assistant Laboratory Director, the procedures appear generally sufficient to address evidence to the contractor laboratory. However, they lack detail regarding the process followed when the evidence is returned. Further, the procedures must be formalized in writing for laboratory management to be able to ensure that all staff comply with them.

    Therefore, we recommend that a comprehensive written policy be developed that contains all aspects of the outsourcing transfer of evidence, and that such a policy be formally approved and implemented.

Contractor Laboratory Deficiencies

    The Bode Technology Group, Inc.

    The Bode Technology Group, Inc's, (Bode) chain-of-custody policy, part of its compliance with QAS 7.1, appeared adequate to track the movement of evidence within the laboratory. However, we determined that the way in which staff are applying their chain-of-custody policy is not sufficient to adequately document transfers within the laboratory.

    Specifically, the policy states that when a case has been processed it must be repackaged and returned to the custody of the Evidence Custodian. While this is being done in practice, the documentation does not reflect this transfer. Instead, the documentation shows that the laboratory personnel still have custody of the items, even while the items are in the Evidence Custodian's custody and control and, therefore inaccessible to laboratory personnel. While this situation does not pose a concern as to the safety or security of the evidence, it does pose a concern for the complete documentation of evidence movements in the chain-of-custody documentation maintained for casework clients.

    In discussing this issue with Bode management, they agreed with our finding. After our audit in January 2004, Bode personnel provided us with documentation that staff had been informed of the new procedure requiring them to formally return custody to the Evidence Custodian on the chain-of-custody form. They also provided us with copies of the new Standard Operating Procedures (SOPs) governing chain-of-custody, that include clarification of the policy. We consider this documentation sufficient to address this deficiency and therefore make no further recommendation regarding Bode's chain-of-custody procedures.

    In addition, we could not determine from a review of Bode's evidence control system which of the evidence handling SOPs applied to the high-throughput (i.e., high-volume) casework environment under which the no-suspect cases are processed. For example, despite a protocol for Photographing of Evidence, in the laboratory tour it was obvious from what we observed and were told that the high-throughput casework items are not photographed. Consequently, it is not clear whether compliance weaknesses might exist in those areas where staff practices are inconsistent with a policy, since that policy may or may not apply to the high-throughput casework environment.

    Management agreed with this finding, and following our audit provided to us a revised Forensic Evidence Handling SOP that clarified which of the procedures applied to individual casework and which were applicable to all casework, including high-throughput. We consider this documentation sufficient to address this deficiency and therefore make no further recommendation regarding Bode's evidence handling procedures.

    Finally, we found that Bode's policies and procedures for cleaning and decontamination of the laboratory (in compliance with QAS 6.1, among other requirements) appeared to adequately address these topics, with one minor exception. The laboratory contains a windowed cutout in the wall, also referred to as a pass-through, between the pre-amplification room (considered a "clean" area) and the post-amplification room (considered a non-"clean" area). The pass-through allows transference of tube trays between the two rooms with minimal risk of cross-contamination or transfer. There were no policies or procedures regarding the cleaning of the pass-through. While this poses a limited risk to the laboratory, since contamination incidents are tracked that would reveal whether the pass-through area has caused contamination problems, we consider this to be a point of inconsistency with the remainder of their policies. Therefore, we recommended Bode management implement a policy for the cleaning of the pass-through, and they agreed that such a policy would be implemented.

    Orchid Cellmark, Germantown, Maryland

    Our examination of procedures of the Orchid Cellmark Laboratory in Germantown, Maryland, revealed that while the laboratory had an evidence control system as required by QAS 7.1, actual practices of staff were inadequate to ensure that evidence is properly secured immediately after it is delivered to the laboratory. Specifically, evidence arriving at the lab is received into the reception area, which is accessible to the general public. After the item is logged in by the receptionist, the item is not immediately secured or moved to a limited access area. Further, we observed that there are times when the receptionist takes a short break and the reception area may be briefly unattended.

    While management acknowledged this latter situation, they stated that for breaks of any length someone fills in for the receptionist. Our observations during fieldwork supported this assertion. In addition, management stated that they would be able to hear from their offices - which adjoin the reception area - when the door clicks open to signify that someone has entered, and would check the situation if the receptionist was momentarily absent. However, we question whether the security of the evidence should rely upon such methods.

    Therefore, while we acknowledge that the evidence packages arrive sealed, and while we acknowledge that the reception area is generally monitored by someone who is physically present, we believe that Orchid Cellmark’s evidence storage policies would be strengthened by requiring that evidence, after being received and logged in by the receptionist, be immediately placed in a limited access or secure area while awaiting the attention of technical personnel.

    Orchid Cellmark, Dallas, Texas

    We found that Orchid Cellmark (Dallas) policies and procedures for cleaning and decontamination of the laboratory (in compliance with QAS 6.1, among other requirements) appeared to adequately address these topics, with one minor exception. The laboratory contains a pass-through, similar to the one described previously at Bode, that allows transference of tube trays between the pre-amplification room and the post-amplification room. In addition, the pass-through is equipped with ultra-violet light that can be switched on to decontaminate the pass-through. However, we noted that there were no policies or procedures requiring the use of the ultra-violet light, nor an indication of how frequently this should occur.

    As with Bode, the omission of this information from the policies poses a limited contamination risk to the laboratory. As with Bode, we consider this to be a point of inconsistency with the remainder of its policies. Therefore, we advised Orchid Cellmark management to implement a policy for the cleaning of the pass-through, and they agreed that they would implement such a policy.


Recommendations

Our recommendations below reflect the structure of the Program in which the primary grantee in each state serves as a liaison between OJP and the co-grantees. Therefore, our recommendations to correct deficiencies at co-grantee laboratories are directed to each state's primary grantee.

Further, contractor laboratory deficiencies must be resolved by a grantee laboratory that used the services of that contractor laboratory, a fact also reflected in our recommendations. Therefore, we addressed our recommendations for contractor laboratories to selected grantee personnel.

    We recommend that OJP:

Florida

  1. Ensure the FDLE, Investigative and Forensic Science Services, Tallahassee, implements a policy to routinely distribute a copy of contractor oversight documentation to all laboratories participating in its outsourcing contracts;
  2. Require the FDLE, Investigative and Forensic Science Services, Tallahassee, to ensure that the Jacksonville Regional Operations Center, as a co-grantee of the grant made to Florida, create a comprehensive policy that will contain current procedures in use for outsourcing of no-suspect casework evidence, and formally approve and implement that policy;
  3. Require the FDLE, Investigative and Forensic Science Services, Tallahassee, to ensure that the Bode Technology Group implements a policy for the cleaning of the pass-through that exists between the pre-amplification and post-amplification areas;

Texas

  1. Require the TXDPS to ensure that the Fort Worth Police Department, as a co-grantee of the grant made to Texas, begin maintaining records to substantiate their vendor's on-going compliance with the QAS;
  2. Require the TXDPS to ensure that the Houston Police Department, as a co-grantee of the grant made to Texas, completes and documents an onsite visit to their contractor laboratory sufficient to meet the requirements of the QAS;
  3. Require the TXDPS to ensure that the Houston Police Department, as a co-grantee of the grant made to Texas, create a comprehensive policy that will contain current procedures in use for the outsourcing of no-suspect casework evidence, and formally approve and implement that policy;
  4. Require the TXDPS, through the Fort Worth Police Department, as a co-grantee of the grant made to Texas, to ensure that Orchid Cellmark in Dallas, Texas, implements a policy for cleaning of the pass-through that exists between the pre-amplification and post-amplification areas;

New York

  1. Require the New York State DCJS to ensure that the NCPD, as a co-grantee of the grant made to New York, create a comprehensive policy that will contain current procedures in use for the outsourcing of no-suspect casework evidence, and formally approve and implement that policy, and
  2. Require the New York State DCJS to ensure that Orchid Cellmark in Germantown, Maryland, implements a policy requiring that evidence, after being received and logged in by the receptionist, is immediately placed in a limited access or secure area while awaiting the attention of technical personnel.

  1. Allowability of Costs Charged to Program Awards
  2. We assessed the allowability of costs charged to Program awards by the four grantees we audited. While we found that they materially complied with most award requirements, we noted small deficiencies at all four grantees, and found costs charged to Program awards that were unallowable and/or unsupported. As a result, we questioned costs of $111,297, out of a total of approximately $13.5 million awarded, and made 9 recommendations. In addition, we assessed whether selected grantees/co-grantees complied with Solicitation requirements pertaining to costs being paid to contractor laboratories and found that one co-grantee was overpaying for the services received from its contractor laboratory. Consequently, we questioned $44,640 in costs that were unallowable out of a total award of approximately $5 million.

The first year of the Program was designed to provide states with funds to analyze no-suspect casework DNA profiles, either through in-house analysis or outsourcing, to build laboratory capacity. We selected four grantees, conducted separate grant audits for each of the grantees, and issued separate audit reports.31 The selection of the grantees was based on the award amount and on the amount of funds drawn down as of the start of our audit. We selected the following four grantees to audit: 1) Ohio Bureau of Criminal Identification and Investigation (Ohio BCI&I), 2) Texas Department of Public Safety (TXDPS), 3) Florida Department of Law Enforcement (FDLE), and 4) New York State Division of Criminal Justice Services (DCJS). In addition, at each of these locations we conducted an analysis of whether selected grantees/co-grantees complied with Solicitation requirements pertaining to costs being paid to contractor laboratories. The specific work conducted at each site, including the scope and methodology of each audit, is detailed in Appendix I of this report.

The four grantees received a total of approximately $13.5 million to analyze 10,874 no-suspect cases and to build capacity in their labs. As of May 31, 2004, these grantees had drawn down approximately $5.9 million, or 44 percent of their awarded funds. The following is a summary of the findings from each of the audits of these four grantees.


Ohio Bureau of Criminal Identification and Investigation

The Ohio BCI&I is a division within the State of Ohio, Office of the Attorney General. The Ohio BCI&I was awarded $2,254,088 to analyze 3,068 no-suspect cases, to purchase supplies and equipment, and to identify old no-suspect cases for testing.

We reviewed the Ohio BCI&I's records to determine whether costs claimed for reimbursement were allowable, supported, and in accordance with applicable laws, regulations, guidelines, and terms and conditions of the award.

Our audit revealed that the Ohio BCI&I charged some unallowable costs to the award and did not have proper documentation to support all expenditures. As a result, we questioned $106,755 in costs that were unsupported or unallowable, or approximately 5 percent of the total funds awarded. Additionally, we noted that required financial status reports were not always submitted timely. We also found that the grantee received drawdowns of funds in excess of their immediate disbursement requirements.

Unsupported Costs

Salaries and fringe benefits for overtime worked on no-suspect casework were authorized to be paid from the award. In turn, the Ohio BCI&I utilized co-grantees within the state of Ohio to accomplish the goals outlined in their Program proposal. The Canton-Stark County Crime Laboratory (Canton-Stark), one of the co-grantees, was approved and reimbursed by the Ohio BCI&I for a total of $110,000. The funds for Canton-Stark were originally budgeted under the supplies budget category, but were altered by Grant Adjustment Notice (GAN) 8 at the request of Canton-Stark, which reallocated $95,497 from supplies to the personnel and fringe benefits budget category. However, after reviewing the personnel records, we found support for only $20,297 of the $95,497, resulting in questioned costs of $75,200.

The Miami Valley Regional Crime Laboratory (Miami Valley), another co-grantee used by the Ohio BCI&I, was authorized in the grantee's budget worksheet to pay overtime in the amount of $8,000 for the analysis of no-suspect cases. At the time of our audit, Miami Valley had been reimbursed a total of $8,000 for overtime by the Ohio BCI&I. After reviewing the payroll records, we found support for only $5,102 of the $8,000, resulting in questioned costs of $2,898.

In addition, we questioned $5,009 in unsupported costs charged to the award. Canton-Stark was reimbursed $14,503 for supplies by the Ohio BCI&I. Using information we received from Canton-Stark, we concluded that they had only spent $9,494 on allowable supplies relating to the testing of no-suspect cases. As a result, we questioned the remaining $5,009 as unsupported.

Unallowable Costs

We found one transaction for $23,648 that was unallowable. The grantee purchased 20,000 buccal swabs from Bode with $23,648 of the award funds, but this purchase was not approved in the budget and these items do not relate to functions performed under this Program.

Untimely Financial Status Reports

We examined the Financial Status Reports (FSR), which contain the actual expenditures and unliquidated obligations incurred for an award on a quarterly and cumulative basis. The Financial Status Reports must be filed within 45 days of the end of the most recent past quarterly reporting period. We reviewed FSRs for timeliness and accuracy, and found that 4 of 5 FSRs were submitted between 5 and 19 days late.

Excess Drawdowns

Our review of the total expenditures compared to drawdowns for the award found that the Ohio BCI&I had excess award funds totaling $201,674 on hand as of April 29, 2003. As of June 18, 2003, the Ohio had excess award funds on hand totaling $236,578. Prior to our audit, the grantee realized that its methodology for drawing down funds was incorrect, and beginning in October 2003, it began to make smaller and more frequent drawdowns. Therefore, we did not make any recommendations regarding this matter.


Texas Department of Public Safety

The TXDPS was awarded $3,379,688 to analyze 3,160 no-suspect cases, and to pay for overtime, consultants for in-house analysis, and for outsourcing. In addition, funds were awarded to purchase equipment and supplies.

We reviewed the TXDPS's records to determine whether costs claimed for reimbursement were allowable, supported, and in accordance with applicable laws, regulations, guidelines, and terms and conditions of the award. We found that one Financial Status Report was inaccurate.

Inaccurate Financial Status Report

We reviewed the FSRs submitted for the period August 1, 2002 through December 31, 2003, for accuracy and timeliness. While the reports were submitted in a timely manner, one of the six reports reviewed incorrectly stated the total outlays.

The FSRs for the quarters ending December 2002 and December 2003 were overstated by $1,435 and $80,033, respectively. The FSRs for the quarters ending March, June, and September 2003 were understated by $346, $70, and $40, respectively. The TXDPS provided documentation showing that, with the exception of the December 2003 FSR, the discrepancies were due to temporary timing differences relating to when benefit expenditures were posted. In addition, the TXDPS did not agree that the FSR for the fourth quarter of 2003 was overstated by $80,033, but they agreed that the report was incorrect and submitted a revised report in May 2004.


Florida Department of Law Enforcement

The FDLE was awarded $2,795,086 to reduce the backlog of no-suspect cases in state and county crime laboratories, to analyze those cases using the 13 CODIS core loci, to expedite the entry of the resultant profiles into state and national CODIS networks, and to increase Florida's DNA analysis production capability and capacity. Included in this amount was funding for outsourcing of over 1,500 no-suspect cases, and funds to purchase equipment and supplies.

We reviewed the FDLE's records to determine whether costs claimed for reimbursement were allowable, supported, and in accordance with applicable laws, regulations, guidelines, and terms and conditions of the award.

Our audit revealed that the FDLE charged relatively small unallowable costs to the award. As a result, we questioned $4,542 or less than 0.2 percent of the total award. We also found that progress reports did not always accurately reflect actual Program activities. Finally, we noted a reportable condition relating to management controls over the approval of invoices from contractor laboratories.

Unallowable Costs

We found unallowable costs charged to the award by four co-grantees. First, the Broward County Sheriff's Office Crime Laboratory exceeded its allowable costs for salary and fringe benefits by $1,932 for five positions that were not in its approved budget.

Second, the Miami-Dade Police Department submitted a reimbursement request for equipment totaling $184 that was not approved by the Forensic Services Director. Also, a transaction totaling $100 was unallowable because the items purchased were not approved in the budget worksheet and were not related to functions performed under the award.

Third, a transaction totaling $786 charged by the Palm Beach County Sheriff's Office Crime Laboratory was unallowable because the purchased items were not approved in the budget worksheet and were not related to functions performed under the award. Specifically, the Sheriff's Office purchased a file cabinet for $241, office supplies for $469, and ink cartridges for $76.

FFinally, a transaction totaling $1,540 charged to the award by the Indian River Community College Crime Laboratory was unallowable because the purchases were not approved in the budget worksheet. Specifically, the laboratory purchased a TLS PC Link Labeling System for $1,175, biodyne membrane for $214, and chemiluminescence reagent for $151.

Inaccurate Progress Reports

We noted that the progress report for the period ending June 30, 2003, understated the number of CODIS hits by 14. In addition, the progress report for the period ending December 31, 2003: 1) overstated the number of cases outsourced to contractor laboratories by 1,867; 2) understated the number of cases uploaded into CODIS by 62 cases; and 3) overstated the number of CODIS hits by 16. Grantee officials concurred and stated that they would ensure that future reports were accurate.

Inadequate Controls Over Contractor Invoice Approval

We noted that FDLE's management controls over the approval of invoices from contractor laboratories were inadequate. Prior to our audit, FDLE officials did not have adequate procedures in place to ensure that the FDLE was charged only for the portion of work actually completed by contractor laboratories. During the audit, the Forensic Services Director revised the procedures to ensure that the FDLE was paying only for services actually performed by each contractor laboratory. Under the revised procedures, each Serology Supervisor is required to verify and certify each invoice for payment before the contractor laboratories are paid.


New York State Division of Criminal Justice Services

The New York State DCJS is the principal coordinating agency for criminal justice activities in the state of New York. The DCJS was awarded $5,039,535 to analyze 3,146 no-suspect cases, to upload the resulting profiles to CODIS, and to compare the profiles to the CODIS convicted offender database. Included in the award was funding for overtime, consultants for in-house analysis and for outsourcing, and equipment and supplies.

We reviewed the DCJS's records to determine whether costs claimed for reimbursement were allowable, supported, and in accordance with applicable laws, regulations, guidelines, and terms and conditions of the award.

Our audit revealed that the FSRs submitted to OJP did not always accurately reflect actual cumulative outlays. In addition, we found that the budget information submitted by one of the co-grantees was inadequate. Finally, we determined that billing arrangements between a co-grantee of the Program award to DCJS and its contractor laboratory were not consistent with Program requirements.

Inaccurate Financial Status Reports

We reviewed the four FSRs submitted by the DCJS and found the reports were submitted in a timely manner. However, we found that the FSRs underreported cumulative outlays incurred.

Underreporting outlays occurred because some of the state's co-grantees did not report outlays to the DCJS in a timely manner. The DCJS acts as the executive agent for all of the state's co-grantees. Each quarter the DCJS completes and forwards to the NIJ a consolidated FSR for all of the state's co-grantees, which the DCJS relies on for the outlay information submitted. The co-grantees report outlays on a state financial reporting form that is similar to the federal FSR. In order for the federal FSR to be accurate, the state financial reporting form must be submitted in a timely manner so that co-grantee's outlays can be accurately reported on the federal FSR.

The DCJS reported cumulative total outlays of $392,187 on the federal FSR for the period ending December 31, 2003. At the time of our audit, we found that seven co-grantees reported award outlays on the state's financial reporting form. Of the seven co-grantees reporting outlays as of December 31, 2003, three did not report outlays in a timely manner to the DCJS. As a result, we found cumulative total outlays on the federal FSR for the period ending December 31, 2003 to be $681,390.

Inadequate Budget Documentation

We reviewed the financial records of each co-grantee and found that one had approved budget documentation that included only one rate of pay for personnel. However, we found expenditures for several different personnel categories. Therefore, we were unable to establish whether the personnel expenditures were approved in the co-grantee's budget and, as a result, we could not determine if that co-grantee accurately expended award funds.

Unallowable Costs

One aspect of our overall assessment of grant activities was to review controls over payments made to contractor laboratories, particularly with a focus on compliance with the Program Solicitation, which requires that state applicants ensure that their contractor laboratories are paid for only the portion of the work that they perform. The Solicitation further states that: "funds from the Program cannot be used to pay laboratories for fully processing samples when certain steps (in the analysis process) were not performed. . . . The compensation given to the outsourcing laboratory should be fair, and directly reflect the effort and cost put forth by the laboratory in processing the case/sample."

While completing this portion of our work, we determined that one laboratory, a co-grantee under the grant awarded to the DCJS, was overcharged for the work that was actually performed by their contractor. Specifically, OJP approved the DCJS and co-grantees to pay a flat rate per case for cases processed by their contractor laboratories, with the limitation that the cases would be screened by the grantee laboratories.32 One of these co-grantees was the Nassau County Police Department (NCPD). However, during the delay between New York's Program application and its award, the operations of the NCPD had changed so that by the time of the award they no longer had the proper facilities or staff to screen the evidence themselves. Consequently, the NCPD began sending out unscreened evidence to the contractor for the no-suspect cases funded under the Program.

In addition, the NCPD worked in cooperation with the Nassau County OCME to complete the outsourcing process: the OCME was responsible for overseeing interactions (e.g., QAS oversight and billing) and reviewing the contractor’s data. These cases were sent out under a contract that had been negotiated and administered prior to the Program award by the New York State Police. Since the OCME did not directly negotiate a price structure with the contractor, that structure did not reflect that the OCME was not screening their cases. Consequently, the OCME cases that were screened and determined to be negative for DNA were being charged the same price as the cases where complete analysis was required and full results obtained. At the time we conducted our audit work in April 2004, the OCME had paid for complete DNA analysis for 48 cases that, in actuality, were only screened.

Since laboratories are not required by the Solicitation to perform all screening in-house, we were concerned that the NCPD was paying a flat-fee for cases analyzed, regardless of whether full analysis was completed. Such an arrangement violates the Program Solicitation requirements and limitations on contractor payments. In addition, the high percentage of these cases that were negative for DNA added greater emphasis to this issue. We determined that analysis for 51 percent of the cases outsourced at the time of our fieldwork in April 2004 could not be completed due to insufficient DNA.

Therefore, we questioned $44,640 as unallowable costs for the 48 cases for which the contractor had been paid for complete analysis when only screening was performence.33 In discussing this with the DNA laboratory management at the OCME, we were informed that they had decided to use the existing New York State Police's contract as a way to avoid the significant delays that would come with going through their local procurement process to execute their own contract. The DCJS's no-suspect grant point-of-contact, the Director of the Office of Forensic & Victim Services stated that the DCJS's goal was to expeditiously outsource the no-suspect cases. Using the New York State Police's contracts allowed the OCME to avoid negotiating separate contracts of their own. However, both the OCME and the DCJS management we talked to agreed with our conclusions and stated that they would seek to remedy the situation with both the contractor and the NIJ.


Recommendations

We issued separate audit reports to OJP, containing a total of 13 recommendations, for each of the four grantees audited.34 Because OJP is working with these grantees in responding to our audit findings, we will not provide additional recommendations to address these audit findings in this report. However, for the one issue in Nassau County that was not previously reported, we recommend that OJP:

  1. Ensure that the New York State DCJS remedies questioned costs of $44,640 for the Nassau County Police Department cases for which the contractor laboratory was paid for complete analysis but only screening was performed.


Footnotes

  1. A profile's completeness is determined by whether it contains all of the points of information that the FBI requires for an NDIS profile to be considered. Therefore, we only included complete profiles in our productivity calculations.

  2. Ohio did not join NDIS until November 2000 and has a major computer system malfunction in 2001, so no profiles went to NDIS in those years. However, profiles for Ohio uploaded to SDIS from 2000 through 2003 were 136, 558, 1099, and 2084, respectively.
  3. Due to the unique circumstances regarding the Fort Worth Police Department's inability to upload profiles to NDIS, we excluded their results from this analysis. This issue is further discussed in Finding II of this report.
  4. The Fort Worth Police Department contracted with the University of Northern Texas for the data review and upload to CODIS.
  5. Effective July 7, 2004, the General Accounting Office (GAO) became the General Accountability Office. The acronym remains the same.
  6. GAO Report No. 01-198, titled Better Performance Measures Needed to Assess Results of Justice's Office of Science and Technology, dated November 2003.
  7. The prior OIG audit report, titled The Office of Justice Programs Convicted Offender DNA Sample Backlog Reduction Grant Program, Report No. 02-20, was issued in May 2002.
  8. See Appendix III for further information regarding Program specific requirements.
  9. We excluded from this analysis states that had begun to draw down more than trace amounts of their grant funds by the time they were awarded their second-year grant.
  10. While our audit work was designed to review whether policies existed, we were dependent upon laboratory staff and management descriptions of procedures in use for an indication of actual practices within the laboratory. Therefore, while we could confirm whether appropriate policies and procedures were in place, we could not attest to the on-going practices of staff within the laboratory.
  11. Audit reports issued are identified in Appendix I of this report.
  12. To screen a case requires a laboratory to determine, through visual inspection and/or preliminary tests, which case samples are most likely to yield sufficient DNA for successful analysis. Screening the cases prior to sending them to a contractor laboratory generally means that there will be a greater level of success during analysis in obtaining a DNA profile from each sample.
  13. Questioned costs of $44,640 were calculated based upon multiplying the number of "screening only" cases (48) by the estimated price for screening only of $250 per case, and subtracting that from the price actually paid (48 x $1,180), with the remainder being the portion we have questioned. This price was provided to us as the upper range of prices typically quoted for forensic casework contract clients by the Executive Director of Orchid Cellmark (Maryland).
  14. See Appendix I for specific information regarding these separately issued reports.