Follow-Up Audit of the Bureau of Alcohol, Tobacco, Firearms and Explosives
Forensic Science Laboratories Workload Management
Audit Report 06-15
Office of the Inspector General
The DOJ OIG found the ATF laboratories had slightly improved the processing times for forensic analysis since the time of the prior audit conducted by the Treasury OIG in 2001. However, approximately two-thirds of completed examinations continued to take more than 30 days to complete. Approximately 37 percent of the completed examinations we tested were completed within 30 days, compared with 33 percent found in the Treasury audit for FYs 1998 and 1999. The laboratories did reduce the percentage of examinations that took more than 90 days to complete, from 43 percent during the Treasury OIG audit to 33 percent during the DOJ OIG audit.
The improvements in timeliness of laboratory examinations were limited because the ATF had not yet accomplished several actions that were planned in 2001. These actions included implementing a revised priority system to expedite services more effectively, increasing the number of examiner positions at the forensic science laboratories, and implementing the planned information system that was expected to improve communications between field offices and the laboratories. The ATF also had not significantly reduced the size of its backlog of examination requests, causing incoming requests to be put on hold for analysis while backlogged requests were handled.
ATF field offices, and occasionally state and local authorities, submit evidence to the forensic laboratories along with information on evidence submission forms, including specific requests for examinations. Agents may submit several batches of evidence, called submissions, at different times during an investigation. Each submission may include multiple items to be tested, and each item may need more than one type of examination. For example, nearly all guns on which firearms examinations are performed are also tested for latent fingerprints.
Submissions received at the laboratories are initially processed by an evidence technician, who records the information in FACETS. FACETS produces an evidence control card that is used, in part, to track the location of evidence in the laboratory. The evidence control card is forwarded to a section supervisor, who assigns an examiner to perform the test. Most evidence submissions are processed on a first-in, first-out basis. Special Agents who need laboratory results quickly, or by a specific date, may request expedited service using the evidence submission form. If expedited service is requested, which was the case in approximately 30 percent of our sample, the section supervisor will review the supporting documentation to determine whether the circumstances justify moving the request ahead of other pending examination requests. The supervisor may contact the submitting agent to discuss the request. If the supervisor determines the submission warrants expedited service, then it will be processed ahead of other submissions.15
Tests on submissions that require more than one type of analysis are processed in the order determined appropriate by the laboratory. This is because one type of examination may destroy some evidence vital to another examination. For instance, explosives fragments are always examined by a chemist in the explosives section of the laboratory before a fingerprint specialist examines the fragments for latent prints. A fingerprint specialist always examines a firearm for latent prints before a firearms examination is performed. Examiners from the various disciplines routinely coordinate with one another, and with the submitting agents, in the handling of evidence. Occasionally, Special Agents cancel requests for examinations before the examinations are completed. This may happen as a result of a defendant’s pleading guilty, which makes the forensic examination unnecessary.
The Treasury OIG used a 30-day standard to define timely service in its audit report in 2001. It asserted that the goal was established by laboratory officials during the early 1980s as a target timeframe for completing examinations. However, the formal performance standard that existed in the Office of Laboratory Services’ Operating Plan for 2005 reflects the expectation that the 30-day goal would not be achieved for all examinations. The Operating Plan for 2005 called for requested examinations to be completed within 30 days in 30 percent of firearms cases, 35 percent of explosives cases, and 40 percent of arson cases during FY 2005. The plan also projected an increase of 5 percentage points for firearms and arson cases for each of the two subsequent fiscal years.
The Director of Laboratory Services told the DOJ OIG that he has adopted a long-term goal that was generally accepted among crime laboratory directors to complete all examinations within 30 days of receiving the evidence. We found that a 30-day turnaround time was commonly identified by the forensic community as a standard that forensic laboratories should try to achieve, but that generally was not met for 100 percent of examinations because of resource constraints. The following are examples of a 30-day standard.
We consider a goal of 30 days for the results of most forensic examinations a reasonable target toward which laboratories should work, but recognize that the standard is not being met and is not a realistic expectation for all examinations under current conditions. It is used in this report to compare our findings with the Treasury OIG audit and to demonstrate how far the ATF is from meeting this long-term goal. We also consider 30 days a reasonable goal toward which laboratories should work because of the potentially serious consequences of delayed forensic examinations. These effects include the costs of wasted investigative time and delayed trials, and the more serious possibility that additional crimes may be committed by offenders who are not identified and arrested quickly.
Timeliness for the Period October 1, 2003, through May 13, 2005
To evaluate timeliness, the DOJ OIG obtained a data extract from FACETS that included information on all forensic examinations completed from October 1, 2003, through May 13, 2005 (the date the extract was performed). The extract also included data on all examinations that were pending or incomplete as of May 13, 2005. The extract included data elements for the laboratory receipt date of evidence submissions and examination finish dates, which we used to determine how long each examination took. In order to determine the reliability of the data extracted from FACETS, we compared FACETS data values with the source documents in case files. We determined that the FACETS data was sufficiently accurate and reliable for our use.20 Our timeliness findings are primarily based on our analysis of the FACETS data.
The laboratory data we analyzed included information on 3,757 cases. These cases were associated with 4,905 evidence submissions and 5,733 requested examinations. Of the 5,733 requested examinations, 4,576 (nearly 80 percent) were completed and 1,157 (20 percent) were pending on the date of the extract. Excluded from these numbers were 591 examination requests that had been canceled at the time of the extract.21
As shown in the following table, of the total 4,576 examinations completed between October 1, 2003, and May 13, 2005, the DOJ OIG found that 1,705 (37 percent) were completed within 30 days and that 1,490 (33 percent) took more than 90 days to complete. The other 30 percent were completed between 1 and 3 months after receipt. The average turnaround time for these examinations was 95 days.22
Examination Processing Time
The following chart compares the processing times for the three laboratories on completed examinations included in our FACETS extract. As the chart indicates, all three laboratories completed examinations at roughly the same rate.
We also evaluated how long the laboratories took to complete requested examinations by type. As the following chart shows, the highest percentages of examinations completed within 30 days were for arson and firearms. As discussed in the Backlog section of this report, arson and firearms had the smallest backlog of unexamined evidence in terms of backlog in months.
Percentage of Examinations Completed Within 30 Days
We compared the percentages of arson, firearms, and explosives examinations completed within 30 days with the targets established by the Office of Laboratory Services Operating Plan for FY 2005 for arson, firearms, and explosives cases.24 For arson and firearms examinations, the laboratories outperformed the established targets of 40 and 30 percent, respectively. However, only 25 percent of explosives examinations were completed within 30 days compared to a target of 35 percent that was completed within 30 days.
The preceding chart reflects data for all of FY 2004 and for FY 2005 through the date of our data extract, May 13, 2005. We also obtained summary data for all of FY 2005 from the Office of Laboratory Services. The differences between the two periods are compared in the following table.
Percentage of Examinations Completed Within 30 Days
For FY 2005, the reported percentages of arson and firearms examinations completed within 30 days were slightly lower than those reported in the preceding table for our data extract, which included all of FY 2004 and only a portion of FY 2005, but continued to meet the targets set in the Operating Plan. The percentage of explosives examinations completed within 30 days continued to fall below the 35 percent target. According to the Director of Laboratory Services, the Office of Laboratory Services was able to exceed its target in arson and firearms cases because it had sufficient staff in these disciplines and because a greater portion of these cases included requests for expedited service. Conversely, the Office of Laboratory Services was not able to meet its target for explosives cases because of a shortage of forensic chemists and fingerprint specialists.
Our analysis of expedited processing times is included in the Priority System section of this report. We did find that a larger percentage of expedited requests were processed within 30 days than was the case for all examination requests.
Improvement in Timeliness
The following chart compares, by percentages of completed examinations reviewed, the processing times in the Treasury OIG report to the processing times the DOJ OIG found. This comparison is not based on precisely comparable data since the Treasury Department’s audit included 323 completed examinations from cases in which all examinations had been completed, and the DOJ’s review included all 4,576 examinations completed during a specific period, regardless of the status of other examinations requested in each case.25 However, this comparison provides a general assessment of the changes between the 1998-1999 sample of completed examinations and the DOJ OIG’s extracted data for all completed examinations between October 1, 2003, and May 13, 2005.
Time to Complete Forensic Examinations
The Treasury OIG found that 67 percent of the sampled examinations were not completed within 30 days, compared with the DOJ OIG finding of 63 percent, and that 43 percent of the Treasury OIG’s sampled examinations took more than 90 days to complete, compared with the DOJ OIG finding of 33 percent. The total percentage of examinations completed within 90 days improved from 57 percent in 1998-1999 to 67 for the period covered in the DOJ OIG data extract. Despite this improvement, 63 percent, or almost two-thirds of examinations, were still not being completed within 30 days at the time of the DOJ OIG audit.
It is crucial that forensic results be of high quality to ensure reliable results. While the issue of quality was not addressed in the Treasury OIG’s 2001 audit, the DOJ OIG performed a limited assessment to ensure that the ATF had a quality assurance process in place.
We found the ATF had a quality assurance process that the Office of Laboratory Services followed to ensure the quality of its services. The three regional forensic laboratories are accredited and internal quality reviews are performed periodically by the Office of Laboratory Services.
The American Society of Crime Laboratory Directors/Laboratory Accreditation Board (ASCLD/ LAB) operates an accreditation program for forensic laboratories to demonstrate that laboratory management, operations, personnel, procedures, equipment, physical plant, security, and safety procedures meet standards established by the organization. The standards are those the ASCLD/ LAB has determined are appropriate to support valid forensic results. According to ATF officials, in 1985 the ATF forensic laboratories became the first federal forensic laboratories to be accredited by the ASCLD/ LAB.
Currently, each of the three forensic science laboratories is accredited by the ASCLD/ LAB in the disciplines of firearms and toolmarks, latent prints, and trace evidence, which includes the arson and explosives work performed by the laboratories. Additionally, since both the Washington and Atlanta laboratories perform questioned document analysis, they are also accredited in that discipline. Each accreditation covers 5 years.
We also reviewed the Office of Laboratory Services Policy and Procedures Guidelines to determine the ATF's policy regarding quality assurance reviews. According to ATF policy, a quality audit of each laboratory is to be conducted annually by a team selected by the Director of Laboratory Services and trained by the Quality Manager. Each audit is to verify that laboratory operations continue to comply with the requirements of its quality system. The reviews are performed using Laboratory Services Policy and Procedure Guidelines; discipline-specific methodology documents; and the principles, standards and criteria established in the ASCLD/LAB Accreditation Manual. Written reports are prepared following each audit, and each laboratory chief is given an opportunity to respond to the report and state the corrective actions taken.
The DOJ OIG reviewed the quality reports for each of the three forensic science laboratories for FYs 2004 and 2005. Each report addressed areas such as the handling and storage of evidence, laboratory security, calibration and maintenance of equipment, and compliance with the Office of Laboratory Services’ scientific methods and technical procedures, including proficiency testing, training, controls over reference standards, and peer review of work. Written responses addressing corrective actions were required on each finding. Based on our review of these reports, we concluded that the Office of Laboratory Services had a quality assurance process in effect and that the forensic science laboratories were being held to its standards.
The potential negative effects of forensic results taking more than 30 days have been identified by various studies (including the studies listed in footnote 19), but not quantified. The most serious potential consequences are that delays in identifying suspects and making arrests allow offenders additional opportunities to commit crimes, thereby endangering the public. It is also possible that the failure to provide results on a timely basis could hamper a defendant’s right to a speedy trial. Other potential negative consequences of delayed forensic results include wasted investigative time and delayed trials, both of which diminish the efficiency of the criminal justice system.
Interviews with Special Agents
The Treasury OIG reported in 2001 that it was unable to quantify the effect processing delays had on ATF cases. However, based on interviews with eight Special Agents from around the country, the Treasury OIG reported that agents indicated that they needed quicker turnaround times for their evidence submissions and told auditors that processing delays made it more difficult for them to track leads and locate witnesses. The agents also said that because of these delays, they accepted fewer cases requiring forensic examinations from state and local officials than they might otherwise accept, and often outsourced simple examinations to state and local laboratories.
The DOJ OIG also obtained information from Special Agents regarding their satisfaction with the service provided by the regional forensic laboratories. We randomly selected 14 Special Agents who had received laboratory reports within the 90-day period prior to the start of our audit. To assess the impact of examinations that took a long time to complete, we confined our selection to those reports that were dated 180 days or more after the exhibits were first received in the laboratories. Additionally, we randomly selected another 8 Special Agents who had been waiting for laboratory reports on uncompleted examinations for more than 180 days. We conducted telephone interviews with these Special Agents and asked questions about the effect processing delays had on investigations and whether the agents ever used state and local laboratories instead of the ATF laboratories.
Based on these interviews, we found that delayed receipt of laboratory reports did not adversely affect investigations in most instances, primarily because of the absence of a suspect or because of a confession. However, about 30 percent of respondents stated that they were less than fully satisfied with the timeliness of the service provided by the laboratories. Additionally, because of delays at ATF laboratories, more than half of the Special Agents we contacted told us that they had at least some laboratory examinations (primarily latent prints) performed at state and local laboratories.26 A laboratory official told us that they were aware that agents occasionally use state and local laboratories for some examinations but were unsure how often this occurs. The ATF does not systematically track requests for services that are sent to other laboratories.
Laboratory Surveys of Special Agents
To survey customers about their satisfaction with the management of the Office of Laboratories Services workload, the ATF laboratories sent customer satisfaction cards with returned laboratory reports to Special Agents who submitted evidence for analysis. The cards asked respondents to rate the laboratory’s performance in three areas: service, timeliness, and reports/statements. Service could be rated as “very good,” “good,” “poor,” or “very poor.” Timeliness and reports/statements could be rated “very satisfied,” “satisfied,” fairly satisfied,” or “not satisfied.” The cards also included a few blank lines for comments.27
The Treasury OIG reported in 2001 it was unable to obtain a representative sample of the customer satisfaction cards to review because laboratory officials had stopped sending them out for comment. According to the audit report, officials told the Treasury OIG that the ATF stopped using the cards because the feedback received showed a general dissatisfaction with the amount of time it took to obtain test results.28
In response to the Treasury OIG report, the ATF re-implemented the practice of sending customer satisfaction cards to recipients with laboratory reports. The Washington and Atlanta laboratories attached a blank copy of the card to each laboratory report mailed out. The San Francisco laboratory sent an electronic version of the card on a quarterly basis to field divisions for which a report was issued during the previous quarter.
We reviewed the customer satisfaction cards at the regional laboratories and matched the responses in the cards to the universe of examinations we were reviewing. Only 23 percent of the recipients of the 4,576 laboratory reports included in our universe returned the customer satisfaction cards to the laboratories. The following tables show the ratings for the examinations we reviewed.29
Responses in Customer Satisfaction Cards Service
Responses in Customer Satisfaction Cards Timeliness
Responses in Customer Satisfaction Cards
As noted in the preceding tables, respondents were overwhelmingly satisfied with the service and reports provided by the regional forensic laboratories. However, a minority (10 percent) of respondents expressed at least some dissatisfaction with the time it took the laboratories to complete examinations and issue reports. The following are examples of negative comments about timeliness taken from the cards.
While the comments above are mostly negative, they only represent 10 percent of respondents. The other 90 percent of respondents said that they were either “satisfied” or “very satisfied” with the timeliness of service. This suggests that some results are not needed by investigators within 30 days, which is discussed further in the Priority System section of this report.
We believe that customer satisfaction cards provide one method for obtaining customer feedback, but the current questions posed by the laboratories need to be revised. More specific questions should be asked, such as whether the analysis was received in time to be of assistance on a case, or if it was not, what the negative effect was on the progress of an investigation or the outcome of a case.
Based on interviews with Special Agents and our analysis of feedback on customer satisfaction cards, we found that most agents were satisfied with the timeliness of service provided by the regional forensic laboratories. However, approximately 10 percent of Special Agents who returned customer satisfaction cards to the regional forensic laboratories voiced dissatisfaction with the timeliness of service provided. We did not identify any systemic quality problems or any instance in which the most serious potential consequences of delayed results had occurred.
In its 2001 report, the Treasury OIG determined that processing delays were caused by the laboratories’ large backlogs of examination requests, the inability to hire sufficient staff to keep pace with the workload, and the staff’s other duties outside the laboratories that competed for time, such as providing training and crime scene assistance. The DOJ OIG assessed the status of these causes and the related corrective actions from the Treasury OIG’s audit, including the ATF’s plan to implement a new priority system. The DOJ audit also evaluated ratings and comments about the timeliness of laboratory services from ATF Special Agents in field offices. With the exception of duties outside the laboratories, we generally found that the same causes identified by the Treasury OIG were still contributing to the processing times for forensic examinations and that the ATF had not implemented some of the corrective actions that were intended to address the causes. The following sections of the report address the causes for processing delays and the status of ATF corrective actions.
The DOJ OIG audit assessed the backlog of uncompleted examination requests by grouping them according to the length of time each had been in the laboratories.30 The following chart shows the distribution of the backlog by the length of time the examinations requests had been on hand. The numbers at the tops of the bars are the total number of examination requests that had not been completed as of May 13, 2005. The percentages shown to the side of each bar are the percentages of the total backlog.
Length of Time Pending Examinations On Hand
As the preceding chart indicates, 64 percent of pending examinations had been in the laboratory more than 90 days. We also determined that 47 percent of all pending examinations were more than 180 days old and 23 percent were a year or more old.
We further analyzed the backlog by type of examination to determine whether it had increased or decreased, and we then estimated how long it would take to completely eliminate the backlog at the current staffing level. To conduct our analysis, the results of which are shown in the following tables, we used data from the Laboratory Services Workload Report for FY 2005, dated September 30, 2005.32 We calculated the average number of examinations received and completed each month (columns C and D) by dividing total figures on the Workload Report by 12. We calculated the “Backlog in Months” (column E) to measure the backlog as represented in months of work only for comparison with information in the Treasury OIG report.33 (For this calculation, we did not factor in additional work received to calculate “Backlog in Months.”) The DOJ OIG audit determined whether the backlog had increased or decreased (positive and negative numbers in column F) during FY 2005. If the backlog had decreased, we then calculated the number of months it would take to eliminate the decreasing backlog (column G) by dividing the total backlog (column B) by the decrease per month (column F). This assessment assumed the laboratories will continue to complete examinations at the FY 2005 monthly completion rates (column D).
Analysis of Examinations Backlog
As the preceding table indicates, the backlog in months for the three forensic laboratories together varied by discipline from about 2 to 8 months, and increased in the disciplines of arson and questioned documents. For the remaining four categories (explosives, firearms, prints, and trace), we project that it would take between 6 and 10 years to eliminate the backlog at the rate examinations were completed in FY 2005.
We compared the average number of examinations received and completed each month (columns C and D). The preceding table indicates that the forensic laboratories were collectively able to complete an average number of examinations that was at least equal to the average number of examination requests received each month for explosives, firearms, latent prints, and trace examinations. For arson and questioned document examinations, the shortfall resulted in increases in the number of months needed to eliminate the backlog. This suggests that the laboratories should be able to provide forensic reports to field offices within 30 days in response to most examination requests if the existing backlog could be eliminated.
We also analyzed the backlog at each of the three forensic science laboratories to identify any significant variations that might exist between them.
Washington: As demonstrated in the following table, the Washington laboratory’s backlog varied from about 2 to 13 months, increased in one of six categories (questioned documents), and remained constant in another category (arson). For four categories (explosives, firearms, prints, and trace), at the rate of reduction during FY 2005, it would take between 4 and 15 years to eliminate it. While filling the vacant positions listed should help eliminate the backlog in all but one category (questioned documents), additional actions will be needed to completely eliminate the backlog in a more timely manner.34
Analysis of Examination Backlog
Atlanta: The Atlanta laboratory’s backlog varied from about 1 to 4 months (column E below), increased in two categories (arson and questioned documents), and remained constant in a third (prints) (column G below). In three other categories (explosives, firearms, and trace evidence), at the rate of reduction during FY 2005, it would take between 10 months and 6 years to eliminate the backlog. Since there were no vacant positions in Atlanta as of September 30, 2005, no additional assistance was planned.
Analysis of Examination Backlog
San Francisco: The San Francisco laboratory’s backlog varied from about 2 to 5 months and increased in four categories (arson, explosives, firearms, and trace). For the other category (fingerprints), at the rate of reduction during FY 2005, it would take more than 3 years to eliminate the backlog. Filling a vacant supervisory position and two vacant forensic chemist positions in San Francisco will help with the arson, explosives, and trace evidence backlog. However, no additional assistance is planned for the backlog of firearms and fingerprint examinations.
Analysis of Examination Backlog
The following chart compares the backlog in months for each discipline for each of the three forensic laboratories. Using data from the three preceding tables, we calculated the backlog in months (column E) by dividing the backlog (column B) by the average number of examinations completed in a month during FY 2005 (column D). As the chart indicates, the Washington laboratory had the largest examination backlog in all disciplines except for explosives and trace examinations, but it also performed between 1.5 and 1.7 times as many examinations as each of the other two forensic science laboratories.
The DOJ OIG audit compared the backlog reported in the Treasury OIG audit report as of September 30, 1998, with the backlog reported in the Laboratory Services Workload Report for FY 2005. The DOJ OIG found that the total backlog of 1,289 examination requests reported in the Treasury OIG audit had been reduced to 983 examination requests as of September 30, 2005. Although the backlog had decreased, it remained significant. The following chart compares the 1998 backlog reported by the Treasury OIG with the DOJ OIG findings for the backlog size at the end of FY 2005. In the chart, the backlog for the two audit periods is presented in months of work at the rate examinations were being completed in 1998 and 2005, respectively.
Comparison of Examination Backlog
As the preceding chart shows, when the DOJ OIG analyzed the differences in the backlog measured as months of work, the backlog in the disciplines of latent prints and questioned documents increased and the backlog in the disciplines of arson, explosives, and firearms decreased. According to the Director of Laboratory Services, the backlog in latent print and questioned documents examinations increased as a result of a shortage of qualified examiners and an increase in the number of latent print examinations requested.
Not all of the backlogged requests for examinations are performed, nor do they all need to be completed. Some requests are canceled based on changes in an ATF case. For example, a case may end with a plea bargain, making the forensic analysis unnecessary. Other requests for examination may wait in the backlog for so long that Special Agents use other laboratories for the forensic work. Eleven percent of the examination requests in our universe that were no longer pending had been canceled.
The DOJ OIG audit analyzed the time between the date of the examination requests and date of the cancellations for 585 canceled examinations in the FACETS data extract.37 The following table shows by days the number and percentage of examinations that were canceled. As the table shows, almost half of the cancellations were for examination requests that were more than 6 months old. This fact supports anecdotal information from Special Agents that the delay in obtaining results was the reason for the cancellation in some instances. One agent indicated that he had canceled a latent print examination because of the delay in obtaining results from the ATF laboratory and then arranged for a local laboratory to perform the examination.
Number of Days before Examinations were Canceled
According to the Director of Laboratory Services, supervisors were required to screen cases over 6 months old on a monthly basis to eliminate the examinations no longer required. However, as one agent commented, this process may not have been effective, and there were no procedures in place to ensure that Special Agents would advise the laboratories in a timely fashion when the examinations were no longer needed.38
In response to the discussion of the backlog in the Treasury OIG audit report, ATF management identified corrective action to provide each field division with a list of all cases with pending examinations on a quarterly basis, so that field divisions could identify cases that could be deleted from the backlog. The ATF indicated it had provided each field division with a list of cases that were at least a year old as of May 1999, which resulted in 94 inactive cases being removed from the laboratories’ backlogs. The ATF repeated this process in November 2000, and removed an additional 85 cases. The DOJ OIG was unable to determine the number of examinations associated with these cases from the Treasury audit report, but the number of examinations is at least equal to the number of cases reported.
The DOJ OIG found that ATF management had not provided these reports to the field divisions since November 2000 because, according to the Director of Laboratory Services, FACETS does not produce such reports in a usable format. However, the Director anticipated that electronic communications between field divisions and the laboratories using the new information management system would facilitate the timely removal of backlogged examinations associated with inactive cases. As previously stated, however, the new system had been in the planning stages at least since the time of the Treasury audit report in 2001, and still had not been implemented by December 2005.
Backlogged examination requests add to processing delays by postponing the start of work on new evidence submissions. In addition, the growing backlog in some disciplines could further add to processing delays. The DOJ OIG concluded that, while filling vacant positions should provide some relief, laboratory officials need to develop plans that specifically address eliminating the backlog of examinations. The plans could include: (1) working with field divisions to prioritize the existing backlog, (2) eliminating requests for examinations that are no longer needed, and (3) outsourcing examinations to other laboratories until the existing backlog has been eliminated.
The Treasury OIG found that between FYs 1995 and 1998, the laboratories devoted significant resources to several high-visibility cases, including the Murrah Federal Building bombing in Oklahoma City, the Trans World Airlines Flight 800 investigation, and the Olympic Park bombing in Atlanta. According to the Treasury OIG, transferring resources to these cases adversely affected the amount of time that was available to laboratory employees for work on routine evidence submissions and had contributed heavily to the size of the backlog.
Recently, the laboratories again devoted significant resources to resource-intensive investigations, such as the Washington, D.C., sniper case. According to laboratory officials, during the first 2 weeks of the sniper investigation (in the Fall of 2002), the Washington laboratory performed over 100 firearms examinations, compared with an average of about 28 firearms examinations per month during FY 2004. These weapons were seized from other suspects or used in other crimes and were tested to determine if they were also used in the sniper case.
Exceptional resource-intensive cases can reasonably be expected to be part of the ATF laboratories’ workload for the foreseeable future. Accordingly, the laboratories need to plan for managing the workloads that are associated with resource-intensive cases while continuing to process more routine examinations. The DOJ OIG believes that proper planning will help keep the backlog of examination requests to a manageable level. Additionally, the Office of Laboratory Services also could consider entering into agreements and contracts with other laboratories to provide support when the demand for examinations is unusually high.
Personnel Ceiling and Allocation
Laboratory officials and the Treasury OIG identified inadequate staffing as one reason for processing delays, leading to large backlogs of uncompleted examinations. The Treasury OIG 2001 audit report recommended the ATF ensure that laboratories were adequately staffed. In response to the report, the ATF Director authorized an increase in the ceiling for the Office of Laboratory Services from 115 in FY 2000 to 134 positions for FY 2001.
During the DOJ OIG audit, ATF officials stated that the personnel ceiling for FY 2005 was 106 positions, which represents a net decrease of 28 positions from the reported FY 2001 level of 134. The net decrease may be explained by the fact that the Scientific Services Division maintained 28 positions within the Department of the Treasury when the ATF laboratories became part of the Department of Justice in 2003. However, while the FY 2005 number of authorized positions was 106, only 90 of these positions were filled by the end of the fiscal year. Budgeted positions and actual staffing levels at the laboratories as of September 30, 2005, are shown in the following table.
Office of Laboratory Services Staffing
To determine if the regional forensic laboratories experienced any increase in the number of positions to help process forensic examinations in a more timely manner, we obtained additional information about how positions were allocated within the Office of Laboratory Services for
FYs 2000 and 2005, which are compared in the following table.
Positions in the Office of Laboratory Services
We found the number of positions available in FY 2005 for the regional forensic laboratories to perform ongoing work decreased by 2 from the 80 positions in the FY 2000 allocation, as indicated in the last row of the preceding table. Regardless of the ceilings that were reported (up to 134 for FY 2001), there has been a small net decrease in staff resources to perform that portion of the forensic mission that remained unchanged from 2001 to the present.
The number of vacancies in the laboratories also has not changed significantly between the two audits. At the time of the Treasury OIG’s audit, the regional forensic laboratories collectively had 10 vacancies out of an allocation of 80 positions, and it usually took from 9 months to 1 year to hire new employees. The vacancies represented approximately 12 percent of the positions allotted to the laboratories. The Treasury OIG reported that the difficulty in hiring experienced personnel resulted from the ATF’s inability to compete with the salaries and benefits offered by private industry and because of the length of time needed to complete the hiring process, during which potential employees may find jobs elsewhere. In response to the 2001 Treasury OIG audit, ATF managers reported they had centralized all administrative personnel functions within the Office of Laboratory Services and were using recruitment incentives and a pay demonstration project in order to attract and retain highly qualified personnel.40
As of September 30, 2005, there were 11 vacant positions, representing approximately 14 percent of the positions authorized for the laboratories.41 Eight of the vacancies were at the Washington laboratory and three were in San Francisco; none were at the Atlanta laboratory. The vacancies occurred as a result of promotions, retirements, internal transfers, or resignations. According to the Director of Laboratory Services, the long hiring time resulted in part from the implementation of a new hiring process, which initially extended the time it took to hire replacement personnel. Further, laboratory officials stated that the three positions in San Francisco were difficult to fill because of the high cost of living there, and it was difficult to locate qualified personnel at a salary commensurate with their qualifications.
The DOJ OIG found that the length of time positions were vacant increased slightly since the prior audit. We also found that positions filled by external candidates during FYs 2004 and 2005 were vacant for an average of 14 months as shown in the following table.
Months to Fill Vacant Forensic Science Examiner Positions
During the DOJ OIG audit, ATF managers told us they were making a concerted effort to hire experienced personnel whose training period would be shorter, making them productive sooner. However, we found that the ATF’s efforts did not reduce the time it took to fill vacant positions. Long periods during which allocated positions were unoccupied contributed to the examination processing delays at laboratories where these vacancies existed. Accordingly, we believe the ATF should take additional measures to reduce the time it takes to fill vacant positions, such as requiring priority attention to filling all open laboratory positions.
The Treasury OIG cited the time examiners were spending performing duties outside the laboratories as a contributing factor to the laboratories’ inability to provide timely services. These outside duties included training, crime scene assistance, and expert testimony in court.
The DOJ OIG found that the number of days staff spent working on other duties had generally declined from FYs 1998 through 2002, but in FY 2003, this time spent on other duties almost doubled and stayed high for FY 2004, as shown in the following table. According to laboratory officials, national-interest investigations such as the Washington, D.C., sniper case were the cause for the dramatic increase from the FY 2002 total of 788 days to the 1,327 days in FY 2003. For FY 2005, the number of days per year spent on duties outside the laboratories again declined to a level similar to the earlier years.
Days Spent on Duty Outside the Laboratories
To put into perspective the number of days examiners spent working outside the laboratories, we determined that the outside duty days for FY 2005 represented about 4 positions for the year.42 This represented approximately 5 percent of the staff time for the year of the total positions allocated to the regional laboratories for that year.
In response to the Treasury OIG audit, ATF managers said the Office of Laboratory Services would coordinate all training requests and that laboratory supervisors would evaluate all ATF requests for laboratory assistance at crime scenes, other than the ATF’s National Response Teams’ (NRT) activities, to ensure there was a valid need to send personnel into the field. Based on the DOJ OIG’s interviews with the Director of Laboratory Services and the laboratory chiefs, we found that laboratories were managing the demands for these services reasonably well. The management of each of these outside duties is discussed below.
The ATF’s Office of Training and Professional Development (TPD) sponsors programs each year for ATF personnel and state and local government officials and requests instructor support for some of these programs from the Office of Laboratory Services. Additionally,
Special Agents-in-Charge (SAC) of field divisions occasionally request instructor support for ATF-sponsored training events. During the DOJ OIG audit, the Director’s Office coordinated all training requests. For example, in one instance we reviewed, the Director’s Office advised a field office that due to current workloads, the Office of Laboratory Services was unable to provide instructor support outside of TPD-sponsored training events.
Crime Scene Assistance
In addition to the crime scene assistance provided as part of one of the ATF’s National Response Teams, SACs also make requests for crime scene assistance. In these instances, laboratory chiefs may approve or disapprove the SAC’s requests. The laboratory chiefs consider whether they have enough qualified staff to meet the request with the objective of minimizing the time examiners spend away from the laboratories. According to the laboratory chiefs we interviewed, laboratory personnel provide crime scene assistance when the laboratory chief agrees that assistance is needed or when required by higher authority.
Examiner appearances in court are based on the receipt of a subpoena. Section chiefs within the laboratories and examiners coordinate with the appropriate prosecutor regarding the necessity, timing, and length of the appearance.
Because laboratories were not adequately staffed to meet all demands immediately, evidence submissions were prioritized to maximize the laboratories’ effectiveness. The Treasury OIG found that the ATF did not develop formal criteria for determining which evidence submissions should receive expedited service, but instead used informal criteria. The ATF also did not establish a methodology for classifying non-expedited work, which accounted for more than two-thirds of all submissions. These examinations were completed on a first-in, first-out basis, which did not account for the varying degrees of importance of the non-expedited examinations. The Treasury OIG recommended that the ATF develop a priority system for incoming evidence submissions to support the ATF’s investigative priorities. In response, ATF management stated that a new priority system was under development.
However, the DOJ OIG found that the new priority system had not been implemented at the time of our audit, four years after it was proposed as a corrective action. According to laboratory officials, the project lapsed because of the need for extensive internal coordination and because of administrative delays related to the reorganization of the ATF under the Department of Justice. The new system was to include a revised evidence transmittal form, an ATF Directive to implement the system, and six priority classifications including date deadlines, significant incidents, and court requirements. Because the new system had not yet been implemented, we re-evaluated the priority issues raised in the Treasury OIG audit.
Effectiveness of the Priority System
We found that approximately 30 percent of the submissions in the FACETS data extract we used were submitted as expedited requests. Submitting agents who needed examination results quickly, or by a specific date, requested expedited service using the evidence transmittal form. Agents might specify a date by which the results were needed, but were not required to do so. A section supervisor reviewed requests for expedited service to determine whether the circumstances justified moving the request ahead of other pending examination requests, and might contact the submitting agent or the agent’s supervisor to discuss the request. If the section supervisor at the laboratory determined it was appropriate to provide expedited service, the submission was processed ahead of others. The DOJ OIG found that expedited examinations were generally completed in a more timely manner than other examinations, as shown in the following chart.
As shown in the preceding chart, 54 percent of expedited examinations were completed within 30 days, compared with 30 percent of non-expedited examinations. Seventeen percent of expedited examinations took more than 90 days to complete, compared with 39 percent of non-expedited examinations. However, while priority examinations were generally performed in a more timely manner than non-expedited ones, 46 percent of examinations for which expedited service was requested were not completed within 30 days. Thus, almost half of expedited examination requests were not completed on an accelerated basis, or even within the 30-day long-term timeliness goal.
Because multiple examinations may be requested on each submission of evidence, some of the expedited examination requests that were not completed within the first 30 days were not the first examination performed on a submission. We analyzed the information to determine how many of the expedited requests completed after the first 30 days were the first examination completed for the submission. Of the 636 (46 percent) expedited examination requests that were not completed within the first 30 days, 392 (62 percent) were the first examinations completed for the submission.
We recommend that the ATF develop and implement a priority classification system that ranks all incoming evidence submissions in a way that supports ATF investigative priorities.43
Authorization and Justification for Expedited Service
To qualify for expedited service, examination requests have to be authorized and justified. All requests for examinations are supposed to include a copy of the ATF’s Report of Investigation to assist laboratory personnel in performing the examination. To request expedited service, agents mark the evidence transmittal form and explain why expedited service is necessary. (See Appendix IV for a copy of the evidence transmittal form.)
The criteria to justify expedited service are informal. The laboratories generally expedite the processing of evidence submissions when:
The Treasury OIG judgmentally selected 284 case files to test the ATF’s priority system.44 These 284 case files included 614 examination requests on 390 evidence submission forms. Of the 390 evidence submissions, 116 included requests for expedited service, while 274 did not.
The Treasury OIG found that:
Based on these results, the Treasury OIG concluded that questionable priorities were being assigned to cases and that Special Agents were not always properly justifying their requests for expedited service.
The DOJ OIG analysis is based on a statistical sample of evidence submissions from all examinations completed between October 1, 2003, and May 13, 2005, or pending on May 13, 2005. Our sample included 465 evidence submissions from 440 cases, of which 220, 126, and 119 submissions were from the Washington, Atlanta, and San Francisco laboratories, respectively. Of the submissions we reviewed, 113 (24 percent) included requests for expedited service.
The DOJ OIG found that most of the expedited submissions in our statistical sample contained justifications. Only 3 of 113 expedited submissions (approximately 3 percent) did not include justifications. Of the 113 expedited submissions we reviewed, 36 submissions (approximately 32 percent) were not signed by a supervisor and 22 submissions (approximately 19 percent) did not meet the ATF’s informal criteria discussed above. The table below presents our results by laboratory.
Submissions Identified as Expedited
We compiled information from the submissions we reviewed to determine how many expedited requests were associated with each of the informal criteria listed above. Using the informal criteria, we categorized the justifications included in these requests as follows.
The following table shows the number of requests and percentages by category, for the 113 requests for expedited service that we reviewed.
Justification Included in
For each submission in our sample, we also determined whether a Report of Investigation was included in the case file. We found that in 42 percent of the case files we reviewed, the submitting agent did not furnish the laboratory a copy of the Report of Investigation. Laboratory personnel often needed the report to assist in the examination of the evidence. When the report was not available, examiners had to contact the agent for a copy, further delaying the process.
Presence of a Report of Investigation (ROI) In Case Files
In its 2001 audit report, the Treasury OIG recommended that the ATF Director ensure that Special Agents provide adequate justification and obtain proper supervisory signatures before submitting evidence transmittal forms to the laboratories.
In response to this recommendation, the Director of Laboratory Services and the Assistant Director of Science and Technology met with the 23 field division directors in November 2000 and discussed: (1) agents’ failures to include a Report of Investigation with evidence submissions, (2) cases being marked for expedited service without evidence of supervisory review or proper justification, and (3) the mislabeling of submitted evidence. The Treasury OIG also reported that field division directors said that laboratory supervisors should contact agents’ supervisors immediately when any of the above issues were encountered and that laboratory supervisors were planning to follow up on that advice.
Laboratory managers initiated two other actions in FY 2001 to educate submitting agents on evidence submissions. First, the Agents Guide to the ATF Laboratories was distributed to all new agents, was placed on the ATF Intranet, and was added to the 2001 edition of the ATF Reference Library CD-ROM. Second, laboratory managers developed an advanced training program for agents on collecting and submitting evidence to the laboratory. The training was to be held at three divisions during FY 2001. The Treasury OIG report noted that ATF management believed improved procedures and more agent education would shorten case turnaround times and ultimately, reduce case backlogs.
During the DOJ OIG audit, the Director of Laboratory Services told us he would address the SACs in October 2005 and again would discuss the requirement for Special Agents to correctly complete evidence transmittal forms and include a copy the Report of Investigation with the evidence submission.
In the interim, we conducted 22 telephone interviews with a judgmental sample of ATF agents. The questions we asked included whether the agents were familiar with the ATF priority system for laboratory examinations. About one-third of the agents we interviewed were unfamiliar with the priority system for laboratory examinations.
Priority System Conclusion
The DOJ OIG found that the priority system in place during our audit resulted in shorter processing times for some expedited examination requests, but did not ensure that expedited examination requests were performed within the first 30 days and did not effectively identify examinations for expedited service without intervention by section supervisors in the laboratories. Section supervisors sorted out competing requests, not all of which were properly authorized or justified. Sometimes supervisors in the laboratories had to contact submitting agents to determine the basis for expedited service when information included with the evidence was insufficient to make this determination.
The problem described above occurred because requests for expedited service were based on informal criteria, the criteria were not always used as the reasons for requesting expedited service, and the priority system did not account for the relative importance of examination requests that were submitted as non-expedited requests. We found that some agents were not familiar with the informal criteria. It is also likely that the long processing times for many examinations contributed to the number of expedited requests agents submitted. Agents may have been motivated to request more examinations on an expedited basis when they did not expect to receive results from a forensic examination for several months.
We recommend that the ATF develop and implement a priority system that would classify all evidence submissions into multiple tiers defined to support ATF priorities. The timeliness standard for different tiers could vary. For example, a three-tier system might establish timeliness goals for three separate classes as: (1) within 30 days, (2) within 90 days, and (3) as time permits. This lowest priority category might be confined to exhibits from investigations in which the examinations by themselves were unlikely to produce any leads, and which lacked any other leads and did not involve the loss of life.
The Treasury OIG 2001 report identified two specific problems in the area of case file management controls: (1) not all examiners recorded the number of hours spent analyzing evidence or the number of hours spent preparing reports, and (2) not all closed case files contained evidence control cards. The DOJ OIG audit found that ATF management had not implemented the new laboratory information management system that would produce accurate and meaningful reports.
The ATF’s case file management controls are identified in the Laboratory ServicesPolicies and Procedures Guidelines (Guidelines). The Guidelines require that the laboratory evidence control specialist create a case file for any physical evidence accepted by the laboratory and that each case file contain an evidence control card for each evidence submission.45 The purpose of the evidence control card is to document the chain of custody for the evidence while it is at the laboratory. The Guidelines further state that examiners should initial the evidence control card, note the date of return to the evidence control specialist, and record the number of hours they spent examining and preparing the report on the card when they return evidence to the evidence control specialist. The Guidelines also specify that the evidence control cards will be retained by the laboratories after the evidence is returned to the submitting office. In response to the Treasury OIG’s findings, ATF management noted that adherence to the Laboratory ServicesPolicies and Procedures Guidelines would be addressed in annual internal reviews of the three forensic science laboratories.
Hours Spent Analyzing Evidence or Preparing Reports
To evaluate the forensic science laboratories’ compliance with case file management controls, the Treasury OIG reviewed the 159 closed case files, which contained the results of 397 examinations for 264 submissions. The Treasury OIG found that 57 percent of the examinations did not include a record of the number of hours that examiners had spent analyzing evidence or preparing reports. The Treasury OIG concluded that if case file management controls were not followed, laboratory managers would not be able to determine how much time employees spent conducting examinations or the number of hours required to complete cases.
The DOJ OIG followed up on the weaknesses reported by the Treasury OIG. We used the statistical sample previously discussed for testing the effectiveness of the priority system. Our sample included 442 completed examinations from the 465 evidence submissions included in our review. As the following table indicates, the DOJ OIG found that the hours spent analyzing evidence or preparing reports were not recorded for 14 percent of the examinations reviewed. Therefore, we found significant improvement in the percentage of records containing information on examiner hours spent analyzing evidence and preparing reports.
Completed Examinations That Did Not Include Information
Missing Evidence Control Cards
The Treasury OIG found that 19 percent of closed case files were missing evidence control cards. As the following chart indicates, the DOJ OIG found an evidence control card for every evidence submission reviewed. We concluded that the ATF took effective action to ensure that laboratory personnel complied with the two specific case file management controls identified as problems in the Treasury OIG audit.
Submissions with No Evidence Control Cards
Lack of New Information Management System
In response to the Treasury OIG audit report, ATF management also stated that acquisition of the new laboratory information management system would permit the development of accurate, meaningful reports to track examiner hours and eliminate the complex flow of documents that is often the cause of incomplete files cited in that report. As previously stated in this report, the new system had not been implemented at the time of the DOJ OIG audit. However, the new system is now projected to be in place by March 2006. According to a laboratory official, implementation of the new system was delayed because of problems modifying an existing commercial software package to meet the needs of the forensic science laboratories. The commercial software that had originally been selected when the ATF was part of the Treasury Department worked well for the regulatory functions performed on alcohol and tobacco products, but needed modification to meet the forensic laboratories’ needs. The modifications have required extensive testing and re-testing, causing the delay in implementation.
Extended processing times identified in the 2001 Treasury OIG audit report continued into the period audited by the DOJ OIG, with minor improvement. Two-thirds of completed forensic examinations continued to take more than 30 days to complete, and about one-third of examinations took more than 90 days. ATF laboratories were appreciated widely for the quality of work produced, and we found that the Office of Laboratory Services was following its quality assurance program. However, some comments from Special Agents in field offices continued to reflect dissatisfaction with the processing times, and some agents used other laboratories to obtain more timely results.
It is crucial that forensic results be of high quality to ensure reliable results. The DOJ OIG found that the ATF had a quality assurance program in place and that the Office of Laboratory Services followed the program to ensure the quality of its services. The program included annual quality reviews of all laboratories and the ATF’s participation in a professional accreditation process.
Although the ATF implemented several corrective actions planned as a result of issues identified in the Treasury OIG audit, other corrective actions that could have had a significant impact on workload management were not implemented. For instance, the ATF did not increase the number of positions in the regional forensic laboratories, did not implement a new priority system, and did not implement a new information management system. The DOJ OIG audit also found that the ATF did not continue initial efforts to clear the backlog of requests for examinations that were no longer needed, and the time it took to fill examiner vacancies had not been reduced.
The DOJ OIG audit found that the staffing level in FY 2005 could manage the incoming workload of evidence, but not in combination with the existing examination backlog and the length of time it took to fill examiner vacancies, which remained a problem. The laboratories had no plan regarding how to clear the existing backlog, reduce the time it takes to fill vacancies, and manage resource-intensive cases so that routine work will not create a backlog of requests that cannot be addressed within a reasonable time. As a result, resource-intensive cases can be expected to continue to contribute to the existing backlog in the future.
The Treasury OIG found that the ATF did not develop formal criteria for determining which evidence submissions should receive expedited service, but instead, used informal criteria. The ATF also did not establish a methodology for classifying non-expedited work. Although the ATF designed a revised priority system, the DOJ OIG found that it had not been implemented and did not address all submissions.
Questionnaires such as customer satisfaction cards provide a method for obtaining customer feedback, but the current questions used by the laboratories do not request specific responses and need to be revamped. More specific questions should be included, such as whether the analysis was received in time to be of assistance on a case, or if it was not, what the negative impact was on the progress of an investigation or the outcome of a case.
Our recommendations focus on managing the incoming workload and existing examination backlog by developing and implementing a revised priority system and a plan to eliminate the backlog, and developing approaches to reducing the time it takes to fill examiner vacancies. Otherwise, the backlog, inadequate priority system, and vacant examiner positions will continue to interfere with the laboratories’ ability to handle the incoming workload of evidence on a timely basis. Serious consequences may occur if delays in identifying suspects, making arrests, and bringing offenders to trial allow offenders to commit additional crimes.
We recommend that the ATF:
|« Previous||Table of Contents||Next »|