The United States Marshals Service's Workforce Planning and Management
(Redacted for Public Release)
Audit Report 07-38
Office of the Inspector General
In its response to our draft audit report, the USMS concurred with each of our recommendations and discussed the actions it has already taken and others it will implement in response to our findings. In its response, the USMS also provided clarifications to portions of our report. Before addressing the USMS’s response to each of the OIG’s recommendations and the actions necessary to close those recommendations, we provide the following response related to the USMS’s clarifications to the draft report, using the same headings used by the USMS.
Strategic Planning Issues
In its response, the USMS stated that it began using quantitative models for the distribution of resources in 1986, not 1995 as cited by the OIG in Chapter 2. As noted in our draft report, we attributed this statement to a USMS headquarters official whom we interviewed during the course of our audit. Specifically, this individual, a senior USMS headquarters official involved in the USMS’s strategic planning process, provided us with a document that detailed the history of the USMS’s budgetary planning processes. This document stated that until 1986 the USMS’s budget requests were derived from surveys of district needs as perceived by U.S. Marshals and input from headquarters management. The document further stated that from 1987 to 1994 the budgetary requests were formulated solely by USMS headquarters program managers. Then, beginning in 1995, the USMS’s budget justifications were driven largely by quantitative workload and resource models.
Considering the source and the documentation provided, we considered the 1995 date to be accurate. Moreover, during our audit no other evidence was provided that indicated 1995 was inaccurate. However, we have revised the report to reflect the information provided in the USMS’s response.
The USMS also commented on its use of Unit Performance Plans, stating that they were implemented in the latter half of 2006 and therefore do not yet contain a full year of data. We recognize that the Unit Performance Plans are a recent initiative intended to be used as tools for implementing the Strategic Plan. However, as we discuss in Chapter 2, USMS headquarters had not assigned responsibility for ensuring that the Unit Performance Plans were complete, accurate, and meaningful. Accordingly, we recommended that the USMS improve its strategic planning efforts by ensuring oversight of the Unit Performance Plan initiative, and the USMS concurred with this recommendation.
Workforce Management Planning
In its response, the USMS provided clarifying comments on its workforce planning models, time recordkeeping system (the USM-7), program and project codes associated with the USM-7, and utilization of contract guards.
District Budget and Workforce Equalization Models
The USMS stated that the District Budget Model (DBM) figures are not adjusted to incorporate future events as stated in Chapter 2. The USMS further commented that future events are taken into account in resource planning but do not affect the DBM. However, our discussion on this topic is referring to USMS resource planning. The OIG is not suggesting that the DBM figures are inaccurate or that future events are factored into the DBM calculations. Rather, since the DBM is a historical model, it does not account for future events that may require additional resources. Therefore, in some instances the DBM figures for certain district offices may have to be adjusted to account for new initiatives.
In addition, the USMS believed the report statement regarding the USMS Director’s suspension of the DBM and Workforce Equalization Model (WEM) suggested his dissatisfaction with these models. The OIG did not intend that this statement be read to suggest that the USMS Director was dissatisfied with the models. In fact, the OIG based this report statement on a January 6, 2006, memorandum from the then-Acting USMS Director, which specifically states that use of the DBM and WEM was retired “…to ensure a more balanced approach to how human and financial resources are allocated…”
In its response, the USMS commented that it does not agree with the OIG’s statement that the USM-7 is the only source of empirical USMS resource utilization data. The OIG understands the USMS’s position that other resource data exists and acknowledges that the USMS has several resource-related data sources. However, in this part of the report, the OIG was referring to the actual reporting of time by USMS employees, which is tracked through the USM-7. Therefore, the OIG has modified the body of the report on page 14 to state that the USM-7 is the only source of empirical employee resource utilization data.
Program and Project Codes
The USMS stated that it does not use program codes as defined by the OIG. Instead, the USMS stated that it relies on project codes for tracking its time utilization by mission activity. The OIG recognizes that the USMS does not utilize its program codes to track time utilization. In fact, in the report we refer to remarks from USMS headquarters officials who characterized the existing program codes as meaningless. However, during our audit these same USMS officials informed us that the program codes contained in the USM-7 data file correlated to USMS mission activities, which are also the first two characters of USMS project codes. We believe that it may be worthwhile for the USMS to utilize the program codes because managers would be able to obtain a snapshot of employee utilization for each of the agency’s broad mission areas. Therefore, we suggested in the report that the USMS could streamline its analysis of human resource utilization through the expanded use of program codes.
The USMS response also provided clarification on the report’s discussion of multiple project codes that can be associated with similar codes. In its clarification, the USMS explained that special assignments have specific codes because they are funded differently than other USMS mission activities. Thus, the USMS requires different codes for tracking purposes. The USMS further explained that these project codes associated with special assignments are tasks that cannot be completed with district resources and require that additional human and financial resources be provided from headquarters.
First, our report states that a single, specific project code could be associated with multiple, broad program codes. During our analysis of the USM-7 data, we requested assistance from the USMS in defining the specific mission activity (program code) to which each project code applied. In response, the USMS indicated that some project codes were associated with one mission activity in certain instances but another mission activity in other instances. In our report, we used the project code TERROTHP (Track Terrorism Activity) as an example. At times, this project code corresponds to the “In Court with Prisoners” mission activity, while in other instances it applies to the “Judicial/Other Protection” mission activity. Other examples include project codes JUDCONFP (Judicial Conferences), MAJ1000P (Management and Administration – Judicial Security Division), and MAJ3000P (Headquarters Security). The OIG does not disagree with the USMS’s need to track the costs of specialized missions. However, the OIG’s concern is that the USMS’s current time-reporting system is cumbersome and can affect the accuracy of the data collected.
The USMS stated in its response that contract employees perform narrowly defined tasks that are specified by contract and are monitored by Contracting Officer’s Technical Representatives. Our report discloses that the USMS uses a large number of contract employees in support of its various mission activities. During fieldwork, USMS officials explained that a contractor can be used on different activities to include securing, processing, and transporting federal prisoners.
We do not dispute that a specific contract employee may perform narrowly defined tasks. However, our concern is that the USMS is unable to completely define its total workload or the total level of effort expended in each mission area in which contractors procured through national contracts, excluding Court Security Officers (CSO), are utilized because these individuals do not record their time in a manner similar to USMS operational personnel. Different contract employees may be performing different tasks and sometimes their work can overlap with that of USMS employees. If the USMS does not have a process for capturing the number of hours contractors are spending on each task, it cannot ascertain the total number of hours expended on each mission activity. To this end, we recommended that the USMS ensure that it has a reliable, standardized process of tracking, by activity, the time of contractors procured through national vendor contracts (other than CSOs). As reflected in its response to this recommendation on page 81, the USMS concurred with our recommendation.
Our report discusses the allocation of operational and administrative positions to USMS district offices, as well as the utilization of those resources. We found that since the USMS does not allocate positions by mission or program area, it does not have a benchmark against which it can measure each district’s use of resources. More importantly, we found that the USMS does not routinely review how the utilization of its human resources impacts all aspects of USMS operations.
The USMS commented that it does track all workload, accomplishments, and time utilization data associated with these resources. However, during our review several management officials at USMS headquarters and the district offices we visited stated that they do not review overall resource utilization reports related to the activities and personnel for which they are responsible. Further, our analysis of USMS resource utilization data raised concerns about the accuracy of the time charged to various activities, which would have been evident if the USMS had been regularly reviewing, not just tracking, its employee utilization. As a result, we recommended that the USMS regularly generate and review resource utilization reports, and the USMS concurred with our recommendation.
As noted in the USMS’s response, Footnote 38 of the draft report stated that each employee assigns time to 1,607 project codes. The USMS commented that we report in Chapter 2 that no more than 250 project codes were active in any given fiscal year. However, the OIG did not make this determination. This statement was offered by a USMS official at the conclusion of our audit. This USMS official could not provide us with any documentation to support the number of active project codes during any one fiscal year. The 1,607 figure is the number of project codes in the data file that the USMS provided to us as evidence of the codes in existence at some point between FYs 2000 and 2005. The USMS, though, could not identify the specific project codes available for use during each fiscal year of our review period.
As discussed in Chapter 2, we were able to verify the actual number of project codes to which time was charged by employees for each fiscal year. Although employees recorded time to no more than 231 project codes during any given fiscal year of our review period, the USMS could not prove that only these project codes were available for use during this timeframe. In light of the USMS’s comments, we have adjusted the language in our report to clarify the discussion of the number of USMS project codes.
The USMS also commented on Footnote 39, which states that the “Protection of the Judicial Process” is not 1 of the 18 program codes or mission activities used by the USMS. In its response, the USMS stated that it does not use program codes as the OIG identified them and noted that individual project codes will frequently affect more than one USMS mission area.
During our audit, the USMS provided us with an overview of the USM 7 project codes, including the construct of a project code and what each piece of the code represents. According to USMS documentation, the first 2 characters of each project code reflect 1 of 18 USMS mission activities, and USMS officials called these 2-character identifiers “program codes.” During our analysis of the USM-7 data, we requested assistance from the USMS in defining the specific mission activity to which each project code applied. In providing this information, the USMS noted some instances in which a determination could not be made regarding the specific mission activity to which a time record applied. At that time, the USMS explained that some project codes could not be associated with a specific mission activity because the codes encompassed multiple activities. In these instances, the USMS categorized the mission activity as “Protection of the Judicial Process” because the tasks performed applied to various areas of this decision unit. Therefore, we used this description for these records in our analysis and included the resulting utilization figures as part of the Judicial and Courthouse Security decision unit.
In its response, the USMS disagreed with the number of FY 2004 potential threats reflected in our report. Specifically, the USMS stated that the correct number of potential threats during FY 2004 is 553, not 678 as depicted in Exhibit 3-5. We do not agree that the correct figure should be 553 and believe that we have accurately reported the FY 2004 level of potential threat investigations as 678. The USMS has not provided the OIG with any information supporting the 553 figure. Moreover, the USMS Office of Protective Intelligence (OPI) informed us that the USMS’s threat-related workload totaled 674 potential threats in FY 2004. As explained in Chapter 3, the difference between our computed 678 threats and the OPI’s 674 was the timing of data entry and other factors, such as the correction of data entry errors and training issues.
At the conclusion of our audit, a senior USMS headquarters official voiced concern with the accuracy of the number of potential threat figures contained in our working draft report. We had subsequent discussions with USMS officials from the OPI to identify the difference between our computed figures and those reported by the OPI. As explained in Footnote 48 of Chapter 3, our analyses concentrated on district office workload, while the OPI’s figures were based on its workload (which was a headquarters perspective). Thus, the figures presented in Exhibit 3 5, including the 678 potential threat investigations in FY 2004, reflect those incidents reported to USMS district offices during each fiscal year of our review period. The OPI officials agreed with our plan to present OIG-computed figures in this chart. Further, it was at the OPI’s request that we included a footnote (Footnote 48) explaining these varying workload perspectives.
In its response, the USMS also referred to the OIG’s computation of 24 FTEs utilized on threat investigations during FY 2005, which the USMS does not dispute. However, the USMS believes that this figure is underreported for various reasons, including project code definitions and organizational changes. Specifically, the USMS commented that it found personnel in the OPI had not appropriately recorded their time to the correct project code. As discussed in Chapter 3, we state that a senior OPI official believed the USMS had underreported the time charged to threat investigations. However, we believe that the USMS’s time resource utilization data illustrate that the USMS should regularly review the utilization of its employees and examine its level of effort in certain areas, particularly protective investigations. As noted on page 82, the USMS concurred with our recommendation.
In addition to its comments on OPI resource utilization, the USMS response included comments on its workload related to protective investigations. The USMS included several points of clarification on protective investigations. The USMS agreed that the number of potential threats has increased. However, the USMS believes that the increase in OPI staff reflects that this increased workload is being addressed. The OIG does not discount that OPI’s increase in staffing has likely resulted in more resources focusing on the USMS’s increasing threat-related workload. However, as noted in the preceding paragraph, USMS data only reflected a small number of FTEs addressing this area.
Additionally, the USMS stated in its response that it agrees with the anecdotal information provided by a local USMS official we interviewed regarding the district’s threat-related workload. However, the USMS believed that our report suggests that attention was not given to every threat made against a USMS protectee. The OIG disagrees that the report suggests such an inference. Our intention in this section of the report was to discuss this USMS official’s viewpoint on the district’s operations and to offer additional information regarding the USMS’s overall effort expended on threat-related work.
Further, as discussed in Chapter 3, we identified a backlog in USMS headquarters’ review of potential threats. This assessment was based upon whether headquarters had assigned a mosaic rating to each potential threat. At the conclusion of our audit, a senior USMS headquarters official stated that this backlog had been resolved as of March 2007. During subsequent discussions with OPI officials, they further explained the process undertaken by the USMS in addressing each of these cases, which mirrors what is noted in the USMS’s response. Given that the OIG did not learn about this process until the conclusion of our review, we did not confirm that the backlog had, in fact, been resolved. Instead, we adjusted our report to include the USMS’s assertion that the backlog had been resolved as of March 2007, as well as the reason why assigning mosaic ratings to cases was not always necessary. The OIG believes that our report appropriately addresses this area.
During our review of potential threats reported to the USMS, we also identified delays from when potential threats were reported to USMS district offices to when they were entered into the Warrant Information Network (WIN). As noted in its response, the USMS does not believe that the OIG adequately explained these delays. Specifically, the USMS asserted that the majority of these cases were reported to USMS district offices in September and then not entered until the following fiscal year. The USMS also stated that all 20 of the cases shown in Exhibit 3-6 as being reported to a district office in FY 2004 but not entered until FY 2005 were reported to a district office in September 2004. However, based upon our analysis of USMS data, only 13 of the 20 records were reported to a district office in September 2004. The other 7 cases were reported to a district office between March 2004 and August 2004.
The USMS also commented that this same situation applies to FY 2006 and infers that all 40 cases shown as being reported to a district office in FY 2005 but not entered until FY 2006 were reported to a district office in September 2005. However, our analysis provided different results. Specifically, we determined that 20 of these 40 cases were reported to a district office between January 2005 and August 2005. Therefore, not all of the delays were due to the change in fiscal years from September to October. The OIG understands that information from these cases may not be entered instantaneously into the WIN. However, our concern is that a delay in recording case-specific data hinders the USMS headquarters’ ability to compare a potential threat against other similar occurrences. As a result, a district office may not be privy to a key piece of information that it could use to help prevent a potential threat from being executed.
Following Exhibit 3-6, we present two possible explanations offered by the OPI on why the delays in entering cases in the WIN occurred – data entry errors and lack of training. In its response, the USMS refers to only the data entry errors and further states that the OPI proved this reasoning to be true. When we were provided with this information, we were told that it was the OPI’s belief that these reasons accounted for the delays. Moreover, we were not provided with any supporting documentation to verify that these explanations were accurate. As a result, the OIG believes that the report appropriately and accurately addresses this topic.
In its response, the USMS stated that based upon anecdotal information the OIG concluded the USMS does not address its fugitive mission correctly because it fails to assign sufficient FTEs to address fugitive apprehension cases. The OIG disagrees that the report reaches this conclusion. Instead, we presented the viewpoint of personnel from the majority of the USMS district offices we visited, who informed us that their fugitive mission activity had suffered over the past few years. The OIG points out in the report that these anecdotal statements conflict with our review of USMS empirical data. As a result, we recommended that the USMS regularly review the utilization of its personnel to ensure that each mission activity of the USMS is being appropriately addressed, and the USMS concurred with this recommendation.
The USMS also commented in its response that the three district offices experiencing a decrease in the number of FTEs addressing fugitive warrants were ones in which Regional Fugitive Task Forces (RFTF) had been established between FYs 2000 and 2005. The USMS further commented that district offices no longer had to devote as many of their resources to fugitive apprehension work because the RFTFs assumed the bulk of the fugitive investigation workload. We do not discount that this is a very likely reason for the decrease in these offices’ FTEs. However, it does not explain why the Central District of California’s FTEs addressing fugitive warrants increased from FY 2000 to FY 2005 when an RFTF was also established within this district during our review period.
As discussed in Chapter 3, the USMS established directives on the types of duties contract guards are allowed to perform, as well as those they are prohibited from performing. Our review of USM-7 data reflected that contract guards had recorded time to restricted duties. In its response, the USMS stated that the OIG discussed these matters with USMS Assistant Directors and that these individuals were not the best sources for responding to this issue.
The OIG inquired about this matter with the Assistant Directors because they are responsible for overseeing USMS operations for distinct program areas and should have extensive knowledge about their respective areas. Moreover, if uncertain as to why this occurred, these individuals should know who in the USMS to ask for clarification. In fact, at least one Assistant Director provided us with a possible explanation after asking his staff for information related to the use of contract guards and the recording of their time. We believe, therefore, that the Assistant Directors were a valid source and their varied explanations accurately represented the situation.
Finally, in its response the USMS presented an explanation as to why contract guards recorded time to restricted duties, an explanation we also provided in our report. In brief, contract guards are often used to backfill for deputies who are assigned to special details. These contract guards, in turn, are told to record their time to the special details to which the deputies have been assigned, not the tasks that the guards are actually performing. The OIG disagrees with this method of timekeeping, as well as the USMS’s assertion that any other method would result in a misreporting of resource utilization. Instead, we believe that the current method used by the USMS overstates the actual amount of time expended on these special details and understates the time spent on other tasks. For example, a contract guard is hired to backfill for a deputy assigned to a Supreme Court detail. The deputy spends 40 hours protecting Supreme Court Justices and during this detail records his time accordingly. The contract guard, in turn, spends 40 hours transporting prisoners, yet he records all of his time to the Supreme Court detail. When determining the total level of effort expended by the USMS employee and contract guard on specific activities, the USMS will show that these 80 hours were spent on a Supreme Court detail and no time was spent on transporting prisoners.
The last clarification presented by the USMS in its response pertains to Footnote 58. This footnote discusses that Detention Enforcement Officers (DEO) are eligible to apply for Deputy U.S. Marshal (DUSM) positions. We further explain that the majority of DUSM positions are not filled from the DEO ranks. The USMS agreed that this statement is factual, but provided additional insight as to why this occurs. The USMS stated that the DEOs are viable candidates for DUSM positions; however, there are fewer DEOs in the USMS than the number of vacant DUSM positions needing to be filled in a given year. The OIG included this footnote in the report for informational purposes only. We were not commenting on the process or the quality of the USMS’s workforce.
Status of Recommendations
Resolved. In its response to our draft report, the USMS concurred with our recommendation to ensure its strategic planning efforts are improved through oversight of its Unit Performance Plan initiative and stronger promotion of the strategic plan by district management. The USMS stated that it will advance the goals contained in its Strategic Plan and the Unit Performance Plans through Strategic Advancement Forums, which were recently established by the Director on May 11, 2007. Besides these Forums, the Director also created the Financial Management Steering Committee that is to develop a Tactical Plan by leveraging the Unit Performance Plans. Finally, the Director established periodic formal reviews.
To close this recommendation, please provide us with documentation on each of these recently established initiatives, including how each will ensure the USMS’s strategic planning efforts are improved. Additionally, please provide the Financial Management Steering Committee’s Tactical Plan and its effect on improving the oversight of the Unit Performance Plans. Further, please provide evidence that district management has more strongly promoted the Strategic Plan among district personnel.
Resolved. The USMS concurred with our recommendation to improve its time reporting system and ensure the integrity of system data by: (1) allowing for the tracking of time by the minimum number of project codes necessary, and (2) implementing an automated control to ensure that all records entered into the time reporting system contain an active project code. The USMS stated that it will use the fewest codes it believes necessary to meet multiple reporting requirements. Additionally, the USMS commented that although its payroll interface does not allow invalid project codes to be processed in its time reporting system, the current DOJ time reporting system does allow the use of invalid codes. As a result, the USMS stated that it will work with DOJ’s Justice Management Division (JMD) to address this issue, as well as consider the availability of a third-party time reporting system that might better fulfill this role.
To close this recommendation, please provide evidence that the USMS has reviewed its list of project codes and identified, with explanation, those it believes to be necessary for reporting requirements. Additionally, please provide evidence that the systems involved in time reporting do not allow the use of invalid project codes.
Resolved. In its response, the USMS concurred with our recommendation to ensure that it has a reliable, standardized process of tracking, by activity, the time of contractors procured through national vendor contracts (other than Court Security Officers). Specifically, the USMS stated that it will develop a system to track the human resource utilization of this workforce.To close this recommendation, please provide us with support that the USMS has implemented a system for tracking the human resource utilization of contractors procured through national vendor contracts, including sample utilization reports generated from this system.
Resolved. The USMS responded that it concurs with our recommendation to implement adequate automated controls into the WIN to ensure that: (1) warrants that have valid warrant closing dates are in a closed status, and (2) fugitive warrants are assigned a proper execution code when closed. The USMS commented that it currently has automated controls in place to prevent warrants having valid warrant closing dates from being in a status other than closed. The USMS also noted in its response that it now has an automated control that prevents fugitive warrants from being assigned a non-fugitive warrant execution code. However, the USMS commented that errors may still occur due to programming bugs and data transmission errors. As a result, the USMS performs manual data reviews and regularly informs employees about the importance of accurately entering data into the WIN. For example, the USMS stated that in January 2007 the Assistant Director of the Investigative Services Division formalized regionalized support to the district offices through the Criminal Information Branch, which routinely reviews the WIN data for accuracy. Moreover, the USMS announced in April 2007 that it will convene a data validation and compliance review workgroup.
To close this recommendation, please provide us evidence to support the statement that the USMS has implemented specific automated controls to prevent the specified types of errors from occurring. Further, please provide us with documentation on the guidance provided to USMS employees on the importance of WIN data accuracy. In addition, please provide support for manual WIN data reviews, including the regional support formalized by the Investigative Services Division in January 2007, as well as the data validation and compliance review workgroup announced in April 2007.
Resolved. In responding to our draft report, the USMS concurred with the OIG’s recommendation to perform regular reviews of the PTS to ensure the accuracy of the information contained in the system. The USMS remarked that it implemented an error tracking program in response to the OIG’s review of the PTS. Additionally, the USMS stated that it created controls to stop erroneous data from entering the database. Finally, the USMS indicated that it will combine and review data from all 94 separate databases to ensure accuracy and completeness.
To close this recommendation, please provide us with evidence of specific controls implemented to prevent data entry errors, including the error tracking program created by the USMS. In addition, please provide documentation pertaining to the USMS’s plan for combining and reviewing PTS data from all 94 databases.
Resolved. The USMS concurred with our recommendation to review alternative options for assigning prisoner identification numbers within the PTS to ensure that all prisoner movements are accurately tracked. According to its response, the USMS is developing policy changes that will strengthen the tracking of prisoner movements and eliminate the use of “0” as a prisoner identification number. Moreover, the USMS states that it will ensure that any prisoner movement analyses include all records.
To close this recommendation, please provide us with documentation on the development and dissemination of policy changes regarding the assignment of prisoner identification numbers. Moreover, please provide evidence that all prisoner movements are accounted for in USMS analyses.
Resolved. In its response to our draft report, the USMS concurred with our recommendation to generate and regularly review resource utilization reports to ensure USMS resources are being used as intended. The USMS stated that it will create regularly generated reports, which will then be reviewed by USMS management.
To close this recommendation, please provide us with evidence that the USMS has created all-encompassing resource utilization reports and that they are routinely reviewed by USMS management.
Resolved. The USMS concurred with our recommendation to ensure that there is an adequate number of staff familiar with the data systems to allow for continuity in the assessment of the USMS workload. To close this recommendation, please provide additional information on how the USMS plans to familiarize an adequate number of staff on the various data systems and confirmation that USMS has provided sufficient training to multiple personnel in order to become adept on the functionality of the data systems.
Resolved. In its response to our draft report, the USMS concurred with our recommendation to develop a formalized training program for USMS operational personnel selected to be Field Training Officers (FTO) to ensure that they have the adequate knowledge, skills, and abilities to instruct new staff. The USMS stated that district offices currently select FTOs with at least 5 years of criminal investigative experience. Further, the USMS stated that the current policy requires that FTO assignments be rotated among senior deputy marshals and this rotation be based on specific expertise of senior personnel. The USMS commented that the Training Academy will develop a FTO training program deliverable that will be provided in DVD/CD Rom format and disseminated to every district office for viewing by assigned FTOs.
To close this recommendation, please provide us with the DVD/CD Rom developed by the Training Academy containing formalized instruction for FTOs. In addition, provide evidence that this training program has been disseminated to all USMS district offices and that all FTOs have viewed the instructional materials.
Resolved. The USMS concurred with our recommendation to ensure that Criminal Investigator Deputy U.S. Marshals (CIDUSM) attend the Advanced Deputy Training course within the timeframes prescribed by the USMS. The USMS stated it will change the prescribed timeframes for attending this course to more accurately reflect the amount of time it will take to conduct this training for eligible CIDUSMs.
To close this recommendation, please provide us with documentation regarding the revised timeframes for completing the Advanced Deputy Training course, as well as executive management’s agreement that these timeframes are acceptable. Additionally, please provide evidence that the USMS has identified when all CIDUSMs are required to attend the Advanced Deputy Training and confirmation that those CIDUSMs who are currently overdue have been scheduled for and completed this training.
Resolved. In its response to our draft report, the USMS concurred with our recommendation to ensure that newly appointed USMS supervisors attend USMS supervisory training within a reasonable period of time following their promotion. The USMS commented that it currently has two supervisory programs for its supervisors, administrative officers, and headquarters inspectors and that it is conducting at least two of these programs each during FY 2007. Moreover, the Training Academy established a committee to evaluate the effectiveness of these programs and assess alternatives or modifications to these programs. Finally, the USMS stated that it is researching additional external management training programs that it could use for new supervisors immediately following their promotion until the next USMS supervisory training program is offered.
To close this recommendation, please provide evidence that the USMS is tracking the training activities of its supervisors, particularly newly promoted individuals, and that supervisors are completing this training within a reasonable period of time following their promotion. Further, please provide documentation on the Training Academy’s assessment of the current supervisory programs, as well as the USMS’s attempt to find additional external management training programs.
Resolved. The USMS concurred with our recommendation to establish a procedure to periodically review the training of Detention Enforcement Officers (DEO) to identify and rectify any backlog of untrained DEOs that exists. Specifically, the USMS stated that it will, at a minimum, review on a quarterly basis the roster of newly hired DEOs to ensure that each DEO is formally trained within 1 year of being hired. Moreover, the USMS commented that it currently has 17 DEOs who require training and are scheduled to attend this training in August 2007.
To close this recommendation, please provide us with documentation regarding the quarterly reviews conducted by the USMS related to the training of DEOs and the USMS’s efforts to ensure that DEOs receive formalized training during their first year of employment. Further, please provide evidence that the 17 DEOs who have not yet attended the basic DEO training did so in August 2007.
Resolved. In its response, the USMS concurred with our recommendation to establish a continuing education program for DEOs. The USMS stated that the Training Academy will formalize the opportunity for DEOs to attend advanced training programs at the Federal Law Enforcement Training Center (FLETC), as well as attend external training programs. Additionally, the response indicated that the Training Academy will consider developing an advanced DEO training program, which will provide refresher training in several areas, including firearms, search and restraints, and defensive tactics.
To close this recommendation, please provide documentation on the Training Academy’s efforts to develop an advanced DEO training program, including the instruction to be provided at the training and the timeframes in which DEOs are to attend. Additionally, please provide documentation on the USMS’s availability of external training programs for DEOs and how DEOs are notified of these programs. Finally, please provide us with the new DEO training policy that identifies DEO training requirements.
Resolved. The USMS responded that it concurred with our recommendation to ensure that training funds are effectively managed and that significant surpluses are avoided. Specifically, the USMS stated that the Training Academy and Management and Budget Division will allocate funds under newly established project codes that more accurately reflect non-training expenses, such as response equipment and costs for firearms. Moreover, the USMS commented that it has changed the methodology used for forecasting basic training class costs at the beginning of FY 2007, which is projected to reduce the surpluses resulting from these courses.
To close this recommendation, please provide us supporting documentation related to the USMS’s allocation of training funds under newly created project codes, as well as the newly implemented methodology for reducing the surpluses in training funds. In addition, please provide evidence that the USMS avoided a significant surplus of training funds at the end of FY 2007.
Resolved. In its response, the USMS concurred with our recommendation to follow up with DOJ’s plans for establishing a Department-wide system to record employee training, as well as to consider developing an interim centralized system to track the training of each USMS employee. The USMS stated that it is actively working with JMD on a training recordkeeping system. Further, the Training Academy is looking at acquiring a Training Management System, which would automate many Academy functions, including curriculum management, testing and evaluation, and reporting of various data. Additionally, the USMS envisions this system to interface with district offices in order to track mandated in-district training and firearms qualifications.
To close this recommendation, please provide evidence of the USMS’s on-going discussions with JMD on a Learning Management System, including any resulting decisions to establish and use a Department-wide system. Additionally, please provide evidence for the USMS’s research on purchasing a Training Management System and other actions to effectively track its employee training activities.
|« Previous||Table of Contents||Next »|