Office Of Justice Programs Technical Assistance And Training Program
Audit Report No. 04-40
Office of the Inspector General
Our audits of 21 TA&T grants disclosed weaknesses in the OJP’s monitoring efforts. Grantees were reimbursed for unallowable and unsupported costs. OJP grant managers did not ensure that all required Financial Status Reports and Progress Reports were submitted timely and other monitoring and closeout requirements were not observed. In total, we identified approximately $5.2 million in questioned costs and funds that could be put to better use of the nearly $78 million in grants examined.13 We attribute these weaknesses to: (1) the lack of training for OJP grant managers in the areas of timely and accurate report submission, allowable costs, grant monitoring and grant closeout procedures, and (2) the OJP’s automated system for grant management not functioning at full capacity. In addition, a criminal investigation was initiated by the OIG, based on our audit results, to examine the expenditures and business practices of one grantee.
The OJP has awarded some grants strictly for TA&T purposes and others that combined TA&T with other objectives. Our audits examined only grants that were exclusively for TA&T; the universe of such grants included 158 grants totaling $312.5 million. Because the BJA and the OJJDP awarded 92.5 percent of the total amount, or $289 million, we concentrated our audit on grants awarded by those two bureaus.
We judgmentally selected 21 grants for audit, totaling $77.7 million or 25 percent of the universe of funding.14 Our audit sample included 10 BJA grants totaling $28.1 million and 11 OJJDP grants totaling $49.6 million with grant award dates ranging from FY 1995 to FY 2002. We examined the monitoring efforts of the OJP and the affected bureaus and concluded that, for all of the grants in our sample, those monitoring efforts were deficient.
Grant monitoring is an essential management tool to ensure that grantees are properly expending funds and that the objectives of the grant program are being implemented. Based on the results of our 21 grant audits, we concluded that weaknesses in the OJP’s monitoring had permitted a wide range of discrepancies to occur among grantees. In fact, for the 21 grants that we audited, grant manager performed site visits for only 8 grantees.
One weakness was the lack of documentation of monitoring. We reviewed the program manager’s grant files for each of the 21 grants that were audited. The grant files generally contained monitoring plans but the grant managers did not consistently follow OJP requirements and document their monitoring efforts. For the closed grants in our sample, one did not contain documentation showing compliance with closeout procedures.15 According to the OJP Grant Managers Manual, the grant managers are to notify the grantee of closeout procedures 30 to 60 days before the end of the grant. However, the grant files that we reviewed did not contain any evidence that this requirement was fulfilled.
According to the OJP Grant Managers Manual, the bureau or program office and the Office of the Comptroller (OC) should coordinate activities throughout the monitoring process by preparing an annual monitoring plan, scheduling site visits, and conducting “team monitoring” or joint site visits. However, when we reviewed both the OC’s grant files and the grant managers’ files, we found little evidence that such coordination actually occurs. In addition, our interviews with the grant managers disclosed that except for the notification letter they receive when the OC schedules a site visit for financial monitoring, there is very little coordination between the program offices and the OC.
Despite the lack of evidence that grant managers were complying fully with the established monitoring plans, we did find that some grant monitoring occurred. For example, the files contained evidence that communication occurred between grantees and grant managers in the form of reports, faxes and letters, and oral and e-mail communications on specific issues and problems or requests for information. However, much of the communication between grantees and grant managers was not documented in accordance with OJP requirements.
The potential adverse effects of the weaknesses in the OJP’s monitoring and oversight of grantees are demonstrated in one particular audit. We found that the grantee’s management of grant funds was inadequate, the grantee maintained poor accounting records, and the grantee generally failed to exercise oversight over the sub-grantees. In addition, the grantee appeared to lack the requisite knowledge to administer the grant and to train other organizations as required by the terms of the grant. Examples of findings from this grant audit include:16
In our opinion, this grantee exhibited significant internal control weaknesses and poor fiscal management, and did not effectively or adequately manage the grant. We discussed this grantee with BJA officials and learned that they had not conducted a site visit during at least the past ten years, nor had the OC conducted a financial review of the grantee during the same period. Further, the BJA officials stated that they assumed the grantee was knowledgeable of grant requirements because, “they have been receiving grants for a long time.”
In addition to the preceding grantee’s poor grant management and oversight efforts, we found that BJA had a general lack of awareness related to the actions of other sub-grantees.
We performed the 21 individual grant audits to determine whether reimbursements for costs claimed under the grants were allowable, supported, and in accordance with applicable laws, regulations, guidelines, and the terms and conditions of the grants. The audits resulted in 8 BJA dollar-related findings and $3.2 million in related questioned costs, and 26 OJJDP dollar-related findings with questioned costs and funds that could be put to better use amounting to $2 million. Summaries of those questioned costs and findings are shown on the following tables (See Appendices III and IV for additional details).
Summary of Questioned Costs and Findings – BJA Grants
We found many violations of essential grant and accounting requirements in our audits of the 21 grants (See Appendix II for additional details). For example:
Our 21 grant audits also resulted in a number of significant non-dollar-related findings.20 For example:
However, in our judgment the most significant non-dollar-related findings involved the timeliness and accuracy of grantee Financial Status and Progress Reports. A summary of findings pertaining to this area is described in the following section.
Financial Status Reports – According to the OJP Financial Guide, each grantee is required to submit a Financial Status Report (FSR) to the awarding agency within 45 days of the end of each calendar quarter. We reviewed the FSRs throughout the grant periods for the audited grants and determined that 10 of the grantees submitted a total of 22 late and 10 inaccurate quarterly FSRs. The untimely reports were submitted as many as 60 days after the due date.
Untimely and/or Inaccurate Financial Status Reports
Progress Reports – According to the OJP Financial Guide, Progress Reports must be submitted within 30 days after the end of the reporting periods (June 30 and December 31). Progress Reports are supposed to describe in a narrative fashion information relevant to the performance of a plan, program, or project. We reviewed the Progress Reports throughout the grant periods for the 21 audited grants and determined that 13 grantees submitted a total of 43 reports late. The untimely reports were submitted as many as 170 days after the due date. In addition, 10 reports for these grantees could not be located.
Untimely or Missing Progress Reports
In our judgment, the failure to enforce the timely and accurate submission of FSRs and Progress Reports compromises the OJP’s ability to ensure the proper use of grant funds, and increases the risk that the OJP will fund projects that are ineffective or failing to meet their objectives. OJP can help address this issue by providing grant managers with training about the submission of timely and accurate reports, allowable costs, grant monitoring, and grant closeout procedures.
Another contributing factor to the weaknesses we identified is that not all of the key elements for monitoring grant activity have been implemented in OJP’s automated system for managing grants. The OJP’s Grants Management System (GMS) was initiated in December 1998 as a pilot program to streamline the solicitation, application, and award of grants. If it functioned at its full capacity, the GMS should provide "one-stop," full life-cycle support for all of the OJP’s grant management efforts. This, in turn, would improve efficiency of grant monitoring efforts, improve access to information, and enhance search and reporting capabilities. While the OJP has mandated that the GMS be used by its various components, several of the modules of the GMS were not fully operational during our audit period. At the beginning of our audit, we were told by OJP officials that the GMS was being implemented in phases and that it would be fully functional by the end of 2003. However, we were informed that the enhanced GMS, which will include all modules to manage grants from beginning to end, is not scheduled to fully operational until September 30, 2004. In our judgment, the OJP’s lack of systematic data to support grant management monitoring efforts is attributable, in part, to the lack of full GMS implementation.26
We recommend that the OJP ensure that:
The OJP is not collecting sufficient data to measure the performance of TA&T grants. Further, the OJP does not play a role in developing grantees' performance or outcome measures for program evaluation purposes, nor does it have specific requirements that grantees must adhere to in developing performance measures. As a result, for the 21 grants that we audited, it was not possible to assess the impact of the TA&T program and determine whether the grants were achieving their intended purposes.
According to the OJP, grant evaluation assesses the effectiveness of an ongoing program in achieving its objectives, relies on the standards of project design to distinguish a program's effects from those of other forces, and seeks to improve programs through a modification of current operations. Program evaluations are critical because they can be used to improve existing programs and provide policymakers and program managers with information for future program development. In addition, evaluations are used to assess how well programs have been implemented, and the extent to which funded activities have achieved their stated goals.
Program evaluation is especially important to the Department of Justice because, through the OJP, it administers over $6 billion in grants. Without proper evaluation, the OJP cannot determine whether the grants it awards are an appropriate use of Department funds. In addition, program evaluations provide policymakers and managers with information about which programs are successful and which programs are inefficient.
The OJP is responsible for collecting data to report on performance measures and for evaluating the performance of all programs. OJP officials told us that in its solicitations, applicants are notified that they are required to collect and report data that measures the results of their grant(s). However, we found that for the 21 grantees audited, the OJP did not collect and report the appropriate data to measure program results. Moreover, the grant files we reviewed showed no indication that the OJP Grant Managers participated in developing program measures.
We determined that the OJP relies on grantees’ semi-annual categorical Progress Reports to determine if projects have been successful. Although these reports give the OJP an outline of grantees’ activities, productivity, and self-assessment, this method of evaluation may not produce definitive results. Moreover, grantees’ self-assessments cannot be considered objective measures of accomplishment. In addition, agencies that fund their own evaluations can be in the position to practice undue influence that jeopardizes the objectivity of the findings. For example, an agency funding an evaluation of itself may select an evaluator who is likely to produce the results desired by the agency.
Our review of 21 TA&T grants indicated that grantees generally perform self-assessments through participant evaluations. For example, some grantees conduct training for criminal justice practitioners addressing new criminal justice issues. At the conclusion of the training, grantees might request that participating practitioners complete an evaluation form to assess the training. The grantee then compiles and summarizes the information from all of the evaluation forms in an effort to measure the success of the training provided. We consider this an insufficient form of evaluation because there is no assurance that respondents will give this kind of questionnaire more than cursory attention or provide candid responses. In our judgment, in addition to the self-assessment, grantees should use outside consultants (following the methodology described in the next paragraph) to evaluate their presentations and provide specific commentary to the grantees addressing how they could improve their training.
We also determined that the OJP does not work with its grantees to develop useful program evaluations. We asked the TA&T grantees in our sample to respond to a questionnaire about program evaluation. Twenty of the grantees stated that the OJP did not play a role in developing performance or outcome measures after making the grant award. In addition, we found that the OJP has no specific requirements to which the grantee must adhere in developing performance measures. We believe this lack of specific requirements results in the OJP having insufficient data to measure program performance.
In the 21 grants we audited, the OJP did not have the necessary information to determine whether the program was successful in meeting its intended purpose.27 When we discussed the OJP’s lack of a formal evaluation of grantee success in implementing program objectives and goals, we were told by senior BJA and OJJDP officials that OJP does not require grant managers to formally evaluate the success or failure of a grant. Instead, the officials said that the OJP relies solely on the Progress Reports, even though the reports almost always indicate the grant is achieving its stated objectives. Generally, grant managers review grant files before grants are renewed, but no formal evaluation is prepared to support the renewal of a grant, nor is such an evaluation required.
Besides the evaluation methods listed by the grantees in response to our questionnaire, our audit disclosed that three grantees hired outside contractors to evaluate their grant programs. While independent evaluations can be helpful without OJP’s participation the evaluation design and scope may not be comprehensive. For example, the Boys and Girls Club requested that program recipients evaluate the training provided to them through surveys developed and evaluated by the Policy Studies Associates (PSA) organization. This evaluation process was designed to collect data about program implementation, the participants' experiences, and positive training outcomes. The PSA used a combination of participant surveys, site visits, and telephone interviews in its data collection efforts. While these methods of evaluation can be useful in measuring program implementation and participants’ experiences, they do not measure post-training impacts or program outcomes. Had OJP collaborated with the grantee and PSA, a more comprehensive evaluation could have been developed to measure these outcomes.
In an effort to develop an overall grant program evaluation system, the OJP has sponsored a series of focus group meetings for Technical Assistance (TA) recipients (e.g., individuals in state and local agencies, local courts, community-based organizations, and the U.S. Attorney’s offices), TA providers, and the OJP staff.28 The focus group participants identified 10 factors as obstacles to the effective delivery of TA:
The OJP stated that it intends to improve its program evaluation efforts. In testimony prepared for the House Judiciary Committee, the OJP’s Principal Deputy Assistant Attorney General said that part of the OJP’s new vision, "is an increased emphasis on measuring the results of the programs we fund and on focusing OJP resources on what works."29 The statement went on to say that the OJP now requires evaluation components in all OJP discretionary grant programs, and is setting aside 10 percent of program funding to ensure evaluations are built into OJP programs from the outset. Moreover, OJP discretionary grant recipients are now required, as part of their grant conditions, to participate in a national or local program evaluation so that the effectiveness of these programs will be measured. During our review, we did not find evidence that these requirements had been implemented. In fact, our audit disclosed that only three grantees hired outside contractors to evaluate their grant programs (See Appendices V and VI).
We recommend that the OJP: