Office Of Justice Programs Technical Assistance And Training Program

Audit Report No. 04-40
September 2004
Office of the Inspector General

Findings and Recommendations

  2. Our audits of 21 TA&T grants disclosed weaknesses in the OJP’s monitoring efforts. Grantees were reimbursed for unallowable and unsupported costs. OJP grant managers did not ensure that all required Financial Status Reports and Progress Reports were submitted timely and other monitoring and closeout requirements were not observed. In total, we identified approximately $5.2 million in questioned costs and funds that could be put to better use of the nearly $78 million in grants examined.13 We attribute these weaknesses to: (1) the lack of training for OJP grant managers in the areas of timely and accurate report submission, allowable costs, grant monitoring and grant closeout procedures, and (2) the OJP’s automated system for grant management not functioning at full capacity. In addition, a criminal investigation was initiated by the OIG, based on our audit results, to examine the expenditures and business practices of one grantee.

The OJP has awarded some grants strictly for TA&T purposes and others that combined TA&T with other objectives. Our audits examined only grants that were exclusively for TA&T; the universe of such grants included 158 grants totaling $312.5 million. Because the BJA and the OJJDP awarded 92.5 percent of the total amount, or $289 million, we concentrated our audit on grants awarded by those two bureaus.

We judgmentally selected 21 grants for audit, totaling $77.7 million or 25 percent of the universe of funding.14 Our audit sample included 10 BJA grants totaling $28.1 million and 11 OJJDP grants totaling $49.6 million with grant award dates ranging from FY 1995 to FY 2002. We examined the monitoring efforts of the OJP and the affected bureaus and concluded that, for all of the grants in our sample, those monitoring efforts were deficient.

The OJP’s Monitoring of 21 Selected Grants

Grant monitoring is an essential management tool to ensure that grantees are properly expending funds and that the objectives of the grant program are being implemented. Based on the results of our 21 grant audits, we concluded that weaknesses in the OJP’s monitoring had permitted a wide range of discrepancies to occur among grantees. In fact, for the 21 grants that we audited, grant manager performed site visits for only 8 grantees.

One weakness was the lack of documentation of monitoring. We reviewed the program manager’s grant files for each of the 21 grants that were audited. The grant files generally contained monitoring plans but the grant managers did not consistently follow OJP requirements and document their monitoring efforts. For the closed grants in our sample, one did not contain documentation showing compliance with closeout procedures.15 According to the OJP Grant Managers Manual, the grant managers are to notify the grantee of closeout procedures 30 to 60 days before the end of the grant. However, the grant files that we reviewed did not contain any evidence that this requirement was fulfilled.

According to the OJP Grant Managers Manual, the bureau or program office and the Office of the Comptroller (OC) should coordinate activities throughout the monitoring process by preparing an annual monitoring plan, scheduling site visits, and conducting “team monitoring” or joint site visits. However, when we reviewed both the OC’s grant files and the grant managers’ files, we found little evidence that such coordination actually occurs. In addition, our interviews with the grant managers disclosed that except for the notification letter they receive when the OC schedules a site visit for financial monitoring, there is very little coordination between the program offices and the OC.

Despite the lack of evidence that grant managers were complying fully with the established monitoring plans, we did find that some grant monitoring occurred. For example, the files contained evidence that communication occurred between grantees and grant managers in the form of reports, faxes and letters, and oral and e-mail communications on specific issues and problems or requests for information. However, much of the communication between grantees and grant managers was not documented in accordance with OJP requirements.

The potential adverse effects of the weaknesses in the OJP’s monitoring and oversight of grantees are demonstrated in one particular audit. We found that the grantee’s management of grant funds was inadequate, the grantee maintained poor accounting records, and the grantee generally failed to exercise oversight over the sub-grantees. In addition, the grantee appeared to lack the requisite knowledge to administer the grant and to train other organizations as required by the terms of the grant. Examples of findings from this grant audit include:16

  • The grantee conducted prohibited lobbying activities using grant funds.
  • One of the grantee’s selection factors for sub-grantees was the connection between the sub-grantees and members of Congress, even though the sub-grants were supposed to be awarded competitively.
  • The grantee billed the OJP for salary costs never paid to employees.
  • Although the grantee budgeted for travel and staff expenses for site visits to its 36 sub-grantees, it did not adequately perform this task. We found that 67 percent of the site visits conducted were made to 6 sub-grantees in the grantee’s local area and 6 other sub-grantees were never visited.
  • The grantee charged unallowable costs to the grant, such as hotel in-room movie rentals, taxi cabs to restaurants, excessive telephone usage ($500 over a 2-day period), and, in one case, the replacement cost for a lost cell phone owned by the daughter of the project director.
  • The grantee charged social gatherings such as a Christmas party to the grant, and labeled a sunset cruise on a yacht as a training meeting for reimbursement purposes; in addition, alcohol was served at both the Christmas party and on the cruise.
  • The grantee failed to properly monitor sub-grantees, maintain appropriate documentation, and take action to recover funds when sub-grantees failed to perform.
  • The grantee’s files were in complete disarray.

In our opinion, this grantee exhibited significant internal control weaknesses and poor fiscal management, and did not effectively or adequately manage the grant. We discussed this grantee with BJA officials and learned that they had not conducted a site visit during at least the past ten years, nor had the OC conducted a financial review of the grantee during the same period. Further, the BJA officials stated that they assumed the grantee was knowledgeable of grant requirements because, “they have been receiving grants for a long time.”

In addition to the preceding grantee’s poor grant management and oversight efforts, we found that BJA had a general lack of awareness related to the actions of other sub-grantees.

  • In one instance, after we learned that a significant number of supporting documents were missing for a sub-grantee, we found that the executive director of that organization had recently been terminated for malfeasance, including destruction or removal of accounting and administrative records. While the grantee was aware of the termination, it did not take action to obtain supporting documentation for the sub-grantee’s expenditures. The grantee was aware of problems surrounding this particular sub-grantee, but failed to provide adequate oversight.
  • In another instance involving alleged embezzlement by a sub-grantee, BJA failed to follow-up to determine whether grant funds were at risk after it was notified of the alleged embezzlement.

Results of the 21 Grant Audits

We performed the 21 individual grant audits to determine whether reimbursements for costs claimed under the grants were allowable, supported, and in accordance with applicable laws, regulations, guidelines, and the terms and conditions of the grants. The audits resulted in 8 BJA dollar-related findings and $3.2 million in related questioned costs, and 26 OJJDP dollar-related findings with questioned costs and funds that could be put to better use amounting to $2 million. Summaries of those questioned costs and findings are shown on the following tables (See Appendices III and IV for additional details).

Summary of Questioned Costs and Findings – BJA Grants

Grantee/Grant Number Award Amount Questioned Costs17 Number of Dollar-Related Findings
American Prosecutors’ Research Institute (2000-PP-CX-K001) $ 2,061,559   0
Search Group, Inc. (1999-MU-MU-0005) 2,500,000 $ 29,602 2
National Council of Juvenile & Family Court Judges (98-MU-VX-K016) 2,904,655   0
National American Indian Court Judges Association (2000-IC-VX-0026) 1,442,112 31,921 4
Fund for the City of New York
1,839,269   0
Doe Fund, Inc. (2001-DD-BX-0055) 1,897,800 24,832 1
Grantee’s name withheld due to ongoing investigation. 3,162,580 3,162,580 118
Strategic Information Technology Center (University of Arkansas #1)
6,700,000   0
School Violence Resource Center (University of Arkansas #2)
3,995,600   0
Inter-Tribal Integrated Justice Pilot Project (University of Arkansas #3)
1,562,900   0
Subtotal – BJA $28,066,475 $3,248,935 8
   Source: Office of the Inspector General Grant Audit Reports

Summary of Questioned Costs and Findings – OJJDP Grants

Grantee/Grant Number Award Amount Questioned Costs and Funds to
Better Use 19
Number of Dollar-Related Findings
National Center for Missing and Exploited Children (2000-MC-CX-K021) $ 10,993,363   0
Development Services Group, Inc. (1999-JB-VX-K001) 5,377,201   0
Florida Atlantic University (95-JN-FX-0024) 2,018,869 $ 199,221
(FBU) 20,419
Boys and Girls Clubs of America (98-JN-FX-0007) 9,275,000 437,885 5
Constitutional Rights Foundation (2001-JS-FX-008) 1,066,400   0
National Court Appointed Special Advocate Association (2002-CH-BX-K001) 3,823,500   0
Children’s Advocacy Center for the Pikes Peak Region (2001-MU-MU-K002) 1,124,343 17,975 3
Suffolk University (1999-JS-FX-0001) 5,060,685 25,279
(FBU) 68,905
Children’s Hospital (2000-CI-FX-K001) 1,286,115 351,484 3
Fox Valley Technical College #1 (98-MC-CX-K010) 7,263,359 777,090 3
Fox Valley Technical College #2 (98-MC-CX-K003) 2,298,701 15,768 4
Subtotal – OJJDP $ 49,587,536 $ 1,914,026 26
Total BJA and OJJDP Questioned Costs and Funds to Better Use $ 77,654,011 $ 5,162,961 34
   Source: Office of the Inspector General Grant Audit Reports

Grant Expenditures

We found many violations of essential grant and accounting requirements in our audits of the 21 grants (See Appendix II for additional details). For example:

  • Eight grantees claimed and were reimbursed for costs that were not supported by their accounting records ($1,534,649);

  • Two grantees claimed and were reimbursed for expenditures that were not included in the approved grant budget ($178,405);

  • Five grantees claimed and were reimbursed for costs that were not allowed under the grant ($123,322);

  • Suffolk University had program income of $68,905 that was not used to reduce future draw downs or returned to the federal government;

  • Children’s Hospital drew down excess funds ($30,595) and transferred excessive funds between budget categories without written approval from the OJP ($59,903);

Our 21 grant audits also resulted in a number of significant non-dollar-related findings.20 For example:

  • The Fund for the City of New York incorrectly budgeted certain costs, e.g., compensated employee leave;

  • The internal controls over authorization and approval of grant expenditures at the Doe Fund, Inc. were inadequate; and

  • The University of Arkansas did not maintain complete and accurate inventory records for property purchased under grants to its Strategic Information Technology Center and its Inter-Tribal Integrated Pilot Justice Project.

However, in our judgment the most significant non-dollar-related findings involved the timeliness and accuracy of grantee Financial Status and Progress Reports. A summary of findings pertaining to this area is described in the following section.

Financial Status and Progress Reports

Financial Status Reports – According to the OJP Financial Guide, each grantee is required to submit a Financial Status Report (FSR) to the awarding agency within 45 days of the end of each calendar quarter. We reviewed the FSRs throughout the grant periods for the audited grants and determined that 10 of the grantees submitted a total of 22 late and 10 inaccurate quarterly FSRs. The untimely reports were submitted as many as 60 days after the due date.

Untimely and/or Inaccurate Financial Status Reports

Grantee Number of
Reports Required
of Late
Number of
Inaccurate Reports
National American Indian Court Judges Association 13 6 0
Fund for the City of New York 18 3 0
Doe Fund, Inc. 7 2 0
School Violence Resource Center
(University of Arkansas)
12 0 2
Florida Atlantic University 32 1 021
Boys and Girls Clubs of America 22 0 1
Children’s Advocacy Center for the
Pikes Peak Region
10 1 7
Suffolk University 18 2 0
Children’s Hospital 14 222 0
Grantee’s name withheld due to
ongoing investigation
12 5 0
Total 158 22 10
   Source: Office of the Inspector General Grant Audit Reports

Progress Reports – According to the OJP Financial Guide, Progress Reports must be submitted within 30 days after the end of the reporting periods (June 30 and December 31). Progress Reports are supposed to describe in a narrative fashion information relevant to the performance of a plan, program, or project. We reviewed the Progress Reports throughout the grant periods for the 21 audited grants and determined that 13 grantees submitted a total of 43 reports late. The untimely reports were submitted as many as 170 days after the due date. In addition, 10 reports for these grantees could not be located.

Untimely or Missing Progress Reports

GranteeNumber of
Reports Required
of Late
Number of
Missing Reports
National Council of Juvenile and Family Court Judges104230
National American Indian Court Judges Association721
Fund for the City of New York93241
Doe Fund, Inc.430
Grantee's name withheld due to ongoing investigation.844
School Violence Resource Center (University of Arkansas)610
Inter-Tribal Integrated Justice Pilot Project (University of Arkansas)320
Development Services Group, Inc.910
Florida Atlantic University1593
Boys and Girls Clubs of America1181
Constitutional Rights Foundation310
Children's Advocacy Center for the Pikes Peak Region520
Suffolk University920
Children's Hospital71250
   Source: Office of the Inspector General Grant Audit Reports

In our judgment, the failure to enforce the timely and accurate submission of FSRs and Progress Reports compromises the OJP’s ability to ensure the proper use of grant funds, and increases the risk that the OJP will fund projects that are ineffective or failing to meet their objectives. OJP can help address this issue by providing grant managers with training about the submission of timely and accurate reports, allowable costs, grant monitoring, and grant closeout procedures.

Another contributing factor to the weaknesses we identified is that not all of the key elements for monitoring grant activity have been implemented in OJP’s automated system for managing grants. The OJP’s Grants Management System (GMS) was initiated in December 1998 as a pilot program to streamline the solicitation, application, and award of grants. If it functioned at its full capacity, the GMS should provide "one-stop," full life-cycle support for all of the OJP’s grant management efforts. This, in turn, would improve efficiency of grant monitoring efforts, improve access to information, and enhance search and reporting capabilities. While the OJP has mandated that the GMS be used by its various components, several of the modules of the GMS were not fully operational during our audit period. At the beginning of our audit, we were told by OJP officials that the GMS was being implemented in phases and that it would be fully functional by the end of 2003. However, we were informed that the enhanced GMS, which will include all modules to manage grants from beginning to end, is not scheduled to fully operational until September 30, 2004. In our judgment, the OJP’s lack of systematic data to support grant management monitoring efforts is attributable, in part, to the lack of full GMS implementation.26


We recommend that the OJP ensure that:

  1. Grant managers receive annual training on OJP's requirements governing the submission of timely and accurate reports, allowable costs, grant monitoring, and grant closeout procedures.
  2. The GMS is brought up to the full functioning capacity as soon as possible and grant managers are trained to utilize this system.

  2. The OJP is not collecting sufficient data to measure the performance of TA&T grants. Further, the OJP does not play a role in developing grantees' performance or outcome measures for program evaluation purposes, nor does it have specific requirements that grantees must adhere to in developing performance measures. As a result, for the 21 grants that we audited, it was not possible to assess the impact of the TA&T program and determine whether the grants were achieving their intended purposes.

According to the OJP, grant evaluation assesses the effectiveness of an ongoing program in achieving its objectives, relies on the standards of project design to distinguish a program's effects from those of other forces, and seeks to improve programs through a modification of current operations. Program evaluations are critical because they can be used to improve existing programs and provide policymakers and program managers with information for future program development. In addition, evaluations are used to assess how well programs have been implemented, and the extent to which funded activities have achieved their stated goals.

Program evaluation is especially important to the Department of Justice because, through the OJP, it administers over $6 billion in grants. Without proper evaluation, the OJP cannot determine whether the grants it awards are an appropriate use of Department funds. In addition, program evaluations provide policymakers and managers with information about which programs are successful and which programs are inefficient.

The OJP is responsible for collecting data to report on performance measures and for evaluating the performance of all programs. OJP officials told us that in its solicitations, applicants are notified that they are required to collect and report data that measures the results of their grant(s). However, we found that for the 21 grantees audited, the OJP did not collect and report the appropriate data to measure program results. Moreover, the grant files we reviewed showed no indication that the OJP Grant Managers participated in developing program measures.

Grantee Evaluation Methods

We determined that the OJP relies on grantees’ semi-annual categorical Progress Reports to determine if projects have been successful. Although these reports give the OJP an outline of grantees’ activities, productivity, and self-assessment, this method of evaluation may not produce definitive results. Moreover, grantees’ self-assessments cannot be considered objective measures of accomplishment. In addition, agencies that fund their own evaluations can be in the position to practice undue influence that jeopardizes the objectivity of the findings. For example, an agency funding an evaluation of itself may select an evaluator who is likely to produce the results desired by the agency.

Our review of 21 TA&T grants indicated that grantees generally perform self-assessments through participant evaluations. For example, some grantees conduct training for criminal justice practitioners addressing new criminal justice issues. At the conclusion of the training, grantees might request that participating practitioners complete an evaluation form to assess the training. The grantee then compiles and summarizes the information from all of the evaluation forms in an effort to measure the success of the training provided. We consider this an insufficient form of evaluation because there is no assurance that respondents will give this kind of questionnaire more than cursory attention or provide candid responses. In our judgment, in addition to the self-assessment, grantees should use outside consultants (following the methodology described in the next paragraph) to evaluate their presentations and provide specific commentary to the grantees addressing how they could improve their training.

We also determined that the OJP does not work with its grantees to develop useful program evaluations. We asked the TA&T grantees in our sample to respond to a questionnaire about program evaluation. Twenty of the grantees stated that the OJP did not play a role in developing performance or outcome measures after making the grant award. In addition, we found that the OJP has no specific requirements to which the grantee must adhere in developing performance measures. We believe this lack of specific requirements results in the OJP having insufficient data to measure program performance.

In the 21 grants we audited, the OJP did not have the necessary information to determine whether the program was successful in meeting its intended purpose.27 When we discussed the OJP’s lack of a formal evaluation of grantee success in implementing program objectives and goals, we were told by senior BJA and OJJDP officials that OJP does not require grant managers to formally evaluate the success or failure of a grant. Instead, the officials said that the OJP relies solely on the Progress Reports, even though the reports almost always indicate the grant is achieving its stated objectives. Generally, grant managers review grant files before grants are renewed, but no formal evaluation is prepared to support the renewal of a grant, nor is such an evaluation required.

Besides the evaluation methods listed by the grantees in response to our questionnaire, our audit disclosed that three grantees hired outside contractors to evaluate their grant programs. While independent evaluations can be helpful without OJP’s participation the evaluation design and scope may not be comprehensive. For example, the Boys and Girls Club requested that program recipients evaluate the training provided to them through surveys developed and evaluated by the Policy Studies Associates (PSA) organization. This evaluation process was designed to collect data about program implementation, the participants' experiences, and positive training outcomes. The PSA used a combination of participant surveys, site visits, and telephone interviews in its data collection efforts. While these methods of evaluation can be useful in measuring program implementation and participants’ experiences, they do not measure post-training impacts or program outcomes. Had OJP collaborated with the grantee and PSA, a more comprehensive evaluation could have been developed to measure these outcomes.

The OJP's Program Evaluation Efforts

In an effort to develop an overall grant program evaluation system, the OJP has sponsored a series of focus group meetings for Technical Assistance (TA) recipients (e.g., individuals in state and local agencies, local courts, community-based organizations, and the U.S. Attorney’s offices), TA providers, and the OJP staff.28 The focus group participants identified 10 factors as obstacles to the effective delivery of TA:

  • Limitations of time and resources;
  • Lack of information about available TA and resources;
  • Inability to select preferred type of TA, or to select individual TA providers;
  • Lack of information about the particular situations in recipient jurisdictions, previous TA work projects, and available work products;
  • Requirements of state sign-off for certain types of TA;
  • Lack of mechanisms for accountability and feedback;
  • Lack of commonly shared expectations regarding what constitutes effective TA;
  • Lack of a research base about what constitutes effective TA;
  • Lack of diversity in the pool of persons used as TA providers; and
  • Limitations on the permissible scope of the OJP TA and categorical funding limitations.

The OJP stated that it intends to improve its program evaluation efforts. In testimony prepared for the House Judiciary Committee, the OJP’s Principal Deputy Assistant Attorney General said that part of the OJP’s new vision, "is an increased emphasis on measuring the results of the programs we fund and on focusing OJP resources on what works."29 The statement went on to say that the OJP now requires evaluation components in all OJP discretionary grant programs, and is setting aside 10 percent of program funding to ensure evaluations are built into OJP programs from the outset. Moreover, OJP discretionary grant recipients are now required, as part of their grant conditions, to participate in a national or local program evaluation so that the effectiveness of these programs will be measured. During our review, we did not find evidence that these requirements had been implemented. In fact, our audit disclosed that only three grantees hired outside contractors to evaluate their grant programs (See Appendices V and VI).


We recommend that the OJP:

  1. Develop performance or outcome measures for TA&T grants.


  1. See Appendices II, III, and IV for a breakdown of our dollar-related findings and for definitions of questioned costs and funds to better use.

  2. Our criteria for selection were: (a) the grant was awarded solely for the purpose of providing TA&T; (b) the awarding bureau was either the BJA or the OJJDP; (c) the grant amount was over $1 million; (d) the sample represented a range of grant periods; and (e) geographic distribution.

  3. Closeout of a grant is a process by which the OJP determines that the grantee and the OJP have completed all applicable administrative actions and all required work on the project. Upon expiration of a grant, the OJP grant manager and the OC are responsible for timely and proper closing of the grant. The grantee whose files did not contain documentation of compliance with closeout procedures is the Search Group, Inc. The files for three additional grantees did not contain closeout documentation but the grantees were subsequently awarded grant extensions.

  4. The results of our audit of this BJA grant caused us to question the entire grant ($3,162,580 over the life of the grant, April 6, 2000, through December 31, 2003), prompted an investigation by the OIG Investigations Division, and resulted in a suspension of grant funding by the OJP.

  5. Questioned costs are expenditures that do not comply with legal, regulatory, or contractual requirements, or are not supported by adequate documentation at the time of the audit, or are unnecessary or unreasonable. Questioned costs may be remedied by offset, waiver, recovery of funds, or the provision of supporting documentation.

  6. The number of dollar-related findings is not yet final given the ongoing investigation.

  7. Funds to Better Use are future funds that could be used more efficiently if management took actions to implement and complete audit recommendations.

  8. See Appendix III for details regarding the following grantees: Fund for the City of New York; Doe Fund, Inc.; and University of Arkansas grants #1 and #3.

  9. We determined that the grantee failed to report program income received from the grant on the FSRs as required. However, we could not determine with certainty when the grantee should have started to report program income.

  10. The untimely reports were submitted 42 and 57 days after the due date. The OJP failed to date-stamp 11 of the reports when received; consequently, we could not determine the timeliness of their submission.

  11. One of the four Progress Reports was submitted 170 days after the due date.

  12. We concluded that at least three of the nine progress reports were filed late and it is possible that the remaining six were also late. Because of the incomplete records on the part of both the Fund and the OJP, we could not determine whether the filing of seven reports on October 8, 2002, was the Fund’s first or second submission for these reports.

  13. The untimely report was submitted 75 days late. Three other reports were not date-stamped by the OJP; therefore, we could not determine the timeliness of their submission.

  14. In April 2004, grantees had the option of submitting their FSRs electronically through the web-based SF 269 application. However, with the first reporting period in FY 2005, all grantees are required to submit their FSRs electronically.

  15. See Appendices V and VI for grantee evaluation methods.

  16. The themes and recommendations that emerged from this research are documented in a report produced by the Justice Management Institute in Denver, Colorado, entitled Improving the Effectiveness of Technical Assistance—A Report on Focus Group Meetings of Criminal Justice Practitioners, Technical Assistance Providers, and OJP Staff.

  17. Statement of Tracy A. Henke, Principal Deputy Assistant Attorney General, Office of Justice Programs, Before the Subcommittee On Crime, Committee on the Judiciary, U.S. House of Representatives, Concerning Office of Justice Programs Oversight (March 14, 2002).