Return to the USDOJ/OIG Home Page

Select Application Controls Review of the Federal Bureau of Prisons's Sentry Database System

Report No. 03-25
July 2003
Office of the Inspector General

Appendix III


The application control guidelines used for this audit were obtained from the GAO's FISCAM. The information below details the sections from the FISCAM used during our review of SENTRY.


Application controls are the structure, policies, and procedures that apply to separate, individual application systems, such as accounts payable, inventory, payroll, grants, or loans. An application system is typically a collection or group of individual computer programs that relate to a common function. In the federal government, some applications may be complex comprehensive systems, involving numerous computer programs and organizational units, such as those associated with benefit payment systems. For the purposes of this document, application controls encompass both the routines contained within the computer program code, and the policies and procedures associated with user activities, such as manual measures performed by the user to determine that data were processed accurately by the computer.

Application controls help make certain that transactions are valid, properly authorized, and completely and accurately processed by the computer. They are commonly categorized into three phases of a processing cycle:

Some guides provide additional categories of application controls. For example, data origination is a breakout of input it controls to focus on source documents and their need for authorization and proper preparation and control. Also, data storage and retrieval focuses on access to and use of data files and protecting their integrity.

Instead of using the phases of a processing cycle, this document uses control categories that better tie in with the Specific Control Evaluation Worksheets (SCE) found in the FISCAM. The SCE is used to document the controls evaluation and is prepared for each significant accounting application. Included on the SCE are columns for recording the control objectives and control techniques being evaluated and accuracy including whether the assertion and related transactions are authorized, complete, valid, and accurate. The control objectives and techniques addressed in this chapter are consistent with other guidance, but our categorization, tying to the SCE, are the following:


Only authorized transactions should be entered into the application system and processed by the computer. Assessing authorization controls involves evaluating the entity's success in performing each of the following critical elements:

Critical Elements:

Data should be authorized before it is entered into the application system. Federal financial management systems are often characterized as large complex 'legacy' systems and often involve a multitude of documents that flow through various work steps. Paper source documents still play a significant role for originating data that enter application systems in the federal government. These source documents should fall under control measures so that unauthorized transactions are not submitted to and processed by the application. Also, data whether from a source document or not should undergo an independent or supervisory review prior to entering the application.

Source documents are controlled and require authorizing signatures.

Control over source documents should begin even before data is recorded on the document. Access restrictions over blank source documents should prevent unauthorized personnel from obtaining a blank source document, recording unauthorized information, and inserting the document in the flow with authorized documents and possibly causing a fraudulent or malicious transaction to occur. Use of pre-numbered source documents could help identify unauthorized documents that fall outside the range of authorized numbers for documents being prepared for data entry.

Key source documents for an application should require an authorizing signature, and the document should provide space for the signature by an authorized official.

For batch application systems - i.e., source documents are processed in batches - the source documents should be collected together and a batch control sheet should be prepared for individual batches. The control sheet should have space for recording the date, a batch control number, the number of documents in the batch, a control total for a key field in the documents, and the identification of the user submitting the batch. Establishing control over batches helps detect unauthorized modifications to a document and prevents unauthorized documents from being entered into the application system. The document counts and control totals also help to determine whether all transactions are completely entered and processed by the computer. The following sections are also important to ensuring all transactions are authorized, particularly when the application system is designed such that transactions are entered individually instead of in batches.

Supervisory or independent reviews of data occur before entering the application system.

Providing supervisory or independent review of data before entering the application system helps prevent the occurrence of unauthorized transactions. A data control unit is effective for this purpose and this function has evolved as technology has advanced. With earlier systems, source documents were batched in the user department and sent to a data control unit that was organizationally under the information systems department. This unit monitored data entry and processing of the documents, seeing that all batches were received, entered, and processed completely. In addition, personnel in this unit verified that each source document was properly prepared and authorized before the data on the document was entered into the system.

This function has migrated to the user department as it gained access to application systems through computer terminals. Several or more personnel in the user department may now enter source documents into a transaction file that is not released for processing until a supervisory or independent review occurs. A user department control unit may have the responsibility to see that entered transactions are supported by a source document that contains a valid authorizing signature. Also, supervisors in the user department may hold this responsibility. These application systems may have a separate authorization screen accessed by computer terminal by control unit or supervisory personnel. After verifying the input transactions, the control unit or supervisory personnel enter the required authorization and release the data for further processing.

Unauthorized personnel who have unrestricted access to data entry terminals (as well as by authorized users who are not restricted in what transactions they can enter) can compromise the integrity of application data. Without limits, unauthorized personnel and authorized users could enter fraudulent or malicious transactions. To counter this risk, both physical and logical controls are needed to restrict data entry terminals to authorized users for authorized purposes. This section provides an overview of controls relevant to restricting data entry terminals and limiting users in what transactions they can enter. Any work done in this section should be done in conjunction with the other two sections.

Data entry terminals are secured and restricted to authorized users.

Data entry terminals should be located in physically secure rooms. When terminals are not in use, these rooms should be locked, or the terminals themselves should be capable of being secured to prevent unauthorized use. Supervisors should sign on to each terminal device, or authorize terminal usage from a program file server, before an operator can sign on to begin work for the day. Each operator should be required to use a unique password and identification code before being granted access to the system.

Data entry terminals should be connected to the system only during specified periods of the day, which corresponds with the business hours of the data entry personnel. Each terminal should automatically disconnect from the system when not used after a specified period of time.

Where dial-up access is used to connect terminals to the system, connection should not be completed until the system calls back to the terminal. These terminals should generate a unique identifier code for computer verification. Such procedures help limit access to known, authorized terminals.

On-line access logs should be maintained by the system, such as through the use of security software, and should be reviewed regularly for unauthorized access attempts. All transactions should be logged as they are entered, along with the terminal ID that was used, and the ID of the person entering the data. This builds an audit trail and helps hold personnel accountable for the data they enter.

Users are limited in what transactions they can enter.

It is not enough to restrict access to data entry terminals to authorized users, as these users may still enter unauthorized transactions, if they are not limited on what transactions they can enter. Limits can be accomplished through authorization profiles. One authorization profile level can be placed over the terminal so that only specified transactions can be entered from a given terminal. For example, a terminal in a payroll office may be granted authorization so that payroll information, such as employee time and attendance and pay withholdings, could be entered from that terminal. However, to effect a separation of duties, this terminal could be denied authorization to enter personnel actions, such as hirings that would create a new employee pay record, or promotions. These latter transactions are normally restricted to a personnel or human resources office.

Authorization profiles can also be established for user personnel. These personnel can be denied authorization for initiating transactions that would add or change a record on the authorized vendor master file. If one employee had the capability to initiate both types of transactions, the employee could potentially cause a fraudulent transaction by creating a vendor master record and initiating a payment that would be sent to the specified address or bank account controlled by the employee.

Before the auditor can rely on authorization profiles to reduce the audit risk, the auditor must determine the adequacy of the general controls over the profiles. That is, if the general controls are not effective in preventing unauthorized changes to the data matrix or table that constitutes the profile, the auditor should not rely upon this control.

An effectively controlled application system will also have authorization type controls to monitor data as it is processed. Two such controls include the use of master files and exception reporting that help determine the validity of transactions. These controls require computer programs to perform the validity checks and involve a process commonly referred to as data validation and editing. Many of the programmed checks in this process also concern the validity and accuracy of data fields in a transaction record, including whether a data field has a valid code, such as a pay withholding code used in a payroll application system. This section focuses on checks to determine the validity of a transaction. Data validation and editing is a more detailed discussion of data validation and editing, focusing on checks to determine the validity and accuracy of data fields.

Master files help identify unauthorized transactions.

A master file is a computer file that contains account and/or reference information that are integral to application systems, such as a payroll master file containing authorized employees and pay data. Master files and their approved records can help identify unauthorized transactions. For example, an accounts payable system should have a master file of approved vendors. As payment transactions are processed, they would be compared with this file and any payment for a vendor not on the file would be rejected and investigated by supervisor personnel, or by personnel specifically assigned this responsibility that do not also have responsibility for initiating vendor payments. Using this process, there is greater assurance that all transactions not rejected are authorized and valid payments.

Exceptions are reported to management for their review and approval.

An exception report lists items requiring review and approval. These items may be valid, but exceed parameters established by management. Implementation of this control may vary, such that one system may print checks and have them routed to management to be released after their approval, and another system may hold the transaction in a suspense account until management enters an authorizing indicator, thus triggering the disbursement.

Before the auditor can rely on these controls to reduce the audit risk, the auditor must, as in the previous section, determine the adequacy of the general controls over these controls. That is, these controls would be rendered ineffective if the general controls would not prevent unauthorized changes to the master files and exception criteria, and to the program code responsible for performing the file and criteria comparisons with transaction data.


All authorized transactions should be entered into and completely processed by the computer. Assessing the controls over completeness involves evaluating the entity's success in performing each of the critical elements listed below.

Critical Elements:

A control for completeness is one of the most basic application controls, but is essential to ensure that all transactions are processed, and missing or duplicate transactions are identified. The most commonly encountered controls for completeness include the use of record counts and control totals, computer sequence checking, computer matching of transaction data with data in a master or suspense file, and checking of reports for transaction data.

Record counts and control totals.

In general, user-prepared totals established over source documents and data to be entered can be carried into and through processing. The computer can generate similar totals and track the data from one processing stage to the next and verify that the data was entered and processed, as it should have been. For example, a file of valid transactions (i.e.; transactions that pass data validation and editing) can contain a control record showing the record count and control totals for the file. As the file is processed through a job (or job step) the computer can calculate a record count and control totals for the transactions processed. The computer calculated amounts are compared with the amounts in the control record. Agreements in the amounts provide evidence that the processing was done accurately and completely. Disagreements indicate that a problem has occurred and needs to be investigated and rectified. On-line or real-time systems, where transactions are not entered as a batch, can still utilize this technique by establishing record counts and control totals over transactions entered during a specific time period, such as daily.

Computer sequence checking.

This control begins by providing each transaction with a unique sequential number. Some transactions originate on source documents with preassigned serial numbers. This number should be entered into the computer along with the other data on the transaction. The computer can identify numbers missing from the sequence and provide a report of missing numbers. The missing numbers should be investigated to determine whether they are numbers for voided source documents, or are valid documents that may have been lost or misplaced.

For transactions not on source documents with preassigned serial numbers, the computer can assign a unique sequential number as the data is entered. At a later point in processing, such as when transaction data updates a master file, the computer can verify that all numbers are accounted for. Again, missing numbers are reported for investigation.

Sequence checking is also valuable in identifying duplicate transactions. For example, two transactions with the same preassigned serial number for a source document would indicate that the transaction had been erroneously entered a second time. As another example, a file of sequential numbers for purchase orders could help prevent paying for the purchase more than once. After the purchased goods and vendor's bill are received, a payment transaction with the purchase order number would be matched with the file containing all purchase order numbers, and an indicator for the payment would be recorded on the file for that purchase. The payment indicator would cause following payment transactions for the same purchase order to be rejected and reported for investigation.

Computer matching of transaction data.

This control involves matching transaction data with data in a master or suspense file. Unmatched items from both the transaction data and master or suspense file are reported for investigation. For example, a payroll system may be designed so that each employee's time and attendance sheet is matched to the employee's master pay record. Each time sheet that does not match with a master pay record is reported to determine whether it represents a valid employee and the master pay file needs to be updated. Each master pay record that does not receive a match is reported to determine whether a valid employee exists and a time sheet must be found or created so that the employee will receive pay on time. Also, master pay records with more than one time sheet are reported, which indicates a duplicate time sheet exists for one employee.

As another example, before initiating a payment, a vendor's invoice could be matched with a file containing records detailing goods received. Invoices not matched could be reported to show goods not received, and no invoices would be paid until a match occurred.

Checking reports for transaction data.

This activity involves checking each individual transaction with a detailed listing of items processed by the computer to verify that the transaction submitted was indeed processed. While an effective method, it is time-consuming and costly. Therefore, it is normally used with low-volume but high-value transactions, such as updating master files.

Reconciliations show the completeness of data processed at points in the processing cycle.

An application system is a collection or group of individual computer programs that relate to a common function. As data is entered into and processed through these programs, reconciliations of record counts and control totals at various points helps make certain that all the data was processed completely for the programs relative to the reconciliation. For example, control over a batch (a collection) of source documents may entail a user to establish a record count and control total over the batch and record the amounts on a batch control sheet. The control information on the batch control sheet would be entered into the computer along with the information on each source document. The computer would compute a similar record count and control total for the batch as the data is entered. For the reconciliation, the computer would compare the computed amounts with the entered amounts from the batch control sheet. Agreement in the amounts indicates all data was completely entered. A disagreement may indicate some data is missing, an amount was entered incorrectly, or the batch control information was calculated or entered incorrectly. Batches with disagreements are commonly referred to as a "batch-out-of-balance." These should not undergo further processing until the disagreements are investigated and resolved. The record counts and control totals for batches in agreement are usable for reconciliations during later processing, as discussed below.

For applications where transactions are entered individually as they occur, this concept is still of use, as a record count and control total could be established over transactions entered during a specific time period, such as daily. Files should contain record count and control total information so that the computer can verify processing completeness as it progresses. Computer tape files would contain this information in a "trailer label" record that exists at the end of all data records on the tape. A disk file would contain this information in a control record. A program creating the file calculates and records the control information on the file. As a subsequent program processes the file, the computer calculates similar information and reconciles what it calculated with what was recorded on the file. Agreement in the amounts indicates all data was completely processed. This control information is commonly referred to as "run-to-run control totals."

As systems have become more integrated over the years, a file produced by one application may be used in another application. It is important to reconcile control information between the sending and receiving applications.

Performing the comparison of control numbers is commonly referred to as balancing, and should be done automatically by the computer, although some older systems may rely on manual balancing procedures. The control numbers for the balancing at key points should be documented, such as being printed on a control totals balance report, and should be reviewed by the data processing control group that monitors the completeness and accuracy of processing.

Reconciliations show the completeness of data processed for the total cycle.

Reconciliations should occur periodically that verify the completeness of data processed for a given cycle, such as daily, weekly, or relative to the processing cycle - for example, monthly for an accounts payable system. A control register is an effective tool to use in this process. Such reconciliations monitor the completeness of transactions processed, master files updated, and outputs generated, such as cash disbursements.

To illustrate with updating a master file, control information for this file should be recorded in the control register at the start of the cycle. Control information for the transactions entered that will update the master file should be reconciled with the control information over both accepted and rejected transactions. Control information for the accepted transactions that update the master file should be entered in the control register and added to the control information for the beginning master file. Control information for the updated master file should then be reconciled to the control register, should equal the sum of the beginning master file and accepted transactions. Another example illustrates reconciliation over disbursements for an accounts payable system. A vendor master file may contain a data field to record month-to-date payments. A total of all the vendors' month-to-date payments in the master file should be reconciled with and equal the total for all the checks written during the month to those vendors.


The recording of valid and accurate data into an application system is essential to provide for an effective system that produces reliable results. Assessing the controls for valid and accurate data involves determining the entity's success in achieving each of the critical elements listed below.

Critical Elements.

Well-designed data entry processes can contribute to the entry of accurate and valid data. On the other hand, inadequacies in this area can contribute to data entry errors. The focus here includes source document design, preformatted computer terminal data entry screens, key verification, and the use of automated entry devices.

Source documents are designed to minimize errors.

Special purpose forms should be used that help the preparer to initially record data correctly and in a uniform format. This also facilitates the entry of data at a later stage. For example, rather than just providing a blank ("_________") for a social security number, a well-designed form would include the following to record the number: " - - ." For each type of transaction, the source document should provide a unique code or identifier, which should be preprinted on the document for data entry if it supports only one transaction type. The application computer programs use the transaction type for selecting the processing to be performed on the transaction. When several or more codes are options for identifying a data field's purpose, such as a payroll withholding, the options should be preprinted on the source document. A short list of options could appear under or near the data field, and a longer list could appear on the back of the document.

Preformatted computer terminal screens guide data entry.

Using preformatted computer terminal screens for data entry helps increase data accuracy at the point of entry. The computer screen (and the associated program code) prompts the terminal operator for data by field. Programmed routines allow the data to be checked or edited as it is keyed. After the data has been entered and passes the programmed edits, the computer screen prompt moves to the next data field indicating to the terminal operator the next data to be entered.

Key verification increases the accuracy of significant data fields


For paper intensive source document environments found in large government transaction operations, key verification is a common technique still used to increase the accuracy of significant data fields. For this technique, after initial entry of transaction data, a separate individual reads the same source document and keys data into a machine that checks the results of keystrokes with what was originally keyed. Data that is keyed differently is reviewed to determine the correct data. As an example, the Internal Revenue Service (IRS) uses key verification to ensure that certain data from tax returns have been entered correctly. This technique's effectiveness is reduced if the original data entry person is also the one performing the key verification, or if the key verifier is located next to or in the proximity of the original data entry person, thereby negating a separation of duties in performing this function.

Automated entry devices increase data accuracy.

The use of automated entry devices (e.g., optical or magnetic ink character readers) can reduce data error rates, as well as speed the entry process. The IRS's use of preprinted labels, showing the taxpayer's name, address, and social security number is such an example. This information can be entered without keying the data, which ensures a more accurate and faster process.

A crucial control activity involves identifying erroneous data at the point it enters the application system, or at some later point during the processing cycle. This is accomplished in a process that is commonly called data validation and editing. Programmed validation and edit checks are key to this process, and are generally performed on transaction data entering the system, as well as data prior to updating master files, and data resulting from processing.

Programmed validation and edit checks identify erroneous data.

Programmed validation and edit checks are, for the most part, the most critical and comprehensive set of controls in assuring that the initial recording of data into the system is accurate. These controls are built as early as possible in the input process, and provide extensive coverage over as many data fields that a user feels a need to control. This approach is used extensively in both batch and on-line environments.

Programmed validation and edit checks can effectively start as the data are being keyed in at a computer terminal using preformatted computer screens. For example, an alphabetic character entered for a numeric field can be rejected as it is keyed. Also, data involving quantities or values can be checked to ensure they fall within reasonable predetermined limits, or within the range of a set of numbers. Further, key fields, such as a loan account number, or parts number in an inventory system, could employ a check digit to help validate that the number is being entered correctly. The check digit is an additional number contained in the key field, which is determined by a formula from the other numbers of the key field. The computer recalculates the check digit using the formula with the numbers entered and compares the calculation with the check digit entered. Agreement between the check digit entered and the recalculated check digit provides support that all the numbers were entered correctly with no transposition errors.

Programmed validation and edit checks may also occur after data has entered the application. For example, transaction data may enter the processing cycle from another application and should be subjected to these checks. This should occur before updating master files, and should be performed early in the data flow to reduce the processing associated with incorrect data. Some of these later checks may focus on determining the validity of a transaction data field. For example, a benefit payment system may compare the transaction's disability type code to a table of valid codes. Other checks may focus on determining the validity of the transaction itself, such as comparing vendor invoices with an approved vendor file, and with a file on purchase orders and goods received.

These checks also help provide that data recorded in key fields on master files are accurate and valid. One check, known as relationship editing, compares data in a transaction record with data in a master record for appropriateness and correctness before updating the master record. As an example, a personnel action to effect a promotion for an employee on a master pay file will first establish a match between the transaction record and pay record based on the employee's social security number. However, before posting the new grade level and salary to the pay record, the computer may ensure that the names in the transaction record and pay records agree, and that the old grade level in the personnel action is the same grade level as the existing grade level in the pay record. Only after agreement with both items will the pay record be updated.

The total transaction should undergo data validation and editing, and all fields in error should be identified before the transaction is rejected from further processing.

Tests are made of critical calculations.

Data resulting from processing routines, such as critical calculations, should also be tested to ensure the results are valid. For example, limits and reasonableness checks would help identify erroneous results before they cause some negative impact. Unusual items could be held and reported for management review and approval. Through such means, disbursements exceeding a certain amount could be routed for a manager's review and approval prior to release of the disbursement.

Before the auditor can rely on the entity's data validation and editing checks, discussed in this and the previous sections, to reduce the audit risk, the auditor must determine the adequacy of the general controls over these checks. To be effective, the general controls should protect the program code and any related tables associated with the validation and edit routines from unauthorized changes.

Overriding or bypassing data validation and editing is restricted.

Many systems allow data validation and edit routines to be bypassed, which could allow the system to accept and process erroneous data. Using the bypass capability (sometimes referred to as an override) should be very limited and closely controlled and monitored by supervisory personnel. For example, each override should be automatically logged and reviewed by supervisors for appropriateness and correctness.

Transactions detected with errors need to be controlled to ensure that they are corrected and reentered in a timely manner. During data entry, particularly with more modern systems, an error can be identified and corrected at the data entry terminal. With errors identified during the data processing cycle, however, a break generally has been made from the data entry terminal. Therefore, errors identified cannot be communicated in a real-time mode back to personnel entering the data for immediate correction. An automated error suspense file is an essential element to controlling these data errors, and the errors need to be effectively reported back to the user department for investigation and correction.

Rejected transactions are controlled with an automated error suspense file.

Using an automated error suspense file should control rejected transactions. Transactions entered into this file should be annotated with:

Record counts and control totals should be developed automatically during processing of erroneous transactions to the suspense file and used in reconciling the transactions successfully processed. A control group should be responsible for controlling and monitoring the rejected transactions.

The suspense file should be purged of the related erroneous transaction as the correction is made. Record counts and control totals for the suspense file should be adjusted accordingly. Periodically, the suspense file should be analyzed to determine the extent and type of transaction errors being made, and the age of uncorrected transactions. This analysis may indicate a need for a system change or some specific training to reduce future data errors.

General controls should protect the suspense file from unauthorized access and modification, in order for the auditor to be able to rely on this control technique to reduce audit risk.

Erroneous data are reported back to the user department for investigation and correction.

Systems that allow user groups to enter data at a computer terminal often allow data to be edited as it is entered, and generally allows immediate correction of errors as they are identified. Error messages should clearly indicate what the error is and what corrective action is necessary. Errors identified at a later point in processing should be reported to the user originating the transaction for correction.

Some systems may use error reports to communicate to the user department the rejected transactions in need of correction. More modern systems will provide user departments' access to a file containing erroneous transactions. Using a computer terminal, users can initiate corrective actions. Again, error messages should clearly indicate what the error is and what corrective action is necessary. The user responsible for originating the transaction should be responsible for correcting the error. All corrections should be reviewed and approved by supervisors before being reentered into the system, or released for processing if corrected from a computer terminal.

Output can be in several forms, including printed reports, data accessible on-line by users, and computer files that will be used in a later processing cycle, or by other programs in the application. Output should be reviewed and control information should be reconciled to determine whether errors occurred during processing. Various reports are typically produced by application systems that, if reviewed, help maintain the data's accuracy and validity. Production and distribution of these reports need to be controlled, and to be effective, they need to be reviewed by the user.

Control output production and distribution.

Someone should be assigned responsibilities for seeing that all outputs are produced and distributed in accordance with the requirements and design of the application system. In larger organizations with mainframe computer environments, this responsibility is typically assigned as part of the responsibilities of a data control group, which falls within the information systems department. This group, or some alternative, should maintain a schedule by application that shows the output products produced, when they should be completed, whom the recipients are, the copies needed, and when they are to be distributed. The group should review output products for general acceptability and reconcile control information to determine the completeness of processing.

Printed reports should contain proper identification, including a title page with the report name, time and date of production, and the processing period covered by the report. Reports should also have an "end-of-report" message to positively indicate the end of a report. A report may have pages missing at the end of the report, which may go undetected without this type of message. Controls and procedures are needed to ensure the proper distribution of output to authorized users. Without control over distribution, users may not receive needed output in a timely manner, and unauthorized persons may gain access to output containing privacy or sensitive information. Each output should be logged, manually if not done automatically, along with the recipients of the output, including outputs that are transmitted to a user's terminal device. For these transmissions, the computer system should automatically check the output message before displaying, writing, or printing to make sure the output has not reached the wrong terminal device. In the user department, outputs transmitted should be summarized daily and printed for each terminal device, and reviewed by supervisors.

Occasionally, errors may be identified in output products requiring corrective action, including possibly rerunning application programs to produce the correct product. A control log of output product errors should be maintained, including the corrective actions taken. Output from reruns should be subjected to the same quality review as the original output.

Reports showing the results of processing are reviewed by users.

The user department has ultimate responsibility for maintaining data quality, and should review output reports for data accuracy, validity, and completeness. Some typical reports that are commonly produced for review by users include the following:

A control totals balance report lists the control fields and the totals calculated by the computer to show the results of processing. If similar figures were predetermined and entered with the data submitted for processing, the report will also identify agreements and variances.


Example of items to cover: