Duplicates

The Duplicates analytic test identifies duplicate or potentially fraudulent transactions or records within a company's financial statements or accounting records. This can be used to uncover errors, anomalies, or fraudulent activities such as double counting of revenue, expenses, or assets.

This analytic test can be useful to identify:

  • Possible attempts to manipulate the financial statements

  • Financial reporting areas that have lower accuracy or precision

  • Fictitious transactions

  • Systemic issues in how a department processes transactions

  • Improperly configured system settings which may need improvement and further evaluation

  • Failure of the data validation process

  • Weak internal controls

  • Weak monitoring and detecting activities

  • Poor training

  • A risk of material misstatement due to fraud

  • Transactions accidentally posted in duplicate

  • Redundant processes or lack of coordination among departments

  • Problems with reconciliations

Fields used for analysis

The following fields are required for this analysis:

  • Reference field(s) - Unique field(s) that are used to create a unique transaction ID such as the Entry ID field for the general ledger dataset. These columns are not part of the result but are used to identify the transactions that are part of the result. This field is already defined in the test and cannot be modified.

  • Duplicate field(s) - One or more fields that will be compared for duplicates.

  • Different field(s) - One or more fields that must be different for the duplicate entries. These fields are optional and are only used for analytics where you are looking for duplicates with one field different. For example, if you are looking for the same payment made on different dates, the payment field would be the duplicate field and the date field would be the field that must be different.

Parameters

There are no user set parameters for this test.

Test configurations

Note: The configurations available to you vary depending on the product you’re using.

The following configurations are available for this test:

  • Duplicate Entries - By Account ID, Identifier Type, Entry Number, Entered Date, Entered By and Amount

  • Duplicate Entries - By Originating Document Type and Originating Document Number

  • Duplicate Entries - By Document Type and Document Number

  • Duplicate Entries - By different Entered By User

  • Duplicate Amount by Identifier

  • Duplicate Entered Date and Amount by Document Type

  • Duplicate Entries - By Description and Amount

  • Different JEs: same Account Number, Amount and Entry date

  • Duplicate Entries - By Account Number, Amount, Document Number and Entry Date

  • Different Entry Date : Same Account Number, Amount, Document Number

  • Duplicate Invoice Numbers

  • Duplicate Supplier Invoice Dates

  • Duplicate Cheque Numbers

  • Duplicate Supplier Invoice Amounts

  • Duplicate Transaction

  • Duplicate Transaction with different invoice date

  • Duplicate Transaction different entered date

Technical specifications

When you run the Duplicates analytic test, the following steps are performed to run the test:

  1. If needed place any filters on the data in order that a subset is used for the analysis. If no filter is placed, the analysis will be run on the entire data file. This step can also be performed as the last step instead of the first. Note that the ability to set filters is not currently available and will be available in later versions of the test.

  2. Validate that the necessary reference fields have been selected. If fields have not been selected, then create a unique reference field. This step is only performed if specific fields have been selected. If all the fields are available, this step is not necessary.

  3. Validate that at least one field has been selected to perform the duplicate analysis on.

  4. Validate if an optional different field has been selected.

  5. Validate if a date range field has been selected.

  6. Core test:

    • Only duplicate fields tagged - Perform the analysis looking for matches within the database and export any matches to the result file.

    • Different field selected - Perform the analysis looking for matches in which the field selected as different does not match. Export any matches to the result file.

  7. Add an additional column that groups the duplicates together. For example, the first set of duplicates would be grouped as 1 in this column and the second set would be grouped as 2.