Looking Forward To Retirement: Managing Duplicate Reporting Under CAT

It is important to note that when a regulation is decommissioned, it does not necessarily mean that analysis on the historically collected data ceases. 

The SEC’s Rule 613, which created the Consolidated Audit Trail (CAT), is estimated to be the world’s biggest data repository for options and equity trade related data, collecting thousands of terabytes of data a year from U.S. self-regulatory Organizations (SROs)s and Broker Dealers. 

One question that has been raised is whether any existing reporting systems will be eliminated once the CAT is operational?1

Currently, the industry spends an annual $1.6 billion2 running related regimes such as Order Audit Trail (OATS), Electronic Blue Sheet (EBS) and Large Trader Reporting (LTR). The SROs involved in the CAT proposal have been specifically tasked with analysing existing regimes3 to find and eliminate duplicative systems as a result of the implementation of CAT. The Commission has put in place a four step plan for Participants to undertake a gap analysis of duplicative systems and  present findings to the Commission for review and implementation. Until such time that this plan is complete, firms can expect to satisfy both old and new reporting regimes simultaneously for up to 2.5 years post CAT go-live at an estimated cost of $55 million4. As a result, there is a possibility regulatory reporting teams may see a significant impact on regulatory reporting initiatives. Firms will need to think about certain factors when planning their CAT projects in order to mitigate any potential strains on reporting obligations across their organizations.

Quality Is Key

There are a number of drivers to prompt the retirement of duplicative reporting regimes, but the key condition will be that the CAT contains complete and accurate data that allows regulators to conduct surveillance and investigations of the markets. Firms should ensure that the quality of data being reported to the CAT Processor is of a high quality and accurately reflects their position and activities. 

There are a number of factors that can affect data quality, including the management of multiple source systems capturing data in varying standards and formats, poor quality proprietary and industry specific reference data, and unfitting business rules and logic used to transform data to meet a desired format for reporting. In implementing a new CAT reporting framework, organizations should ensure that the solution being used provides complete visibility and transparency over their reporting process. All touch points which could be a threat to data quality should be easily accessible, reportable and maintainable. Reporting error rates may play a part in defining a benchmark for CAT data quality. As a result, being able to effectively view, analyze and remediate data quality issues through an efficient exception management process will support key data integrity initiatives.  

Streamline Where Possible

Another aspect that should be considered to assist with managing duplicative reporting is to streamline the reporting process. Streamlining can cover the team operating the reporting, the governance structure, the compliance oversight, and the technology supporting the obligation. Dual reporting will introduce complexities such as “conflicting reporting requirements, varied correction [rules]…, legal and compliance confusion, costs of maintenance of duplicative reporting systems… .”5

Consolidating is a way to reduce costs, share knowledge across a wider audience, reduce maintenance efforts, and introduce consistency in management. This consolidation effort could have a positive impact on data quality, which as we know is a driver for retiring duplicative regimes. For example, an exception identified in one CAT could very well be an issue in OATS and vice versa, so combining the efforts in exception management would result in remediation under both regulations. 

Although the technical specifications for submitting data to the CAT Processor have not been finalised, there is a still a question as to whether data will be required in a uniform or non-uniform format. It is unclear if  there will be an opportunity to combine the OATS and CAT data elements into a single dataset for dual reporting in order to support the notion of consolidation. There is also a question as to whether any  changes will be made to duplicative systems, or whether there could be updates made to OATS and other regimes alongside CAT, requiring a dual maintenance effort. Technology can play a big part in streamlining data capture, normalisation, validation and reformatting to meet both the OATS and CAT data requirements. It will be up to firms to decide whether this re-modelling is worth the effort, and the next and final point in this article adds a compelling factor to this discussion.

Do Not Neglect Existing Data Issues  –Reconciliation Is Worth It

CAT data must meet minimum standards of accuracy and reliability.  This requirement has been seen as an incentive for “accurate CAT reporting because it could potentially allow industry members to retire redundant, and costly to maintain, systems sooner”6>. But does attention on CAT mean firms may lose focus on accurate OATS data? 

It is important to note that when a regulation is decommissioned, it does not necessarily mean that analysis on the historically collected data ceases. Firms can expect to still be investigated based on information reported under the old regime and penalised for any identified malpractice. 

Firms should make a positive attempt to ensure data consistency between OATS and CAT, and should consider implementing data reconciliation procedures to avoid discrepancies. The OATS reporting framework should benefit from the investments made in supporting CAT, and where firms identify faults in the reporting process, these should be rectified in OATS to avoid submitting inaccurate data under this regime. 

In 2015, FINRA fined a firm for reporting violations spanning a period of more then 8 years7. It is important to note, therefore, the quality of OATS data is not to be overlooked as it is unknown as to when fines under this regime will be retired. 

Food For Thought

Following public consultation, the Commission believes the revised plan could significantly shorten the duplicate reporting period and reduce the cost of servicing8. We must also consider the fact that all relevant OATS data must be available in CAT, all FINRA members must be reporting under CAT, and all asset classes must be in scope of CAT9. The bottom line is that the turning point will be when the CAT data quality reaches a good enough level. Until such a time, firms will need to be resourceful in dealing with the multiple reporting obligations, as the Commission and SROs will continue to rely on information collected through these regimes until “sufficiently complete, accurate and reliable data is available through CAT”.10


1. http://www.catnmsplan.com/cat_faq/index.html#q_14
2. https://www.sec.gov/rules/sro/nms/2016/34-79318.pdf F.Costs
3. FINRA OATS, CATS and associated rules, NYSE Rule 410(b), PHLX Rule 1022, CBOE Rule 8.9, EBS and associated rules, CHX BrokerPlex reporting Rule 5.
4. https://www.sec.gov/rules/sro/nms/2016/34-79318.pdf F.Costs
5. https://www.sec.gov/rules/sro/nms/2016/34-79318.pdf b. System Retirement and Duplicative Reporting Costs
6. https://www.sec.gov/rules/sro/nms/2016/34-79318.pdf 2. Proposed Alternative Approached to System Retirement
7. http://www.valuewalk.com/2015/07/finra-fines-goldman-sachs-1-8-million-f...
8. https://www.sec.gov/rules/sro/nms/2016/34-79318.pdf b. System Retirement and Duplicative Reporting costs
9. https://www.sec.gov/rules/sro/nms/2016/34-79318.pdf b. Retirement of Systems Required by SEC Rules
10. https://www.sec.gov/rules/sro/nms/2016/34-79318.pdf 1. Timing