Quinnie Luong, senior product manager at Fidelity Investments in Boston, Massachusetts, considers the investment industry’s need to step up the pace of standardisation of corporate action data and processes.
As corporate actions and complex events continue to rise, coupled with the communication of data that can be accommodated in a variety of formats, the need for data transformation is essential to normalise information for downstream processing. The industry continues to face pressure to improve data quality, timeliness, and risk reduction. The demand for the preservation of issuer data in the communication flow could be a very promising step to speed up data standardisation and alleviate the pressure the industry is currently facing.
While it is possible to achieve a high degree of automation when data parsing in a single source data model or when issuer data preservation is implemented, it is challenging to achieve efficiency when parsing data from multiple sources into a single composite Golden Record. It is even more challenging when the data files are in a proprietary or non-standardised format and contain unique service provider elements. This results in more manual validation to address data variation from issuers, depositories, exchanges, custodians, agents and data vendors.
Typically, financial firms have staff dedicated to interpreting the information received. This manual process delays the flow of information and increases the risk of errors due to misinterpretation. Therefore, data quality, timeliness and cost reduction remain high priority corporate actions initiatives. Firms should look to address these high priorities internally. They may also consider forming partnerships with service providers that have expertise, scalability, automation and SWIFT capabilities to streamline processes and perform data validation. This will allow the individual firms to focus on their core business. The partnership may help drive the industry forward to improve data quality, timeliness, and conforming to industry standards.
Current industry efforts to improve data quality require innovative thinking to develop new processes and technologies. Ideas include machine learning tools that automate data interpretation in different formats and perform data matching and enrichment.
The key benefit of investing in these technologies is to automate the resolution of data discrepancies rather than maintain staff to fix it. The demand for operational efficiency, scalability, and risk mitigation creates a constant battle to secure technology funding.
Key success measures in data quality projects include:
Achieve event completeness
Standardise the Golden Record
Provide accurate data sets
The extent of automation is dictated by its ability to process large volumes, handle event complexity, (e.g. mandatory versus voluntary events) and use data for multiple purposes (e.g. processing and analytics). Spending on corporate actions automation projects is on the rise for custodian banks, outsourcing providers and mid-to-large asset management firms. Smaller firms are not spending on automation projects because they have more manageable volumes and need fewer staff to support it.
Adoption enables automation
Another area of focus to improve data quality is adopting a standard reporting tool set. Currently, ISO 15022 or 20022 are the co-existing market standard formats for corporate actions reporting.
Adopting the ISO messaging standards enables the automation of corporate actions communication and processing, which improves efficiency, accuracy and timeliness. Since there is no regulatory oversight to mandate the use of ISO standards, adoption remains voluntary. Industry-wide adoption should be dictated to improve data quality, increase operational efficiency, develop data standards and implement system upgrades or new technologies.
Today, most mid-to-large size firms have the capability to use ISO 15022 for corporate actions. Key benefits to using these standard messages include increasing straight-through-processing (STP), achieving greater scalability, and conforming to industry standards. But there still is a high degree of non-compliant practices taking place. This is partly due to the fact that corporate actions message standards are subject to interpretation and can vary by source.
For example, a rate or price can be expressed as a percentage or an amount. This requires additional transformation matching rules to normalise the sender’s choice into a single format. The data fields available in the message standards are somewhat limited and often fail to account for common data attributes that are unique at the individual market level. For example, a Publication Date is required for a U.S. Lottery event but is currently not available in the ISO data fields. As a result, a workaround is created to communicate the information. It relies on the national market practice group to enforce conformity.
Narrative worth noting
Another area worth noting is the allowance to use narrative text. This should be limited to information that cannot be provided in a structured field. Overuse of unstructured narrative text has hindered STP and impacted data quality. This impact could be lessened by a cutover to ISO 20022 and leveraging the use of extensions.
Globally, pockets of ISO 20022 adoption for corporate actions are emerging. In the U.S., the DTCC Corporate Actions Transformation project, and in the Asia Pacific region, market infrastructure providers such as exchanges or depositories lead the way.
Although data type restrictions were introduced to ensure interoperability between ISO 15022 and 20022, the newer messaging standard can support the placement of structured text. This add-on capability is embedded within the 20022 XML extensions and can be transmitted as structured, fielded elements. The benefit to data consumers using this capability is that it provides greater STP. Unfortunately, this has a higher technology cost to maintain because all sources may use different extensions which require more customized mapping at the source level.
Parsing and transforming
There are case studies that show the success of ISO 20022 adoptions in the corporate actions arena. However, when taking a closer look, the data producers primarily received data from a single source, (e.g. from issuers), and the data is parsed and transformed based on ISO standards. This means there were no data discrepancies from conflicting sources to resolve. As a result, data quality and timeliness improved. Firms that want to benefit from these improvements now have a clear incentive to develop ISO 20022 capabilities.
What is still required?
Market practitioners need to maintain an open mind towards improving message standards. Accommodations need to be made to reflect the unique market nuances that limit the use of narrative text. Significant progress has been made defining market practice guidelines at the individual market level (e.g. in the U.S.). Also, the ongoing work led by the Securities Market Practice Group and the National Market Practice Groups has helped. However, a larger effort is needed to examine the message standards and to develop rules to address areas open to interpretation. To eliminate redundancy and misuse, a review of the repeatable data placements and available data formats is also required.
Another major issue is the lack of uniformity in how corporate actions data is communicated in distribution. At the top of the information flow, issuer data currently published in paper-based format may be enhanced to support ISO standards. Exchanges or depositories should have the infrastructure to capture and publish event information in formats that can support ISO standards, (e.g. 20022).
Next, the flow of information is passed on to the intermediaries (agents, custodians, data vendors) in a variety of formats (e.g. proprietary file and 15022). Data is then validated by the intermediaries before sending to the end subscribers in a variety of formats (e.g. fax, web portal, 15022). Even with the use of ISO standards, there are so many touch points in the data distribution cycle, quality and timeliness are likely to be impacted.
As the industry expands and event complexity increases, more engagement between third-party service providers, custodians, large asset management firms and the issuer community is needed to improve transparency. This engagement may also lead to the standardisation of corporate action announcements by event type. Furthermore, the ongoing demand to preserve issuer data using a standard tool set could speed up the pace of data standardisation and improve data quality.
Automation drives standardisation
Until data flow is transformed and sourced directly from the issuers to all interested parties in the distribution cycle with minimal manual work, data quality and timeliness will continue to be problematic.
As a financial firms business grows and they diversify into new asset classes, it will be a challenge to maintain a lean operation without outsourcing or investing in complex technologies. There is a need for data parsing tools that handle different formats and machine learning tools that perform complex data matching to provide enriched event data. This transformation is necessary to keep up with the increase in corporate actions volume and the processing of complex events.
Ultimately, standardisation and automation are needed to improve the quality and timeliness of data. The consequences of living with multiple standards, less than optimal levels of automation, and manual processes are costly. Operational costs will remain high, STP will not be achieved and controlling risk will become more difficult as volumes and complexity increases. Although many of the industry initiatives to standardise and automate corporate actions are ongoing and longer term, much work can be done in parallel. Accelerating these initiatives, while exploring new technologies, is essential to moving the industry forward in the standardisation of corporate action data and processes.