The Case for Externalisation in Regulation and Technology

Fiona HamiltonBy Fiona Hamilton Vice President Europe and Asia, Volante Technologies

In the past two months I have attended three capital markets conferences: FIX EMEA in London, ISITC in Boston and ISITC Europe in London. While the FIX event was much more front office focused and the ISITC meetings more middle and back office, what was striking was that the subject matter of presentations and panel sessions were broadly similar; the panels looked at regulation and technology, or, in the latter case more specifically blockchain/ distributed ledger technology.

 

In some cases the two subjects were even being conflated, in the sense of it being suggested that distributed ledger technology might be the answer to the ever-increasing regulatory burden, which seems to be stretching even the almost mythical panacealike qualities of that golden child. Regulation obviously covers more than just increased or enhanced trade and transaction reporting, collateral management being just one example. If, however, we look just at transaction reporting, does the premise have merit, and if so what hurdles would market participants have to overcome within their own business to adopt that approach?

 

There is one pretty common facet of new and increased regulation and that is the need for more accurate, detailed and timely recording and sharing of data. This was very clearly shown in a presentation by Dr Anthony Kirby, the Chair of the Regulation Working Group of ISITC Europe, at their recent meeting. As can be seen in his diagram below, apart from TDII all the others have either medium or high data impact.

“Much of the wailing and gnashing of teeth being done over the subject was in actual fact about how to access, aggregate and format the data into the required formats to share with the regulators.”

Indeed, data impact was the subject of many discussions at these events, whether it being the requirement to report additional data items such as trader ID or Portfolio Manager ID, etc. or to start reporting for asset classes that to date have not yet been covered by market regulation. Obviously that information needs to be recorded somewhere and if not already catered for will have to be added to transaction records somewhere along the trading and settlement lifecycle. It is pretty clear that you can’t report something you haven’t recorded, so there is no getting away from that element of work yet to be done. What was surprising though, was that much of the wailing and gnashing of teeth being done over the subject was in actual fact about how to access, aggregate and format the data into the required formats to share with the regulators. It strikes me as odd that this should still continue to be such a pain point given that it is not as if regulation just started happening in 2016; the industry has had to cope with this increasing burden since the financial crisis of 2008. There can only be one possible reason and that is the traditional silo approach to the trading lifecycle, or more specifically, the applications that support it.

 

For a trade to reach final settlement it will typically have passed through a number of systems; OMS, trading platform, risk analytics, middle office confirmation processing, settlement or portfolio management system, etc. Typically at each hand-off, from front to middle and middle to back office, the trade data is enriched with the data associated with that process. However, the issue is that not all of the information is passed down the line. Most systems only require or support the data items that are salient to that operation, so to use a technical term, the process is not ‘lossless’. It is therefore easy to see the challenge and the impact of adding new fields in the front office or having to report on existing ones for the first time could have on downstream systems. The question is, does it have to be that hard?

 

VOLregulation

Source: ISITC Europe General Meeting 25.04.16

 

Most of the cost and time involved in adopting the above ‘waterfall’ approach is in the invasive changes required to all systems in the chain. Not only altering the hand-off interfaces but often requiring costly change requests to product vendors which can also take time to be delivered and which then require full regression testing, which in itself takes resourcing, time and budget. Obviously if there are brand new data items, then all this is pretty unavoidable; but if they already exist, why change the system and not just access the data and aggregate between all the silos by externalising the functionality? This non-invasive approach also facilitates the creation of golden copy transaction records that can be persisted to a data warehouse which can then drive multiple different reporting requirements, both regulatory and indeed for internal business purposes. The formats required by the different regulators are also externalised so that the internal systems do not have to interpret and understand them. This method means that new regulatory channels can be rapidly supported with minimum impact on internal systems. This is increasingly important as many new regulations have short timeframes from being agreed to being mandated.

 

Another benefit of this approach is where, for example, internal or proprietary identifiers are insufficient to report on. A trading system might record a trader ID as ‘JB1’ who is Joe Bloggs and that is fine within that organization; however, it doesn’t help the regulator if there is another organization that uses the same trader ID but this time for Josephine Bloggs. So it is likely that a unique identifier would have to be used, such as a national insurance, tax or ID number. Again, in using an externalised architecture this becomes a relatively simple process of transforming or enriching the data by means of either implementing a simple lookup table in the data warehouse, or indeed retrieving the information from an in-house HR system.

 

So the task of keeping up with regulation with respect to the data and reporting doesn’t need to be a Herculean task taking many years to complete and a significant budget to support it. Implementing a flexible externalised architecture also acknowledges the fact that the current wave of regulations isn’t the end and that it should be viewed as an ongoing change process, and as such you don’t want to have to change your systems unless you absolutely have to.

 

So what has that got to do with blockchain or distributed ledger?

The spectacular rise of distributed ledger technology is a subject of intense debate, and indeed backing in terms of proof of concept projects by many of the large financial institutions round the world, is undisputed. What is less clear is what will come out of all of this activity. So far many of the regulators seem to be keeping more of a cautious watching brief than wading in wholeheartedly. Certainly the technology is going to have an impact and indeed already has in areas where there was previously no prior or viable solution such as non-listed securities issuance; but whether it scales up to cope with the volumes that would be necessary to support global equities trading or the long lifecycles of corporate action activity in equities and fixed income, remains to be seen. It is perhaps not surprising then, given that some current regulatory reporting in the OTC derivatives clearing world involves the daily transfer of Gigabytes of data to regulators, that they will probably let the industry do the serious tyre kicking via their proofs of concept before getting off the fence.

“It seems that pretty much every day a new provider of distributed ledger technology comes into existence – there are hundreds of them out there. However, history suggests that they can disappear as quickly as they appeared.”

What can be said with some degree of certainty is that most of the regulations in the previous diagram are unlikely to be based on a distributed ledger given their timeframe, but that is not to say that they won’t migrate to it over time or that new and unforeseen regulation won’t utilise it. So once again market participants are going to possibly at some point, be faced with the challenge of interfacing their internal systems to this new technology. The same challenges as have been discussed with relation to the data requirements apply. Do you invasively change your internal systems to natively understand the relevant APIs with the inherent risk and costs, or do you leverage the same architecture as suggested for externalising data aggregation and communication?

 

To me the answer is clear for exactly the same reasons as previously asserted. In fact, the case is potentially even stronger for the following reason.

 

It seems that pretty much every day a new provider of distributed ledger technology comes into existence – there are hundreds of them out there. However, history suggests that they can disappear as quickly as they appeared – so which horse do you back? Their technology might be great and your architects may have made the right decision, but market forces do not necessarily mean that the company has a future. If you have decided to invasively change your systems to interface to their API it could be a very expensive piece of bad luck if they disappear overnight. In an externalised architecture however, it should be a relatively trivial task to map the same data to a different API of another provider and the same would also be true if a regulator decided to migrate from a filebased mechanism to a distributed ledger at a later date. It would simply require a different mapping. Indeed, examples of this already exist in the payments world where intelligent routing and formatting can already take a payment record and decide whether to route via a SWIFT message or via Ripple which is a blockchain based API. It is easy to see that adding new destinations from the same set of data becomes a task of adding the API interface and mapping the data to it.

 

So putting it simply, whether it be fears over having to report ever more and detailed information or keeping up with its technological base, by implementing an externalised architecture that in effect insulates core or legacy systems, those challenges don’t disappear completely but they certainly shouldn’t be the cause of the continued wailing or gnashing of teeth.

volante logo transparent