Asset managers are tackling a bevy of new data-related requirements this year, driven by clients, boards, global regulators and the FinTech disruptors. Many buy-side firms have responded by embarking on operational and technological transformation projects. Some have focused on data access and delivery, upgrading legacy data architectures using hubs, virtualization, warehouses, or data-as-a-service (DaaS); others are tackling costs and efficiency by taking advantage of newer technologies such as machine learning, robotics, and various forms of artificial intelligence (AI). As they institute these initiatives, investment managers also discover new bedrock needs: enhanced data governance, integrity and controls.
Gresham Technologies’ Jan Dinger and Scott Knous of Olmstead Associates sat down with EDM Council’s Executive Director, John Bottega, to assess these shifting pressures, and discuss why being more data-centric can help firms execute their transformation initiatives with a greater probability of success.
JB: What are your thoughts on the key business and data management challenges facing today’s firms?
SK: Well, the heat is on and firms are feeling the pressure. In the coming year, firms must be ready with a comprehensive, flexible solution and while many have a plan to achieve their definition of “ready”, the question is what does it look like?
They know they need to address regulations, for example, but each one demands dexterity across different operational functions – whether legal entity identifiers (LEIs) for MiFID II, privacy concerns for GDPRS, or deeper fund-level reporting. The challenge is legacy data architectures and reporting solutions on the buy-side were not designed to aggregate, transform, store and report data from the enterprise in a way that matches the diversity and depth of these new rules.
Firms also need greater insight into where their business—whether portfolio investments, clients, or sub-entities—faces risk exposure and they need to be proactive in mitigating that risk.
Expense pressures abound on the buy-side, but even more so when ‘40 Act funds or advisors are forced to spend on infrastructure. That means either expenses rise, or they trim elsewhere. Conversely, they’re developing new products, new distribution tactics, cross-selling existing clients and eyeing mergers to boost revenue.
Lastly, client retention. The competitive landscape has gotten more intense over the last few years and investment firms risk losing clients to competitors who deliver a higher customer service level. This is not just about responsiveness, but about having a 360-degree view of the client, detecting their preferences and habits, and putting that intelligence to use. Once again, much of this comes down to capturing, storing and managing your client data. It goes beyond a CRM system today.
JB: Those are big subject areas. How are firms looking at these challenges?
JB: The EDM Council, in our advocacy with our members, clearly support the data-centric approach. Is this approach recognized by Gresham’s clients?
JD: Yes, I think our clients understand the value in letting data drive you to the best solution. Really, they’re looking for innovation and a true technology partner. Our clients find value in direct access to subject-matter and implementation expertise, flexible integration with existing platforms in the data estate, and tools to automatically identify and match to any data format.
At the core of the challenges they face, ensuring the data, whether structured or unstructured, is fit for purpose is key. At Gresham, we describe this as assuring the data has the proper level of integrity—including the granularity, availability, lineage (auditability) and controls—around it for the task.
As many large managers come to face the new requirements you mention, the data transformation and reporting required demands stitching together disparate spreadsheets and siloed datasets, or transitioning off a large (and often expensive) legacy system—if not both. The institutional energy required is profound. Furthermore, wrong moves in this foundational area can delay necessary work for a data warehouse or reporting engine build-out downstream. Getting this right at an enterprise level is both crucial and hard.
JB: Do you see expectations for the supply-side of the market changing with respect to how data is managed, and what value is now expected from investments in improved data management?
SK: Well, another reason that firms are now paying greater focus to data integrity and control is because, frankly, they can—and they must. Operations, technology and data managers have long been trying to build business cases to modernize systems and optimize processes to lessen technical debt and better manage their firms’ shared data. Today, with heightened visibility of data issues there are fewer business cases to prove; in fact, there is increasing collaboration and project sponsorship from the business, itself. The availability of faster, cheaper technologies and a footrace to own the data lifecycle by FinTech companies is also providing buy-side firms many more choices at a relatively reasonable price point.
JB: Would you say that has put more pressure on internal technology teams, and their providers, to up their game?
SK: Absolutely. Because of budgetary constraints, managers have tended to spend only what they feel they need to get by. Therefore, some buy-side firms develop technology or processes designed specifically to address a seemingly isolated business problem because it is faster and cheaper at that time. However, because they didn’t take a holistic view when developing them, these multiple isolated solutions collectively result in a sprawling, ineffective data estate. Today, they are putting these issues under a microscope—not only to reduce that technical debt, but to extract more from their data.
There are two key factors that firms must address as they select an external partner. First, they need to get their arms around their data. They need a holistic view of their data, or a data topology — essentially a heat map of who is using the data, and how it is being accessed, processed and applied. The topology also depicts the extent to which each business area relies upon different datasets to do their job. This leads to the second factor: scoping the shared data or, as it is commonly known, enterprise data. Understanding and prioritizing the shared data will clarify the scope of data to be managed by the third-party solution. Defining scope is a prerequisite for properly testing and evaluating the provider’s understanding of your data.
JB: How does Gresham’s platform help clients meet these requirements, Jan?
JD: For Gresham, our process always starts there, with scope—analyzing the requirement, identifying the possible sources of data and just as importantly, possible gaps—and then implementing the solution. That diligence unleashes the full potential of the platform’s flexible open (versus fixed) schema and N-way reconciliation capabilities, while optimizing the number of controls in play; it also reflects our clients’ need for process prioritization and implementation design.
JB: Can you give us some of the real-world implications?
JD: While it’s true managers are looking for solutions that are quick to deploy and don’t require significant development or reengineering of existing systems, achieving data integrity and control is about accuracy and calibration. Many of our client’s projects require us to enter midstream, so to speak—sometimes in the middle of a multi-year transformation or integration. So for us, it’s about expertise rather than being fastest out-of-the-gate. Today, large investors prioritize the ability to interface and integrate with existing systems, such as trading and middle/back office systems, and third parties such as custodians, TAs or sub-advisories. This makes achieving strong data integrity particularly tricky. In addition, they want full auditability and a customizable, front-end view of the workflow being executed across the platform, which, for our largest capital markets and asset management clients, involves the processing of millions of data points on a daily basis. They invest in us because we have the experience and the platform to tackle this still relatively new, but increasingly crucial, discipline for enterprise data management.
JB: Wrapping up, what is your advice for 2018?
SK: I think it would be that if you look at today’s environment—the growing demands on data, the heightened attention paid to tech transformation by the business, and the sophistication of options available—any large asset manager seeking a pure reconciliation system should strongly reconsider. Today’s challenges require a technology partner who curates intelligence, facilitates data movement, transformation and workflow capabilities, prioritizes data integrity, in addition to core reconciliation functionality.
John Bottega is the Executive Director of the EDM Council, a 501(c)(6) trade association and leading advocate for the development and implementation of data content standards.
Jan Dinger is Sales Director for North America at Gresham Technologies (GHT.L), a global leader in financial data integrity.
Scott Knous is Managing Director and Head of Data Management at Olmstead Associates, a data-centric strategic investment firm.