Following is the second part in a two-part guest post from Branden Jones, Global Head of Marketing at Liquid Holdings Group, Inc. based in New York, NY. To read Part One, click here.
In this age of data management—this new state of cross-office functionality—operational models must be able to house, curate, and level-off information sets as they happen. Funds must not only actively manage a growing universe of market data but also tackle performance reporting, risk projections, disaster planning, and partitioned client data.
To successfully, and simultaneously, manage these activities, funds must have a data operational model that supports automation, where it makes sense:
- Continuous processing, as an underlying system
- Consistent normalization, across the board
- Historical, since inception view
- Defensive measures, to protect the operation
Real-time, continuous actions are the new normal in today’s hedge fund reality. Funds are expected to understand, identify, and take advantage of opportunities as they occur. However, from a data standpoint “real-time” is only a point on a larger continuum of activity that occurs when a participant observes or captures a single event in time. Continuous processing is the underlying current that accepts and captures, or rejects data inflows and outflows. As pressures increase from both investors and regulators, managers should rely on continuous, automated services, processes, and technology to support their business, not only as a viewable segment, but constantly, throughout the lifespan of the fund.
While the amount of data increased, the types of data and their origin/ sources have multiplied as well. That means that systems that previously could only recognize one or two sources, are now challenged with a more complex ferrying of information sets from counterparties, exchanges, fund admins, and primes. Normalization is the process that guarantees safe passage of these data packets, regardless of origin, as the data becomes available to converge with its intended destination(s) within the fund infrastructure. Consistent data, through consistent ongoing normalization, translates into accurate pricing and valuations for use in real-time and forward-looking portfolio management, as well as precision analysis and reporting for investors.
The need to investigate and utilize historical, security-level data unique to the fund is a key to the success of the business. Arming a fund with since-inception-data allows the manager to transform the most unique and granular drivers of past performance into the underpinnings of actionable, forward-looking initiatives across alpha generation, risk management, investor insights, and compliance.
While data trafficking, shaping, and viewing are relatively benign activities, when it comes to true data management, a fourth component is critical: the ability to uncover and recover from adverse events, and the greater protection of investor interests. A solid wall to prevent co-mingling of client data within an underlying architecture keeps critical, and proprietary, data safe. When it comes to planning for the unplanned, like adverse events both in the digital and physical worlds, automated services can provide the second life for a fund—without interruption. Cloud technology provides the best option for funds to house data infrastructures—not only providing secure and convenient access, but also virtual warehouses that are automated, back-up systems, shielding the business from any physical hardware environmental risks like earthquakes, floods, or outages. Thus, it’s not only important how data is managed but where it is managed.
To continue reading the white paper, please visit http://liquidholdings.com/whitepapers/newreality.html.