Blog Entries from 07/2014
The competition amongst firms in the financial services industry is ever burgeoning, and in order to achieve differentiation, it is imperative for firms to create and maintain robust, manageable, scalable and reliable technology infrastructures. Increasingly, we’re seeing more than just emerging managers opting for a cloud solution and established hedge funds and alternative investment firms shifting gears from traditional on-premise IT infrastructures to cloud services.
If you missed our webinar yesterday on Why the Billion Dollar Club is Going Cloud, read our recap below or scroll down to watch the full webinar replay, featuring Eze Castle’s Managing Directors Bob Guilbert and Vinod Paul.
The Business Case for the Cloud: Why Established Firms are Making the Move
Across the industry, established firms that have been in business for several years are moving away from physical infrastructures and adopting the cloud. Traditionally, investment firms would allocate substantial capital budgets to build on-premise Communication (Comm.) Rooms. These cost-intensive infrastructures can take months to build out, and specific expenses can vary depending on a firm’s unique needs. For example, at minimum, investment firms require file services, email capabilities, mobility services and remote connectivity, as well as disaster recovery and compliance. Beyond those, many firms also require systems and applications such as order management systems (OMS), customer relationship management tools (CRM), and portfolio management or accounting packages.
Categorized under: Cloud Computing Disaster Recovery Security Hedge Fund Due Diligence Hedge Fund Operations Hedge Fund Regulation Infrastructure Communications Outsourcing Trends We're Seeing Videos And Infographics
We’ve seen the face of the financial services industry change dramatically over the last few years, with emerging technologies, investor transparency demands and growing competition fueling firms to assess their operations and focus on the health and success of the overall business. But perhaps beyond any of these trends, the focus on industry regulations and compliance efforts may be the most significant in changing the way financial services firms do business.
This year alone, we’ve seen regulatory initiatives dominate headlines and leave firms scrambling to comply, notably the SEC’s cybersecurity guidelines released this spring and the official implementation of the Alternative Investment Managers Fund Directive (AIFMD), which went into effect last week. Also becoming official this month is the Foreign Account Tax Compliance Act, or FATCA, which requires U.S. persons to report financial accounts held outside of the United States and financial institutions (notably banks) to report foreign financial accounts and clients who hold foreign assets.
To identify non-compliance, the Internal Revenue Service is requiring financial institutions with foreign entities and foreign financial institutions (FFIs) to disclose information about U.S. clients with balances over $50,000. The law threatens a steep 30 percent withholding tax on payments for non-compliant FFIs.
There is also a significant cost for firms to implement compliance procedures and reporting standards to meet the legislative requirements of FATCA. It is reported that implementation costs average between $100,000 and $500,000 depending on firm size and are expected to amount to roughly $8 billion USD a year for financial institutions alone (not including costs to the private sector, IRS and foreign entities).
Your hedge fund's information security plan likely includes details on where information is stored, how it is accessed and who it is accessible to. But a critical component of this plan often overlooked is how and why data is destroyed when it is no longer needed. Including data destruction procedures in your WISP or as a separate document is vital to ensuring your firm’s sensitive data and intellectual property does not fall into the hands of the wrong people. Unfortunately, in today’s technology-driven, cyber-aware environment, simply hitting the delete key is not enough.
There are a few different scenarios that warrant secure data destruction maneuvers:
Changing service providers
Retiring a service/product
Your methods and policies for secure destruction may vary according to the above scenarios, or they may be standard across the firm. Your hedge fund should also consider if there are any regulatory implications. Do you need to maintain/archive data for a prescribed period of time in order to comply with state, federal or other compliance or auditing standards?
In any case, you’ll want to consider a variety of methods in the beginning to ensure your firm’s confidential data (e.g. investment portfolio, investor contact information, etc.) is thoroughly destroyed, preventing unwanted breaches or thefts.
When most people envision Business Continuity Planning (BCP) and testing, they conjure up images of conference rooms, hardcopy documents, projectors and key personnel. But the real world is a different reality.
In recent memory, there have been many situations that have disrupted businesses - be it by natural disaster or as a result of human interference. In either event, people need to be able to reestablish essential business functions, communicate, and make decisions as quickly and easily as possible.
Although many organizations do an annual BCP review, the big question is whether they truly test the process, ease of accessibility, and the time it takes an organization/leadership group to go from unsure about the situation to confidently executing a thoughtful game plan.
What can make a considerable difference in terms of functionality and familiarity with the plans and recovery procedures is to practice -- not only verbally in the conference room setting, but also by taking time to troubleshoot and brainstorm to determine what works and what may need a second look. There is a lot that can be learned from being unplugged and “kicked” out of the conference room and asked to assume a role outside of the comfort zone. This can be done simply by taking away some of the accepted norms during a test. The following scenario illustrates issues that arise when the accepted norms are chipped away.
We are excited to debut our newest video that explains why the network powering a cloud service matters and should be evaluated closely.
As background for why we created this video, in today’s interconnected financial world, investment firms have global interests and a global presence, making fully on-premise IT infrastructure a way of the past. Cloud service providers have a variety of capabilities, each designed to serve a specific set of needs, which makes it crucial for businesses to critically evaluate the network behind a cloud and what it can deliver. Not all clouds are created equal.
Our ECI Link Financial Network is a global private cloud network built for the financial industry. With data centers in the US, UK and Asia, it enables organizations to efficiently leverage a single provider for all their global infrastructure needs.
Now on to the video -- let us show you why ECI Link is THE single converged network built to power today’s buy-side firms' trading operations.
Following is the second part in a two-part guest post from Branden Jones, Global Head of Marketing at Liquid Holdings Group, Inc. based in New York, NY. To read Part One, click here.
In this age of data management—this new state of cross-office functionality—operational models must be able to house, curate, and level-off information sets as they happen. Funds must not only actively manage a growing universe of market data but also tackle performance reporting, risk projections, disaster planning, and partitioned client data.
To successfully, and simultaneously, manage these activities, funds must have a data operational model that supports automation, where it makes sense:
- Continuous processing, as an underlying system
- Consistent normalization, across the board
- Historical, since inception view
- Defensive measures, to protect the operation
Real-time, continuous actions are the new normal in today’s hedge fund reality. Funds are expected to understand, identify, and take advantage of opportunities as they occur. However, from a data standpoint “real-time” is only a point on a larger continuum of activity that occurs when a participant observes or captures a single event in time. Continuous processing is the underlying current that accepts and captures, or rejects data inflows and outflows. As pressures increase from both investors and regulators, managers should rely on continuous, automated services, processes, and technology to support their business, not only as a viewable segment, but constantly, throughout the lifespan of the fund.
Following is the first part in a two-part guest post from Branden Jones, Global Head of Marketing at Liquid Holdings Group, Inc. based in New York, NY.
This is the year for big data. Across industries, firms have unprecedented amounts of both public and private information sets – from user profiles and consumer habits to business outputs and proprietary algorithms. But access to data, or information at large, does not guarantee a valuable yield. Jonathan Shaw, managing editor of Harvard Magazine notes, “The [data] revolution lies in improved statistical and computational methods, not in the exponential growth of storage or even computational capacity.” Data is ubiquitous but not intrinsically valuable – it needs to be smartly processed, not just farmed.
For hedge funds, data processing is the quiet, invisible process that moves through the trade lifecycle—accessed from external entities like exchanges and brokers, modified and adjusted in execution, and at times, frozen in snapshots for an increasingly complex group of investors and regulators. More operational credibility and regulatory compliance is required than ever before, with increased scrutiny of the secret buy-side manna that goes along with it.
Smarter data management can be expensive and time-consuming as funds seek to keep up with regulatory, compliance, and transparency requirements while navigating through a sea of market opportunities. Good fund management starts and ends with precise, accurate data management. Truly taking advantage of data, and smarter computational methods, requires not only shedding the skin of outdated models, but categorically understanding a whole new data ecosystem, with new methods of processing, through selective automation and augmented observation. Once that new data ecosystem has been embraced, fund managers can spend their time mastering alpha generation and capital building initiatives.
One of the first questions on the SEC’s cybersecurity questionnaire for financial firms asks firms to "indicate whether they conduct periodic risk assessments to identify cybersecurity threats, vulnerabilities and potential business consequences", and if so, who conducts them and how often. Clearly the goal behind this question is to ensure that firms are taking a proactive approach to security. But what exactly does this assessment entail?
Here’s a quick overview.
The type of risk assessment typically associated with information technology/security is an external vulnerability assessment. Essentially, this is the process of identifying and categorizing vulnerabilities related to a system or infrastructure. Typical steps associated with a vulnerability scan or assessment include:
Identifying all appropriate systems, networks and infrastructures;
Scanning networks to assess susceptibility to external hacks and threats;
Classifying vulnerabilities based on severity; and
Making tactical recommendations around how to eliminate or remediate threats at all levels.