Last week we explored what big data is and what its implications are on the hedge fund industry. Diving a little deeper, today we are looking at the storage considerations for the endlessly growing amounts of data with which companies must cope.
We create 2.5 quintillion(!) bytes of data every day, so not surprisingly, big data is breaking today’s storage infrastructure barriers and creating new challenges for companies. NetApp has pinpointed three areas where storage is faltering – complexity, speed and volume.
- Complexity. Data is no longer just about text and numbers; it's about real-time events and shared infrastructure. The information is now linked; it is high fidelity, and it consists of multiple data types. Applying normal algorithms for search, storage, and categorization is becoming much more complex and inefficient.
- Speed. How fast is the data coming in? High-definition video, streaming media over the Internet to player devices, slow-motion video for surveillance – all of these have very high ingestion rates. Businesses have to keep up with the data flow to make the information useful. They also have to keep up with ingestion rates to drive faster business outcomes – or in the military, to save lives.
- Volume. All collected data must be stored in a location that is secure and always available. With such high volumes of data, IT teams have to make decisions about what is “too much data.” For example, they might flush all data each week and start all over the following week. But for many applications this is not an option, so more data must be stored for longer periods of time – without increasing the operational complexity. This can cause the infrastructure to quickly break on the axis of volume.
Not surprisingly, NetApp also has a solution to the hurdles big data is creating for companies. Coined the ‘ABCs’ of big data solutions, it focuses on addressing challenges in three key areas – analytics, bandwidth and content.
- Analytics. Analytics is about gaining insight, taking advantage of the data explosion, and turning data into high-quality information that allows for deeper business insights and better decision-making. In order to do this, companies should look for storage solutions that improve response times for ad-hoc and real-time inquiries as well as deliver overall storage performance increases.
- Bandwidth. To leverage big data, companies need to obtain better performance for very fast workloads and high-bandwidth financial applications. Large financial database applications process and analyze large amounts of data in real-time. In order to execute these real-time, intense processes, high-bandwidth storage must be available.
- Content. This focuses on the need to provide boundless, secure, scalable data storage. Content solutions must enable the storage of virtually unlimited amounts of data, so that companies can store as much data as necessary and have the ability to find it when they need it.
To hear more about what is driving big data, check out this Forbes interview with NetApp President and CEO Tom Georgens.
- The New CIO: From IT Manager to IT Innovator
- New Infographic: Criteria for Evaluating Colocation Providers
- What Not to Do When It Comes to Your IT
- Data Protection Changes Coming to EU Firms
- Psst. Are you in the know about SSD (aka Solid State Disk)?
- business continuity planning
- cloud computing
- data loss prevention
- disaster recovery
- eze castle milestones
- hedge fund due diligence
- hedge fund marketing
- hedge fund operations
- hedge fund regulation
- help desk
- high frequency trading
- launching a hedge fund
- privacy compliance
- project management
- real estate
- startup & relocation
- trends we're seeing
- videos and infographics