With data multiplying in volume and complexity at an exponential rate, outdated data management poses a serious business risk. Understanding the current state of your data storage, and developing cost-effective strategies to cope with future growth, is key to preventing data overload, explains Guy Clapperton.
Data is one of a business's most valuable assets. But with digital data doubling every two years - more than a Zettabyte was generated in 2011, says global market intelligence firm IDC - it's hardly surprising that in a recent survey by IBM CIO only 15 percent of companies believe their data to be well managed.
What's driving the data explosion?
Increasingly stringent and complex regulation around financial reporting, privacy and security mean that businesses are required to hold more information for compliance reasons.
Market dynamics are at work too. Companies create prodigious amounts of transactional data around their operations, and being able to access this quickly, on both personal and business devices and in great depth, gives them a competitive edge. Even everyday interactions between businesses, customers and suppliers produce so-called 'exhaust data'. Business continuity is also a factor, with growing demand for minimal or zero downtime.
Managing the time bomb
Greg Bailey, Head of Storage at BT, believes the first step for businesses concerned about data overload is to establish what information is truly mission-critical and how long it has to be stored for. The compliance/governance team and/or HR may have a crucial role to play here.
"Each industry sector presents unique data management challenges, as not all data is created equal," explains Bailey. "For example, marine architects are required to store ships' plans for 50 years; television production companies need nine times more storage for High Definition files; pharmaceutical companies need complex systems to track and prove the provenance of their products."
With a more complete understanding of the business data, and how it's created, says Bailey, the next step is to find out whether the right tools are in place to handle it. "Typically, IT staff spend 70 per cent of their time managing information and 30 per cent innovating - it should be the other way around."
Understanding how employees are using the data storage infrastructure, and communicating company policy clearly to ensure that it is being used correctly, is also key.
How to do more with less
Bailey suggests a number of technologies and strategies. An audit of the technologies a business is using is essential; typically, since storage is relatively inexpensive, businesses tend to buy more capacity without any data management strategy in place. However, this is only part of the picture. It is also important to consider backup, archiving and most importantly, recovery - being unable to recover data is a serious business risk. Often, whether they are mission-critical or not, differing apps or data are stored on the same tier of disks; identifying what is archivable is key. The next step is to move this data off expensive, fast disk - it can then be fully policied, indexed and searched.
Having established what data a business holds, the next step is to review what storage is being allocated to what application, and that performance-critical applications are on the fastest and most available storage, says Bailey. This might require drilling down to as deep a level as Input/Output Operations Per Second (I/OPS). This will reveal at a more granular level what should sit where in the storage estate, and the protocol on which it resides.
Assessing multiple, even global, storage systems allows a business to look at duplication, poor LUN utilisation and performance degradation. It can also highlight where charge back can be used to recover storage costs from application or departmental users, or where storage can be reclaimed.
Beyond this, says Bailey, which of the many options a business uses for optimising data storage - virtualisation, compression, de-duplication, thin provisioning, storing data in the public, private and hybrid cloud - will depend on its unique needs. At this point, the business would work with its IT provider to create a coherent storage strategy.
With a trusted strategic partner, consolidation can deliver not only cost efficiencies and greater productivity, but can also enable greater business agility with the capability to respond more swiftly to the challenges and opportunities of the marketplace. Over the next decade, says IDC, the number of servers and storage systems will grow by a factor of 75. The IT professionals needed to manage them will grow by a factor of just 1.5. Scalability will be crucial says Bailey. Businesses need to think about their storage needs over a three- to five-year time horizon and to work with a trusted partner to develop and maintain efficient, cost-effective systems.
Guy Clapperton is a freelance business and technology journalist.